Group D / Demonstrator

Roboters controlled by models

The demonstrator is made up of several robots (Raspberry PI and LEGO) and a track. This enables us to practically test how well controlling physical objects with models at runtime works.

How were the robots built?

The robots used in this project were multiple Pi2Go and one Pi2Go-Lite by which are kits for the raspberry pi and a lego ev3.
All robots provide distance measurement using ultrasonic sensors, line following, and color detection.

How were the robots programmed?

The Pi2Go was programmed using the Pi4J libary which “provides a friendly object-oriented I/O API and implementation libraries for Java Programmers to access the full I/O capabilities of the Raspberry Pi platform”.
For the communication between the robots MQTT was used, a lightweight connectivity protocol developed for machine-to-machine communication.
Additionally Android was utilized to provide an app which is used to remote control the robots.

The robots receive messages from the runtime. They map these to actions they can physically execute. The possible set of messages is determined by the modeled scenario. All messages that occur in the specified scenario, have to be understood by the affected physical objects.

How does the track look?


How are relevant messages broadcasted?

We use MQTT as a communication protcol. It is currently considered being well suited for the Internet of Things (IoT). That is because it supports communcation between many gadgets / machines.
The protocol relies on a server that acts as a message broker. Every client can publish messages under a specified path. If a client wants to listen to certain messages, it has to subscripe a corresponding path. The broker receives all messages and distributes them to all subscribers.
In our setup this means, that every roboter and every part of the infrastructure that sends SDL-messages is a MQTT client.

What are the results?

A video shows more than many words, so see the following video: