When we started this project, we defined our goals as designing a robust and interesting robot platform for research, education, and last but not least: for promotion. In that context, we have been demonstrating what Zeppy can do at two different events lately, which resulted in some nice pictures.
The first event was for in honour of our first-year-students at the AI department receiving their first 60 points, their propedeuse. Their parents were all invited to come and see for themselves what their children actually do all day, and we were there to show off Zeppy. Though we must admit, it had been a while since Zeppy actually flew, so some last-minute hardware changes had to be made.
But after that we were all set and ready to fly!
The second event was the open house the university held. All bachelors programs were there to promote what they’re doing, and Zeppy was there among other cool stuff, to show what AI can be like.
Both were really fun events and we were very happy that we could once again show our work.
First of all let me refresh your memory on the architecture that we had in mind for our project. As you might recall there are two computers involved in controlling our robot: there is the software on the Ground Station, called Uplink (this has been introduced about three posts below). This software is able to send and receive UDP-messages to the other computer, which is Zeppy itself, flying around. These two communicate via Bluetooth, so we can tell Zeppy to ‘do stuff’ from Uplink, and Zeppy can then send back to Uplink what it senses (if needed). Our idea is that then we can implement different types of cognition in Zeppy ranging from simple remote control, where the messages determine entirely what action is to be done, to completely autonomous behavior where Zeppy determines everything for itself. (and perhaps communicating with Uplink in the process, so the system could work out hard calculations)
Well as it seems now the entire software architecture on Zeppy has been completed! Also, the “remote control”-cognition has been implemented, so now we have basically created an rc-blimp (which can even be controlled with a joystick or a compass). This may not seem that impressive to you, but this framework is way more flexible and robust than what we had before, and it allows us to easily implement very different types of behavior.
Encouraged by this thought and the ideas of our supervisors we have chosen to take on the challenge of implementing SLAM; Simultaneous Localization And Mapping. This would mean that we could make Zeppy fly around in a room or an entire building for a while and have it tell us what it ’sees’, and simultaneously Zeppy can localize where it is. To be realistic, we dont expect this to be an overnight implementation exercise. However, we’re looking forward to trying this, since the end result would be really cool.
However we’re unfortunately still held back by some other issues we have to resolve. A lot of work is still needed on the hardware level. First of all we noticed that the motors we use to propell the blimp are simply not strong enough. Also we’re still worried about the envelope we currently have. Although a lot of work has been put in by the engineering students, it still isn’t as robust as we hoped. Also it is still very big. We’re now looking into other options such as a professionally made blimp.
As you see, although we have been silent on here for a long time, there’s quite a lot going on right now. Work in progress…
Last week the students at Electrical engineering delivered the PCB, and what a change. Compare the pictures of the Sort of completed post, and the image below. The breadboard has been replaced, so the electronics are mostly finished.
This is not the first PCB that was made, however due to layout-errors on the previous design and the large amount of time it took to solder the SMD-components we decided that a different approach was needed.
The new PCB has through-hole components instead of the surface mounted components of the previous PCB, which makes soldering it a lot less work, and this time the design is solid.
Next up: attaching the third motor, used for controlling vertical speed and making attachment points for the PCB.
The past few months we’ve been using Assembla to manage our code. Assembla provides easy to use SVN, Trac and wiki capabilities for free to Open Source projects. Since we strongly believe in Open Science we’ve decided to share our code under the famous MIT license.
Before you check out our code remember that we’re undergrad students with little to no experience with software development out of our classes. We’ll be gradually improving and expanding our code with the knowledge we’re acquiring.
Compiling and running the Java for the client side should be no problem with Java 1.6. The C/C++ code should be able to compile under linux with gcc. To cross-compile for the Verdex Pro (Open Embedded) you need to install the Open Embedded toolchain.
Wednesday was judgment day for the people at the Hanze University as that day signaled the ending of their 8 week project. Both Electrical Engineering and Mechanical Engineering worked hard to complete a working Zeppy before the 13:00 deadline. And succeeded, Zeppy flied beautifully!
There are some parts that still need a finishing touch, like the printed circuit board(PCB) and helium balloon. But for the demo we replaced the PCB with a experiment board (breadboard) which we were able to lift with a slightly over-sized balloon. Below are some pictures of this historical day!
In other news our, slightly outdated, article for the Dutch magazine for Artificial Intellgence (de Connectie) got published! Head over to their site to see how you can get a copy!
Also we’ve made great progress on the client-side. We’re now able to remote control Zeppy with a Joystick and have made a GUI to read and set the status of Zeppy. Below is a screenshot of this system we call Uplink!
Our cooperation with the Hanze University is coming to an end. We expect to integrate the fruits of 9 weeks of hard labour into a great product by the end of next week. It has been an awesome experience and I hope that the students from the Hanze University can only agree. Out of nothing there will be a robotic zeppelin, which was not an easy task. Finding the right material for the helium envelope proved to be extremely hard and even after acquiring the desired material, coated mylar, working with it was also problematic. The imagination of the engeneering students was taxed with finding a suitable way of sealing the mylar envelope. Quite a few unorthodox methods of sealing have been explored and, ultimately, rejected. At the moment they are trying a combination of a special mylar glue and aluminium tape, which seems to be working. As for the gondola part the finalized blueprints are being sent to a 3D printer which was made available to us.
On the electrical engineering side of the story we were faced with the failure of our Gumstix module. This was a setback which was quickly overcome by buying a new Gumstix and Robostix. However easy this simple switching trick might seem it pulled some nasty tricks on us. There is no documentation on both components, not organized anyway. So for the electrical engineers this meant browsing the internet for hours to get the system back to its original state. Not an easy task when taking into account that the platform had switched from Buildroot to Open Embedded rendering most our i2c modules useless. The lack of i2c modules seemed to be easily fixed by compiling them from the available repositories, which were unavailable. We were able to get the system running again, however, and the AVR(a type of integrated circuit) software almost immediately worked its magic. The AVR will communicate to a custom Printed Circuit Board(PCB) which will provide access to the sensors and actuators. It’s a strange thing realizing that you’re communicating through a Bluetooth connection to a computer chip, talking to another computer chip which is making motors run. At a distance!
While the students at the Hanze University were developing the hardware, we focused on creating good software for developing the robots’ behavior. On the client-side we now have a functional graphical user interface, written in Java, which is able to sent and receive UDP(User Datagram Protocol) messages to and from Zeppy. On Zeppys’ side we are now able to receive these messages and parse them appropriately into C++ objects. It now only seems to be a matter of tying the generated C++ objects to the i2c-layer and we have a remote controlled blimp again! We’ve also started working on a simulation environment which will be able to simulate Zeppy’s behavior on a basic level and replay Zeppy’s logfiles so we can analyze them.
When Zeppy is completed the fun for us will really begin. We’re considering multiple techniques for autonomous behavior ranging from simple reactive control (like Braitenberg vehicles) to complex Simultaneous Localization and Mapping(SLAM) systems integrated in a hybrid control architecture. Keeping the zeppelin stable in an ever changing environment will also provide the necessary challenges and we are eager to try various control systems like PID-control or recurrent neural networks.
Of course we will flood this blog with pictures and video’s when we are able to show the brand new Zeppy but for now we have some fun videos and pictures!
Testing variable motor speeds using Pulse Width Modulation(PWM)
The students from Mechanical engineering have been working on designing a new gondola for Zeppy. This gondola has to be lightweigt and small, but also strong and big enough to house e.g. the gumstix and the robostix. The separate container at the bottom is designed to hold the battery.
We have decided not to improve, but to rebuild most of the zeppelins’ hardware. To do this, we have called in the help of the Hanze University Groningen. The departments of Electrical engineering and Mechanical engineering are willing to cooperate; they have agreed to let the zeppy-team guide a group of approximately eight students, four from each department. We will lead these students towards designing and building the hardware and the low-level software (the implementation of the API).
From february until april these students will build those elements of the zeppelin for which we lack the technical knowledge, like the balloon, the gondola and the steering system.
The progress and outcome of this cooperation will be posted later on this website.
The university has invited Castel Mediaproductions to create a promotional video for artificial intelligence, focussing on Zeppy. We are very proud of the result and very eager to show you. It can be found here.
We have developed a prototype for a new gondola for Zeppy. We thought it was time to replace the old one wich is basically just a plastic margarine tub with some things attached to it with duct-tape. We want a gondola that is easier to adjust when we come up with new ideas. Here is a (very rough) indication of what we want the gondola to look like in the future: only a slight bit bigger but with more ways to extend and improve it (like by adding more sonars). And don’t worry, the smiley is just for our fun. Currently we are working on having some engineers build this for us completely customized. As for material we were thinking of something like lexan, but maybe those engineers have got some other great ideas for us.