Droid Racing Challenge 2017
On July 4, our team from the Griffith Robotics Lab went up to QUT, Brisbane, to pit our car against the other universities. After everything that could possibly go wrong went wrong, we perservered and hacked together a working car. How did we go? Pretty amazing if I may say so myself. Along side Macquarie Uni, QUT, UNSW, UQ and Univeristy of Wollongong, we managed to the start line for the finals. Not a bad run with 11 teams from across Australia and New Zealand. During our development, we learned a few things.
Murphy was right. Everything went wrong. We initially had two cars, the car built for last year's challenge and another car that was supposed to use differential turning. Our second car had unrecoverable control problems and many difficulties in trying to make it turn in a way we wanted it to. Running off an Arduino and Raspberry Pi, the car was originally designed to send visual data to a laptop, the laptop running the control logic and returning driving instructions. Our initial problem was the Raspberry Pi B+ was not powerful enough to encode and stream a USB web cam, which we solved by upgrading to a Raspberry Pi 3. We then ran into problems with the power and control limitations of the actual RC vehicle. A couple of weeks out from the race day, our nVidia Jetson TX1 board for the first car died. Yes, we managed to kill a $300 embedded device. This car was designed to run the vision and turning logic onboard without any external systems. After a week of trying to salvage the situation, we concluded that only solution we had was to frankenstein our two cars together and start working on a new car that used the chassis of the first car with the computer systems from the second car. We had 4 days left.
Luckily we didn't need to touch the Arduino code from the previous weeks. Our motor controls and car settings were perfect -- props to the IT and Engineering guys that did that for us! We then set about working on getting the Raspberry Pi code into C++ so we could quickly get it talking to our server. Every hour was critical and every milestone felt like an achievement. This was quickly turning into the most ridiculous hack-a-thon. Using a single Logitech C920 web cam, we added a fish-eye lens to increase its field of view. The video feed was sent over WiFi to our laptop (UDP if anyone is interested) where we started doing some pre-processing. Pre-processing in this case can be considered as any image processing done on the video prior to our business logic (automated driving). Our immediate goal was to find the edges of our blue and yellow lines. Using fuzzy logic, we could identify the colours and then edge detection to find the outlines. From there, we derived edge "points" and used a system of weights to derive the direction and intensity of the turns required to keep our car on track. Awesome theory... didn't quite work in practice. Our car was too fast.
I will always be the first to say that our car looks awesome. The engineering team built an amazing car last year. Seriously amazing. Customisable suspension, differntials, 4WD and the fact that we can even fine tune the wheel alignment helped a lot in getting our car to go straight (which is more difficult than you'd think). As it was originally an RC race car, it was amazingly fast too. How fast? The slowest we could get it to run was still too fast for our control system. On our last straw, we decided to try and send bursts of power to the motor so that we could hopefully have something slow enough for our system to make decisions. After fixing the driving speed, it was now back to the driving logic to get it to turn. Close to midnight, July 2, two days before the race, we managed to get the car to turn around a short corner in my garage. If we weren't so tired we would have been cheering. We called it a day and relied on the rest of the team to tweak the code so that it would work for the next day's live trials at QUT and then the actual race on the day after.
I wasn't able to go up for the two days but as we hadn't been able to test in outdoor conditions up until the day, there were a number of teething issues. Thanks to the hardwork of the team on site, the car managed to get around the 2/3 of the course and come home with great feedback on the problems and challenges faced on the day.
What We Achieved
Looking back at our 4 months of working together, I think we managed to achieve a lot of things with very little budget. First we achieved greater insight and experience in machine vision and fuzzy logic. I suspect that this knowledge will be useful in future projects where a system will have to make decisions based on how close the parameters are to the goals it's aiming to achieve. Working with an inter-disciplinary team from 1st to 3rd years and getting the most out of each team member. I don't think we were always successful but we managed to get a lot out of our team with backgrounds in engineering, programming, networking and system engineering. While finding out what worked and what was doable was critical to the success of the project, finding out what doesn't work through experimenting and pushing the boundaries of our experiences was what lead us there.
Be more agile but make executive decisions. We dived into the project with an agile approach to development. Trying out something and then switching gears if it didn't work. At first we tried a lot of Python code for our Raspberry Pis trying to get existing libraries to work and finding out ways to stream data between the laptop and Pi. It worked and didn't work. In hind-sight, we should have made the decision to switch over to using OpenCV on the Pi and streaming it from there earlier. Which leads to the second point:
There are no shortcuts. Libraries are great and should be used for development efficiencies but overly relying on them as a shortcut can result in compromises that just don't work.
Thanks for All the Support!
If it wasn't for the support from Assoc. Prof. Jun Jo fromGriffith University's School of ICT and School of Engineering, we wouldn't have the cars or the expertise to tap into. We'd also like to thank Griffith Enterprise for the space that was available for us and profusely apologise for the amount of noise our cars made.
Most of all, I'd like to thank the awesome bunch of people that contributed to the project. Many thanks to
- Cailen Robertson
- Matthew Bourgeois
- Matthew Lee
- Hayden Norris
- Ryan Dennis
- Wilhelm Pinto
- Shane Cress
- Duncan Kirkland
Don't stop being awesome.