(Nb. All resources for this post can be found on GitHub at https://github.com/mikeisted/maaxxeurope2018)
It’s been a while since my last post. My research has since moved towards the use of machine learning in UAVs and so my trusty Groundhog now sports a Jetson TX2 instead of a Raspberry Pi and an Intel Realsense depthcam for ‘deep vision’ to match it’s deep learning capabilities. But I digress… so I’ll blog more on this another time…
This post is about the DroneJam coding masterclass for autonomous UAV ‘newbies’ I ran for this year’s MAAXX Europe autonomous drone competition, held in March at the University of the West of England.
In common with last year, the competition is to build a drone to autonomously navigate around a circuit, marked out by a red line. However, the event was altogether bigger this year with three flight arenas, many activities organised by sponsors, FPV racing and even a full two-seat glider flight simulator. It was great to see this event continue to build on last years’ strengths and to become more popular still. Its has also attracted international attention with one team coming all the way from Moscow (more of them later!).
The idea was to produce a competitive drone equipped with camera, optical flow, pixhawk flight control unit and enough code to get drone-coding newbies off the ground, so to speak. The code was to be implemented in Python on a Raspberry Pi companion computer. So it wasn’t particularly expected that the students would compete in the main competition, more that they would learn from a real world experience.
The drone was built minimally with the following components:
- S500 standard quad frame
- Pixhawk running Arducopter
- PX4flow for optical flow
- Standard (cheap) USB webcam
- Raspberry Pi 3 running Stretch, Dronekit
- Benewake TFMini rangefinder
- 3S or 4S lipo
- 5V voltage regulator to supply both the Pixhawk and Raspberry Pi.
The core configuration was based on the tried-and-tested combination of The Groundhog hexacopter (previously blogged) , but reduced in size and cost. In particular, the PX4flow was needed to replace the GPS (for indoor use), a £5 webcam replaced the picam and the TFmini (£35) replaced the much more expensive TeraRanger One.
The main challenge was to convert the system for indoor use. The GPS was replaced with a PX4flow and the Arducopter parameters configured accordingly. Arducopter was also reconfigured to place priority on the rangefinder to control altitude as previous testing at the Bristol Robotics Laboratory flight arena indicated that using the barometer resulted in unacceptable altitude deviations when flying only a metre or two above the ground.
A parameters file for Arducopter is included in my GitHub repository at https://github.com/mikeisted/maaxxeurope2018. However below are instructions for configuring the key parameters:
To set up the Pixhawk, it is necessary to adjust parameters for the following circumstances:
1. Use of Lidar
2. Use of Optical Flow
3. No use of GPS
4. Prioritisation of lidar over barometer
Lidar Set Up (TFMini connected over serial)
RNGFND_TYPE = 8
RNGFND_MAX_CM = 700
RNGFND_MIN_CM = 30
SERIAL2_PROTOCOL = 9 (Lidar)
SERIAL2_BAUD = 115 (115200 baud)
RNGFND_GNDCLEAR = 13
FLOW_ENABLE = 1
FLOW_ORIENT_YAW = 9000
Follow Optical Flow instructions at: http://ardupilot.org/copter/docs/common-px4flow-overview.html
AHRS_GPS_USE to 0
EK2_GPS_TYPE from 0 to 3
Other Serial Interfaces
Serial1_baud – 57
All software is included in my GitHub repository at https://github.com/mikeisted/maaxxeurope2018.
There are three folders of code, each utilising a different line detection or control technique. The Activity documents are designed for students to work through each algorithm, developing a progressive understanding of each ‘level’ of coding.
The code itself is fully documented and with the Activity sheets there is no need to go over it again here. However, there are a couple of points I would make for those interested in MAAXX Europe line-following.
- The code that works is in the folder ‘Algorithm 3’. The other two algorithms are designed to help students learn progressively what does and doesn’t work.
- Some modifications were made to both the drone itself and the code in
Algorithm 3. These are described in the next section, but are necessary!
Our team of newbies came second! We were the only team to successfully follow the red line, which we did for 19 laps. Here’s how we did it… very briefly!
Students arrived individually and small groups, some locally but some also travelled from across the UK. It was decided to create our own team (IOS) and try to compete – great!
We went through the basics of the drone, components, flight etc. and started flying using the first two line-following algorithms. These simply find a point on the line and yaw towards it. As expected, it was not a very effective technique, but it did give a platform for discussion of drone control, including the exciting (but often misunderstood) topic of PID control (what fun!!).
We progressed onto Algorithm 3, which was based on the (previously blogged) tests run with Groundhog last year.
Test runs showed the drone was oscillating side-to-side over the line. It was also yaw-ing in the same way. The team decided to turn down the P constants in the respective control loops. We also added texture to the ground with squares of duct tape to help the optical flow unit gain ‘traction’. Unfortunately this had little effect and we ended Day 1 none the wiser. Mind you – pizza was on offer by way of consolation and a chance to rethink…
So it’s all about problem solving – someone suggested that ‘real progress’ was made by solving one problem, simply to have the opportunity to solve the next thing that doesn’t work! How true.
Overnight we had a chance to reflect with the benefit of some sleep. We knew that:
- The basic algorithm works (I showed the team the YouTube video of the Groundhog at https://www.youtube.com/watch?v=UGCw27Atork)
- The only thing that had changes is that we are now indoors, with no GPS.
- We were in danger of having P constants so low that the drone would just be completely unresponsive.
Eureka Moment 1…
We turned our attention to the optical flow unit used to replace GPS. This had worked successfully in tests outdoors on grass, and in the brightly lit flight arena of the Bristol Robotics Laboratory, but this arena was much darker and had a dark, finely textured carpet…
We hooked the px4flow unit up to Mission Planner and hey presto, this just had to be part of the problem:
- The image was soooo very dark.
- The focus was optimised for 2m+ (as in general setup instructions, of course) so was fuzzy.
- At a flight height of 1m, the field of view on the ground was only 10cms across!
So this meant that:
- The optical flow unit had little chance of working, as the images were near black.
- Even if it did, the squares of tape put down previously were probably too far apart.
The solution – refocus and much, much more light to illuminate the grain of the carpet itself.
We quickly rigged up some high-brightness LEDS (although one melted it’s own solder joints), but this was not enough. Then of course in true Wallace and Gromit style we deployed an LED bicycle torch, set to maximum and fixed to point down with some Dual Lock. Success!
The drone started to respond much more decisively, however it was still oscillating side to side (so along the y axis) to the point that it lost the line.
Eureka Moment 2…
In order to make the drone as simple as possible and in a late decision, the camera gimbal used the previous year had been removed and the camera set at a fixed pitch of -45 degrees. To compensate, the algorithm detected the roll of the drone and rotated the image. I can just hear you now… “What about transforms?”, “Didn’t you at least calculate the homography matrix?”… So as a developer I have no excuse! But of course this was an educational exercise and it was one of the students that not only noticed this, but also realised it was giving rise to a positive feedback loop, thus increasing the oscillations! So how successful was that?! Hats off, as they say.
Fortunately we were able to fit a camera gimbal (more Dual Lock) and the change was immediate. We were able to complete 16 laps of the track, putting us in 2nd position, only behind the Russian team.
It did indeed turn out to be all about the illumination for the optical flow unit. As the natural daylight dwindled and our torch battery became depleted we were only able to add a few more laps, and were not able to catch the team from Moscow.
Nevertheless, in just 2 days our team of students who were new to this had faced many unpredictable problems and had responded with tenacity and skill to overcome them. They had gone well beyond tweaking a few parameters in the code to dissecting most aspects of it and highlighting key issues. Their 2nd place was well-earned.
And how did the team from Russia do it? With some lateral thinking they surrounded the track with block images ( ArUco markers similar to QR codes) which served to map the circuit. Their drone traversed these ‘knowing’ the pre-programmed sequence very successfully. So nothing at all to do with following the red-line, but an excellent solution nevertheless and a well-deserved win.
MAAXX Europe 2019?
Next year, perhaps we will run another Masterclass – but maybe using cameras for localisation and machine learning for navigation. Whatever we do, it is clear that the MAAXX Europe Autonomous Drone Competition is not only great fun, but is becoming instrumental in building the ‘real world’ skills of our UAV developers of the future.
A huge thanks to Dr Steve Wright and Miles Isted s’Jacob of The University of the West of England for organising another great event.