Part of a series of videos and blogs tracking the development of The Groundhog, which was entered into the MAAXX Europe 2017 competition earlier this year.
Having successfully tested the re-written code to follow straight lines using velocity vectors for control and NED space mapping for line detection, we test it around a 50m track comprising 50mm wide red webbing – and we speed it up a bit as well.
The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3. This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!
The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.
Whilst the test was successful, there remain significant barriers to the higher-speed, lower level flight that I want to achieve.
Currently, altitude sensing is by barometer only, the accuracy of which is clearly limiting at low altitudes. I’m going to introduce a sonar in the first instance to give programmatic altitude control, adjustable in-flight by a pot on the transmitter.
Surface reflection ‘white-out’ of the line. These cause the range at which the line can be reliably spotted to be considerably reduced. A higher speeds, a forward looking camera is essential, however this also means the higher angles of incident light make it prone to surface reflection from the line which cause it to appear white. There are a number of physical and programmatic changes that can be made to take account of this.
Manoeuvrability. At 3Kg, the Groundhog only turns so fast. If the line curves faster than it can turn, the lock is lost. Again, several solutions here, but the Groundhog is not just following a line, it is plotting it out in NED space. This means it should be able to learn where the line should be. I think that’s enough of a clue…
In this short blog series I’m outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.
In this post I shall overview the approach taken to the image recognition system used to track the line being followed around the track. Remember the line is red, about 50mm across and forms an oval track 20m by 6m. We are attempting to race around as fast as we can, avoiding other UAVs if necessary.
The Groundhog was at least twice as big and probably three time as heavy as many other competitors. Why? Because it is built for endurance (flight time 35mins+) and also because it’s what I have as my development platform. It normally flies outdoors of course…
Ah, so that means no gps and flying less than 30cm from the ground also rules out an optical flow camera (they can’t focus that close). So how to control this thing?
I recently entered the first MAAXX-Europe competition to be held at the University of the West of England (UWE). On the surface it’s a simple challenge in which UAVs must follow an oval circuit of one red line, around 20m by 6m. However, this proved to be anything but simple and with few rules about how many could be on the circuit at the same time… you get the idea!
So before you read further, I didn’t win (I came 4th of 6, which I am pleased with as a hobbyist up against university teams). However, my hexacopter did complete a lap and I now know exactly what worked really well and what didn’t! And it’s that knowledge I wish to share as part of the payback for all the open-source community help I have had in the last year. Continue reading →