Part of a series of videos and blogs tracking the development of The Groundhog, which was entered into the MAAXX Europe 2017 competition earlier this year.
Having successfully tested the re-written code to follow straight lines using velocity vectors for control and NED space mapping for line detection, we test it around a 50m track comprising 50mm wide red webbing – and we speed it up a bit as well.
The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3. This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!
The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.
In this short blog series I’m outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.
My Masters project within the Bristol Robotics Laboratory is to design a system of UAVs that can be deployed in groups to co-operatively map the structure of their environment. This is envisaged as an internal environment, however it is expected that the technologies developed may be additionally adapted for external mapping. This series of posts documents key elements of the project. So far we have set the objectives and built an airframe based on a standard 450 quadcopter configuration.
An on-board Raspberry Pi will have overall control of the UAV. This post shows how we can set up communications between the Raspberry Pi 2 and a Pixhawk flight management unit, using the Mavlink messaging protocol, so that the Raspberry Pi can take control of navigation.
My Masters project within the Bristol Robotics Laboratory is to design a system of UAVs that can be deployed in groups to co-operatively map the structure of their environment. This is envisaged as an internal environment, however it is expected that the technologies developed may be additionally adapted for external mapping. This series of posts documents key elements of the project.
This post shows the construction of the new airframe being used for development.
Introduction I wanted to do something to connect the Microview to the real world, so this project begins a short series to display GPS data on the Microview. All we will do here is connect a GPS unit and show some of the data it is sending. Next, we will interpret that data into GPS co-ordinates.
The MicroView has a dial-type gauge that can be implemented in two styles – larger and smaller. This project is an adaptation of Project 003 which also used a potentiometer to demonstrate the MicroView sliders.
It’s possible to draw multiple components of different kinds on the MicroView screen at once.
In this blog, the three inputs to an RGB LED are controlled using pulse width modulation (PWM) signals. The three sliders are set up with simple labels ‘R’, ‘G’ or ‘B’ to the left. Each colour is raised to full brightness, then dimmed in sequence. The sliders show this graphically and display the current PWM value from 0-256 in real time.
Pins referenced in code as 3, 5 and 6 (numerical pins 12 to 14) are capable of PWM on the MicroView. These are connected to the LED via 330 ohm resistors in the usual way, with the COMMON of the LED connected to pin 8 (GND).