In this post we will build a ROS node on a companion computer to subscribe to data being published by the flight control unit (FCU). This will allow us to use the many data streams available from the flight controller as inputs to our system and then be able to make decisions over how the UAV should be controlled.
This series of posts is for those who want to take more control of their UAV (or other robotic vehicle) using a companion computer connected to the flight control unit. Typically to get started, this might be a Raspberry Pi connected to a Pixhawk FCU.
22 students from France will be spending the next two weeks building and coding autonomous drones as part of the UWE Bristol Summer School. Along with Miles Isted s’Jacob, I am delighted to be leading on this activity and have produced a short sneak peek video of the challenge to share.
So six team drones racing autonomously on a single track? What’s not to like?
Code is based on that used for MAAXX Europe, so Python Dronekit, with ArduCopter on Pixhawk. However, the final code will be posted on my github at the end of the Summer School.
If you have heard of Robot Operating System and want to use it to monitor and control UAV flight, this post will get you started…
More specifically, this post details how to set up a Pixhawk flight controller running PX4 firmware, with a Raspberry Pi3 companion computer running Robot Operating System. This combination will give flexible control over the flight control unit and the ability to integrate a very wide range of features such as depth-sensing cameras and machine learning networks.
It’s been a while since my last post. My research has since moved towards the use of machine learning in UAVs and so my trusty Groundhog now sports a Jetson TX2 instead of a Raspberry Pi and an Intel Realsense depthcam for ‘deep vision’ to match it’s deep learning capabilities. But I digress… so I’ll blog more on this another time…
This post is about the DroneJam coding masterclass for autonomous UAV ‘newbies’ I ran for this year’s MAAXX Europe autonomous drone competition, held in March at the University of the West of England.
The Groundhog is being developed to line-follow at low altitude and higher speeds. This video is of the field testing taking the speed up to 1.75 m/s. It also explains why the Groundhog now sports sunglasses.
An oval of 50mm red webbing, bends of radius approx 3m and straights of approx 15m. Testing took place in early morning with glancing sunlight on dew-soaked webbing – great for walking the dog but not so good for computer-vision.
Part of a series of videos and blogs tracking the development of The Groundhog, which was entered into the MAAXX Europe 2017 competition earlier this year.
Having successfully tested the re-written code to follow straight lines using velocity vectors for control and NED space mapping for line detection, we test it around a 50m track comprising 50mm wide red webbing – and we speed it up a bit as well.
The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3. This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!
The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.
Several lessons were identified here from the entry of The Groundhog hexacopter in the MAAXX Europe competition earlier this year.
Current developments are around correcting the issues so that we get a UAV successfully lapping the oval track at a minimum average speed of 1m/s.
A number of changes in approach have been made from that previously blogged. Recall the platform is based on a combination of Pixhawk/Raspberry Pi3/OpenCV/Dronekit.
The birds eye view image transformation in OpenCV was causing segmentation faults on the RPi. Instead the position and bearing of the detected line is calculated using straight trigonometry.
Improvements made to the ranging ROI bands to further speed-up the frame rate. This is now at a reported 50fps (which is faster than the PiCam is supplying them).
The use of quaternions has been temporarily suspended in favour of control by velocity vectors.
As in MAAXX Europe, it makes sense to initially test on a straight line. Initial testing was conducted outdoors using red-seatbelt webbing for the line. It was not possible to fly below about 2m as the propwash blew the line away (will sort that next time!).
In this last post of the series I shall overview the main program including the control algorithms for the Groundhog. Code is written in Python, using Dronekit and OpenCV all running on a Raspberry Pi 3.
As we are flying indoors without GPS and also without optical flow, we are using quaternions to control the vehicle in the GUIDED_NOGPS flight mode of ArduCopter. To be honest, I’ve not come across anyone else doing this before, so it must be a good idea…