Whilst I’m not into FPV, I use an Eachine LCD5800D monitor to check the view from the Raspberry Pi companion computer of The Groundhog. With the super-imposed graphics, it gives a constant view of the status of the image lock on target etc. It also has a nifty built-in recorder.
Recently the link refused to work, and after first replacing the transmitter, I realised it was actually the receiver that had failed. I decided to upgrade the receiver, hopefully fitting a new unit within the existing case.
A good briefing on the internals is to be had from the excellent YouTube video from Albert Kim at https://www.youtube.com/watch?v=P7A6fKBtbXM.
Several lessons were identified here from the entry of The Groundhog hexacopter in the MAAXX Europe competition earlier this year.
Current developments are around correcting the issues so that we get a UAV successfully lapping the oval track at a minimum average speed of 1m/s.
A number of changes in approach have been made from that previously blogged. Recall the platform is based on a combination of Pixhawk/Raspberry Pi3/OpenCV/Dronekit.
- The birds eye view image transformation in OpenCV was causing segmentation faults on the RPi. Instead the position and bearing of the detected line is calculated using straight trigonometry.
- Improvements made to the ranging ROI bands to further speed-up the frame rate. This is now at a reported 50fps (which is faster than the PiCam is supplying them).
- The use of quaternions has been temporarily suspended in favour of control by velocity vectors.
As in MAAXX Europe, it makes sense to initially test on a straight line. Initial testing was conducted outdoors using red-seatbelt webbing for the line. It was not possible to fly below about 2m as the propwash blew the line away (will sort that next time!).
Initial Testing (Links to YouTube Video).
Video – Groundhog UAV initial testing on straight line.
In this last post of the series I shall overview the main program including the control algorithms for the Groundhog. Code is written in Python, using Dronekit and OpenCV all running on a Raspberry Pi 3.
As we are flying indoors without GPS and also without optical flow, we are using quaternions to control the vehicle in the GUIDED_NOGPS flight mode of ArduCopter. To be honest, I’ve not come across anyone else doing this before, so it must be a good idea…