Groundhog UAV curved line following

groundhogcurve
Part of a series of videos and blogs tracking the development of The Groundhog, which was entered into the MAAXX Europe 2017 competition earlier this year.
Having successfully tested the re-written code to follow straight lines using velocity vectors for control and NED space mapping for line detection, we test it around a 50m track comprising 50mm wide red webbing – and we speed it up a bit as well.

The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3.  This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!

The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.

Continue reading “Groundhog UAV curved line following”

Post 4. MAAXX Europe. Connecting the Pi 3 to the Pixhawk

In this short blog series I’m outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.

Connecting the Raspberry Pi 3 to the Pixhawk took quite some working out, so I am hoping that by publishing my own step by step checklist, it may help others save a little time. Continue reading “Post 4. MAAXX Europe. Connecting the Pi 3 to the Pixhawk”

03. MRes in UAV Co-operative Mapping. Getting Control of the Flight Management Computer

WP_20150602_11_27_21_Pro
Connecting the Pixhawk FMU to a Raspberry Pi 2

 

Project Recap:

My Masters project within the Bristol Robotics Laboratory is to design a system of UAVs that can be deployed in groups to co-operatively map the structure of their environment.  This is envisaged as an internal environment, however it is expected that the technologies developed may be additionally adapted for external mapping.  This series of posts documents key elements of the project.  So far we have set the objectives and built an airframe based on a standard 450 quadcopter configuration.

Post Objective:

An on-board Raspberry Pi will have overall control of the UAV.  This post shows how we can set up communications between the Raspberry Pi 2 and a Pixhawk flight management unit,  using the Mavlink messaging protocol, so that the Raspberry Pi can take control of navigation.

Continue reading “03. MRes in UAV Co-operative Mapping. Getting Control of the Flight Management Computer”

02. MRes in UAV Co-operative Mapping. Airframe Construction.

WP_20150520_12_42_23_Pro
Two on-board computers and plenty of space for sensors. Oh yes, it flies rather well too!

Project Recap:

My Masters project within the Bristol Robotics Laboratory is to design a system of UAVs that can be deployed in groups to co-operatively map the structure of their environment.  This is envisaged as an internal environment, however it is expected that the technologies developed may be additionally adapted for external mapping.  This series of posts documents key elements of the project.

Post Objective:

This post shows the construction of the new airframe being used for development.

Continue reading “02. MRes in UAV Co-operative Mapping. Airframe Construction.”

01. MRes in UAV Co-operative Mapping. Objectives.

Initial research platform. Pixhawk and Raspberry Pi.
Initial research platform. Pixhawk and Raspberry Pi.

Research A major part of my Masters by Research at the Bristol Robotics Laboratory is the research project itself.  I am developing unmanned aerial vehicles (UAVs) with the following capabilities:

  1. Able to fly autonomously in a confined space and to map that space.
  2. Able to join with others in a swarm to map the space about them.
  3. Able to identify and locate others in the swarm by:
    1. recognising broadcast ID signals;
    2. using other means to recognise and identify other vehicles (e.g. image recognition).
  4. The objective is that these should eventually be fixed wing, rather than multi-rotor.

Continue reading “01. MRes in UAV Co-operative Mapping. Objectives.”

Electric Vehicle/Robot Sound Synthesiser

EV Synthesiser arrangement in project box.
EV Synthesiser arrangement in project box.

Introduction

A potential issue for electric vehicles and robotics in general is that they move relatively silently. This can pose safety issues when the vehicle/robot is in close proximity with people.

As an aside to my main UAV research project, the synthesiser explores how sounds can be created that relate to the movement of an electric vehicle or robot.

Continue reading “Electric Vehicle/Robot Sound Synthesiser”

MicroView GPS – displaying data

 

 kr

Introduction

This builds on Project 006a, in which we simply connected a GPS unit to our Microview and checked we were getting some data.  Now to make some sense of that data and display it!

This is a bit of a step-up in terms of code, which is why I have left an intermediate Project 006b available in case I need to retrace.  I’m sure people will let me know..

Continue reading “MicroView GPS – displaying data”

MicroView sliders with text labels using RGB LED

Introduction

It’s possible to draw multiple components of different kinds on the MicroView screen at once.

In this blog, the three inputs to an RGB LED are controlled using pulse width modulation (PWM) signals.  The three sliders are set up with simple labels ‘R’, ‘G’ or ‘B’ to the left.  Each colour is raised to full brightness, then dimmed in sequence.  The sliders show this graphically and display the current PWM value from 0-256 in real time.

Circuit

Pins referenced in code as 3, 5 and 6 (numerical pins 12 to 14) are capable of PWM on the MicroView.  These are connected to the LED via 330 ohm resistors in the usual way, with the COMMON of the LED connected to pin 8 (GND).

kr

Continue reading “MicroView sliders with text labels using RGB LED”