I really like the idea of two or more of these cars chasing each other around a track, so that’s now the first objective. And I think this also suggests a name for these vehicles – Serial Pursuit Vehicles, or SPV1 and SPV2. I’ll need to decide on the colour at some stage, too ;-).
I have not raced RC cars since I was 15, so I’m no expert on modern RC cars. In those days everything was simpler; we used to charge our NiMH cells from a car battery through a single, fat 1 Ohm resistor. And speed controllers got so hot they would do much more than fry an egg if you got near them (that’s linear regulators for you). But I am quite impressed with the Quanum Vandal kit from what I can see so far. It certainly looks sturdy enough for my planned use, which will be on smooth-ish concrete and has a decent amount of adjustment for the steering and suspension.
For the build, there are unused holes in the chassis tray that can be used to mount a platform for more components. It also looks as though the driveshaft could be used for an optical odometer when the time comes.
The battery area can also be used for other components as we will mount the batteries themselves elsewhere.
On that note, we will be adding a certain amount of weight, so it seems sensible to adjust the mounting points on the suspension to the stiffest position, front and back.
One potential issue is the orientation of the ESC, which has the signal wiring coming out immediately adjacent to the drive shaft. This might create a wear-risk over time, so all I have done is reverse the ESC. The motor wires now come out by the drive shaft, but they exit vertically and so there is no chance of them interfering.
Lastly I am removing the small, bash plate at the front. This takes up space that will be required for other components and is not needed anyway as the car will not be crashing ;-). It also gives two more screw holes at the front that might prove useful for mounting more stuff.
Apart from that, we are done with the basic chassis adjustments. In the next post we will add extra capacity for components with some flat plates.
This is something I have wanted to do for some while for fun. Note that it has nothing to do with my work at Manna, so don’t make any assumptions about the technologies being used! I also don’t get much spare time, so updates are likely to be irregular.
Objectives This is experimenting for fun, so may go in a number of different directions. However, I do want to see how close I can get to a car that accurately stays over a line, steered using Ackerman steering rather than a differential drive. This sounds simple but most examples I see seem to be too ‘approximate’, too slow or not very smooth.
Software / Hardware Approach I’ll predominantly be using Python and open source libraries such as OpenCV and ROS. Probably ROS2, in fact. Where we need speed, we’ll use C++.
The usual suspects will be present, such a Raspberry Pi and Jetson. I’m also interested in the OpenMV unit, stereo and depth vision, RFID, mesh networking and others as things progress. For example, maybe having two cars maintaining distancing on the same circuit.
Chassis Selection There are a vast number of notable car projects based on different chassis. and with different levels of aspiration from simple line following through to full autonomy. Many examples of the great Donkey Car project are based on the Exceed 1/16 scale truck. At the other end of the spectrum is the MIT RaceCar project and derivations of that such as Racecar/j. This also has some great support material available from others, such as Jim at JetsonHacks.
I want something a little larger than 1/16th, so have picked a 1/110 scale car from Hobbyking. The Quanum Vandal 1/10 buggy is particularly affordable and comes as a kit or assembled with ESC and motor. I picked the latter for simplicity. With a 2S battery pack, it should hopefully not be too fast for my purposes.
From the images on the website, it looks as though it should be simple to build on a couple of platforms for components – there are some unused holes in the baseplate. I’ll probably also want to set up odometry of some sort, and there look to be some options for that, too, maybe from the driveshaft, wheels or electronically.
In the next post, I’ll go over prepping the kit before making the first modifications.
I’m rebuilding The Groundhog to a more professional level, with the level of accuracy required for the AI and computer vision work planned. It’s also getting an upgrade to the avionics to make it more resilient. This post details the rebuild and also has links to the 3D printed parts used.
In this post we will build a ROS node on a companion computer to subscribe to data being published by the flight control unit (FCU). This will allow us to use the many data streams available from the flight controller as inputs to our system and then be able to make decisions over how the UAV should be controlled.
This series of posts is for those who want to take more control of their UAV (or other robotic vehicle) using a companion computer connected to the flight control unit. Typically to get started, this might be a Raspberry Pi connected to a Pixhawk FCU.
Part of a series of videos and blogs tracking the development of The Groundhog, which was entered into the MAAXX Europe 2017 competition earlier this year.
Having successfully tested the re-written code to follow straight lines using velocity vectors for control and NED space mapping for line detection, we test it around a 50m track comprising 50mm wide red webbing – and we speed it up a bit as well.
The test turned out to be quite successful, with following speeds of 1.5m/s achieved under autonomous control provided by an on-board Raspberry Pi 3. This is significantly faster than the winning UAV in MAAXX Europe this year, which is quite pleasing!
The YouTube video shows both on-board and off-board camera footage, the former demonstrating the roaming regions of interest used by OpenCV to maintain a lock under varying lighting conditions.
In this short blog series I’m outlining the hardware and software of The Groundhog, my entry into the recent MAAXX-Europe autonomous drone competition held at the University of the West of England, Bristol.
In this post I shall overview the approach taken to the image recognition system used to track the line being followed around the track. Remember the line is red, about 50mm across and forms an oval track 20m by 6m. We are attempting to race around as fast as we can, avoiding other UAVs if necessary.
The Groundhog was at least twice as big and probably three time as heavy as many other competitors. Why? Because it is built for endurance (flight time 35mins+) and also because it’s what I have as my development platform. It normally flies outdoors of course…
Ah, so that means no gps and flying less than 30cm from the ground also rules out an optical flow camera (they can’t focus that close). So how to control this thing?
I recently entered the first MAAXX-Europe competition to be held at the University of the West of England (UWE). On the surface it’s a simple challenge in which UAVs must follow an oval circuit of one red line, around 20m by 6m. However, this proved to be anything but simple and with few rules about how many could be on the circuit at the same time… you get the idea!
So before you read further, I didn’t win (I came 4th of 6, which I am pleased with as a hobbyist up against university teams). However, my hexacopter did complete a lap and I now know exactly what worked really well and what didn’t! And it’s that knowledge I wish to share as part of the payback for all the open-source community help I have had in the last year. Continue reading “Post 1. MAAXX-Europe 2017”