2016 Robot Snow Test

This was our 2016 robot’s full-up test video, mining some snow.


Hawaii robot mining competition press

A few articles came out this morning about our next robot mining competition, the PRISM event on the big island of Hawaii.  We’re currently the only team already in Hawaii, so we were able to talk with the press early.

This Tribune-Herald article gives a decent overview of the contest, and has a decent shot of the Iowa State robot, which I think uses custom fabricated tracks.


Judges Innovation Award at NASA 2014 Robotic Mining Competition

UAF’s team in the 2014 NASA Robotic Mining Competition this May won the Judge’s Innovation Award for our four-wheeled mining robot.  The team’s trophy was presented at the NASA Kennedy Space Center visitor’s complex, directly underneath the space shuttle Atlantis on display there.  
The team’s robot is nearly 5 feet wide, weighs just over 100 pounds, and mined a total of 170 pounds of dusty simulated Martian regolith during two 10-minute teleoperated mining missions, operated via a combination of onboard autonomy and teleoperation.  
The judges especially liked the UAF robot‘s compact stowed configuration, which unfolded at startup to a much larger and more stable wheelbase, allowing it to reliably navigate the obstacle field and remain mobile in the fluffy dust.  The team’s software also impressed the judges, such as the fly-by-wire computer vision robotlocalization system, which used an onboard wideangle camera and a large marker similar to a coarse QR code to automatically determine the robot‘s position in the mining arena–this marker also blocked the pilot’s direct view of the robot, but provided such reliable positioning it was more useful than directly seeing the robot.  They also liked our use of off-the-shelf parts like Barbie Jeep wheels and gearboxes, recycled steel, and commercial conveyor tracks. 
It has been an intense five days of official competition runs, practice missions, and final tuning of hardware and software.  The competition began with a literal bang when early Monday an errant wire inadvertently shorted a motor controller input pin, resulting in smoke and fire, but luckily we had brought a spare motor controller board.  Our first practice session on Monday started with an unexpected camera dropout, resulting in the robot climbing the far wall and ending up on its back, which we addressed by adding front bumper hardware, and camera error handling software.  We encountered intermittent problems the first few days with electromagnetic interference from the motors disconnecting our USB connection, but we built opto isolators for our serial connection to the motor controller breakout board, and found a software fix to re-enable the USB ports to allow the robot to continue operating despite the interference.  Our fully autonomous navigation software performed beautifully during our testing both before and during the competition, and during a final non-scored exhibition run, but lost position while crossing the obstacle field during the scored mission runs.  We will continue to tune our automatic vehicle path planning software.
The travelling student team members are:
 Arsh Chauhan, UAF Computer Engineering
 John Pender, UAF Computer Science
 Tyler Pruce, UAF Electrical Engineering
 Dalton Newbrough, UAF Mechanical Engineering 
The faculty advisor is Dr. Orion Lawlor, UAF Computer Science, who donated shop space for the Sunday construction sessions and backyard space for a dusty test area.  Travel support was provided by the College of Engineering and Mines, and the Undergraduate Research and Scholarly Activity program.
We also received help from students Joe Tallan and Aven Bross earlier this year.  We built on the successful basic design constructed last year with Noah Betzen, Robert Parsons, and funding from the Alaska Space Grant.

Mined 27Kg, plus autonomy!

We just finished our first official NASA Robotic Mining Competition run, which worked amazingly well!  We mined 27Kg of dusty regolith–over half the 46.1Kg body weight of the robot–in ten minutes, and we used full autonomy to locate the dump bin, and drive across the obstacle field into the mining area.

We’d actually just this morning finished adding homebrew opto-isolators to our control Arduino, which let us separate the electrically noisy main battery ground plane with the big robot motors from the sensitive USB input to the control laptop.  This noise has caused us annoying intermittent cutouts, as the laptop’s internal protection electronics disable the USB port.

New modular Aurora Robotics frame

We’re now two weeks from flying to Florida for the finals of the NASA Robotic Mining Competition, and we’ve been trying to figure out how to get our robot’s chassis transported there.  Shipping delay is important, since we’re still tuning autonomy (more about that once the contest finishes!), and price matters too since our travel budget is limited.

Last year, Robert Parsons welded up a neat little bolt-together frame, which worked out well, so I’m going to try that again.  Step one is to de-bone the robot, removing the black steel chassis without touching the wiring or sensors. 


Step two was cutting and welding.  A *lot* of cutting and welding.  Like, all day.  My welding skills are improving, but I have a really hard time figuring out how to design my way from point A to point B in steel.  Anyway, I ended up with a design in three major parts; the two sides hold the wheels, and the raised center portion holds the excavation conveyor/dump mechanism.  Raising the center gives us more ground clearance, but it sure makes the chassis harder to figure out!


The cool part is I was able to fit everything inside *one* airline-legal checked bag, with plenty of space left for tools and robot parts.



Hopefully tomorrow we can put the electronics and actuators back onto the new frame, and see how it works!

Robot needs sensors

Currently, we drive the robot using teleoperation, and don’t even have any onboard video.  To move in the direction of autonomous operation, we’re going to need sensors.

I built some little encoder wheels using a rotary mechanical switch (ALPS 688-EC12E24204A8 from mouser).  The switch just grounds the outside two quadrature wires to the central ground line, and should be easy to hook to an Arduino.  Because the encoder wheels aren’t connected to the drive wheels, we should be able to detect when the robot’s just spinning its wheels rather than driving normally.

Bump sensors would be handy, especially when docking with the bin at the back wall.  I’ve been looking at ordinary AC light switches, since they’re incredibly robust, and cheaper than microswitches.

It’d be really nice to be able to measure current draw.  Current draw to the mining head (about 4A) would let the Arduino instantly detect it’s stopped and raise the head in mining mode, which would make mining much more efficient.  Current draw to to the front arms (about 1A unloaded) would let us know when they’ve finished deploying, and could even tell us how hard they’re pushing.  Microcontrollers can directly detect voltage; the 10-bit Arduino A/D could in theory measure a voltage difference of 5mV, which at 1A only needs 5mOhm–we should reach that with just two feet of 14-gauge wire (2.5 mOhm / ft), and might already have more than that available at the motor controller directly.

I’m thinking we should make an Arduino shield to collect up and noise-filter all the sensors we’ll be adding–it should have RJ-45 female connections for sensor wiring, so we can just unplug/test/plug sensors.  A shield would be less likely to vibrate loose than last year’s “mess of wires” technique, too!