Wednesday, February 15, 2017

Rubiks Cube Tracker using OpenCV

I've been working on my own Rubiks Cube solver for the past few months, it is a mashup of the Lego 42009 Mobile-Crane-MK-II and an EV3 Mindstorms and can solve from 2x2x2 up to 5x5x5. The robot is called CraneCuber, it uses a USB webcam to see the cube and extract the colors for each square. I'll do a post on CraneCuber soon but I wanted to do a separate post about what is involved with finding a rubiks cube in an image and extracting the colors.

Here is a quick video showing all this in action.


Cube Location

The robot uses a webcam to take a picture of each side, we'll focus on this photo of a 4x4x4 cube. I use python opencv2 for all of the image processing.


Canny Edge Detection

The first steps are to load the image, gray it, blur it a little and then use Canny edge detection to locate all of the edges.

    self.image = cv2.imread(filename)
    gray = cv2.cvtColor(self.image, cv2.COLOR_BGR2GRAY)
    blurred = cv2.GaussianBlur(gray, (3, 3), 0)
    canny = cv2.Canny(blurred, 20, 40)

Canny Edges

Dilate

The next step is to dilate the lines a little, this makes the lines thicker which will make it easier to find the contours of the squares.

    kernel = np.ones((3,3), np.uint8)
    dilated = cv2.dilate(canny, kernel, iterations=2)

Dilated Canny Edges

Contours

At this point we have a black and white image with nice thick lines around each square.  opencv has the ability to find all of the contours in an image where a contour is defined as "a curve joining all the continuous points (along the boundary), having same color or intensity".

(contours, hierarchy) = cv2.findContours(dilated.copy(), 
                                         cv2.RETR_TREE,
                                         cv2.CHAIN_APPROX_SIMPLE)

Contours

Contour Approximations

It looks like someone spilled a plate of blue spaghetti noodles all over the cube doesn't it?  We need to reduce the complexity of the shapes of the contours.  We can do this via opencv's approxPolyDP which in a nutshell approximates the shape of a contour. In the image below the contours are again in blue but their approximations are in red.

Contours (blue) with approximations (red)

Square Contour Approximations

The approximations look much easier to work with in terms of figuring out which ones are squares. We look at each approximation and use the following three rules to determine if it is a square:
  • There must be four corners
  • All four lines must be roughly the same length
  • All four corners must be roughly 90 degrees
In the image below the contour is green instead of blue if the approximation for that contour satisfies our rules for being a square.


Contours with square approximations are green

At this point we can eliminate all of the contours that are not squares, that cleans things up a lot

Removed non-squarish contours

Squares within squares

Things are looking better but we have three squares in the middle where there is a square around the square.  This happens from finding the contours of the black sections between each square.  We eliminate these outside squares which leaves us with:

Removed squares around squares

Cube dimensions and boundries

We can now analyze the remaining contours and figure out two key pieces of information
  • We are dealing with a 4x4x4 cube
  • We can find the boundries of the cube
Since we know know the boundries of the cube we can throw away any contours outside the boundries which leaves us with

Removed contours outside cube boundry
Presto we have identified the 16 squares in this image!! We compute the mean RGB (Red, Green, Blue) values for each of the 16 contours and later on we hand this data off to the rubiks-color-resolver (more on that in a minute).

Handling Errors

Sometimes the squares of a cube are dented or the sticker is peeling off or there is some really bad glare or it is in the shadows or there is some other crazy problem that makes the contour approximation fail one of our three "is it a square" rules when in fact it is a square.  This happened in the image below, the square that is in the 3rd row, 5th column has a contour approximation that doesn't look like a square (the contour is blue, the approximation is red).  Here we see the approximation is a triangle so it clearly failed the "There must be four corners" check.

Non-square approximation for a square
There were enough good squares for us to determine that this is a 6x6x6 cube though and that we are missing one square
Missing One Square
We then look at all of the non-square contours inside the boundries of the cube and see if any of them are in a location that will give us six squares in the 5th column and 6 squares on the 3rd row.  The contour in yellow below (approximation in light blue) satisfies that condition.
All Squares Located

Color Extraction

Once we've processed the images for all six sides we have the RGB (Red, Green Blue) value for each square.  The 4x4x4 image from our example is side F in the cube below, the colors shown are the mean RGB values as extracted from the images.  Notice how much variation there is...40 is a good bit darker than 17 but these are both yellow squares.
Raw RGB values

The next step is to take those RGB values and identify which side (U, L, F, R, B, or D) the square belongs to.  We do this because all rubiks cube solvers need to know the exact state of the cube.

The actual mechanics of how this works isn't terribly exciting to write about in a blog but in a nutshell it does the following:

  • Identify the six colors used by the cube.  In this case the colors are red, orange, green, blue, yellow and white.
  • Create a list of the color combinations used by all edges (red/blue, yellow/red, etc)
  • Create a list of the color combinations used by all corners (white/orange/green, red/blue/white, etc)
  • Create a list of centers color squares
  • For each edge (say U5 L18 for example) compute the color distance of that edge vs. all of the edges that in the "list of edge color combinations that we need" and find the one that is the closest match.  The distance between two colors is calculated using the CIEDE2000 algorithm. For the U5 L18 example the needed color combination that matches with the lowest color distance is white/orange so we assigne white/orange to U5 L18 and remove white/orange from the needed list (since this is a 4x4x4 there is one more white/orange left on the needed list).
  • Repeat the process for the corners and centers
The end result is that we can nail down exactly which side each square belongs to.  We then redraw the cube with some brighter colors so we can sanity check against the "Raw RGB values" image above to make sure we got everything correct.


And finally we print out a big string that represents the state of the cube.  This string is what is passed to various rubiks cube solvers so they can compute a solution.  For the cube above the state string is 
RRBBURURBULLLBBUFFFRRRFLFBFDRDFDDRRLFBDDBBLRRLUBFDLUDLDRDRUUBBUFDLUBDDRLUBDLLFFUDDUFBLULBFFRLFUU

The color resolver code is available at https://github.com/dwalton76/rubiks-color-resolver

The Code

If you have a linux box, webcam and a Rubiks cube you can install the software that I used in the video.  It is available at https://github.com/dwalton76/rubiks-cube-tracker...just follow the install instructions in the README.

References


  • http://www.pyimagesearch.com/2015/04/06/zero-parameter-automatic-canny-edge-detection-with-python-and-opencv/
  • http://www.pyimagesearch.com/2016/02/08/opencv-shape-detection/
  • http://docs.opencv.org/trunk/d9/d8b/tutorial_py_contours_hierarchy.html
  • http://www.alanzucconi.com/2015/09/30/colour-sorting/
  • https://en.wikipedia.org/wiki/Color_difference

Sunday, October 16, 2016

SeanBot 2.0 - smartphone web interface for lego tank robots

SeanBot

Two years ago I did the SeanBot robot which was the EV3D4 build but with ev3dev running under the covers and a basic web interface for driving.  The implementation was a bit clunky though as it used an apache web server to serve the web pages, images, etc but used a second web server for the REST API.  Despite being a bit clunky to get going I've seen several projects use the seanbot code to provide a web interface for their robot. pandaeatsbamboo was the most recent project to do so, they have a nice blog post on their build.

Since there were several projects using the seanbot code and since many of the basic lego mindstorms builds are tank style robots (EV3D4, TRACK3R, GRIPP3R, EV3RSTORM, ete are all tank robots) I decided to revamp the seanbot code to make it a little easier to use.  Here it is in action, details after the video.


Goals


  • Only run one web server
  • Implement the web server in python
  • Take advantage of the Tank class in ev3dev-lang-python's helper.py
  • Provide a desktop interface and a touch based smartphone interface
  • Make it easy for someone to modify the existing code for their own robot

Web Interface

The web server listens on port 8000 so go to http://x.x.x.x:8000/ where x.x.x.x is the IP address of your EV3.  The main interface will present you with a choice between a desktop interface or a mobile interface.


Desktop Interface

The desktop interface looks very similar to the original seanbot interface.  There are two sliders, one to control the speed of the medium motor and one to control the speed of the tank's movements.  This page supports touch events so you can use the desktop interface from your smartphone if you like.


Mobile Interface

The mobile interface uses a virtual joystick, if you drag the black dot towards the top of the gray circle the robot will move forward.  Drag it to the right to turn clockwise, drag it to the left to turn counter clockwise, etc.  

The closer the black dot is to the edge of the gray circle the faster the robot moves.


The Code


The code is available at ev3dev-lang-python in the demo/EV3D4 directory, just run EV3D4WebControl and off you go (I am assuming that you are running python-ev3dev 0.7.0 or later).  WebControlledTank lives in ev3dev/helper.py so take a look at that if you want to see the details on what is going on behind the scenes.

If you want your own tank class robot with a web interface you could do something similar to this:

#!/usr/bin/env python3

import logging
from ev3dev.auto import OUTPUT_A, OUTPUT_B, OUTPUT_C, OUTPUT_D
from ev3dev.helper import WebControlledTank, MediumMotor

log = logging.getLogger(__name__)

class MyTankRobot(WebControlledTank):

    def __init__(self, 
                 medium_motor=OUTPUT_A, 
                 left_motor=OUTPUT_C,
                 right_motor=OUTPUT_B):
        WebControlledTank.__init__(self, left_motor, right_motor)

        self.medium_motor = MediumMotor(medium_motor)
        self.medium_motor.reset()
        # code specific to your robot (sensors, etc) goes here

if __name__ == '__main__':
    mybot = MyTankRobot()
    mybot.main()




Tuesday, August 9, 2016

Save the Clock Tower!!

I am way behind on posting recent projects...and by recent I mean last Halloween. One of my friends goes all out for his annual Halloween party, there is always a theme and some elaborate prop. A couple of years ago there was a pirate theme so he built a partial pirate ship in his garage.

The theme for last Halloween was Back to the Future and the prop was going to be a large clock tower. I volunteered to build a clock motor out of an EV3.  The goals were:
  • Run until 10:04 and then freeze
  • Signal an arduino to trigger a lightning strike (white rope lights)
  • After five minutes of sitting at 10:04 start back up but spin extra fast so that we get back to 10:04 in 15 minutes
I found a pretty cool analog clock design online. It used technic parts and the instructions were only a few bucks. The gear ratios are such that for every full rotation of the long hand the short hand makes 1/12 of a rotation. Here is a test run of the clock being driven by a power functions motor.



And the finished clock tower with a lightning strike at 10:04





Tuesday, October 20, 2015

MindCuber EV3 in Python

MindCuber

If you have an EV3 you have probably watched a video of the awesome MindCub3r robot solving a Rubiks cube.  I too built this robot and made a video :).  One twist is that mine is running ev3dev (Linux) and all of the code is written in Python. I am a big Python fan and would much rather program crazy lego robots in it than I would the graphical programming language provided by lego.


The Code

cavenel on github wrote the original MindCuber python implementation based on a release of ev3dev from early 2014.  It no longer worked on the latest ev3dev though so I started fixings bugs and got the basics working.  I also rewrote most of the code that does the color analysis.

The code is available here:

Color Analysis

The lego color sensor can scan something and return one of a few basic colors (blue, green, yellow, red, etc) but for scanning a Rubiks cube this doesn't work very well.  The reason being certain colors on the cube are too similar, Red vs. Orange are really close, even Blue vs. Green can confuse the sensor because they are both so dark.

There is hope though, you can tell the color sensor to return the raw RGB (Red, Green, Blue) value instead of trying to guess at the color.  I take the RGB values for all 54 squares and figure out which of the six possible colors each square is.  I learned a lot about Color Difference and Rubiks Cube Parity :)

Solution Code

Once you know what color each square is you can feed that data to one of many rubiks cube solving programs out there on the web and they will return a list of instructions that tell you how to solve the cube.  This is some pretty complex stuff, people have done their PhD thesis on algorithms for solving Rubiks cubes.

I did not dive into this in depth, this problem was solved in depth long ago by people a lot smarter than me.  I wanted to keep everything in python though so I grabbed a copy of the following and am using that:
https://github.com/muodov/kociemba





Sunday, November 9, 2014

SeanBot - An EV3 robot controlled via a web interface

I built a cool little robot recently and even managed to convince myself that it was somewhat work related.  Sean, one of my co-workers at Cumulus Networks, and his wife were expecting their first child so Sean wasn't going to risk making the trip to California for a quarterly meeting since it was too close to their due date.  I hadn't done a cool mindstorms project recently so I offered to build a little robot that he could control via the web and drive around the offices at headquarters.  The idea was to have the robot hold an iPhone running Skype so that Sean could drive around the desk during the meetings and chat with everyone.

I decided to use one of the EV3 bonus models, EV3D4, as the robot.  It is small but very mobile so it seemed like a good choice.  That and it is based off of the Star Wars robot R5-D4 (the little red robot the Jawas were selling along with R2-D2 and C-3PO in Episode IV) so that just makes it extra cool.  Originally I told my 5-year old that it was R2-D2 but he called me out with "Daddy that isn't R2-D2, it needs to be white and blue to be R2-D2, not white and red".  Outsmarted by a 5-year old...

HiTechnic's Rotacaster
In terms of the hardware the main change I made was to use an omniwheel, officially I used a Rotacaster from HiTechnic.  An omniwheel rotates like a normal wheel but can also slide from side to side...they are handy if you have robot that needs to spin in circles.  I also used a very small wifi dongle called 'The Pi Hut' but be warned you can't use this with the standard EV3 software...more on that in a bit.



You can see the robot in action here with the omniwheel front and center, iphone mounted, etc. Sean used Skype to dial the iphone from his desk so that is him on the phone there.

For the hardware I didn't change much from the stock Lego instructions but for the software I took a very different approach.  I needed the robot to run a web server that would allow Sean to control the robot via a web browser.  I couldn't find a way to do this via the standard software that ships with the EV3 but good news, there is a great little project call ev3dev that built a debian linux distribution that will run on the EV3.  In other words you just boot debian linux on your EV3!!  This gives you tons of flexibility and it means that you no longer have to use the graphical programming language provided by Lego.  Python is my favorite language and it turns out that with ev3dev you can program the EV3 in python...perfect!!

All of the code for this project is available on github, feel free to use it:

How does the code work?  The /var/www/ content is served by the main webserver (apache, etc).  LegoR2D2.py runs a REST API on port 8000.  When the user clicks on a button on the web page served by apache, AJAX is used to send a HTTP GET to LegoR2D2.py's REST API (it is listening on port 8000) to tell it to move forward, backward, etc.  When the user releases the button we send another HTTP GET to tell it to stop.

The Star Wars page on the left of Sean's screen is the web page served by the robot.  You just click the arrows to make the robot move.





























Here is SeanBot in action for the quarterly meeting, SeanBot is in Mountain View, CA but Sean is driving it from Apex, NC...cool stuff.




And last but not least here is Sean's beautiful daughter :)  She was born a few weeks after the meeting in California ended and is keeping Sean and his wife plenty busy.

Friday, June 27, 2014

Lego Digital Clock

I built a copy of an awesome lego clock designed by Hans Andersson:



Mine is a little different in that it uses a Raspberry Pi to control the motors instead of two NXT bricks.  Dexter Industries makes a Pi add-on board called BrickPi which allows you to control lego mindstorms motors, etc from a Pi:

Raspberry Pi + BrickPi

All of the code is written in python and is available on github:

Sunday, September 22, 2013

Remote Controlled EV3 Lego Excavator 42006

Remote Controlled Excavator

I got the LEGO Excavator a few months ago but the controls on it were way too difficult for my kids to use.  Heck I had a hard time remembering what combination the three switches had to be in to make it open/close the claw or raise/lower the arm.  I found some videos on youtube where people used a lot of power functions gear to make their excavators remote controllable.  The way they did it was cool but I don't own tons of power functions parts so I decided to make mine remote controllable by combining it with my EV3.

The hardest part was figuring out how to control six movements with only four motors. I built a motor multiplexer that uses two motors to control four movements.  This is used to control the claw, arm, rotating the platform,etc.   I used the other two motors to control the tracks.

It is a big hit with my kids :)


Instructions


  • LDD Instructions - I didn't include instructions for the arm/boom since they are the exact same as the standard 42006.  When you build it turn the mux so that it is engaging the gear at the top.
  • EV3 Software - There is a program called RunFirst that you need to run one time when you first build the excavator.  This program creates a text file that is used to remember where the multiplexer is located.  This way if you turn the EV3 off and turn it back on, we know which gear is currently engaged by the mux.