Exploring the Pixy sensor with Mathematica

Lead Image © lineartestpilot, 123RF.com

Tracking Number

We show how to track objects and plot their movement using the Pixy camera and the Mathematica package that's included with Raspbian.

The Pixy camera [1] was developed at Carnegie Mellon University and is a fairly specialized vision sensor. Does it take gorgeous 10-megapixel photographs? No. It tracks things and sends data about the objects to microcontrollers, Raspberry Pis, Beagle Bones, and notebook computers.

Mathematica [2] is a data analysis and plotting package that's bundled into the Raspbian OS [3] on the Raspberry Pi. It can crunch numbers and help you visualize all kinds of trigonometric, calculus, and mathematical functions.

I wanted to develop a basic process to get data from the Pixy into Mathematica on the Raspberry Pi in preparation for more in-depth projects. The premise was to track an object with the Pixy and then plot the coordinates of the x and y data. In the future, I might want to suspend the Pixy over a robot arena and analyze bot movements or predict the path of a ping pong ball, as captured by the Pixy. I'm sure you are already thinking up ideas of your own. Exploring new ideas and tools requires prototyping and testing to get an understanding of what you have to work with.

Note that I pretty much work exclusively on Linux machines, which is what runs the Raspberry Pi. I use Xubuntu Linux on my workhorse Asus notebook.

Meet the Pixy

The Pixy is a $75 open source, open hardware board that can track hundreds of objects (using seven different colors) at 50 frames per second. It outputs x-y data, object height, and corresponding object names to a variety of hardware interfaces, including UART serial, SPI, I2C, USB, or digital/analog devices. My experiments used the USB connection, because that worked with both the Pi and my Asus Linux notebook.

The front view (Figure 1) shows the camera sensor and lens assembly at the top and the GPU (large square IC) just below it. The little white button, at the top right of the lens, is the mode selector. Simply push and hold the button to get the Pixy to "learn" and track an object.

Figure 1: Pixy front view.

The back side of the Pixy (Figure 2) shows the large black SPI connector at the top right. The mode selection button is at the top left. Just below it and to the left is the USB micro-B (five-pin) connector. At the lower right, you'll see the six-pin servo header, and just below that is a two-pin power header. The Pixy can draw power from the USB, SPI, or two-pin connectors.

Figure 2: Pixy reverse view.

The Pixy sensor packs a lot of computational power (and its own graphics processing unit) along with complex algorithms in its firmware to perform its object tracking feats. It integrates with other microcontrollers and acts as a sensor, not so much as a standalone device; however, it does have basic servocontrol capabilities. The device is easily hooked up to a Raspberry Pi (Figure 3), an Arduino, the BeagleBone Black, and notebook computers.

Figure 3: Connecting the Pixy and Raspberry Pi.

To capture data and send it to a Rasp Pi or your Linux notebook, you'll need to install some libraries and the Pixy software. Comprehensive Pixy download and installation instructions can be found online [4] [5].

If you are familiar with the Linux command line, Listing 1 shows a summary.

Listing 1

Download and Install

sudo apt-get install libusb-1.0-0.dev
sudo apt-get install libboost-all-dev
sudo apt-get install cmake
git clone https://github.com/charmedlabs/pixy.git
cd pixy/scripts
./build_libpixyusb.sh
./build_hello_pixy.sh

Once the software is installed, simply center a brightly colored object in front of the Pixy lens and hold the mode selection button until the colored LED (at the bottom) turns red. Release the button and then tap it briefly, and the Pixy will have "learned" the first color signature. Then, run the hello_pixy program below to get a textual readout of the x, y, and height values of the object as it moves through the Pixy's field of view. Figure 4 shows a picture of a blue battery pack I tracked for data.

Figure 4: The blue battery I used for tracking.
cd /home/pi/build/hello_pixy
sudo ./hello_pixy

Use sudo to run the program so you don't have to tweak the USB port permissions. As the object is moved in front of the Pixy, the data stream will print out the x and y coordinates, along with the object height and name. You can capture the data to a text file using standard Linux command-line redirection. For example:

sudo ./hello_pixy > data.txt

A portion of the output I captured from the Pixy is shown in Listing 2. At 50 frames per second, you get a lot of data very quickly.

Listing 2

Output from Pixy

[sig: 1 w: 51 h: 35 x: 25 y: 17]
-------------------------------
[sig: 1 w:129 h:112 x: 64 y: 56]
-------------------------------
[sig: 1 w:142 h:135 x: 71 y: 67]
-------------------------------
[sig: 1 w:149 h:141 x: 74 y: 70]
[sig: 1 w: 16 h:  2 x: 12 y:161]
-------------------------------
[sig: 1 w:141 h:143 x: 70 y: 71]
-------------------------------
[sig: 1 w:141 h:144 x: 70 y: 72]
-------------------------------
[sig: 1 w:165 h:151 x: 82 y: 75]
[sig: 1 w:  5 h:  5 x:  3 y:190]

This output is easy for humans to understand but not overly useful as input to another program or for automation.

Time-honored Unix/Linux philosophy dictates that I use small general-purpose programs, strung together, to process data (text) files. In this case, I'll use Awk with redirection to a file. Awk is your text filtering and transformational friend [6]. It's a complete text processing application that runs from the command line. With Awk, you can easily get the data into a form suitable for analysis with Mathematica. Here's what I used:

awk -F: '/x/ {print "{" substr($5,1,3)","substr($6,1,3)"}"}' data.txt > data2.txt

In this case, the -F option sets the colon as the field delimiter. I then wanted to include only lines with the letter "x," so I used /x/ to match the string. This option eliminates the dataless lines of hyphens. Finally, the substr option searches the specified fields and prints the characters it finds in positions 1 through 3. The input field is data.txt, as captured from the Pixy. The data2.txt output file is then read into Mathematica.

Listing 3 shows snippet of what comes out of the Awk program. It's just one x and y coordinate per line.

Listing 3

Awk Output

{ 38, 23}
{ 40, 28}
{ 39, 12}
{ 36, 27}
{ 35, 25}
{ 35, 14}
{ 32, 13}
{ 29, 10}
{ 15, 5}

Because I was only tracking one object, I had no pressing need to include the "Sig 1" field. Plotting multiple objects would have to take multiple object names and their corresponding data into account.

The next step is to get the data file into Mathematica and plot some points.

Mathematica on the Pi

Start Mathematica from the icon on the main Raspbian desktop or the panel. It will take a minute or two to initialize. Then, assign a variable to the list that will be pulled in with the ReadList command.

mydata = ReadList ["/home/pi/pixy/build/hello_pixy/data2.txt"]

Hit the Shift-Return keys to evaluate the command. After a short period, the imported data will appear in the next cell. I had several hundred data points, of which only a portion is visible in Figure 5.

Figure 5: Data points.

Figure 6 shows the last couple dozen data point pairs and a command to plot the points with a line using the variable (which is actually the list of imported data points). Visible also is the ListLinePlot command and resultant plot of all the data points.

Figure 6: Data point pairs and resultant plot.

To make it easier to visualize the motion of the object in front of the Pixy, I cut the list down to the last 20 data points using the tail command (on the Linux command line) and redirected it to a new file. The tail command outputs the end of a file and the -n option specifies the number of lines to grab (from the end, backward):

tail -n 20 data2.txt > data3.txt

Using the data3.txt file as input to the ReadList command in Mathematica produced Figure 7.

Figure 7: Visualizing the movement.

The object is moving around more or less in a kind of circular motion. That's how you prototype. Clearly, there's considerable opportunity for improvement of the process and results through refinement in Pixy data acquisition and recognition tweaking.

Buy this article as PDF

Express-Checkout as PDF

Pages: 6

Price $2.95
(incl. VAT)

Buy Raspberry Pi Geek

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Smells Like Maker Spirit

    The maker movement just keeps getting stronger, and open hardware and software are right in the middle, combining electronics, code, construction kits, and bits and pieces of whatever's in the spare parts bin to create something new or remake something old.

  • Mathematica and the Wolfram language on Raspberry Pi

    Wolfram Mathematica and the powerful Wolfram language are now available for the Raspberry Pi. We'll help you get started with this fascinating tool for calculation, presentation, visualization, and general programming.

  • The Pi Wire

    Things move quickly in the Raspberry Pi ecosystem. This regular column rounds up the best Raspberry Pi and open hardware news to keep you up to date on the latest projects, products, and events.

  • Monitoring auto sensor data with the Raspberry Pi

    Every year, teams of students from universities around the world compete in the Formula Student auto racing competition. One team used a Raspberry Pi to monitor sensor data on the vehicle and send the results to the pit crew through a wireless connection. We'll show you how they did it.

  • Math, Music, and Cat Toys

    Welcome to Raspberry Pi Geek – the first and only print magazine dedicated to the amazing Raspberry Pi mini-PC and the open hardware revolution. We ring in the new and old in this issue. (Actually, nothing is really very old with the Raspberry Pi, but we follow up on some previous themes, including a report on how it went for the wind-turbine-powered Raspberry Pi we described last time.)