MouseAir launches toy mice for the amusement of cats

Lead Image © sergey vasiliev, 123RF.com

Cat and Mouse

In a valiant effort to entertain a certain black cat, the author creates a weapon of mass distraction. In the process, you learn how to use servomotors, solenoids, RFID readers, and more with the Raspberry Pi.

The cat toy launcher is one of the more whimsical projects to come out of SwitchDoc Labs [1]. It originated from a discussion surrounding a cat named Panther, who loves to bat fake furry mice around on the floor but is continually knocking them under the oven, refrigerator, or couch. Clearly, I needed to provide the cat with a greater ongoing supply of mice, especially when no one was around. Thus, MouseAir was born.

A block diagram (Figure 1) was developed in a pub during a wine-tasting adventure, and development started the next day. Although a number of new products have appeared for remote pet interaction, such as iCPooch [2], I have seen nothing like a Raspberry Pi-based WMD (weapon of mass distraction) for cats. The goal is to have the device detect the cat when it walks by and fire a toy mouse into the hallway.

Figure 1: Block diagram mapping configuration and communication of MouseAir.

As cat owners everywhere know, you have only about a 20 percent chance that your cat will like a new toy. I hope that by combining a new cat toy delivery system with a proven cat toy as ammunition, this probability can be dramatically improved. MouseAir clearly fills a need that should be addressed. At least the project provides a platform for learning how to deal with servomotors, DC motors, solenoids, and RFID readers.

The MouseAir system is built around a Raspberry Pi controlling all the associated devices (see the "Parts List" box) and connected to an iPad-based control panel using RasPiConnect via a WiFi connection. I am using a Pi Camera to capture the cat events, to examine the launching mechanism for jams, for motion detection, and even for streaming video. The actuators are a major part of MouseAir and, in some ways, are the hardest part of the project to implement, because they have to be mounted properly.

Parts List

  • Raspberry Pi Model B
  • Pi Camera
  • Flex cable for Raspberry Pi Camera – 300mm/12 inches
  • RS002B: minipan and tilt kit
  • Adafruit bicolor LED square pixel matrix with I2C backpack
  • MaxBotix LV-EZ1 ultrasonic rangefinder
  • PN532 NFC/RFID controller breakout board
  • SparkFun RFID button – 16mm (125kHz)
  • SparkFun RFID USB reader x1
  • SparkFun RFID reader ID-3LA (125kHz) x1
  • SainSmart 2-channel 5V relay module x2
  • Adafruit 16-channel 12-bit PWM/servo driver – I2C interface
  • Handmade 125KHz RFID antenna
  • PVC pipe
  • Hitech HS300 servomotors x2
  • BaneBots RS-555 12V 7,750rpm brushed DC motor
  • BaneBots hub, hex, series 40, set screw, 4mm bore, 1 wide, 1/2-inch hex
  • BaneBots wheel, 1-7/8x0.4-inch, 1/2-inch hex mount, 50A, black/blue
  • Absolute DLA110 universal power door lock two-wire actuator kit
  • 5V power supplies x2 (one for the Pi, one for servomotors)
  • 12V 2A power supply for DC motors
  • Toy mice

Ultrasonic Sensors

The ultrasonic device is a MaxBotix LV-EZ1 ultrasonic rangefinder. It communicates through a serial interface to the Raspberry Pi. I used a SparkFun RaspiRobot board basically to supply the physical stand for the ultrasonic device. I planned to use the board to control the DC motors, but the board supply ran way too hot and powering the DC motors and the Raspberry Pi from the RaspiRobot board led to reboots from the Raspberry Pi when the motors turned on. I finally gave up on the option and used relays and an external 12V supply instead.

I had to use a transistor to invert the data coming in on the TTL port. Newer devices from MaxBotix have an option to invert the TTL serial signal without the use of an external device.

The LV-EZ1 reads from about 15cm to about 2m with very good accuracy, but the signal does bounce depending on what it is pointed at. I fixed this by only using readings less than 18cm to indicate the cat. Although you can turn the mouse launching trigger on and off, the ultrasonic device is always running in MouseAir. I take this ongoing information and send it to a live graph running on the iPad under RasPiConnect.

Camera

I used a standard off-the-shelf Pi Camera designed to plug directly into the Raspberry Pi. To activate the camera and then add a time stamp, I use the Python subprocess commands. The camera is activated by the start of a launch mouse sequence or by a command from RasPiConnect. Because of the amount of time it takes to acquire and modify a picture, this code is put into a thread and the main loop continues (Listing 1).

Listing 1

Take and Tag a Picture

01 # take picture
02 cameracommand = "raspistill -o /home/pi/RasPiConnectServer/static/picameraraw.jpg -t 750 -ex " + "auto"
03 output = subprocess.check_output (cameracommand,shell=True, stderr=subprocess.STDOUT )
04 #adding tag to picture
05 output = subprocess.check_output("convert '/home/pi/RasPiConnectServer/static/picameraraw.jpg' -pointsize 72 \
  -fill white -gravity SouthWest -annotate +50+100 'Mouse Air %[exif:DateTimeOriginal]' \
  '/home/pi/RasPiConnectServer/static/picamera.jpg'", shell=True, stderr=subprocess.STDOUT)

In the future, I will add streaming video to the device as a pure software solution that won't require any new hardware.

Buy this article as PDF

Express-Checkout as PDF

Pages: 6

Price $2.95
(incl. VAT)

Buy Raspberry Pi Geek

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content