Introduction to the project
This project is about upgrading my LG Hom-Bot vacuum robot, nicknamed ‘Edgar’, with a Raspberry Pi brain. This mod allows for some cool and unique features which you will not find in any household vacuum robot.
For this project I didn’t want to go hacking, slashing and blasting through the perfectly fine vacuum robot, potentially destroying Edgar in the process. I figured that I could use the infrared remote to controll the robot. By sticking a Raspberry Pi equipped with an IR remote to the vacuum robot I wouldn’t need to open the robot and void my warranty. This ‘soft-upgrade’ kind of reminds me of the headcrabs seen in tha Half-Life game series, the intelligent Raspberry latched onto the head of the vacuum robot, telling zombie Edgar what to do. A project was born.
Here are the project requirements:
- Edgar can be controlled from anywhere on the world.
- Edgar can stream a camera feed so the user is able to navigate the room (and check up on my cats).
- Make pictures of my cats.
- Work as a surveillance camera, motion triggered recording.
- Turn on the airco (is also IR controlled :D).
- Edgar should be able to have all it’s functions, like starting a cleaning-process and going to the home station controlled through an interface of some sorts, this allows for remote activated cleaning.
- After this project Edgar still needs to be able to his job; cleaning (this means that the Raspberry Pi and all of it’s electronics (a.k.a) the headcrab, should be easy to remove.
Setting up the raspberry
The Raspberry Pi is completely new to me so first thing on my list was to get to know the Pi. I ordered the Raspberry Pi User Guide from Bookdepository.com and ordered a Raspberry Pi learning kit from Banggood.com. As I was waiting for these items to arrive I did a basic setup of my Raspberry Pi following these tutorials from Element14. This process went well apart from the dreadfull long time it took to write the Raspbian OS on the micro-SD card (around 30 minutes).
The video / surveillance stuff
The book and learning kit were still on its way to me so I decided to investigate a bit further on the topic of streaming video and motion detection. I found this wonderful tutorial from Christoph Buenger about installing and setting up the program called Motion. I didn’t follow the steps as strict as I should have, so I had some problems that I was able to fix. Great way of learning how to use a Raspberry Pi though! Motion is an awesomely feature rich program, which I think, is suitable for this project. Next up is having the Pi to send and receive IR signals to be able to controll mindless Edgar.
|-> To do: accessing the webcam through the internet: http://www.cnet.com/how-to/how-to-set-up-a-cheap-home-security-system-using-dynamic-dns/
Controlling the robot through infrared…
Yay! The book and learning kit from banggood arrived! Tonight I quickly followed the tutorial from Alex and got LIRC up and running in no time. Still need to do the ‘sending’ infrared part of the tutorial but need to order some parts for that.
A few moments later…
Ok, didn’t want to wait for a simple transistor to test the IR sending. I just used the IR led with a resistor and it works! With a limited range of course, however since the rapsberry pi will be sitting on top of Edgar this doesn’t matter.
Another interesting tutorial on how to use LIRC in a simple Python program (haven’t got the chance yet to test this out).
Next up; controlling the IR commands through the interwebs, Alex happens to also wrote a nice tutorial on this! 😀
Unfortunately I could get NodeJs to compile on my Raspberry for some reason, I got this after two hours of compiling:
deps/v8/tools/gyp/v8_snapshot.target.mk:13: recipe for target '/home/pi/nodejs/node-v0.12.7/out/Release/obj.target/v8_snapshot/geni/snapshot.cc' failed
make: *** [/home/pi/nodejs/node-v0.12.7/out/Release/obj.target/v8_snapshot/geni/snapshot.cc] Error 132
make: Leaving directory '/home/pi/nodejs/node-v0.12.7/out'
Makefile:45: recipe for target 'node' failed
make: *** [node] Error 2
a few moments later…
pi@raspberrypi ~ $ node -v
Next; installing licr_web, an application written by Alex Bain to controll LIRC from the interwebs using NodeJS! Installing the software took quite a while but it works like a charm!
La construction du raspberry pi headcrab
The goal for the construction of the raspberry pi headcrab is a bit strange in comparison with the actual headcrab in the game Half-Life. This headcrab shouldn’t damage or kill it’s host, so bolting this thing on the HomBot with screws will not work for this project. I started researching ways to attach the headcrab to the HomBot and came up with a solution that involved some kind of arms reaching around the body of Edgar but figured that the sensors of Edgar might get blocked.
While cleaning a drawer I came across these suction cups they use to attach something to a window. Ahaa! Suction cups it is!
Next I used masking tape to draw a square with known dimensions, loaded it up in Photoshop to correct the perspective and lens distortions, used Illustrator to trace on the reference image. Exported it to .dxf to be used in Solidworks to make all the technical drawings and files.
This is a list of all the stuff that needs to be mounted for the headcrab:
- raspberry pi
- battery pack
- pcb with IR components and other stuff
- servo for the tilt of the picam
These are all the extra’s I’m thinking of using:
- temperature sensor
- PIR (motion) sensor
- battery charge circuit
- sound module / speaker
Still To Do:
- designing the headcrab in solidworks
- setup my router so that I can acces the local network from anywhere in the world
- figure out how to combine the video stuff (motion) with the IR sending stuff (LIRC / NodeJS)
New updates soon!