Digression #2: PiMony, a smart remote prototype

And so I digress again…

Presenting PiMony, a smart remote prototype built on a Raspberry Pi!


As prototypes go, this is very… prototypey! It is quite large and kludgy (e.g. trailing wires). The goals at the moment are to learn more about the hardware side, and try out ideas for both hardware and software. Then work on the size, look and feel!


For an idea of scale, here’s the remote next to a Raspberry Pi Model B. In total, it’s about three-and-a-bit times the length of a naked Pi.

My motivation for making this may seem familiar if you’ve read my earlier post about Squeezeboxes…

I’ve been the owner of a Harmony 655 smart remote for quite a few years, and it has done sterling service controlling our entertainment gear. However it is now ageing, and the number of buttons that are intermittent in operation is slowly growing. I researched newer models… and was rather disappointed. The less pricey of the new models only control a limited number of devices (my 655 could handle loads more), and the higher-end remotes have bells and whistles that draw more power and increase fragility (judging by customer reviews). And Logitech bought the original manufacturer a while back…

So I decided to go DIY – that way I would only be limited by my imagination and skills, would certainly learn a few things and have a device I could easily repair myself if it got damaged.


So what’s in the PiMony? I won’t go into much detail here, but these are the main component parts with embedded links to information and resources that helped me with this project.

Raspberry Pi Model B+ and PiTFT

Raspberry Pi Model B+ and PiTFT

Custom circuitry

Custom matrix keypad (left), GPIO extender (centre) and load sharing circuit (right)

Power supply

LiPo battery (lower left) and PowerBoost 500C (lower right) supported by PiBow extended baseplate.

Supporting Information

There’s a good deal of information on infra-red remote control hardware and protocols about. Here are some of the most useful links I found:


To get things going swiftly, I re-purposed some early Python code from the SqueezePi project that used Pygame to handle rendering on the PiTFT and input from its touch screen. I rounded this out to present a basic data-driven user interface, and added GPIO button polling and IR sending via LIRC – more details on this below.

PiMony User Interface on touch screen

Here’s a shot of it running.

The Python code is dependent on Pygame, smbus and RPIO and runs on Raspbian configured with support for PiTFT, LIRC and I2C.

You can find the PiMony code in this Github repository.

Phase 1: IR Basics


I started out creating a basic IR receiver and sender circuit on a Raspberry Pi with LIRC installed. This comes from some very helpful posts on alexba.in’s blog. I used a TSOP39238 IR receiver and TSUS5400 940nm IR LEDs – both purchased from Bitsbox (in the UK).

I proceeded to try capturing the IR signals from a couple of remotes (Sony TV, Phillips DVD player). Initially, I found recording the IR signals via LIRC to be somewhat hit-and-miss – some codes got recorded while others didn’t. Then I found that there were orders of button presses on my Sony TV remote that 100% reproduced the problem. Some research into Sony TV IR codes brought to light that Sony remotes can output 12 bit, 15 bit or 20 bit codes. So I tried some ‘raw’ recording of the button output and found that my remote has mostly buttons generating 12 bit codes but some generating 15 bit codes.

By trial and error, I was able to build a list of buttons for each code length and then captured each group independently.

Once captured, the IR codes played back perfectly through my IR sender circuit. The only downside I found was that the IR LEDs I use have a narrower output angle, so the remote needs to be more accurately pointed at the target device to work.

With the basic record and send capabilities operational, I spent a couple of hours capturing all the IR signals of all the remotes used by our household entertainment devices, building a LIRC database of remotes and their codes.

Phase 2: Python Prototype

IMG_20150101_182432With that all running, I moved on to writing the Python code. As mentioned above, I reused some early code from the SqueezePi project to get rendering and touch screen input quickly up and running, then went on to create a simple touch-screen UI that triggered the sending of IR codes. My Github repository contains the code if you want to refer to it.

At the start, I sent the IR codes by spawning LIRC’s irsend application with a suitable set of command line parameters. This quickly proved problematic; spawning a new process was too slow for a responsive user experience and LIRC didn’t cope well if more than one irsend process was active (which was needed for buttons that send more than one IR command). I replaced this with code that communicated directly to LIRC via its socket interface; this proved to be faster and reliable for sending multiple commands.

The Touch Screen Challenge

I also found that the touch screen was flickering at times, and touches were ‘bouncing’ – i.e. one touch on a button would cause that button’s IR commands to be sent multiple times. Debugging this showed that there were phantom ‘touch release’ events being generated, immediately followed by detection of a new touch even though the user had not stopped touching the button. This led to quick changes to button visuals (the flickering) and the unwanted command repeats.

To get to the bottom of it, I created a standalone C app to see if the problem was repeatable outside of Pygame. I first used SDL (on which Pygame is based) to read from the touch screen – and this showed exactly the same issue with phantom release events. I then replaced SDL with direct calls to the underlying tslib touch screen library; and this too showed the same problem.

I then tracked down the source code for the PiTFT touch screen driver and began my study. It has a timeout mechanism implemented in one of its functions (stmpe_work) that generates a release event if there has not been an interrupt from the touch screen for around 20 milliseconds. So if something stalls the CPU in the right way for long enough, this will cause a phantom touch release event to occur.

As IR sending coincides with the phantom release events, I dived into LIRC and the lirc_rpi kernel module source. The Pi module source is available as a very readable patch. Lo and behold – the IR sending function lirc_write uses a kernel spinlock which disables interrupts while the IR pulse generating loop is running (in order for the software’s timing to correctly reproduce the right frequency of output signal). Basic math shows that a Sony 12-bit IR code takes between 17ms (all zero bits) and 24ms (all one bits) to send. GIven that this timing is close to the touch screen driver’s ‘timeout’ timing, and that LIRC is repeating each command twice for robustness, it seems highly likely that having interrupts disabled for this time could be causing the touch screen driver to generate phantom releases.

For the time being, I dealt with this by revising the Python code to work around it – if a touch screen button press is detected, any further events are flushed and touch screen events are disabled until all IR codes are sent (plus a short delay). At the cost of some responsiveness, this avoids the flickering and unwanted repeats.

Improving IR Sending

IR_1The timing of the IR signal is critical. A 38kHz square wave is used, meaning the driver has to toggle the GPIO pin state every 13 microseconds. The pulses of the square wave have to match timings very accurately – e.g. durations of 2.4, 1.2 and 0.6 milliseconds for Sony codes. Driving this in software on a Pi running Linux with interrupts enabled is very likely to make the timing too variable for it all to work.

This could be tackled in the LIRC driver a number of ways:

  • Use less CPU intensive 38kHz square wave generation:
  • Use an interrupt to accurately control the timing of the IR square wave pulses; turning the IR wave output on or off as required.

In all cases, this requires kernel module coding – not for the faint-hearted!

There are also some external hardware solutions:

  • For generating just the square wave – e.g. a PWM breakout board, or a 555 timer IC.
  • For generating the square wave output and its pulses – e.g. a small microcontroller with dedicated code for driving the IR sender circuit..

I successfully breadboarded a circuit using a 555 timer, and was able to drive this from a Pi GPIO pin and control the TV. However it required a fair few supporting components and some trial-and-error to get the timing accurate.

For this project, I am going to follow the hardware route, building an LPC810 micro-controller based ‘IR blaster’ device, controlled via an I2C interface. This completely decouples the IR sending, so that the UI can remain responsive – and will be a fun further learning opportunity!



I wanted some physical buttons on the prototype – partly because it will be a nice user experience to combine touch screen and tactile buttons, and partly to learn more hardware stuff!

As the PiTFT has tactile button wiring (and buttons were already soldered on from a past project), I extended the Python code using RPIO to poll the pins on those buttons and send IR commands when presses were detected. To keep the wiring simpler, I set the GPIO pins to be pulled up to high internally on the Pi, and the buttons were connected to ground – so when a button is pressed, the GPIO pin goes from high to low. This was quick to set up and worked well. The only downside is that the code has to be run as root because of RPIO.

A remote needs a numeric keypad. That’s a whole ten GPIO pins if one is used per button! However, they can be wired in a three column, four row matrix arrangement that reduces the number of GPIO pins to seven – one pin for each row and column with column pins as outputs, row pins as inputs. The matrix is read by sending a signal to the column pins in turn and reading back the row pins at each step – any set row pin means the corresponding button in that column is pressed.

Seven pins is still too many for the original Raspberry Pi model on which I started this project – and even using a B+ would mean a lot of extra wiring to connect it together. So I opted to use a GPIO extender chip connected to the Pi’s I2C bus – that meant only four wires from the Pi (two for I2C, one for power and one ground). Plus if I decide not to use a Pi in the future, I can easily reuse the keypad with other small CPUs that support I2C.

As with the buttons on the PiTFT, I set the row input pins to be internally pulled high – so the polling signal needs to be reversed (set each column to low in turn, and any row pins that are low means buttons are pressed). In addition, each button is wired up through a diode (reversed because of low signalling a pressed button). These diodes prevent phantom button presses that would otherwise happen when certain patterns of buttons are pressed at the same time.

This was all pretty straightforward to wire up and implement in Python – the existing smbus library makes I2C a doddle. The only awkward part was that my original IR sender board connector masked the I2C pins on the Pi, so I had to construct a further breakout board to sit between the connector on the PiTFT and the original sender board.

Phase 4: Putting It Together


The next main phase of work was to get all these pieces to work together in a form that could actually be used as a remote. This would require a case for the Pi and PiTFT which would also allow me to attach the keypad and IR sending circuits.

I had previously bought a Pibow case for a Model B. These cases are made from horizontal layers of laser cut acrylic screwed together. I realised that I could create an extended replacement base layer to which my other parts could be fitted.

To test this out, I tried using this case with the screen and IR sender circuit – but it wouldn’t fit together without having to add a very tall spacer between the screen and the Pi. That set-up wasn’t stable enough to allow the screen to be touched and proved too bulky to be practical. So instead I switched to using a model B+ Pi in a Pibow Coupe case. This allowed all the parts to fit, and though quite hefty was not too bulky for use.

Switching to the B+ highlighted some issues with RPIO. I found that the mapping of GPIO identifiers to expansion port pins wasn’t working right; it turned out that some of the functions used the wrong identifier (pin vs GPIO or vice versa) and one of the internal pin numbers was incorrect for revision 2 Pis (and the B+). I fixed these in the dc-dev branch of the RPIO fork in my Github.

Next up was a power supply! Adafruit provide some straightforward LiPo battery driven power boosters/chargers; I tracked down a supplier and purchased a PowerBoost 500C and a 2500mAh LiPo battery. It charged up fine (though that took a good 7+ hours) and ran the PiMony continuously for about 6 hours in testing – which should be fine for further development purposes. The only problem was that though power could be supplied to the PiMony while charging, it wasn’t enough for it to run properly.

The solution was to create a load sharing circuit that would switch off the PowerBoost output when a USB power supply was connected and instead allow it to supply current direct to the PiMony as well as the battery charger. This wasn’t hard to build, but the only components I could obtain were rather over-specified for the task and consequently added some unwanted bulk.

Creating the extended base plate meant measuring up the existing base plate carefully with digital calipers, and reproducing it as a vector image in Inkscape. Extending the base to fit my custom circuit and power supply was straightforward, with careful measurements taken to position screw holes for the supports. I also allowed some extra space between the edge of the Pi and the top of the keypad so I could get access to the network and USB ports on the Pi. I sent this file off to Razorlab for some trial cuts in cardboard; after testing and tweaking those, they made a final acrylic cut for me. It all fitted together beautifully.

So now I have a usable if somewhat bulky prototype that works fine at controlling my TV and surround sound system!

What’s Next?

There’s plenty in the pipeline for this project!

The software side needs further development to make the remote actually ‘smart’ – so it can handle different combinations of devices for different activities (e.g. just watching TV as opposed to watching a movie on Blu-ray), and track which devices are on at any given time. The remote configuration is currently data ‘baked in’ to the Python code; this should be moved into data files, and a GUI tool created to allow easy editing.

The IR sending circuitry should be rebuilt using a microcontroller as mentioned above, to avoid the interference between the touch screen and the current GPIO IR driver.


Freescale FRDM-KL26Z board

The next iteration of the hardware should be smaller and sleeker, and run for longer on a single battery charge. I’m thinking of replacing the Pi with an ARM microcontroller board like the FRDM-KL26Z. It is much lower profile than the Pi and less power hungry but should be able to handle the required behaviour. It may even be able to do IR sending, or that could still be done by an external device.


FRDM-KL26Z alongside a Model B+ Pi

The software would need to be reimplemented in C/C++, and the big advantage would be complete control – no Linux overhead!

I could also reuse the keypad matrix as I2C is available on this type of device, and the battery based power supply and load sharing could also be reused.

Speaking of load sharing… that part of the circuit should also be improved by trying to find a solution that’s smaller!

And some sort of low battery detection and indicator would be a good idea! There’s an unconnected white wire you may have spotted – this is the low battery output from the PowerBoost board, which can be directly connected to a GPIO and read in software. Unfortunately when connected it turns on the low battery LED on the PowerBoost regardless, which isn’t helpful!

(c) 2015 Nicholas Tuckett.