Replacing the PiMony IR sender

It has been way too long since my last post. Not that nothing has been going on… in fact a very great deal has occurred in the intervening time! This and subsequent posts should fill that gap, in roughly chronological order.

So, first things first. I’ve been focussing my energies on projects other than Pithesiser – namely the PiMony smart remote project, and a related parallel side-project for a ‘smart’ kitchen timer. Both projects are learning devices for me – experiences and discoveries on one cross-fertilise the other.

Microcontroller-based IR sender

If you recall my previous post, I expounded at some length about problems with the touch screen response ‘freezing’ while the PiMony device blasted out IR. Towards the end of that post, I mentioned about using a microcontroller based sender circuit to solve this. Well, that is what I have now created – and it works a treat!

My requirements for this circuit were simple, based on the experience with PiMony:

  • Use a simple protocol over I2C to set up pulse timings, repeat count and repeat delay, trigger IR blasting and send back a ‘busy’ flag to indicate activity.
  • Consume as little power as possible – power down under internal timer control, and wake up when sent data over I2C.
  • Use a 5V power supply – direct for LEDs, and potentially via a voltage regulator for the microcontroller if required.

The ultimate goal is to create a device to which the Pi can delegate the low-level aspects of IR sending, and carry on with its own processing (e.g. refreshing UI) while the sending is performed.

Choosing components

The fundamental IR sender circuit of LEDs, resistors and transistor is same as for the full Pi based version.

The new stuff that needs sorting out is everything microcontroller related from the controller IC itself to any ancilliary components needed to power it and interface it with the LEDs and Raspberry Pi.

This was an area relatively new to me – I’ve used many different ‘full’ CPU computers over the years, but not tried any microcontroller devices such as Arduino.

Microcontroller central

IMG_20150620_161524I wanted a microcontroller that meets these requirements:

  • Capable of very low power consumption.
  • Easy to use on a breadboard for quick prototyping.
  • Supports I2C for interfacing with the PI.
  • Supports GPIO for controlling the IR LEDs.
  • Has good free tools for coding, compiling and debugging.
  • Has plenty of good documentation and sample code.
  • Affordable and easily available.
  • Based on an architecture with which I am familiar.

Starting with the last one first – that would be ARM for me. And to satisfy the low power consumption, an MCU from the Cortex M0 or M0+ family would suit. That makes for quite a big field, with chips from TI, ST Microelectronics, Freescale, NXP, Atmel, Cypress… how to choose between them?

Support for I2C and GPIO is very common, so that isn’t a strong distinguishing factor. Free coding tools, documentation and examples – I found lots on NXP, Freescale and ST for starters. Easy for breadboarding… hmm, that’s where it gets more interesting. Either the chips have to be available in chunky DIP packaging, or as a breakout board designed for breadboarding. Not quite so many of those around!

After very little Googling, I hit on the LPC810 from NXP. This comes in a small 8-pin DIP package, so can very easily be used on a breadboard. It also fits all the other requirements, including the last – you can buy them from Farnell here in the UK for under £2 each.

The LPC810 is a 3.3V part, so some extra bits are needed to regulate the 5V supply – I settled on a 3.3V MCP1700, which requires a couple of 1uF capacitors to support it.

Breadboard version

The schematic for the circuit is:
This proved pretty straightforward to source parts and build. I did need to wire up a few extras for development purposes with the LPC810 chip:

  • Pin 1 (RESET/P0_5) is used to reset the chip by pulling it low. I wired a tactile switch between that pin and ground so I could manually reset the chip – and also pulled it high via a 10K resistor so it wouldn’t float when the switch was open.
  • Pin 5 (PIO0_1/BOOT) is used to put the chip in ‘ISP’ mode if it is low when the chip is reset – this mode is used to download code to the chip via a serial connection. I wired up a similar tactile-switch-to-ground as for Pin 1, so I could manually trigger this mode for easy development.
  • To download code when in the ISP mode requires a serial connection – pins 2 and 8 (P0_4/WKUP and PIO0_0) are used for this purpose. I wired these up to a couple of header pins on the breadboard so I could connect a USB-to-serial cable from my PC. As pin 8 also drives the IR LEDs, I used a three pin slide switch so I could switch between the serial line (for programming) and the LEDs.
  • For power, I connected up the 5V power and ground pins from the aforementioned USB/serial cable to the breadboard power rails, and fed these into the circuit appropriately. For extra flexibility, I routed the power connection via a slide switch.

Note that the LPC810 pins are not 5V safe, so it needs to be a 3.3V USB-to-serial cable.

To test this worked, I downloaded and built the lpc21isp tool, and used its ‘detect only’ option to check that the MCU was working and ready to receive code. With the LPC810 powered up and reset into ISP mode and the USB/serial connection in place, this reported back the correct type of microcontroller.

You can see the breadboard version under development below – I’m using part of the IR stripboard circuit from the GPIO version to provide the actual IR sending hardware here, this time under control of the GPIO_0 pin of the LPC810.


Development environment

With working hardware under my belt, I now needed the tools and system libraries required to write the code. Some Googling yielded a number of different codebases and toolchains for LPC (e.g. LPCOpenLPCXpresso, LPC810 Code Base). I wanted to keep things very small, simple and open, so used the approach adopted by Jeelabs and their LPC codebase (see examples in the ‘explore’ folder). Jeelabs have lots of good articles on working with the LPC800 family of microcontrollers.

I acquired the GCC ARM toolchain from Launchpad (at the time of development, the 2014q4 was the latest version), and set that up in a folder. To keep the dev environment lean, I used a simple text editor and make to write and build the code, then used lpc21isp to download it to the microcontroller. By leaving pin 2 connected to the serial cable, I could print characters back to the PC via lpc21isp’s terminal mode for simple debugging.

Software design

A little forward thought and planning was required, as I was going to be working under some tight restrictions.

Memory use

The LPC810 has very limited memory – 4Kb of Flash, and 1Kb of RAM. Therefore my design and implementation had to be very lean, so I used the following strategies:

  • Write just the code that’s needed.
  • Pack data structures – don’t allow any padding.
  • Avoid any C runtime library code if at all possible, especially printf (event for debug builds).
  • Don’t perform any runtime division by a variable, as this requires library support (which bulks up the code size).
  • Use -Os to let the compiler optimise for size.
  • Generate a map file and use GCC tools to rigorously monitor code and data memory use.

The LPC810 does include a ROM with various useful routines, which can save memory – however after some experimentation and research, I found these didn’t suit. It would saved some memory if the ROM I2C driver could be used, but it didn’t seem to work reliably enough with interrupts for my needs.

Having the compiler optimise for size might lead to performance issues, but as the speeds of I2C (100kHz) and IR (38kHz) are relatively low while the LPC810 can clock up to 30MHz, that wasn’t likely to be an issue.

Managing power

To minimise power use, I wanted the LPC810 to drop to as low a power state as possible while idle – and only use the resources it needs when active. That meant adopting two principles:

  • Use of an interrupt driven approach to the design, as the ARM Cortex M low power modes occur when waiting for interrupts to trigger.
  • Only enable the exact peripherals and pins required – don’t turn on extraneous features, and turn off anything that’s on by default if not required. E.g. there are couple of outputs not connected to pins on the LPC810 that need to be set up to avoid leaking power.
  • Lower the clock rate if possible.

Execution flow

I used this flow of execution for the app:

  • Low-level processor startup.
  • Initialise clock and required peripherals – disable anything not required.
  • Set up I2C for interrupt operation.
  • Set up IR sending via timers and interrupts.
  • Main loop to service IR sending and wait on interrupts if idle.
  • Timer interrupt to trigger low power mode if idle for a set period – wake up on I2C interrupt.

IR sending

The app has a set of ‘registers’ (memory locations) that are used to contain the data describing the IR sending:

  • Status byte to indicate if sending is active or not.
  • Number of repeats.
  • Delay between repeats in milliseconds.
  • Count of timing values in signal.
  • Array of timing values, up to 64.

When sending is triggered, the 38kHz carrier wave is turned on and off according to the array of timing values – which generates a sequence of pulses used to control the IR LEDs, thus sending the coded signal out.

I used the LPC810’s State Configurable Timer peripheral to both generate the 38kHz carrier wave, and to control the pulses of that wave. The timer is set to operate as two 16-bit counters – one acting as a straighforward PWM controller for the carrier wave, the other driven from the IR timing data and using an interrupt to control the process of iterating through the array of timings.

The main loop of the application calls a service routine after any interrupt when IR sending is active – this detects when a complete set of pulses has been sent, and either triggers any required repeat (after the required delay) or stops the sending process.

To make the timing as accurate as possible for the 38kHz carrier wave, the main system clock is set to 10MHz – which my calculations indicate yields the least amount of error when trying to achieve the 38kHz carrier.

I2C implementation

This is the key to the operation of the app. It is configured as an I2C slave at address 0x70 on the bus, and its behaviour is defined in a small state machine implemented inside the I2C interrupt handler. This state machine handles sending and receiving based on the I2C hardware registers. When receiving data over the I2C bus, it writes values into the IR data ‘register’ locations, and triggers IR sending when the last timing value is written. When asked to transmit I2C data, it returns the content of the IR status ‘register’, to indicate if IR sending is active or not.

If it receives data while currently sending out IR, it will indicate an error back via I2C NAK.

This interrupt handler is kept simple and short in order to maintain performance, and I2C reliability.


IMG_20150201_153939_cropWith the breadboard version set up and the key design issues decided, I was able to progressively write the software and test it directly from the PC for the setup and IR sending side. To check that the expected behaviour occurred, I employed the following:

  • TTY output from the LPC over the serial connection.
  • Use of a webcam or phone camera to view the IR LEDs flashing.
  • Use of a digital oscilloscope to capture the pulses used to drive the IR LEDs.

After a few iterations and bugs, this worked reliably – I could set up an IR pulse sequence then measure the expected timings on the oscilloscope, and verify this drove the IR LEDs via a webcam.

I2C testing required a different approach – for this, I connected the circuit to a Raspberry Pi and used the i2ctools package of command line tools to interactively read and write the IR ‘registers’ on my circuit, and trigger IR sending. Again, the webcam/oscilloscope combo was used to verify the behaviour.

Once I worked through a few bugs and glitches, this worked well. I learned how sensitive I2C can be to noise or level mismatches during this process, and to timing – keeping that interrupt handler fast was key!

And overall the system worked. The memory footprint stayed pretty low at 1.7Kb, in part due to the simple functionality, simple design and careful monitoring. By far the biggest memory ‘ugly’ was the presence of C runtime library code for printf and to support non-power-of-two division – I eliminated these by supplying my own basic putchar/puts and ASCII numeric conversion code for the debug build, and ensuring all division in the code was by constant.

The stripboard version

IMG_20150620_114209With the circuit and firmware fairly stable, I decided to make the hardware side more permanent so I could move on to trying it with real IR codes. So I rebuilt the circuit on stripboard, but left out the switches for reset and ISP, the serial connection and the power switch. Instead I added a header socket so I could connect it easily to the Pi’s GPIO header pins, and wired it up to use the Pi’s 5V and I2C connections.

I also added a couple of pull-up resistors on the I2C lines, and a four pin header to breakout power, ground and I2C (this is to connect to the keypad matrix circuit described in the previous post).

I used this circuit to replace the stripboard circuits connected to the underside of the PiTFT screen on my prototype. Those circuits were directly driven via GPIO and the corresponding LIRC driver – next I had to provide my own I2C driver for LIRC to drive this new circuit.

Putting it all together on the Pi

LIRC has a plugin driver model with lots of existing code to control different IR devices. However none of those would work with my circuit. So I forked the LIRC codebase and created a new branch for my driver code. This was straightforward to integrate with the LIRC build system, and I was able to carry out all the work directly on Raspberry Pi. Configuring LIRC to use the driver was trivial – by specifying it in the config file instead of the default driver, and commenting out options for the Pi GPIO driver.

Testing on the command line and with the PiMony Python code showed it to work successfully! I could now remote control my devices as before. So I took a deep breath, and removed the timing workarounds for the touch screen issues in the Python code – and it worked smoothly! No more missed touch screen events.

To be a fully working version, this I2C driver requires some more effort (but not much) – it should really have options to specify the I2C bus to use, and the I2C device address as currently these are hard-coded.


So this was a trip around the houses to get back to where we started – but yielded the desired results and taught me a whole lot of new stuff into the bargain. And it’s opened up doors for further work on this project…


Here’s where it starts to get dangerous – from here on in, I’ll be taking off the safety wheels of the Raspberry Pi and diving deeper into pure microcontroller territory. There will be bare-metal coding, parallel interfaces, custom PCB creation and DIY surface mount soldering before this tale is fully told…


(c) 2015 Nicholas Tuckett.