Replacing the PiMony IR sender

It has been way too long since my last post. Not that nothing has been going on… in fact a very great deal has occurred in the intervening time! This and subsequent posts should fill that gap, in roughly chronological order.

So, first things first. I’ve been focussing my energies on projects other than Pithesiser – namely the PiMony smart remote project, and a related parallel side-project for a ‘smart’ kitchen timer. Both projects are learning devices for me – experiences and discoveries on one cross-fertilise the other.

Microcontroller-based IR sender

If you recall my previous post, I expounded at some length about problems with the touch screen response ‘freezing’ while the PiMony device blasted out IR. Towards the end of that post, I mentioned about using a microcontroller based sender circuit to solve this. Well, that is what I have now created – and it works a treat!

My requirements for this circuit were simple, based on the experience with PiMony:

  • Use a simple protocol over I2C to set up pulse timings, repeat count and repeat delay, trigger IR blasting and send back a ‘busy’ flag to indicate activity.
  • Consume as little power as possible – power down under internal timer control, and wake up when sent data over I2C.
  • Use a 5V power supply – direct for LEDs, and potentially via a voltage regulator for the microcontroller if required.

The ultimate goal is to create a device to which the Pi can delegate the low-level aspects of IR sending, and carry on with its own processing (e.g. refreshing UI) while the sending is performed.

Choosing components

The fundamental IR sender circuit of LEDs, resistors and transistor is same as for the full Pi based version.

The new stuff that needs sorting out is everything microcontroller related from the controller IC itself to any ancilliary components needed to power it and interface it with the LEDs and Raspberry Pi.

This was an area relatively new to me – I’ve used many different ‘full’ CPU computers over the years, but not tried any microcontroller devices such as Arduino.

Microcontroller central

IMG_20150620_161524I wanted a microcontroller that meets these requirements:

  • Capable of very low power consumption.
  • Easy to use on a breadboard for quick prototyping.
  • Supports I2C for interfacing with the PI.
  • Supports GPIO for controlling the IR LEDs.
  • Has good free tools for coding, compiling and debugging.
  • Has plenty of good documentation and sample code.
  • Affordable and easily available.
  • Based on an architecture with which I am familiar.

Starting with the last one first – that would be ARM for me. And to satisfy the low power consumption, an MCU from the Cortex M0 or M0+ family would suit. That makes for quite a big field, with chips from TI, ST Microelectronics, Freescale, NXP, Atmel, Cypress… how to choose between them?

Support for I2C and GPIO is very common, so that isn’t a strong distinguishing factor. Free coding tools, documentation and examples – I found lots on NXP, Freescale and ST for starters. Easy for breadboarding… hmm, that’s where it gets more interesting. Either the chips have to be available in chunky DIP packaging, or as a breakout board designed for breadboarding. Not quite so many of those around!

After very little Googling, I hit on the LPC810 from NXP. This comes in a small 8-pin DIP package, so can very easily be used on a breadboard. It also fits all the other requirements, including the last – you can buy them from Farnell here in the UK for under £2 each.

The LPC810 is a 3.3V part, so some extra bits are needed to regulate the 5V supply – I settled on a 3.3V MCP1700, which requires a couple of 1uF capacitors to support it.

Breadboard version

The schematic for the circuit is:
i2c-ir-schematic
This proved pretty straightforward to source parts and build. I did need to wire up a few extras for development purposes with the LPC810 chip:

  • Pin 1 (RESET/P0_5) is used to reset the chip by pulling it low. I wired a tactile switch between that pin and ground so I could manually reset the chip – and also pulled it high via a 10K resistor so it wouldn’t float when the switch was open.
  • Pin 5 (PIO0_1/BOOT) is used to put the chip in ‘ISP’ mode if it is low when the chip is reset – this mode is used to download code to the chip via a serial connection. I wired up a similar tactile-switch-to-ground as for Pin 1, so I could manually trigger this mode for easy development.
  • To download code when in the ISP mode requires a serial connection – pins 2 and 8 (P0_4/WKUP and PIO0_0) are used for this purpose. I wired these up to a couple of header pins on the breadboard so I could connect a USB-to-serial cable from my PC. As pin 8 also drives the IR LEDs, I used a three pin slide switch so I could switch between the serial line (for programming) and the LEDs.
  • For power, I connected up the 5V power and ground pins from the aforementioned USB/serial cable to the breadboard power rails, and fed these into the circuit appropriately. For extra flexibility, I routed the power connection via a slide switch.

Note that the LPC810 pins are not 5V safe, so it needs to be a 3.3V USB-to-serial cable.

To test this worked, I downloaded and built the lpc21isp tool, and used its ‘detect only’ option to check that the MCU was working and ready to receive code. With the LPC810 powered up and reset into ISP mode and the USB/serial connection in place, this reported back the correct type of microcontroller.

You can see the breadboard version under development below – I’m using part of the IR stripboard circuit from the GPIO version to provide the actual IR sending hardware here, this time under control of the GPIO_0 pin of the LPC810.

IMG_20150222_101032

Development environment

With working hardware under my belt, I now needed the tools and system libraries required to write the code. Some Googling yielded a number of different codebases and toolchains for LPC (e.g. LPCOpenLPCXpresso, LPC810 Code Base). I wanted to keep things very small, simple and open, so used the approach adopted by Jeelabs and their LPC codebase (see examples in the ‘explore’ folder). Jeelabs have lots of good articles on working with the LPC800 family of microcontrollers.

I acquired the GCC ARM toolchain from Launchpad (at the time of development, the 2014q4 was the latest version), and set that up in a folder. To keep the dev environment lean, I used a simple text editor and make to write and build the code, then used lpc21isp to download it to the microcontroller. By leaving pin 2 connected to the serial cable, I could print characters back to the PC via lpc21isp’s terminal mode for simple debugging.

Software design

A little forward thought and planning was required, as I was going to be working under some tight restrictions.

Memory use

The LPC810 has very limited memory – 4Kb of Flash, and 1Kb of RAM. Therefore my design and implementation had to be very lean, so I used the following strategies:

  • Write just the code that’s needed.
  • Pack data structures – don’t allow any padding.
  • Avoid any C runtime library code if at all possible, especially printf (event for debug builds).
  • Don’t perform any runtime division by a variable, as this requires library support (which bulks up the code size).
  • Use -Os to let the compiler optimise for size.
  • Generate a map file and use GCC tools to rigorously monitor code and data memory use.

The LPC810 does include a ROM with various useful routines, which can save memory – however after some experimentation and research, I found these didn’t suit. It would saved some memory if the ROM I2C driver could be used, but it didn’t seem to work reliably enough with interrupts for my needs.

Having the compiler optimise for size might lead to performance issues, but as the speeds of I2C (100kHz) and IR (38kHz) are relatively low while the LPC810 can clock up to 30MHz, that wasn’t likely to be an issue.

Managing power

To minimise power use, I wanted the LPC810 to drop to as low a power state as possible while idle – and only use the resources it needs when active. That meant adopting two principles:

  • Use of an interrupt driven approach to the design, as the ARM Cortex M low power modes occur when waiting for interrupts to trigger.
  • Only enable the exact peripherals and pins required – don’t turn on extraneous features, and turn off anything that’s on by default if not required. E.g. there are couple of outputs not connected to pins on the LPC810 that need to be set up to avoid leaking power.
  • Lower the clock rate if possible.

Execution flow

I used this flow of execution for the app:

  • Low-level processor startup.
  • Initialise clock and required peripherals – disable anything not required.
  • Set up I2C for interrupt operation.
  • Set up IR sending via timers and interrupts.
  • Main loop to service IR sending and wait on interrupts if idle.
  • Timer interrupt to trigger low power mode if idle for a set period – wake up on I2C interrupt.

IR sending

The app has a set of ‘registers’ (memory locations) that are used to contain the data describing the IR sending:

  • Status byte to indicate if sending is active or not.
  • Number of repeats.
  • Delay between repeats in milliseconds.
  • Count of timing values in signal.
  • Array of timing values, up to 64.

When sending is triggered, the 38kHz carrier wave is turned on and off according to the array of timing values – which generates a sequence of pulses used to control the IR LEDs, thus sending the coded signal out.

I used the LPC810’s State Configurable Timer peripheral to both generate the 38kHz carrier wave, and to control the pulses of that wave. The timer is set to operate as two 16-bit counters – one acting as a straighforward PWM controller for the carrier wave, the other driven from the IR timing data and using an interrupt to control the process of iterating through the array of timings.

The main loop of the application calls a service routine after any interrupt when IR sending is active – this detects when a complete set of pulses has been sent, and either triggers any required repeat (after the required delay) or stops the sending process.

To make the timing as accurate as possible for the 38kHz carrier wave, the main system clock is set to 10MHz – which my calculations indicate yields the least amount of error when trying to achieve the 38kHz carrier.

I2C implementation

This is the key to the operation of the app. It is configured as an I2C slave at address 0x70 on the bus, and its behaviour is defined in a small state machine implemented inside the I2C interrupt handler. This state machine handles sending and receiving based on the I2C hardware registers. When receiving data over the I2C bus, it writes values into the IR data ‘register’ locations, and triggers IR sending when the last timing value is written. When asked to transmit I2C data, it returns the content of the IR status ‘register’, to indicate if IR sending is active or not.

If it receives data while currently sending out IR, it will indicate an error back via I2C NAK.

This interrupt handler is kept simple and short in order to maintain performance, and I2C reliability.

Testing

IMG_20150201_153939_cropWith the breadboard version set up and the key design issues decided, I was able to progressively write the software and test it directly from the PC for the setup and IR sending side. To check that the expected behaviour occurred, I employed the following:

  • TTY output from the LPC over the serial connection.
  • Use of a webcam or phone camera to view the IR LEDs flashing.
  • Use of a digital oscilloscope to capture the pulses used to drive the IR LEDs.

After a few iterations and bugs, this worked reliably – I could set up an IR pulse sequence then measure the expected timings on the oscilloscope, and verify this drove the IR LEDs via a webcam.

I2C testing required a different approach – for this, I connected the circuit to a Raspberry Pi and used the i2ctools package of command line tools to interactively read and write the IR ‘registers’ on my circuit, and trigger IR sending. Again, the webcam/oscilloscope combo was used to verify the behaviour.

Once I worked through a few bugs and glitches, this worked well. I learned how sensitive I2C can be to noise or level mismatches during this process, and to timing – keeping that interrupt handler fast was key!

And overall the system worked. The memory footprint stayed pretty low at 1.7Kb, in part due to the simple functionality, simple design and careful monitoring. By far the biggest memory ‘ugly’ was the presence of C runtime library code for printf and to support non-power-of-two division – I eliminated these by supplying my own basic putchar/puts and ASCII numeric conversion code for the debug build, and ensuring all division in the code was by constant.

The stripboard version

IMG_20150620_114209With the circuit and firmware fairly stable, I decided to make the hardware side more permanent so I could move on to trying it with real IR codes. So I rebuilt the circuit on stripboard, but left out the switches for reset and ISP, the serial connection and the power switch. Instead I added a header socket so I could connect it easily to the Pi’s GPIO header pins, and wired it up to use the Pi’s 5V and I2C connections.

I also added a couple of pull-up resistors on the I2C lines, and a four pin header to breakout power, ground and I2C (this is to connect to the keypad matrix circuit described in the previous post).

I used this circuit to replace the stripboard circuits connected to the underside of the PiTFT screen on my prototype. Those circuits were directly driven via GPIO and the corresponding LIRC driver – next I had to provide my own I2C driver for LIRC to drive this new circuit.

Putting it all together on the Pi

LIRC has a plugin driver model with lots of existing code to control different IR devices. However none of those would work with my circuit. So I forked the LIRC codebase and created a new branch for my driver code. This was straightforward to integrate with the LIRC build system, and I was able to carry out all the work directly on Raspberry Pi. Configuring LIRC to use the driver was trivial – by specifying it in the config file instead of the default driver, and commenting out options for the Pi GPIO driver.

Testing on the command line and with the PiMony Python code showed it to work successfully! I could now remote control my devices as before. So I took a deep breath, and removed the timing workarounds for the touch screen issues in the Python code – and it worked smoothly! No more missed touch screen events.

To be a fully working version, this I2C driver requires some more effort (but not much) – it should really have options to specify the I2C bus to use, and the I2C device address as currently these are hard-coded.

Conclusions

So this was a trip around the houses to get back to where we started – but yielded the desired results and taught me a whole lot of new stuff into the bargain. And it’s opened up doors for further work on this project…

Next

Here’s where it starts to get dangerous – from here on in, I’ll be taking off the safety wheels of the Raspberry Pi and diving deeper into pure microcontroller territory. There will be bare-metal coding, parallel interfaces, custom PCB creation and DIY surface mount soldering before this tale is fully told…

Resources

(c) 2015 Nicholas Tuckett.

Digression #2: PiMony, a smart remote prototype

And so I digress again…

Presenting PiMony, a smart remote prototype built on a Raspberry Pi!

IMG_20150101_170254

As prototypes go, this is very… prototypey! It is quite large and kludgy (e.g. trailing wires). The goals at the moment are to learn more about the hardware side, and try out ideas for both hardware and software. Then work on the size, look and feel!

IMG_20150101_170518

For an idea of scale, here’s the remote next to a Raspberry Pi Model B. In total, it’s about three-and-a-bit times the length of a naked Pi.

My motivation for making this may seem familiar if you’ve read my earlier post about Squeezeboxes…

I’ve been the owner of a Harmony 655 smart remote for quite a few years, and it has done sterling service controlling our entertainment gear. However it is now ageing, and the number of buttons that are intermittent in operation is slowly growing. I researched newer models… and was rather disappointed. The less pricey of the new models only control a limited number of devices (my 655 could handle loads more), and the higher-end remotes have bells and whistles that draw more power and increase fragility (judging by customer reviews). And Logitech bought the original manufacturer a while back…

So I decided to go DIY – that way I would only be limited by my imagination and skills, would certainly learn a few things and have a device I could easily repair myself if it got damaged.

Hardware

So what’s in the PiMony? I won’t go into much detail here, but these are the main component parts with embedded links to information and resources that helped me with this project.

Raspberry Pi Model B+ and PiTFT

Raspberry Pi Model B+ and PiTFT

Custom circuitry

Custom matrix keypad (left), GPIO extender (centre) and load sharing circuit (right)

Power supply

LiPo battery (lower left) and PowerBoost 500C (lower right) supported by PiBow extended baseplate.

Supporting Information

There’s a good deal of information on infra-red remote control hardware and protocols about. Here are some of the most useful links I found:

Software

To get things going swiftly, I re-purposed some early Python code from the SqueezePi project that used Pygame to handle rendering on the PiTFT and input from its touch screen. I rounded this out to present a basic data-driven user interface, and added GPIO button polling and IR sending via LIRC – more details on this below.

PiMony User Interface on touch screen

Here’s a shot of it running.

The Python code is dependent on Pygame, smbus and RPIO and runs on Raspbian configured with support for PiTFT, LIRC and I2C.

You can find the PiMony code in this Github repository.

Phase 1: IR Basics

IR_0

I started out creating a basic IR receiver and sender circuit on a Raspberry Pi with LIRC installed. This comes from some very helpful posts on alexba.in’s blog. I used a TSOP39238 IR receiver and TSUS5400 940nm IR LEDs – both purchased from Bitsbox (in the UK).

I proceeded to try capturing the IR signals from a couple of remotes (Sony TV, Phillips DVD player). Initially, I found recording the IR signals via LIRC to be somewhat hit-and-miss – some codes got recorded while others didn’t. Then I found that there were orders of button presses on my Sony TV remote that 100% reproduced the problem. Some research into Sony TV IR codes brought to light that Sony remotes can output 12 bit, 15 bit or 20 bit codes. So I tried some ‘raw’ recording of the button output and found that my remote has mostly buttons generating 12 bit codes but some generating 15 bit codes.

By trial and error, I was able to build a list of buttons for each code length and then captured each group independently.

Once captured, the IR codes played back perfectly through my IR sender circuit. The only downside I found was that the IR LEDs I use have a narrower output angle, so the remote needs to be more accurately pointed at the target device to work.

With the basic record and send capabilities operational, I spent a couple of hours capturing all the IR signals of all the remotes used by our household entertainment devices, building a LIRC database of remotes and their codes.

Phase 2: Python Prototype

IMG_20150101_182432With that all running, I moved on to writing the Python code. As mentioned above, I reused some early code from the SqueezePi project to get rendering and touch screen input quickly up and running, then went on to create a simple touch-screen UI that triggered the sending of IR codes. My Github repository contains the code if you want to refer to it.

At the start, I sent the IR codes by spawning LIRC’s irsend application with a suitable set of command line parameters. This quickly proved problematic; spawning a new process was too slow for a responsive user experience and LIRC didn’t cope well if more than one irsend process was active (which was needed for buttons that send more than one IR command). I replaced this with code that communicated directly to LIRC via its socket interface; this proved to be faster and reliable for sending multiple commands.

The Touch Screen Challenge

I also found that the touch screen was flickering at times, and touches were ‘bouncing’ – i.e. one touch on a button would cause that button’s IR commands to be sent multiple times. Debugging this showed that there were phantom ‘touch release’ events being generated, immediately followed by detection of a new touch even though the user had not stopped touching the button. This led to quick changes to button visuals (the flickering) and the unwanted command repeats.

To get to the bottom of it, I created a standalone C app to see if the problem was repeatable outside of Pygame. I first used SDL (on which Pygame is based) to read from the touch screen – and this showed exactly the same issue with phantom release events. I then replaced SDL with direct calls to the underlying tslib touch screen library; and this too showed the same problem.

I then tracked down the source code for the PiTFT touch screen driver and began my study. It has a timeout mechanism implemented in one of its functions (stmpe_work) that generates a release event if there has not been an interrupt from the touch screen for around 20 milliseconds. So if something stalls the CPU in the right way for long enough, this will cause a phantom touch release event to occur.

As IR sending coincides with the phantom release events, I dived into LIRC and the lirc_rpi kernel module source. The Pi module source is available as a very readable patch. Lo and behold – the IR sending function lirc_write uses a kernel spinlock which disables interrupts while the IR pulse generating loop is running (in order for the software’s timing to correctly reproduce the right frequency of output signal). Basic math shows that a Sony 12-bit IR code takes between 17ms (all zero bits) and 24ms (all one bits) to send. GIven that this timing is close to the touch screen driver’s ‘timeout’ timing, and that LIRC is repeating each command twice for robustness, it seems highly likely that having interrupts disabled for this time could be causing the touch screen driver to generate phantom releases.

For the time being, I dealt with this by revising the Python code to work around it – if a touch screen button press is detected, any further events are flushed and touch screen events are disabled until all IR codes are sent (plus a short delay). At the cost of some responsiveness, this avoids the flickering and unwanted repeats.

Improving IR Sending

IR_1The timing of the IR signal is critical. A 38kHz square wave is used, meaning the driver has to toggle the GPIO pin state every 13 microseconds. The pulses of the square wave have to match timings very accurately – e.g. durations of 2.4, 1.2 and 0.6 milliseconds for Sony codes. Driving this in software on a Pi running Linux with interrupts enabled is very likely to make the timing too variable for it all to work.

This could be tackled in the LIRC driver a number of ways:

  • Use less CPU intensive 38kHz square wave generation:
  • Use an interrupt to accurately control the timing of the IR square wave pulses; turning the IR wave output on or off as required.

In all cases, this requires kernel module coding – not for the faint-hearted!

There are also some external hardware solutions:

  • For generating just the square wave – e.g. a PWM breakout board, or a 555 timer IC.
  • For generating the square wave output and its pulses – e.g. a small microcontroller with dedicated code for driving the IR sender circuit..

I successfully breadboarded a circuit using a 555 timer, and was able to drive this from a Pi GPIO pin and control the TV. However it required a fair few supporting components and some trial-and-error to get the timing accurate.

For this project, I am going to follow the hardware route, building an LPC810 micro-controller based ‘IR blaster’ device, controlled via an I2C interface. This completely decouples the IR sending, so that the UI can remain responsive – and will be a fun further learning opportunity!

Phase 3: “NEEDS MOAR BUTTONS!”

IMG_20150102_192658

I wanted some physical buttons on the prototype – partly because it will be a nice user experience to combine touch screen and tactile buttons, and partly to learn more hardware stuff!

As the PiTFT has tactile button wiring (and buttons were already soldered on from a past project), I extended the Python code using RPIO to poll the pins on those buttons and send IR commands when presses were detected. To keep the wiring simpler, I set the GPIO pins to be pulled up to high internally on the Pi, and the buttons were connected to ground – so when a button is pressed, the GPIO pin goes from high to low. This was quick to set up and worked well. The only downside is that the code has to be run as root because of RPIO.

A remote needs a numeric keypad. That’s a whole ten GPIO pins if one is used per button! However, they can be wired in a three column, four row matrix arrangement that reduces the number of GPIO pins to seven – one pin for each row and column with column pins as outputs, row pins as inputs. The matrix is read by sending a signal to the column pins in turn and reading back the row pins at each step – any set row pin means the corresponding button in that column is pressed.

Seven pins is still too many for the original Raspberry Pi model on which I started this project – and even using a B+ would mean a lot of extra wiring to connect it together. So I opted to use a GPIO extender chip connected to the Pi’s I2C bus – that meant only four wires from the Pi (two for I2C, one for power and one ground). Plus if I decide not to use a Pi in the future, I can easily reuse the keypad with other small CPUs that support I2C.

As with the buttons on the PiTFT, I set the row input pins to be internally pulled high – so the polling signal needs to be reversed (set each column to low in turn, and any row pins that are low means buttons are pressed). In addition, each button is wired up through a diode (reversed because of low signalling a pressed button). These diodes prevent phantom button presses that would otherwise happen when certain patterns of buttons are pressed at the same time.

This was all pretty straightforward to wire up and implement in Python – the existing smbus library makes I2C a doddle. The only awkward part was that my original IR sender board connector masked the I2C pins on the Pi, so I had to construct a further breakout board to sit between the connector on the PiTFT and the original sender board.

Phase 4: Putting It Together

IMG_20150102_193236

The next main phase of work was to get all these pieces to work together in a form that could actually be used as a remote. This would require a case for the Pi and PiTFT which would also allow me to attach the keypad and IR sending circuits.

I had previously bought a Pibow case for a Model B. These cases are made from horizontal layers of laser cut acrylic screwed together. I realised that I could create an extended replacement base layer to which my other parts could be fitted.

To test this out, I tried using this case with the screen and IR sender circuit – but it wouldn’t fit together without having to add a very tall spacer between the screen and the Pi. That set-up wasn’t stable enough to allow the screen to be touched and proved too bulky to be practical. So instead I switched to using a model B+ Pi in a Pibow Coupe case. This allowed all the parts to fit, and though quite hefty was not too bulky for use.

Switching to the B+ highlighted some issues with RPIO. I found that the mapping of GPIO identifiers to expansion port pins wasn’t working right; it turned out that some of the functions used the wrong identifier (pin vs GPIO or vice versa) and one of the internal pin numbers was incorrect for revision 2 Pis (and the B+). I fixed these in the dc-dev branch of the RPIO fork in my Github.

Next up was a power supply! Adafruit provide some straightforward LiPo battery driven power boosters/chargers; I tracked down a supplier and purchased a PowerBoost 500C and a 2500mAh LiPo battery. It charged up fine (though that took a good 7+ hours) and ran the PiMony continuously for about 6 hours in testing – which should be fine for further development purposes. The only problem was that though power could be supplied to the PiMony while charging, it wasn’t enough for it to run properly.

The solution was to create a load sharing circuit that would switch off the PowerBoost output when a USB power supply was connected and instead allow it to supply current direct to the PiMony as well as the battery charger. This wasn’t hard to build, but the only components I could obtain were rather over-specified for the task and consequently added some unwanted bulk.

Creating the extended base plate meant measuring up the existing base plate carefully with digital calipers, and reproducing it as a vector image in Inkscape. Extending the base to fit my custom circuit and power supply was straightforward, with careful measurements taken to position screw holes for the supports. I also allowed some extra space between the edge of the Pi and the top of the keypad so I could get access to the network and USB ports on the Pi. I sent this file off to Razorlab for some trial cuts in cardboard; after testing and tweaking those, they made a final acrylic cut for me. It all fitted together beautifully.

So now I have a usable if somewhat bulky prototype that works fine at controlling my TV and surround sound system!

What’s Next?

There’s plenty in the pipeline for this project!

The software side needs further development to make the remote actually ‘smart’ – so it can handle different combinations of devices for different activities (e.g. just watching TV as opposed to watching a movie on Blu-ray), and track which devices are on at any given time. The remote configuration is currently data ‘baked in’ to the Python code; this should be moved into data files, and a GUI tool created to allow easy editing.

The IR sending circuitry should be rebuilt using a microcontroller as mentioned above, to avoid the interference between the touch screen and the current GPIO IR driver.

IMG_20150101_181757

Freescale FRDM-KL26Z board

The next iteration of the hardware should be smaller and sleeker, and run for longer on a single battery charge. I’m thinking of replacing the Pi with an ARM microcontroller board like the FRDM-KL26Z. It is much lower profile than the Pi and less power hungry but should be able to handle the required behaviour. It may even be able to do IR sending, or that could still be done by an external device.

IMG_20150101_181931

FRDM-KL26Z alongside a Model B+ Pi

The software would need to be reimplemented in C/C++, and the big advantage would be complete control – no Linux overhead!

I could also reuse the keypad matrix as I2C is available on this type of device, and the battery based power supply and load sharing could also be reused.

Speaking of load sharing… that part of the circuit should also be improved by trying to find a solution that’s smaller!

And some sort of low battery detection and indicator would be a good idea! There’s an unconnected white wire you may have spotted – this is the low battery output from the PowerBoost board, which can be directly connected to a GPIO and read in software. Unfortunately when connected it turns on the low battery LED on the PowerBoost regardless, which isn’t helpful!

(c) 2015 Nicholas Tuckett.

SqueezePi: Booting and PiTFT

Bootup Failures

I mentioned in the last post that occasionally the SqueezePi would not boot up. The behaviour seemed very random, so I initially suspected my soldering, or power supply issues. But after a morning spent checking and cross-checking both thoroughly, I found nothing of concern.

Then while digging into GPIO information for another project (more to come in a future post on this) I learnt that some versions of the firmware will boot into ‘safe mode’ if GPIO3 is held low on startup. Hmmm… one of my rotary encoders (for volume) in the control panel connects to that pin…

After some careful positioning of the rotary encoder and gentle probing with a multimeter to achieve GPIO3 held low, I was able to replicate the bootup failure every time. Plugging in an HDMI cable showed the Pi was booting partway, and the video output signal was low resolution (640×480). Further adjusting the encoder to ensure GPIO3 was connected high allowed perfectly stable boot every time. Culprit found!

The random behaviour of this issue was purely because of what position the volume control was in on boot. This meant that you could on one day have the SqueezePi working fine and turn the volume up or down while listening, then on the next boot it wouldn’t start. Accidentally knock the volume control while trying to sort it out, and it would ‘magically’ start booting again.

PiTFT and Upgrading Firmware Challenges

So to resolve this, I decided to see about upgrading the Pi firmware and Raspbian version in the SqueezePi. But because I’m using the prebuilt kernel modules for the Adafruit PiTFT touchscreen, options are limited as Adafruit package up the modules with an entire kernel. Fortunately the latest version at the time of writing is new enough not to have the safe mode boot check, so I went for that.

Lo and behold… booting became reliable, no matter the volume control position. However there was a not-so-nice and very obvious side effect – every time the screen updated, very audible noise was emitted from the speakers. And the screen update speed was visibly slower.

Rolling back to the previous firmware and kernel I was using fixed both of these issues (see here on Adafruit) – but means exposure to the safe mode boot scenario.

What to do?

Well, the situation still stands – I haven’t gone into this further, and both Adafruit and notro (who did the PiTFT work) don’t currently have an answer. I’m sticking with the older firmware, and have just found out that there’s a config.txt option to force safe mode to be ignored so I’m going to try that for now.

EDIT: I’ve tried the above ‘ignore safe mode’ option, and it works. So I’m going with that and the earlier firmware for now.

(c) 2014 Nicholas Tuckett

SqueezePi: Audio output and an enclosure to call my own

Audio Cannibalisation

For this first go at making a usable digital music player, I decided to forego custom analogue audio circuitry and use off-the-shelf stuff and/or cannibalise. In fact, I wound up doing both.

The speakers in my previous post are active (USB powered) so I decided to pull them apart and see if I could reuse anything – lo and behold, there was a neat little amplification circuit with volume control.

IMG_20140621_133510

I just chopped off the USB plug and spliced the power connections into a 2.1mm socket alongside an adapter cable that connected to the Pi’s micro USB socket – and with a good quality 5v power supply, I was able to successfully power the Pi and the speakers.

The little USB audio dongle was also co-opted into my build, as it was small, cheap and worked well with the Pi – and better than the onboard audio!

So now I had all the main parts ready and working:

  • Raspberry Pi
  • Software: Squeezelite, Jive Lite, pikeyd.
  • Custom control panel.
  • Adafruit TFT touchscreen.
  • Analogue audio output, amplification and reproduction.

Act of Enclosing

I started out with the intention of finding a ready-made housing using something “retro” – an old radio or radio cassette player. Others have done this successfully; but after some initial research I realised I would either have to fork out for a refurbished item from an online seller (not cheap) or spend time grubbing around and hope to get lucky.

My next thought was – why not try 3D printing one? A chance to try out this cool new technology with a project… So I set to with a will, roughed out some ideas on paper based on the Squeezebox Radio (but with two speakers and fewer buttons), picked up the free version of Sketchup and created a box-like structure with suitable holes for speakers, controls, etc. The intention was to export the front and back faces as separately printable parts, and the main body as a hollow “sleeve” into which the electronics would fit.

SqueezePiFullSqueezePiBody

This seemed like a good idea, until I got further into researching 3D printing – confirmed by feedback from a commercial “on-demand” 3D printing firm. The size of just the front face of my enclosure meant it would be expensive (£60) and risky (likely to distort) to print.

Then I came across laser cutting services – and this looked exactly the thing! At least one of my Raspberry Pi cases had been made with laser cut acrylic, and I also found more designs online for electronics enclosures made from both acrylicplywood and MDF. I found some great tutorials and a local cutting service with starting out advice and templates.

I went back to Sketchup and revised my design, producing a box whose faces fit together using interlocking “crenellation” and a bit of tab-and-slot. No screws or glue needed!

SqueezePiLaserCutBodySqueezePiSide

In the pictures above, I’ve also cut out holes for speakers, screen, control panel and speaker surround mounts (cannibalised from the original USB active speakers). The image on the right is a face-on view of the right side, with tabs for connecting the front face and a hole for the 2.1mm power jack in the bottom right.

I decided to follow the advice I’d read on various sites, and prototype the design with the cheapest material – cardboard! I first made a print out on paper of each face to the right size, stuck them onto cardboard and carefully cut them out with a craft knife and put it all together – and it worked! It was a bit fiddly trying to assemble, and I used a bit of selotape, glue and a few internal cardboard strips to help rigidity. That lack of rigidity wouldn’t be a problem for a final acrylic or plywood version.

IMG_20140719_202421

So as a trial run, I exported the faces as DXF files from Sketchup, brought them into Inkscape and followed Razorlab’s instructions on how to set them up for cutting (using their template sizes and styles). I adjusted my crenellations and tabs to allow a little extra width to account for material lost by the cutting action of the laser (kerfing).

I sent these off and in a week got back the first cut – it went together like a dream! The cardboard was a lot more rigid than my first homemade version (no glue to dampen it), so assembly was easier.

IMG_20140721_194309

The crenellation technique seemed to work well, and the general sizing and location of holes was accurate enough to facilitate a working mock-up assembly.

I tried several iterations of the front panel over the next few weeks via Razorlab, all still in cardboard – until settling on the design in the earlier Sketchup screenshot.

Then it was time to get real… I put in the order for a black acrylic version, uploaded the design and waited…

Meanwhile

All this trial assembly of the components with various cases unfortunately took its toll on a weak point – the wires connecting the speakers to the amp. These were thin, and never designed to be manipulated – so broke very easily. I re-cut and soldered them a few times, then decided replaced them with thicker wires and stronger soldering. However at this point, some of the solder pads on the amp circuit board lifted away from the board and broke off – again, they were never designed for repeated re-soldering.

Adafruit and ModMyPi came to the rescue in the form of the Adafruit TS2012 class D mini amplifier. This was just the ticket – easy to assemble, the right kind of power output and input – and worked like a charm. I just had to solder on the connectors, wire up a new cable to take the output from the USB audio dongle, and splice a connection to the power jack.

IMG_20140814_215007

The Coming of Acrylic

Shortly after resolving the amplification woes, the acrylic case parts arrived from Razorlab. With nervous excitement, I unpacked it, checked the component fit and tried assembling it…

IMG_20140820_213714

It wasn’t without some worrying moments:

  • The fit tolerance of the crenellations and slots/tabs was much tighter – because acrylic is obviously less flexible than cardboard. I also think that the laser cutting may have burned away less material, so I probably didn’t need to compensate in my kerfing quite as much. Some careful filing eased the fitting.
  • I’d left screwholes out of my design, assuming I could drill them ok. This turned out to be scary, again because of the rigidity and brittle nature of acrylic. I managed in the end, but did cause some cracking (thankfully covered by one of the speaker surrounds).
  • My lower “springy” tabs on the sides proved too fragile in real use and broke off – fortunately, they also turned out not to be vital.
  • One corner crenellation was also a little too small on the front panel, and also broke off – again not a vital part. To be robust, the minimum dimension for a part probably needs to be 4 mm or higher (this was 3 mm x 6 mm).

If I were to make this again, I would:

  • Allow less for kerfing, or disregard adjusting it altogether. My adjustment was 0.1 mm each side of a tab, so I think I would probably try again with no adjustment.
  • Let the laser cut screw holes where possible – or if they’re too small, get it to etch a mark on the right position.
  • Ensure no feature is smaller than 4 mm.

I then proceeded with final assembly, which was a bit fiddly (small screws, nuts, washers, etc) but pretty much went according to plan.

IMG_20140823_174912 IMG_20140830_121028 IMG_20140830_121041 IMG_20140829_192321

Testing and Bumps in the Road

Next up, some testing in the real world. It worked fine “in the lab”, so I tried it out around the house over a weekend – and it stood up rather well. Sound quality occasionally got a bit boomy and distorted – but adjusting the amp settings resolved this (I’d confused the onboard DIP switch settings as their action turned out to be the reverse of what I expected).

The only way to properly power off the device was to unplug it – this isn’t a good idea with the stock Pi operating system, as it can cause corruption to the SD card. So I did some research and reconfigured it to boot up in read-only mode, writing log and temporary files to a RAM disc.

Somewhere along the line when fiddling around in the internals during testing, I managed to kill the left audio channel. This turned out to be on the amp, so I ordered a replacement from ModMyPi. That arrived promptly, but also turned out to be faulty (DIP switches missing). ModMyPi were brilliant about that and sent me a replacement the very next day.

When the Pi boots up, and when it shuts down, strange noise comes through the analogue audio circuit. This happens when the USB audio dongle is in an uninitialised state (i.e. before the drivers are started, and after they’re shut down). It’s not the end of the world, just a little unpleasant.

Occasionally, the touch screen either doesn’t work after boot up or stops working after a while. I suspect my soldering might be the cause here…

And also occasionally the Pi doesn’t boot up. I haven’t pinned down the exact cause, but experiments suggest it may be related to my custom control board.

What’s Next?

I plan to tackle the following over time:

  • Sort out the bootup/shutdown noise.
  • Improve boot speed and experience – get some sort of boot image on the screen.
  • Sort out the intermittent boot up and touchscreen failures.
  • Improve the custom control board – use better components and a custom PCB.

(c) 2014 Nicholas Tuckett

SqueezePi: The Joy of lite Jive, and custom controls

Community Power

There’s a great community that has grown up around Slim Device’s Squeezebox products. I think that stems from the company’s very open approach, with a wiki and forums that are still very active today – despite the company’s assimilation into Logitech, and the discontinuation of its devices.

It was through these resources that I discovered JiveLite, Squeezebox controller software created by members of this community. This is a very fully featured app, written in a mixture of C and Lua with full source available, that runs on Linux. With a little digging and interpretation of the available instructions, I built and ran it on my Pi connected to a conventional LCD monitor via HDMI.

The next challenge was to get it working with the little 2.8″ LCD. Fortunately others had been there before me, so that didn’t take long – and with a little extension to the settings, I also got JiveLite to respond to the touch screen. This was proving a bit too easy…

Other People’s Software

Yes, it was a little too easy. It turned out that JiveLite had a few issues – some minor ones arising from using a very small display, some in the build configuration files and a crash when navigating music collections (e.g. when selecting by artist). I’m jumping ahead a bit here… I’ve now fixed these issues in a clone of the original codebase, and made it public. Hopefully I can persuade the originators to consider taking some of these back into the main repository.

I also added some extras – the necessary user interface data files for the small size LCD, the ability to turn off the mouse pointer for touch screen use, and the ability to control GPIO pins from the Lua script. The latter I used to modify one of the screen-savers, so it would turn off the LCD backlight when active to save some power.

Prototyping Custom Controls

Inspired by these strong steps forward, I decided to try driving JiveLite via controls attached to the Pi via GPIO pins. You can see what my efforts look like in the picture below. I used an Adafruit Pi Cobbler kit to connect my Pi to the breadboard, to make it easier to wire up the GPIOs to the control components.

IMG_20140614_162041 (1)

Yet more research turned up another really useful bit of Pi software made for this purpose – pikeyd. This package runs in the background, polling GPIO pins connected to switches and generating keystrokes when the switches are closed. So it was a relatively easy task to find out the keys that control JiveLite and configure pikeyd to generate those from GPIO inputs. Et voila, those little switches under the LCD were usable for navigating the menus, playing tracks and changing the volume.

Next I wanted to get a rotary encoder control to work. My Squeezebox Radio has one of these, which is used primarily to scroll up and down through menus and make choices by pressing on it. Rotary control components are relatively cheap and easy to obtain – both RS and Farnell carry a bewildering range. I got a couple of cheap ones from here. They’re okay quality for this project; however you could wind up spending a lot if you want really top quality rotary controls!

These SparkFun encoders act as a pair of switches that make and break contact with ground out of phase with each other as you turn the control shaft. By wiring the two “output” pins of the encoder via pull-up resistors to GPIO pins on the Pi, those pins will alternate between low and hi in the a pattern known as a Gray code, and the direction of rotation can be determined by tracking how the pin values change over time. So you can turn rotation of the controller into steps in the two different directions, and map those steps into changing something in your software (e.g. the current selection in a menu). This tutorial for Arduino provides more detail.

Unfortunately, piked doesn’t support rotary encoders, just simple switches on GPIO. But it wasn’t hard to add this support… you can find my modified fork of the software here for public consumption. So now pikeyd had gained the ability to map rotary encoder steps into key presses (two on each encoder, one per rotation direction), then I added a configuration option to set up the internal pull-up resistors on the Pi GPIO pins, saving extra components and wiring.

With my modified pikeyd configured to generate the corresponding keys from a rotary control, I could now navigate the JiveLite interface through it. There was also a built-in push switch on the controller, which I wired up to another GPIO pin to act as a “select” key stroke when pushed.

You can see all of it working together on this video.

The Control Board

With all of this prototyping under my belt, I went on to build a first working version of the controls for the SqueezePi – two rotary encoders (one for navigation, one for volume) and four switches for “home”, “back”, “on/off” and “play”. I constructed this on a thin strip of Veroboard with tracks like a breadboard, and attached a couple of single track strips top and bottom running at ninety degrees as ground tracks. I also used another (cheap) Pi cobbler from Maplin soldered onto the breadboard as a quick way of attaching the Pi GPIOs via ribbon cable.

IMG_20140629_133259IMG_20140629_133317

Extending my custom pikeyd configuration to read these controls was a walk in the park.

Next, I’ll go onto describe what I did for audio output and the process of creating the case.

Digression #1: SqueezePi

I’ve wandered away a bit from Pi synthesiser exploration, and spent some time on another Pi audio project – the SqueezePi!

IMG_20140829_192321

As a long-term owner and user of a couple of Slim Devices Squeezebox networked digital audio players, I’ve been feeling let down by Logitech’s purchase of that company and its subsequent discontinuation of the products. So I set myself the challenge to see if I could create an equivalent to the Squeezebox Radio or Boom players – taking inspiration from the work of others on open source versions of the software, and the efforts to get that running on the Pi.

In this post and subsequent ones, I’ll provide a potted, illustrated history.

First steps: Software and Key Goals

Fairly quickly, I discovered SqueezeLite. This is a free Squeezebox compatible player that has been successfully ported to the Pi in various Linux distributions. There’s a good thread about it on the Pi forums. I used a prebuilt (now deprecated) binary version.

It worked straight away on the Pi with my existing Squeezebox server – and I’m using it still (I should really attempt to build it from source soon, though).

So that got me off to a good start. Now some forward planning was in order…

To be as standalone as the real Squeezebox players, I was going to need some sort of dedicated display on the Pi. Also some push switches and a rotary control, analogue audio output to amplification and speakers, a power supply to drive the Pi and amplifier, and a nice case to package it all up. I drew up that list and began investigating my options.

Display

I was already aware of a wide range of dedicated displays for the Pi – many easily affordable and fairly easy to set up and use. But to be like the original Radio player, the display would have to be colour, which led me to research the various small TFT LCD displays for the Pi. So I decided to try the Adafruit 2.8″ TFT LCD which had the added bonus of a (resistive) touch screen – so my SqueezePi would be a blend of the Radio, Boom and Touch players!

I ordered one from the marvellous folks at ModMyPi, and on its prompt arrival dusted off my soldering skills and put it together. This was a little nerve-wracking, but ultimately more straightforward than I feared! The Adafruit site provided all the info and software needed, and I easily had it up and running in under an afternoon. I also soldered on some small push switches onto the display’s circuit board that I could try as dedicated controls for the SqueezePi.

With a little research, I was able to write a simple prototype UI in Python that pulled information from my Squeezebox Server, via PyLMS and PyGame and control playback on the Pi’s instance of SqueezeLite. It was straightforward to get the touchscreen working via PyGame, and I used the RPIO Python GPIO library for the Pi to add support for those little touch switches – et voila custom dedicated controls worked!

IMG_20140531_184444

In the image above, you can also see the first part of the audio stage – a “no-name” brand cheap USB audio card.

This was great as a prototype, but falling way short as a Squeezebox-like experience. So, next I had to either do more research into open source, or start some serious UI coding…

(c) 2014 Nick Tuckett.

Beginnings of a more friendly user interface

To make at least my life easier, I’ve added some new UI features to the Pithesiser – namely text rendering and screenshot taking.

Now I can render the values of settings in a straightforward fashion; next I need a nice data-driven way to set up display of all the settings, as there are a fair few already and will be more soon.

Image

All the UI rendering is via Open VG, including the anti-aliased text. It’s done on a separate thread which is sent “events” by the main thread to cause re-rendering of portions of the UI when changes occur.

To drive the oscilloscope display, each newly mixed buffer of sound triggers an event that causes the render thread to add line segments for the sample data onto the oscilloscope path. And when the path fills the display area, it triggers rendering of the path and a buffer swap to update the screen.

Other UI elements receive refresh events when their data changes; these events mark the elements as “dirty” which causes them to be redrawn on the next oscilloscope triggered buffer swap.

(c) 2013 Nicholas Tuckett