Raspberry Shake: IoT for your boring appliances

My latest project (now I have some spare time again!) has been something quite simple – I’m terrible at putting the washing machine on, and then forgetting about it for the rest of the day, leaving a load of wet clothes inside to fester for hours on end. I know some washers and dryers come with alarms that beep at you when they’re finished, and I wanted to emulate this with an Internet of Things vibe.

So I came up with the Raspberry Shake – quite simply it’s an accelerometer connected up to a Raspberry Pi Zero with some LEDs to indicate status, all shoved inside a small box with some magnets attached, so it can cling to the side of any appliance. The Pi Zero runs a bit of Python code that checks for any movement, and sends notifications when the appliance starts and stops. I’ve made two so far, with plans for a third, and they’re working great!

You can see a full writeup and a video of the build on the Raspberry Shake project page

Talking to a LIS3DH via Python on a Raspberry Pi

For my latest project (details coming soon available here) I acquired a couple of LIS3DH triple-axis accelerometers. As most of the products available through Adafruit are fairly well used, I didn’t bother checking what libraries were available before buying, but unfortunately for me only a C++ library had been written. I didn’t feel like learning C just for the purpose of this project, and so the only option left was to write my own Python library!

Thankfully I had some excellent starting points with the aforementioned C++ library, as well as the Python I2C library that Adafruit have published. I found myself referring back to the manufacturer datasheet quite often as well, mainly to clarify what each register contained.

While the task initially looked rather daunting (having had zero prior experience with bit-bashing through registers) – I found that with some pre-existing code to crib from, the various functions took shape rather quickly, and within an afternoon I’d produced a library exposing all the basic functions I’m likely to need for this project. I’ve put my code on Github in the hope that people will contribute to filling in the gaps, and improving where necessary.

AlarmPi: The Raspberry Pi Smart Alarm Clock

When I left my previous job around 18 months ago, I promised myself I’d do something productive with the time I had between employment. During that time, I realised how much I hated my alarm clock going off every morning, and also how stupid and inflexible most alarm clocks are. I managed to achieve very little with that spare time between jobs, but this hatred of alarm clocks has been driven home even further since I’ve started working shifts in my new job – no alarm clock I could find had the ability to vary the alarm time based on a shift pattern (I suppose that’s a fairly niche feature!), and very few had decent internet radio connectivity to allow me to listen to music I like in the morning.

That productive feeling drew me to buy some parts from Adafruit and have a play with some electronics projects – the furthest I got was playing around with a LCD display as documented in this other blog post. More recently, my old alarm clock started to fail in rather interesting ways (ever been woken up at 3:27AM by a piercing screaming & static noise?), so I decided it was time to build my own, and the AlarmPi was born!

The core of the project is a Raspberry Pi connected up to a series of fairly basic components, all controlled by a Python script which takes input from all manner of sources, and shows information through the two front displays. I’ve put together a short video explaining some of the main features which can be viewed below, and you can read more about the AlarmPi on the project page

Text to Speech on a Raspberry Pi using Google Translate

For a couple of upcoming projects, I’ve been trying to find a way of making a Raspberry Pi take an input of a piece of text and vocalise it through a pair of connected speakers (so-called Speech Synthesis). There are a number of methods listed on the eLinux wiki page on the subject, however I found the suggested available packages produced rather robotic sounding results, and I was after something a bit more natural and pleasant sounding, rather than something to scare the bejeezus out of me every time it speaks. The most natural sounding offering is a hidden and unofficial API provided through the Google Translate service, which produces some very nice sounding audio, and is very accurate most of the time. Unfortunately, it’s limited to 100 characters at a time, which starts to be a problem when you want to read out large swathes of text.

There are a few scripts that I found (including this one from Dan Fountain) that offer an interface to this API, however the majority of them just split the input at the 100 character mark (or by the previous space to it), which leads to broken sounding sentences in some cases, where the pre-existing punctuation could be used. In order to get something slightly more natural sounding, I set about bodging together some Python, and came up with the following:

Please note: this script no longer works! Google made some changes to their TTS engine during July 2015 which meant this script would no longer work, as the translate_tts request would be redirected to a CAPTCHA page. There is an updated version of the script available in my SVN repository, and now at Github as well

#!/usr/bin/python

# googletts
# Created by Matt Dyson (mattdyson.org)
# http://mattdyson.org/blog/2014/07/text-to-speech-on-a-raspberry-pi-using-google-translate/
# Some inspiration taken from http://danfountain.com/2013/03/raspberry-pi-text-to-speech/

# Version 1.0 (12/07/14)

# Process some text input from our arguments, and then pass them to the Google translate engine
# for Text-To-Speech translation in nicely formatted chunks (the API cannot handle more than 100
# characters at a time).
# Splitting is done first by any punctuation (.,;:) and then by splitting by the MAX_LEN defined
# below.
# mpg123 is required for playing the resultant MP3 file that is returned by Google TTS

from subprocess import call
import sys
import re

MAX_LEN = 100 # Maximum length of a segment to send to Google for TTS
LANGUAGE = "en" # Language to use with TTS - this won't do any translation, just the voice it's spoken with

fullMsg = ""
i = 1

# Read our system arguments and add them into a single string
while i<len(sys.argv):
   fullMsg += sys.argv[i] + " "
   i+=1

# Split our full text by any available punctuation
parts = re.split("[\.\,\;\:]", fullMsg)

# The final list of parts to send to Google TTS
processedParts = []

while len(parts)>0: # While we have parts to process
   part = parts.pop(0) # Get first entry from our list

   if len(part)>MAX_LEN:
      # We need to do some cutting
      cutAt = part.rfind(" ",0,MAX_LEN) # Find the last space within the bounds of our MAX_LEN

      cut = part[:cutAt]

      # We need to process the remainder of this part next
      # Reverse our queue, add our remainder to the end, then reverse again
      parts.reverse()
      parts.append(part[cutAt:])
      parts.reverse()
   else:
      # No cutting needed
      cut = part

   cut = cut.strip() # Strip any whitespace
   if cut is not "": # Make sure there's something left to read
      # Add into our final list
      processedParts.append(cut.strip())

for part in processedParts:
   # Use mpg123 to play the resultant MP3 file from Google TTS
   call(["mpg123","-q","http://translate.google.com/translate_tts?tl=%s&q=%s" % (LANGUAGE,part)])

This can also be downloaded from my projects repository at http://projects.mattdyson.org/projects/speech/googletts, where updated versions may be available. The package mpg123 is required to play the resulting MP3 file that Google Translate returns. The easiest way to get this script installed will be with the following (run as root on your Raspberry Pi):

$ apt-get install mpg123
$ cd /usr/bin/
$ svn co http://projects.mattdyson.org/projects/speech speech
$ chmod +x speech/googletts
$ ln -s speech/googletts
$ googletts "Hello world, the installation of the text to speech script is now complete"

Unfortunately, if a clause of a sentence is longer than 100 characters there will still be an unwanted pause in the middle, as the script does not know where best to split the text, and if you’re using a lot of punctuation you might find the text takes a long time to read back. I’d be welcome to incorporate any improvements people may suggest!

Blinkytape

Yet another one of my Kickstarter jaunts turned up just before Christmas – the Blinkytape by BlinkinLabs. Essentially, this product is a strip of 60 LEDs connected to a USB interface, which allows you to address each “pixel” individually through a little bit of coding so you can build up your own programmable lighting show! So far I’ve only had chance to use this as a very nerdy alternative to Christmas lighting, and more generally expanding my knowledge of Python, but I’ve got big plans for it in future!

First up – getting started. I decided to use this in conjunction with a Raspberry Pi I had going spare from another project, as it gives me network connectivity and a platform to write and run Python scripts on. Conveniently, no powered external USB hub is required to run the Blinkytape off a Pi (as I had no other peripherals plugged in, your mileage may vary!), so it was just a case of plugging it in and installing the necessary Python libraries:

$ sudo apt-get install python-pip
$ sudo pip install pyserial

There is an official Blinkytape python library available from their GitHub repository (along with some other languages), however at the time when I was playing with this (before Christmas) their base class was lacking a lot of features – so I wrote my own! To get my integration script, run the following:

$ svn co http://projects.mattdyson.org/projects/blinkytape blinkytape

This will give you the main class (BlinkyTapeV2.py) and a couple of example files, all of which are commented in a (hopefully!) helpful manner to show what’s going on. The following video shows an example of the BouncingBlocks.py class in action (by running sudo python BouncingBlocks.py) followed by a more ‘festive’ example, something I knocked together very quickly to cycle through a series of effects in very Christmas-y red and green colours!

Overall, I’m very impressed by the quality of this product. I was expecting something very rough-and-ready, being a rather specialist product marketed through Kickstarter – however the LEDs themselves are very bright, and nicely packaged up in a plastic flexible strip in order to protect the circuitry. The ease with which I managed to write my own integration library is also a testament to how simple the electronic design of this product is.

So what am I planning on using this for? First up, I’m looking at building my own alarm clock that reads from Google Calendar to only wake me up when I need to be up – normal alarms don’t seem to have been built with shift work in mind! I’m hoping to integrate the Blinkytape into this project by creating an ambient light that gradually fades up after the alarm has gone off, hopefully easing the transition into daylight hours! There are also plenty of projects I was hoping to do with a Moore’sCloud Light, another Kickstarter project that sadly failed to meet their funding goals, but hopefully Blinkytape will fill the void! I’ll make sure to post back here with further updates when my Blinkytape gets put to use!

Kano: ICT education, easy as Pi!

I stumbled across the Kano Kickstarter project this evening, and felt compelled to take to this blog in order to say what an excellent idea this really is!

There has been a lot of negative media coverage over the state of ICT education in the UK, and from my perspective this seems fairly justified. As far as I can remember, none of my ICT teachers in high school actually had any qualification in the field, and only one or two had any relevant experience to bring to the classroom. The majority were almost completely unaware of anything other than the allocated syllabus, but it only took one particular teacher (who has influenced my career path much more than anyone will ever realise!) with a passion for programming and the subject in general to get me hooked. It’s worth noting that this inspiration didn’t come from the taught subject matter itself, it was extra-curricular activities that really got me started in the field. ICT will continue to be a niche subject until the curriculum is updated to actively engage kids, rather than subjecting them to endless lessons on dry subjects on network architectures and database schemas. I may be biased as a kinaesthetic learner, but I think that the best way to get kids to engage with and learn this subject is by getting hands on.

The Raspberry Pi foundation have done a fantastic job bringing a cheap (£30) computer in reach of everyone. I own a couple myself for general tinkering and hacking about, and can honestly say it’s the main reason why I started playing with electronics, and gave me the confidence to start on a whole raft of new projects (such as this) which I never would have considered before. However, selling the raw pieces as they do, this machine seems weird and scary, outside the reach of the majority of educators and parents. While this is not a failing of the foundation itself (as I think they’ve been slightly overwhelmed by demand from the hobbyist sector), it’s crying out for someone to take this excellent system and package it in a more friendly way. Enter Kano.

Kano appears on the face of things to be a very simple project – they’re packaging up the Pi with the majority of peripherals needed to run it, and crucially they’re including kid-friendly instructions on how to get the whole thing working. Their use of the phrase “Simple as Lego” really struck a chord with me – that’s exactly the right way to approach this kind of teaching, by letting the kids play, hack around and figure it out themselves.

I really hope that the guys behind Kano take some of the money they’ve made from this project (it’s already 3 times over their target as I write this, with another 27 days left to run!) and take these kits into schools at a lower per-unit cost for education uses, just to make them a truly irresistible purchase for any ICT department. I genuinely believe that giving kids access to this kind of kit as part of their curriculum will not only educate them, but it’ll help inspire a future generation of hacker nerds – and that’s no bad thing in my view!

Using 20×4 RGB LCD over i2c with a Raspberry Pi

Now there’s a specialist blog post title if ever there were one…

Recently, I’ve been dabbling with electronics to fill the void of spare time I’ve found myself with while I’m between jobs. I’m currently working on a half-baked idea to create some sort of digital assistant who will take instructions in some form, and then read stuff back to me in a Siri-esque manner. Nothing sounds more awesome than having twitter @replies read out to you, right?! To kick off this project, and get me motivated to actually do something, I ordered a boatload of parts from Adafruit, and set about learning how to use them. First challenge – connecting up their 20×4 RGB backlight negative LCD screen to my Raspberry Pi.

In order to assist with this, I also bought the i2c / SPI character LCD backpack in order to save some GPIO pins for other uses. Due to my lack of attention while ordering, I failed to notice that the LCD backpack only has 16 pins, whereas the LCD screen I ordered has 18 (2 more for the extra background LEDs). Rather than giving up and being limited to only a single channel of control for the backlights, I decided to connect pins 14 to 18 direct into the Pi, and mash two separate libraries together to give myself full control. This is what I ended up with (click for big):

2013-05-02 21.36.11

Now, that looks like an absolute mess. That’s because it is. In an attempt to make that a bit more readable, here’s a Fritzing diagram of how it’s wired (again, click for big).

lcd_test_bb

Now, that’s even more confusing as I couldn’t find a Fritzing library with the right parts – so I’ve fudged a few things. Imagine there are ports 17 and 18 on the LCD, and that the LCD itself is 20×4 rather than 16×2. Secondly, imagine the chip in the middle is actually the i2c backpack mentioned above, so everything on the bottom is connected straight to ports 1 to 16 on the LCD, and the VCC/GND/CLK/DAT are connected to the Pi. So, in terms of wiring we get:

  • LCD #1 to #14 -> i2c backpack #1 to #14
  • LCD #15 -> 5V0
  • LCD #16 -> Raspberry Pi GPIO 17
  • LCD #17 -> Raspberry Pi GPIO 27
  • LCD #18 -> Raspberry Pi GPIO 22
  • i2c backpack GND -> GND
  • i2c backpack VCC -> 5V0
  • i2c backpack CLK -> Raspberry Pi SCL
  • i2c backpack DAT -> Raspberry Pi SDA

Now that’s all set up, you can use the standard AdafruitLcd Python library (nice adaptation that I used can be found here) to control the text shown on screen, but we need something bespoke for our background lighting. For future projects, I wanted the ability to control each colour individually, so I can set arbitrary RGB values on the screen, and also brighten/dim appropriately. The latest version of RPi.GPIO will let you do software Pulse Width Modulation, which will achieve this quite nicely for us. To install the latest version (0.5.2a at the time of writing), you’ll need to run the following on your Pi (as root):

$ wget https://pypi.python.org/packages/source/R/RPi.GPIO/RPi.GPIO-0.5.2a.tar.gz
$ tar xf RPi.GPIO-0.5.2a.tar.gz
$ cd RPi.GPIO-0.5.2a
$ python setup.py install

So, combining some standard example code for PWM on the Pi with the AdafruitLcd library, I developed my own little library for controlling a LCD wired up in this manner. To get up and running with the code I wrote, you will need (again, as root):

$ mkdir lcdtest
$ cd lcdtest
$ svn co http://projects.mattdyson.org/projects/LCDControl@889 .
$ git clone https://github.com/PDKK/RpiLcdBackpack.git
$ touch RpiLcdBackpack/__init__.py
$ python testLCD.py

Note: If you see IOError: [Errno 5] Input/output error when running testLCD.py, you may need to edit RpiLcdBackpack/RpiLcdBackpack.py and change the line self.__bus=smbus.SMBus(0) to self.__bus=smbus.SMBus(1). This should only happen on newer versions of the Pi, where the i2c bus number changed to 1 from 0.

Note 2 (added 15/10/14): The version of my LCDControl library that you’re checking out with the above command is now out-dated, I’ve updated the library to use pigpio instead of RPi.GPIO, as the latter was causing me flickering problems when the Pi was under load. To get the latest version, remove the @889 from the svn co command, you will need to have pigpio installed and running for this to work.

Once you run testLCD.py, you should see the screen flash a series of colours, followed by some messages appearing on the screen. Yaaay – it works!

The LCDControl class I’ve written is pretty basic (I’m still learning Python… slowly!) but allows you to set RGB or individual colour values for the backlights, and also pass in any message without worrying about formatting. Currently (version 1.0 at the time of writing), the LCDControl.setMessage method will split by the newline character (\n) and do the logic regarding line numbers for you (as the third display line on the LCD is actually carried over from the first line passed to the controller, and the fourth with the second) – future iterations of this code will allow you to do things such as full text wrapping, and scrolling text.

So there we have it – a 20×4 RGB LCD screen talking to a Raspberry Pi over i2c, retaining individual control over the background LEDs. As always, please leave a comment if you spot anything wrong with what I’ve written here, or have any feedback/suggestions/requests!

Using an Xbox 360 Wireless Controller with Raspberry Pi

As part of a project I’m working on at the moment (more information to come soon… more information here) I’ve been attempting to get my Xbox 360 Wireless Controller for Windows talking to my Raspberry Pi. Having spent a fair amount of time chasing various options around the internet, I thought I would share my eventual (and rather simple) solution here.

The first thing I found was PyGame – a python library that offers support for joysticks and gamepads, but primarily designed for game development. This post suggested that PyGame may solve the connectivity problems, and gives some example code for echoing out events, however I could not get this to work.

The Ubuntu wiki suggested a module called xpad, which is included by default on Ubuntu, but not on the Rasbian image I am using (Rasbian “wheezy”), although it is available through apt-get in the default repository. Unfortunately, this didn’t work for me either.

The eventual solution that worked for me came up in a blog post by Zephod about using an Xbox controller to run a remote control car, which suggested using Xboxdrv – a userspace driver for the Xbox controllers in Linux. There were suggestions on the Raspberry Pi forums that this would require building, but a simple `apt-get install xboxdrv` on the Pi worked for me. Once installed, execute the program (as root), and then re-sync the Xbox controller – this had me stumped for quite a while, and seemingly only needs to be done the first time you attempt to use the controller since the module was loaded. A re-sync for the wireless controller means holding the button on the reciever for ~3 seconds until it starts to flash, and then holding the button on top of the controller (to the right of where ‘Microsoft’ is written) until the lights flash. Once this has completed, you should see a new line in stdout for every event that happens on the controller – so press a button and see what happens!

This is the output I saw when starting up xboxdrv (having already done the sync) and pressing the “A” button on my controller – notice the A:1 changing to A:0 as I release the button (about 3/4 of the way across the terminal). Success! (Click for a larger image)

Zephod has written a small Python class to read from the output of xboxdrv and allow it to be read in a more usable format – I’ve not yet had chance to fully digest what it does and how it works (this is an exercise in me learning Python as well!), but it looks very promising, and I’m looking forward to continuing with my project!

Update 05/01/2013
I’ve had chance to play around with the xbox_read.py file I linked above – and it works a treat! I’ve set up a quick test python script that you can use to print out events. In order to use it, do:

$ git clone https://github.com/zephod/lego-pi.git legopi
$ touch legopi/__init__.py
$ wget http://projects.mattdyson.org/projects/robotarm/readEventTest.py
$ sudo python readEventTest.py

You may have to re-sync your controller as described above, and then once you start to move sticks/press buttons you should see a single line for each event! Magic!