Thursday, 28 September 2017

Pivarium: care for your reptiles with a Raspberry Pi

It's been a while since there’s been a new addition to the family and I’d forgotten how much preparation is involved. All the familiar concerns that I remembered from when the boys were born came flooding back: you need to make sure the new arrival is not too hot or cold, provide a safe environment that they can’t escape from, ensuring they’re safe at night...

Snakes, I've discovered, are a lot like children. Except easier to feed. 





Anyway, faced with such challenges there is only one response: make some stuff to help.

Of course when the boys were born, the Maker community wasn’t as well established and, Raspberry Pi, the core of many IoT projects had yet to be born itself. So this project gave me an opportunity make up for it now.  The Pivarium deals with the the main requirements mentioned above.

Monitoring the environment inside the vivarium.


Snakes are ectotherms - they have no internal means of regulating metabolic function and maintaining homeostasis. In cold weather, snakes tend to be sluggish as their metabolisms slow down, whereas in warm weather they tend to eat more and move more quickly.  So in captivity they need to have a temperature gradient available so that they can move to a cooler place when they get too hot and vice versa. The Kernels vivarium has a  thermostatically controlled ceramic heat lamp at one end but I wanted to monitor the temperature under it,  as well as at the opposite end where there is more ventilation. 
I opted for DS18B20 temperature sensors because they are cheap and reasonably water -and hopefully snake pee - resistant. I was too lazy to convert the ends of the leads to female jumper sockets so I just used screw terminals and a small breadboard. My excuse is that this will make it easier to replace a faulty sensor in future. 
DS18B20 (left) next to probe from thermostat at 'hot end'

ADS18B20 at 'cool end'

The humidity in the Pivarium is also important, especially when the snake is shedding its skin so I wanted to monitor that as well I opted for a DHT22 which can also record temperature as well and is more accurate than the cheaper DHT11.  Therefore I placed it in the centre of the vivarium, attached to the roof. This gives me a third temperature reading to provide a good overall idea of what the gradient is like in the vivarium.
DHT22 mounted on roof

While I was testing and The Kernel was settling in,  I used a Pi Touchscreen to display a dynamic plot of the temperature and humidity values in real time. This was produced using Jupyter Notebook and the Python Matplotlib library. 




Once I was happy that everything was working and reasonably stable removed the touchscreen and swapped the Pi 2 for a Zero W. I still wanted to keep a graphical record of the environment though, so I created a data bucket on InitialState and upload regularly the data to produce a nice dashboard that I can check from anywhere that I can get on the InterWebs. 


InitialState dashboard


For a local display, I dont really need graphs - just the headline readings are fine. I also didnt want something bright and flashy like an LED matrix so I opted for the excellent Pimoroni InkyPhat. This is perfect as I only need to update the values every 5 minutes or so - things dont change that rapidly in the Pivarium.  Although the InkyPhat is limited to three colours, thats fine.  I want to easily see if a temperature is outside the desirable range for that area of the Vivarium, in which case the value is displayed in red text, otherwise black. I also display the time so I can tell that the readings are current and that the Pi or Inkyphat hasn’t stuck or crashed. 



I want to easily see if a temperature is outside the desirable range for that area of the Vivarium, in which case the value is displayed in red text, otherwise black. I also display the time so I can tell that the readings are current and that the Pi or Inkyphat hasn’t stuck or crashed. 

Night Vision

The next task was to see what The Kernel was getting up to at night. At a later stage I might put a camera in the vivarium itself, but for now I’m happy to have a second Pi outside with a Pi Noir camera and some IR LEDs, all mounted in a Lego frame.   Reflection from the vivarium glass in minimal and I can easily position the whole thing to focus on a particular area (e.g the hot or cold hide, the water bowl etc). I didnt want to record all the time so I used Motion software to detect when activity was occurring and start recording then. 

Ive been able to capture some good footage of The Kernels nocturnal behaviour. My favourite so far has been this clip of him having a big drink and then burping.



Houdini

The final task was to prevent The Kernel from unexpectedly leaving his lovely home. Corn snakes are renowned escape artists and will find a way out through any small crack or opening. A common escape route is cased by the sliding vivarium doors not being closed properly and the patient and surprisingly strong guest levering their way out when nobody is looking. A simple remedy is to always fit the sliding vivarium lock and mark its position on the bar so that you know the doors are fully closed. However it is really easy to forget this or to not lock the doors immediately and then forget to do it later (especially when removing snake poo). So  I wanted to add some blinkin LEDs to provide an immediate visual indication that doors were not secure. 
I found some nice chunky contact switches (with a satisfying click). These have 3 spade-connector terminals, a different pair of which are connected together when the switch is closed or open.    I designed some 3D printable brackets to fit hold them and allow easy fixing to the side of the vivarium. I also added a hole and channel for a blinking LED.
I also 3D printed some handles to fit onto the existing glass squares that are used to slide the doors. These press against and close the contact switches: the advantage of 3D printing these was that I could precisely adjust the size so that the switch is only fully depressed when the door is completely closed. 



As usual, the Python gpiozero  library makes the code to control these switches embarrassingly easy, especially using source/values.


from gpiozero import Button,LED
leftside = Button(20)
leftside_led = LED(21)
leftside_led.source = leftside.values
rightside = Button(19)
rightside_led = LED(26)
rightside_led.source = rightside.values

You can see the complete Python code for the Pivarium here. The STL files for the various 3D printed files and the Jupyter Notebook file used in testing are also there. 

Wednesday, 2 August 2017

Introducing your Pi-rsonal Trainer

Since starting a new job earlier in the year I've had to make a lot of  adjustments to my fitness schedule. One thing I really miss are the exercise classes I used to attend. I particularly miss the High Intensity Interval Training sessions which were a great way get a quick cardiovascular workout.

I'd been rolling my own version at home in the back garden for a few weeks and found that although these sessions worked reasonably well,  I missed the class instructor who used to call out the next exercises and provide motivational commentary. Keeping track of what exercise I've just done gets increasingly difficult the more tired I get.  I also found it annoying to have to check my watch for when to pause between exercises and re-setting a timer every 20 seconds is annoying.

Obviously it would be hard to simulate the enthusiasm of a trainer but the exercise choice and timings seemed like a straightforward thing to automate with a Raspberry Pi.


I wanted a visual and audible alerting mechanism and something that would be small enough to be readily portable and run from a standard power bank.

Here is the finished prototype: my Pi-rsonal Trainer.


BoM

Pi Zero running Raspbian Jessie-lite and with the ScrollPhat HD Python library installed.
Pimoroni ScrollPhat HD
Pimoroni Pico HAT Hacker
Buzzer
Two push buttons

Construction



1) Solder a standard male header onto the Pi Zero.
2) Solder on the Pico HAT Hacker, being careful not to let the solder wick too far up the pins.
3) Solder a small buzzer directly on to the Pico HAT Hacker between the holes for Ground and GPIO 18
4) Solder two buttons onto the Pico HAT Hacker, one between Ground and GPIO 9 and the other between a different Ground pin and GPIO 19.
5) Solder a female header on to the ScrollPhat HD and then mount this onto the Pi.


Operation

For my HIIT sessions I like to have 4 repetitions of 10 x 20s periods of exercise interspersed with 10 second rests.

Some simple Python code runs at startup-up. When the bigger button (on GPIO19) is pressed, the sequence begins with an on/off flash of all the LEDs on the ScrollPhat . An exercise is selected at random from a list and this choice is scrolled across the matrix for 5 seconds, followed by a 5 second countdown accompanied by beeps. Then there's another flash and the 20 second exercise period begins. The exercise being undertaken is constantly scrolled across the LED matrix until the last 5 seconds when there's another beeping countdown. Then there's a 10 second rest period during with the next exercise is scrolled. And repeat 10 times.

Then a 30 second break.

Then repeat all that 4 times!


The LEDs of the ScrollPhat HD are nice and bright, and visible in full sunshine. The beeps are annoyingly shrill so that I can hear them even though I'll normally be listening to music on my earbuds. This means I don't have to keep watching the LED display to now when the various intervals are finished.

The code is easily customisable if you want different exercises or want to alter the timings. You could just as easily program a Yoga session rather than a HIIT workout!

Thursday, 9 February 2017

MakeTronix Alarm and GPIOZero on Raspberry Pi

I funded the MakeTronix Alarm on IndieGoGo because it contains many of the components I usd with Raspberry Pi at CoderDojo, but in one compact package. Don't get me wrong, I'm all in favour of having learners build the circuit using a breadboard, but sometimes there just isn't the time to do that and code it.

The board arrived very quickly and it is a nice, neat package. I was pleased that they kept the board to 26 pins as this will enable it to work with my trusty set of original mode Bs.

The online documentation is easy to find but I have to say I found it a little disappointing. It seems weird to not show a picture of the board mounted on a Pi in the “connecting the board” section. The written description is entirely accurate but a simple diagram would be much better, especially for younger users. I was also surprised to find that the code examples downloaded from the MakeTronix github don't use gpiozero. There's nothing wrong with using the Rpi.GPIO library – it works perfectly – but I believe  gpiozero is much better for those who are new to Python and this board is being marketed as a “fantastic educational circuit board for learning programming”. They've tried to abstract out the key functions into a library file that the other examples import, but this feels a bit unwieldy, again especially for novice Pythonistas.

I also couldn't get the full alarm code to work. It turned out this was nothing to do with the code: the PIR that shipped with my kit didn't work correctly and would not trigger on motion at any sensitivity. Luckily I have plenty of these PIRs and substituting in a replacement got everything working.

Nevertheless I decided to write a gpiozero version. It uses a couple of lambda functions which you might argue are too opaque for beginners (and I'd probably agree) but they do remove the need to give each button its own when_pressed function individually.

Here's why I love gpiozero – all sorts of hardware control is possible with simple, easy to understand syntax. My MTAlarm code does the following:

1. Activates the PIR and waits for motion to be detected. The LED glows/pulses to indicate that the alarm is active.

2. When motion is detected, the LED starts flashing.


3. If within 10 seconds  the correct code is entered (each key beeps when pressed and DEL acts as backspace) the alarm is disabled.






If not, the buzzer starts to beeeeeep constantly.



Friday, 23 December 2016

Amazon Alexa controlling Raspberry Pi GPIOZero Xmas lights

I've been looking for a home IoT project to setup with Amazon Alexa and my Raspberry Pi powered Christmas Tree seemed like an obvious choice.




Here's how it works:

There are 5 sets of cheapo xmas LEDs connected directly to GPIO pins on the Pi. I then use Python and GPIOZero to turn them on and off, do a bit of PWM and run some fancy sequences.  I've then integrated this code into a simple Python Flask app that essentially provides a web-interface to activate the LEDs.

The Pi is also running this awesome Home Automation Bridge code  from BWS Systems. This emulates the Philips Hue light system and can be easily configured to send web requests to other devices, in this case my Flask app.

Finally, Alexa discovers the HAB and merrily passes requests to it based on your voice commands.

All this runs fine on an old 512MB Raspberry Pi model B.

Step-by-step

1) Get and SD card with the latest version of Raspbian. Boot it and connect to your wifi

2) Find the if address of your Pi


pi@raspberrypi:~ $ ifconfig wlan0
wlan0     Link encap:Ethernet  HWaddr 00:c1:41:39:0c:3e  
          inet addr:192.168.49.213  Bcast:192.168.49.255  Mask:255.255.255.0
          inet6 addr: fe80::1185:ec1e:49f9:d936/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:2270 errors:0 dropped:2 overruns:0 frame:0
          TX packets:646 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:327701 (320.0 KiB)  TX bytes:94619 (92.4 KiB)

3) Download the jar file for the HAB:


wget https://github.com/bwssytems/ha-bridge/releases/download/v3.5.1/ha-bridge-3.5.1.jar

4) Run this jar file:


sudo java -jar -Dconfig.file=/home/pi/habridge/data/habridge.config /home/pi/habridge/ha-bridge-3.5.1.jar

5) Point a browser at the IP of your Pi (from step 2). You should be able to connect to the HAB - click on the Bridge Control tab and you should see a screen like this:


 6) Clone my github with the  Flask app. There are two versions of the code available: one is setup to use 5 sets of LEDs while the other is a cut down version with just a single LED string configured, connected to GPIO 14 and ground. I'm assuming you've just got one LED (or a string of LEDs) from now on. 

7)  Run the app:

pi@raspberrypi:~/gp0web $ python singleLED.py 
 * Running on http://0.0.0.0:5000/
 * Restarting with reloader

8) Now we can configure our HAB. Click on the 'Manual Add' tab and fill-in the fields as shown below. The 'Unique ID' field will self-populate once you save your settings so don't worry about that.



Don't forget to change the IP address based on your setup.

9) Now save the settings and then test them. Click on the Bridge Devices tab and try to turn your LEDs on/off using the Test buttons.

If you encounter problems, check you've typed the URLs correctly and see if the Flak app is showing any errors.

10) If all is working, the final configuration  step is to ask Alexa to 'discover devices'. It should respond that it found one (unless you already have other devices on your network).

11) Now you should be able to ask Alexa to "turn on tree" (or whatever name you gave to your device in step 8). 

Friday, 16 December 2016

Getting started with Android Things and Pimoroni Rainbow HAT


I was really excited to try out Android Things on the Raspberry Pi, especaiily with the sweet looking Pimoroni RainbowHAT, but found getting everything up and running initially a bit fiddly. Here are my notes on how it all works. I used a Mac as my developer box but I assume things will be fairly similar if you're using Windows. 


1. First of all, download the Android Studio and install it. On a mac this just means dragging the app into the Applications folder.  
2. Now run it. You'll be asked if you want to import anything – you probably don't. The only other optional configuration is to select a colour theme. I chose Dracula! 


3. Now we'll be able to connect to our Pi. Boot it from the Android SD card and wait for the Android Things splash screen to appear. It will show you the IP address being used by the Pi – make sure you connect it to your network via an Ethernet cable. Now open up a Terminal (or a CMD prompt if you're on a Windows box I guess) and cd to the Android SDK directory just installed. On a Mac that'll be /Users/<your user>/Library/Android/sdk/platform-tools.

4. Run the adb command to connect to your Pi, using the IP address shown on the splash screen.


5.Now head back to Android Studio. We'll check out one of the sample apps from the GitHub repo. Let's start with the simple Button app, which will illuminate one of the HAT's touch pad LEDs when the pad is pressed.

Click on 'Check out project from version Control' from the Android Studio welcome window and select 'Git'. Copy the URL of the button repo into the appropriate box. Create a folder on your developer machine where you'd like to store your AndroidThings projects and browse to that using the button next to the 'Parent Directory' text box. The sub-directory name should be filled in for you.


6. Then click the 'Clone' button, and once it has downloaded, click 'yes' on the next dialog box. 



7. Then Click 'OK' on the next one.

8. Android Studio will start to build your code, but because you haven't yet downloaded the correct SDK, you'll get an error. Just click 'OK' – we'll fix that now.


9. Back at the main Studio welcome screen (step 2), click on 'Open an existing Android Studio Project' and browse to the sample-Button folder which has just been created under the folder created in step 4. Click OK.

10. You'll see a window like this. If you look closely at the bottom of the window, you'll see some status messages.


After a few seconds you'll see a 'Failed to sync' error and a blue 'Install missing platforms' link. Click on the link.

Accept the license agreement (after reading it thoroughly, of course) and click next.

13. Then you'll get another sync error with a link to install missing Platform Tools. Click the link.

14. You might see a message about updating plugins. Click 'update'.

15. That should be everything installed. Now we can run the app. Click on 'Run App' or the green triangle. You'll be aked to select a deployment target. You should see an Unknown Iot_rpi3 device listed (if not, try and reconnect as per step 3). Click OK.
16. The screen on the Pi should change to be almost entirely white with a black band at the top with the word 'Button' Try pressing the A touchpad on the Pi and the LED on that pad should illuminate. Notice that touching either of the other 2 buttons doesn't do anything (yet).

17. Success ! Your first Android things App. Now let's hack it a bit. We'll keep it simple: let's enable touchpad B instead, but have it turn on the LED on button C. Open the Project view in Android Studio and select the BoardDefaults file from the tree in the left pane.



18. Double click on the file and have a look at the contents. You'll see two functions which we'll need to modify: getGPIOforLED and getGPIOforButton.

19. By consulting the excellent pinout.xyz entry for Rainbow HAT we can see that pin 26 is the LED for pad C and pin 20 is used for pad B itself.


20. Modify getGPIOforLED so that it returns BCM26 for Pi boards and similarly modify getGPIOforButton so that it returns BCM20.


21. Now re-run the app and deploy the new version to your Pi (click the green play button). You should see the Pi reset. Test the HAT to check that your changes have had the desired effect.  

22. Now why not try the awesome weather station app which really lights up the Rainbow HAT.  Follow steps 4-17 above but choose the weather station Github repo instead.  Once you've deployed it, reboot the Pi. When it restarts you should be given the option as to which app you want to run.  Connect a mouse to the Pi and choose the Weatherstation. 


23. That's it! Now get stuck into some Java coding and explore some of the other sample code available. 



  

Friday, 18 November 2016

Pimoroni Flotilla Offline - instructions

After last week's Wimbledon Raspberry Jam, a few people asked me for instructions for getting Pimoroni Flotilla working in a completely offline environment. Obviously you need an Internet connection to do all the installing, but following these instructions should enable you to use Flotilla and all the great Cookbook stuff when you don't have Net access.

Here is a brief description of the steps I performed. This will almost certainly not work exactly the same way with future releases of the software and may not even be necessary!

1. Start with a fresh install of Raspbian and PIXEL. Do the usual update and upgrade:
sudo apt-get update
sudo apt-get upgrade -y

2. Install the Flotilla software as normal.
curl -sS get.pimoroni.com/flotilla | bash
Start it up and update the Flotilla Dock firmware if required.

3. Clone the flotilla-cookbook and flotilla-rockpool repositories
git clone https://github.com/pimoroni/flotilla-cookbook.git
git clone https://github.com/pimoroni/flotilla-rockpool.git
4. Change into the flotilla-rockpool directory and switch to the Edge branch.
cd floilla-rockpool
git checkout edge
git fetch
5. Now for the tedious bit. All the hyperlinks in the examples in the Cookbook are pointing at online resources. We need to change them so that they point at our local cloned copy of Rockpool. Basically you need to go through each of the Cookbook recipe directories (in /home/pi/flotilla-cookbook) and edit the index.html file:  change wherever it says 'http://flotil.la/rockpool/' to 'file:///home/pi/flotilla-rockpool/' (note the three /s after file). You could use a bash script to trawl through the whole directory or just open each one manually (Geany on the Pi is a good editor) and do a find/replace.

6. Finally, to make sure all the examples (especially Synth) work correctly with local files, you need to launch Chromium with some extra flags. I created a separate Desktop shortcut to make this simple. Create a file called flotilla-offline.desktop in /home/pi/Desktop

[Desktop Entry]
Encoding=UTF-8
Version=1.0
Type=Application
Exec= chromium-browser --allow-file-access-from-files
Icon=rockpool
Terminal=false
Name=Flotilla Offline
Comment=Flotilla Offline

7. An icon should appear on the Desktop. Run it! I recommend adding Chromium bookmarks to each of the modified index.html files in each of the Cookbook recipe subdirectories. I also added a bookmark for the local version of Rockpool (file:///home/pi/flotilla-rockpool/index.html)

Thursday, 23 June 2016

Pi Zero CCTV with ZeroView

One of the first things I used a raspberry Pi for was to build a simple CCTV system. In fact that original model B is still up and running and today.  It is a nice compact, self contained system.

However the addition of a camera capability to the Pi Zero opened up the possibilities for even smaller photography ideas.

Those rambunctious pirates at Pimoroni were quick off the mark with their dinky Little Bro kit. This is a smart CCTV sign that has a  hole for the camera and comes with a mounting bracket for the Pi Zero which also allows you to fix the whole thing to a wall.    When I wanted to make a timelapse of some garden work that was happening I simply used some string to hang the Little Bro from the window handle. This worked fine although it was a bit wobbly (especially when the window as opened) and when I had to adjust something I didn't manage to put it back up without altering the field of view slightly (you can see that the picture jumps slightly at about 00:10). It was also difficult to get the camera as close to the glass to keep reflections to a minimum.


Then The PiHut launched the excellent ZeroView, designed for exactly this sort of thing. It uses two sucker pads to stick to a window thus keep the camera as close to the glass as possible. Because the frame can be removed from the suckers while leaving them attached to the window, it also allows for easy adjustment and replacement without altering the field of view.


Initial tests confirmed that this was genius!




 Remixing my original CCTV build (which uses Motion to detect movement and then uploads the resulting movie to Dropbox) was easy. On the first night of testing I managed to catch this wily fox sneaking through my garden at dawn.



However there were a couple of issues about making this a permanent installation that I wanted to address:

1) My experience of headless Pis and wifi is that sometimes the computer drops off the network. Although the power management for dongles seems to be much improved now, I wanted an easy way of seeing that the Zero was still connected.
2) Suckers are effective but not always reliable, especially in a window which might experience big changes in temperature.

So as a combined solution to both of these, I used a ProtoZero board, an accelerometer and an RGB LED.


Some simple python checks the accelerometer and looks for abrupt changes in the values reported, to detect if the Pi falls off the window. In this event it uploads a picture of Humpty Dumpty to DropBox. The same code also periodically checks that it is still connected to the Internet (by pinging Google) and changes the colour of the RGB LED from green to red.  There are a few other visual cues too: the LED flashes blue when the baseline accelerometer readings are being measured at start-up and also flashes green when a ping test is underway. This is run at startup via /etc/rc.local.

#!/usr/bin/python3
from gpiozero import RGBLED
from adxl345 import ADXL345
import time, numpy
from datetime import datetime
import dropbox
import urllib3.contrib.pyopenssl
urllib3.contrib.pyopenssl.inject_into_urllib3()
import os,sys,logging

logfile = "/home/pi/cctv-"+str(datetime.now().strftime("%Y%m%d-%H%M"))+".csv"
logging.basicConfig(filename=logfile, level=logging.DEBUG,
    format='%(asctime)s %(message)s',
    datefmt='%Y-%m-%d, %H:%M:%S,')

led = RGBLED(26,13,6)
eggfile='/home/pi/Humpty-Dumpty.gif'
adxl345 = ADXL345()
client = dropbox.client.DropboxClient('<ENTER YOUR DROPBOX KEY HERE>')
#print('linked account: ', client.account_info())
logging.info('linked account: ', client.account_info())
    
axes = adxl345.getAxes(True)
#print("ADXL345 on address 0x%x:" % (adxl345.address))
logging.info("ADXL345 on address 0x%x:" % (adxl345.address))
t = 0
x_av = 0
y_av = 0
x_values = []
y_values = []
#print('Calibrating....')
logging.info('Calibrating....')
led.blink(on_time=0.1,off_time=0.1,on_color=(0,0,1), n=50,background=True)
while t < 100:
    axes = adxl345.getAxes(True)
    x = axes['x']
    y = axes['y']
#    print(x)
    x_values.append(x) 
    y_values.append(y) 
    time.sleep(0.1)
    t+=1
x_av = numpy.mean(x_values) 
y_av = numpy.mean(y_values) 
x_range = numpy.max(x_values) - numpy.min(x_values)
y_range = numpy.max(y_values) - numpy.min(y_values)
logging.info('Set mean x: ' + str(x_av))
logging.info('Set range x: ' + str(x_range))
logging.info('Set mean y: ' + str(y_av))
logging.info('Set range y: ' + str(y_range))
led.color = (0,0.2,0)
counter = 0
jiggle = 5
while True:
    axes = adxl345.getAxes(True)
    x = axes['x']
    y = axes['y']
    if ((x > (x_av + (jiggle*x_range))) or (x < (x_av - (jiggle*x_range)))) and  ((y > (y_av + (jiggle*y_range))) or (y < (y_av - (jiggle*y_range)))):
        #print('Humpty')
        logging.info('Humpty')
        led.color = (1,0,0)
        f = open(eggfile, 'rb')
        response = client.put_file(eggfile,f)
        f.close()
        time.sleep(1)
    time.sleep(0.1)
    counter +=1
    if counter > 6000:
        #print('Checking network comms...')
        logging.info('Checking network comms...')
        counter = 0
        response = os.system("ping -c 3 8.8.8.8")
        if response == 0:
            logging.info('I pinged Google successfully')
            led.blink(on_time=0.1, off_time=0.1, on_color=(0,0.5,0), n=3, background=False)
            led.color = (0,0.2,0)
        else:
            logging.info('Comms Down :-(')
            led.blink(on_time=0.1, off_time=0.1, on_color=(1,0,0), background=True)


Notes:


1) I used numpy to calculate the means and ranges which is really lazy - especially as numpy takes ages to compile on a Pi Zero.

2) I started from a fresh instal of jessie-lite. In addition to the Python libraries at the top of the code, you'll also  need to install the following Raspbian packages:

   python3-dev
   libffi-dev
   libssl-dev
   requests[security]
   git

3) You''l also need to enable the camera and the i2C interface through raspi-config