Smarter Streetlights

I know its already passed but I saw this on the BBC website, posted today and thought it was so relevant to our project it had to be blogged…

Few people in the UK now have a clear view of the night sky because of light pollution. The fight is on to reclaim the stars, but what are the possible solutions?

Look at the sky at night and what do you see? Not much most probably. Even in the countryside the stars are becoming much harder to spot, with the sky glow caused by light pollution now visible for up to 50 miles (80km)…. (Continued here)

Wanted: meaningful actors... humans need not apply.

Now about halfway through my borrowed copy of Adam Greenfield’s Everyware, I came upon an insightful observation. Mr. Greenfield’s proposes that there is at some level universal appeal for every ware/ubicomp. My instinctual reaction was one of disbelief. But as I read further I was intrigued.

He writes that humans across most cultures throughout history have instilled consciousness and sentience into the physical world and seemingly inanimate objects… “spirits” if you will. This anthropomorphism was pervasive and

“…indeed, most of the humans who ever walked the planet would have found it utter folly to conceive of the natural world as mainstream Western culture did until very recently: a passive, inert, purely material stage, on which the only meaningful actors are human ones.”

With our newfound advances in technology, we now seem to be on the cusp of giving a voice and a personality to anything that operates on electricity or batteries. The wise old oak tree used to speak to us; now its the vending machine next to the bus stop. As the ability to connect to these machines no longer relies on a keyboard, gestural or voice-recognition commands seem like the logical way to interact with another ‘living’ being.

Nowadays we pretty much expect our devices and interfaces to possess some sort of charisma—the works exhibited at MoMA’s Talk to Me exhibit are prime examples of this way of thinking. Computers, consumer electronics, phones are now quirky, funny, helpful, not-so-helpful but apologetic, courteous, kind… anything but the sterile command-line interface of yesteryear.

In the process, have people become duller, more robotic? People text one another far more than actually talking. The abbreviated acronyms-laden style of texting to me is reminiscent of Newspeak, the thought-reducing language of George Orwell’s 1984. Have we given so much of our personality and behavior to computers that we none left for ourselves?

Part of what I want to explore during thesis is how to retain our humanity, while still celebrating all of the wonderful things technology can provide us. To be continued…

QR Codes

In my research for QR Codes, specifically in relation to the Tales of Things app, I learnt that inverting the colours for a QR Code (to make a white QR code on a black background) is not something that works.

I know that it ought to work, having been written into the process, however most QR readers on phones - the most commonly used device to read and market QR Codes for - cannot read inverted codes.
So if you were thinking about making a snazzy black business card or advert with a white QR Code on it, guess again. For most readers, it will just not work.
This is because of the way in which QR Codes are read. The reader identifies the information encoded into the QR by the black data that comprises it. Inverting it ‘changes what that data is’ into a new set of information, and because of that it can’t be identified and the reader won’t work. 

This is a shame, because making white QR Codes for black shadows would have been pretty cool aesthetically.

Here is a test I was playing around with in Processing. I was able to track the blue laser pen on the opposite wall using the webcam on my mac (you can see the laser reflecting off of the wall onto the screen). This then draws onto the blank processing sketch. It would be possible to expand on this idea by projecting the sketch onto a wall and directly drawing onto it with the light, and with everything aligned, this would look great. However, I decided against this as it would be too much like a previous project (L.A.S.E.R. Tag) I was looking at for inspiration, it would pretty much be steeling an idea that has already been created.

There are different directions I could take with this project such as non human controlled graffiti. But I feel that this would be second best to the main idea. I will post the code for anyone interested in the next post.


Monthly Talk at Lighthouse - Adam Greenfield

Pre-Presentation Fountain Test

Success! Here’s a video of the pre-presentation test on the fountain. For the test, when the program runs, the fountain is on for 15seconds, then off for 4 seconds, then on again. The reason I look so relieved when it comes on for the 3rd time is that it had been getting “stuck” after the second run through. It turned out we’d had had incompatible versions of Processing/Firmata.

I’ve got to say, I’m really proud of this project! Not because it was the most brilliant idea, or the most complicated, but I have never been good at arduino/Processing/coding in general, and for this project, I was the only one working on getting the prototype to work and all the wiring and programming that entailed. Granted I did have to run to Simon (Lock, our lecturer) once or twice as I couldn’t work out why it wasn’t working, but in general, I did it on my own.

The main problem was getting arduino to talk to Processing (using Firmata) and to take 3 scripts (2 arduino, one Processing) and get them all into one. I’ll post the code in the post after this.

External image

"Every City uses between 20 to 40% of their budget for streetlighting"

That quote comes from a great video I found on YouTube about another lighting initiative called Echelon. It seems like they are inputting special chips which make a grid of streetlights that can be dimmed at certain times, saving up to 70% in some areas, somewhat similar to our idea but ours will be a lot more intelligent!


Yesterday evening I went to Plymouth Hoe along with Ollie get some pictures of my installation in action as it was a really clear night. The plan was to aim the laser projection onto Smeatons Tower (the lighthouse) and colour it in depending on the data the sensors picked up, which would also trigger the camera to take a long exposure image, recording the data. However, as we approached the Hoe we saw a mass of scaffolding covering the tower. How annoying. Thankfully there are many other landmarks on the Hoe that I could make use of. I chose to use one of the memorials with the wheel in the background as I thought it would make an awesome image.

Above you can see a range of images taken from the night. The first few images are from when the installation was set up and started triggering the camera before I had positioned it. As you can see, the next few images show a range of data loud, quiet, lots of movement and not so much movement. Since there was fair amount of light on the Hoe I was unable to get the effect that I wanted, but overall I was really happy with the results of the images.


Arduino MIDI Drum Kit and Spooky Sound Trigger

Just found this video which is quite similar to what I’m hoping to do in my Final Year Project, and also nice and Halloween themed!

As you can see the designer has made an Arduino pressure sensitive drum, which is part of what I want to do, now I just need to find out how to visualise it well!

I’ve been thinking about how the drum could be intergrated into the internet of things, and I like the idea of it forming some kind of social intervention, by putting it into locations like Drake Circus, to see how people react, maybe by having it under a mat so when you tread on it it makes a sound, then seeing how people react?

This map shows the spots in which we have chosen to hide our geochaches for our everyware project. The red markers are the three sites we will use for the demo on friday. The blues are the rest of the sites we would use in the real version of the game. Finally, the yellows are already existing geocaches.


We decided to go with the “smart” streetlights for out IBM project. Here is an animation I created to support our the presentation.

Streetlights - The Problem
On average it costs 15p per night to power one streetlight in winter and with 7.5 million in the UK alone it costs over £1000000 to power streetlights in winter everyday. This is a massive amount of money that’s wasted and as you can see in the video, 830,000 tonnes of CO2 pollution a year is produced by the energy wasted by streetlights alone! These facts are what swayed us into wanting to create a smart solution for this.

Our idea is to create streetlights that sense when a car is near. This will activate that streetlight and (if on a motorway or main road) streetlights for one mile in front and half a mile behind the car. These will fade on and off as the car moves along (as shown in the video).

By doing this, the lights are only used when they are needed which means no energy is wasted.

As well as massive amounts of money and energy being saved, there are also additional benefits to this system. With the Arduinos and wifly modules in place, you could access wifi from each lamppost. All of the lights could turn on or off in and emergency. The system could tell you if a bulb was blown and where. Finally, it could be a very accurate traffic report system since each sensor knows where the cars are.