Cloud Pink is an interactive installation by Seoul-based creative Everyware (formed by Hyunwoo Bang and Yunsil Heo) that allows the public to live out their childhood dreams of touching and manipulating clouds.
Everyware is a creative computing group from Seoul, Korea consiting of Hyunwoo Bang and Yunsil Heoin.
Bang and Heoin are new media artists exploring intuitive and fun communications between the real and virtual worlds.
Their art installationCloud Pink invites participants to “touch the pink clouds” drifting on a giant fabric screen suspended in the air.
“Lying down on a hill with your pupils filled with the endless blue sky, perspective of your eyesight suddenly gets distorted and clouds drift at the tip of your nose. You stretch your arms up to the sky to touch the clouds but can’t reach. Another world right above your head, clouds….”
Be sure to check out Unknown Editors onTumblr&Facebook.
Everyware is a creative computing group from Seoul, Korea consisting of Hyunwoo Bang and Yunsil Heoin. The duo are new media artists exploring intuitive and fun communications between the real and virtual worlds!
Their art installation Cloud Pink invites participants to “touch the pink clouds” drifting on a giant fabric screen suspended in the air. Footage + more after the jump:
Desarrollado por el grupo creativo Everyware encabezado por los artistas mediales Hyunwoo Bang y Yunsil Heo.
Soak es una vídeo instalación que responde a la presión de los dedos sobre una tela elástica que colorea el lugar del contacto, simulando la caída y esparcimiento de tinta sobre la tela elástica. Esta desarrollado en processing utilizando el sensor de distancia de la cámara kinect.
Lo más interesante que encontré sobre este proyecto es que la pintura que aparece al contacto con la tela, no es un vídeo pre grabado sino que es un autómata celular que alentadoramente evoluciona a la presión del dedo sobre la tela.
I know its already passed but I saw this on the BBC website, posted today and thought it was so relevant to our project it had to be blogged…
Few people in the UK now have a clear view of the night sky because of light pollution. The fight is on to reclaim the stars, but what are the possible solutions?
Look at the sky at night and what do you see? Not much most probably. Even in the countryside the stars are becoming much harder to spot, with the sky glow caused by light pollution now visible for up to 50 miles (80km)…. (Continued here)
Wanted: meaningful actors... humans need not apply.
Now about halfway through my borrowed copy of Adam Greenfield’s Everyware, I came upon an insightful observation. Mr. Greenfield’s proposes that there is at some level universal appeal for every ware/ubicomp. My instinctual reaction was one of disbelief. But as I read further I was intrigued.
He writes that humans across most cultures throughout history have instilled consciousness and sentience into the physical world and seemingly inanimate objects… “spirits” if you will. This anthropomorphism was pervasive and
“…indeed, most of the humans who ever walked the planet would have found it utter folly to conceive of the natural world as mainstream Western culture did until very recently: a passive, inert, purely material stage, on which the only meaningful actors are human ones.”
With our newfound advances in technology, we now seem to be on the cusp of giving a voice and a personality to anything that operates on electricity or batteries. The wise old oak tree used to speak to us; now its the vending machine next to the bus stop. As the ability to connect to these machines no longer relies on a keyboard, gestural or voice-recognition commands seem like the logical way to interact with another ‘living’ being.
Nowadays we pretty much expect our devices and interfaces to possess some sort of charisma—the works exhibited at MoMA’s Talk to Me exhibit are prime examples of this way of thinking. Computers, consumer electronics, phones are now quirky, funny, helpful, not-so-helpful but apologetic, courteous, kind… anything but the sterile command-line interface of yesteryear.
In the process, have people become duller, more robotic? People text one another far more than actually talking. The abbreviated acronyms-laden style of texting to me is reminiscent of Newspeak, the thought-reducing language of George Orwell’s 1984. Have we given so much of our personality and behavior to computers that we none left for ourselves?
Part of what I want to explore during thesis is how to retain our humanity, while still celebrating all of the wonderful things technology can provide us. To be continued…
Here is an image of the ultrasonic sensor and the LED module connected to the sensor shield, which is on the wifly shield, which is connected to the Arduino board. This is the set up we would use for each lamppost.
In my research for QR Codes, specifically in relation to the Tales of Things app, I learnt that inverting the colours for a QR Code (to make a white QR code on a black background) is not something that works.
I know that it ought to work, having been written into the process, however most QR readers on phones - the most commonly used device to read and market QR Codes for - cannot read inverted codes. So if you were thinking about making a snazzy black business card or advert with a white QR Code on it, guess again. For most readers, it will just not work. This is because of the way in which QR Codes are read. The reader identifies the information encoded into the QR by the black data that comprises it. Inverting it ‘changes what that data is’ into a new set of information, and because of that it can’t be identified and the reader won’t work.
This is a shame, because making white QR Codes for black shadows would have been pretty cool aesthetically.
come quando sei in aereo e all'improvviso attorno a te è tutto rosa, come in una gigantesca big babol. vedi le nuvole così gonfie e morbide che sogni di toccarle, buttartici dentro e farti avvolgere come fossero ovatta.
il tuo dito è come una punta d'inchiostro, sfiori le nuvole rosa e sopra di te il cielo si apre dipingendosi, come quando usi gli acquerelli e una goccia blu cade sul foglio.
e se ci mettessi anche il naso dentro? chissà che profumo di fragole.
Here is a test I was playing around with in Processing. I was able to track the blue laser pen on the opposite wall using the webcam on my mac (you can see the laser reflecting off of the wall onto the screen). This then draws onto the blank processing sketch. It would be possible to expand on this idea by projecting the sketch onto a wall and directly drawing onto it with the light, and with everything aligned, this would look great. However, I decided against this as it would be too much like a previous project (L.A.S.E.R. Tag) I was looking at for inspiration, it would pretty much be steeling an idea that has already been created.
There are different directions I could take with this project such as non human controlled graffiti. But I feel that this would be second best to the main idea. I will post the code for anyone interested in the next post.