accelerometer

4

Smartwatch-controlled Drone

Sony Developer has put together a demo of controlling a drone with a smartwatch (with SmartEyeglass AR glasses as a display) - video embedded below:

The drone used is a Parrot AR.Drone 2.0. The SmartWatch 2 accelerometer is used to provide control signals for flying the drone, and the display on the display is used for more control inputs (for example, swiping will trigger the mini-drone to do rolls in the air). On the SmartEyeglass prototype, flight information and live video from the camera on the drone are displayed.

It is about time smartwatches were explored for other possibilities other than notifications and fitness tracking, to compliment other tech for other experiences.

The developers (Peter Bartos, Jonas Hellström and Alexander Najafi) have put together some background and a tutorial to set up your own version of this project, which you can find here

The Sensors Are Coming! - NYTimes.com

A microscopic view of a gyroscope sensor created by STMicroelectronics. It is as thin as a piece of paper and can detect the movement of a mobile phone.

If you own a smartphone, you probably know just how smart those little gadgets are. They know where you’re standing, the direction you’re moving in and if you’re holding the phone vertically or horizontally.

To perform these clever tricks, the inside of your phone is stuffed with a number of sensors, which are little electronic components that can sometimes be as thin as a piece of paper.

The coming generation of mobile phones — and other gadgets for that matter — will have so many new types of smart sensors that your current mobile phone will look pretty dumb.

In a recent interview, Benedetto Vigna, general manager of the MEMS division of STMicroelectronics, a company based in Geneva that creates sensors for mobile devices and other consumer electronics, talked about some of the smart technologies we can expect to see.

Mr. Vigna said the next smartphones would have altimeter sensors that would be able to detect your elevation. “These sensors will tell people what floor they are on in a building, or could be used to more precisely determine where you are in relation to your friends on a location-based service,” he said.

Other sensors built into your next-generation phone could include heart monitors to keep tabs on your health. There will also be sensors that can detect perspiration and could be used to monitor your excitement level and even mood. Additionally, phones will include more microphones, and temperature and humidity sensors to better determine their location and surroundings.

Orientation & Accelerometer sensors

One thing I’m interested in are the sensors in the N9. This includes the following things:

  • Accelerometer
  • Ambient Light Sensor
  • Compass
  • Gyroscope
  • Light Sensor
  • Magnetometer
  • Orientation Sensor
  • Proximity Sensor
  • Rotation Sensor
  • Tap Sensor

From the list I’m particularly interested in the orientation sensor and the accelerometer as they are very versitile and you can do a lot of things with them.

Starting with sensors

Although there is an overview page of the sensor API containing all the individual sensor classes, it only contains C++ style documentation and examples. There is however a documentation for QML wrappers around the sensor API that you can conviniently use from QML pages directly. This one would be for the orientation sensor.

The QML sensor classes are much just a layers over the regular C++ classes (QSensor). At their core they only one property and one signal:

onReadingChanged: This is a signal that is sent every time the sensors registers a change, for example when the orientation changes.

reading: This property holds the information that was gathered during the last reading. So if you get an onReadingChanged signal, this will hold the new data.

Using the orientation sensor should be easy then: just wait for theonReadingChanged signal and read out the reading property.

The Orientation Sensor

The orientation sensor basically measures which side of your phone is pointing upwards. There are six states the sensor can report:

  • TopUp (1)
  • TopDown (2)
  • LeftUp (3)
  • RightUp (4)
  • FaceUp (5)
  • FaceDown (6)

To start coding, I did again use the standard QML “Hello World” application and removed the button and the label from the MainPage.qml.

All sensor handling is located in QtMobillity, so I started by adding the sensor classes to the project:

import QtMobility.sensors 1.1

Then I added a text field to receive the sensor output:

    Text {
        id: txtOrientation
        anchors.centerIn: parent
        text: "Orientation starting.."
        font.pointSize: 25
    }

And finally, I added the sensor:

    OrientationSensor {
        id: oriData
        active: true

        onReadingChanged: {
                txtOrientation.text = "Orientation: " + reading.orientation;
            }
     }

And that was it. Note that you actually have to activate the sensor with the “active: true” flag, otherwise it won’t work.

If you compile and run the example you will see the number representing the orientation in the text field. If you want a convinient way of reading out and matching the orientation to states, look at this example.

The accelerometer

The accelerometer is a sensor that measures, well, acceleration. So if you would move the phone on the x-plane, you would get  a positive x reading (or negative, depending on your direction). Read more about accelerometers here.

All sensors work the same in QML, only the reading is different, so instead of reading.orientation, we ask for x, y and z values.

    Accelerometer {
        id: accData
        active: true

        onReadingChanged: {
            txtAccelerometer.text = "Accelerometer: \nX: " + reading.x + 
				"\nY: " + reading.y + 
				"\nZ: " + reading.z;
        }
    }

And there you go: you can output the readings from the accelerometer.

That is really all there is to work with sensors in QML. I have to say I’m pretty impressed at how easy this was. You can find the source to this example here.

youtube

(via iPhone活用で配達を! スウェーデンの郵便局が行うリアルなゲーミフィケーション事例 « WIRED.jp 世界最強の「テクノ」ジャーナリズム)

In the previous post we have seen, how we can simulate the rotations of the gravity vector (thus measuring the exact tilt) with the help of the gyroscope. During that measurement we moved the device only slowly to validate the claim that the gyroscope is able to track the gravity vector for a certain period of time. We were aware of the fact that measurement errors for this type of measurement will eventually accumulate and therefore we have to pick the correct gravity vector time to time….

youtube

The Gaits is een app die muziek componeert op basis van jouw stappen door middel van de accelerometer in de iPhone. Een wandeling wordt een toffe audio/visuele ervaring.

4

Digitally Communicating Touch

…we have an innate sense of how things feel
Can technology leverage this?

In this TED-Ed lesson YouTube, learn about the field of haptics and how it could change everything from the way we shop online to how dentists learn the telltale feel of a cavity.

TED-Ed talk by Katherine Kuchenbecker,
Mechanical Engineering professor
at University of Pennsylvania

Great TED talk about this here.

I’m exhausted, but I have to keep running. I have to warn the Brunswick settlement that a horde of zombies is on its way to attack them. The song I’m listening to ends, I hear a crackle over the comms channel, and then Sam relays the bad news: The zombies are closer to the settlement than I am, and I’m going to have to pick up the pace to get there in time to deliver the warning. With human lives at stake, I have no choice. I have to find a burst of speed.

Never mind that all of this happens in the safety of my home, on my treadmill. The running and resultant exhaustion are very real. And the dire survival scenario? That’s courtesy of “Zombies, Run!” — an app designed to gamify running and walking workouts by layering an apocalyptic storyline over your training. I love being able to lose myself in these fictional missions while I run, but the reality is that I’m training for a half marathon. I need to achieve certain goals in my running workouts to ensure I’m prepared for my upcoming event. I need to know how far I’ve run, and at what pace. So how does my device’s accelerometer know?

The motion sensors of an accelerometer measure changes in direction and velocity — acceleration. If you have a mobile device in your pocket while you run, the accelerometer constantly assesses and calculates your movement and cadence as you accelerate into each stride and decelerate slightly after foot strike.

Keep reading…

vimeo

NECLUMI: ‘a probable future of jewellery’ - PanGenerator

Warsaw-based new media design studio PanGenerator has developed a piece of programmable virtual jewellery formed entirely of light. Able to be set to four different settings on your smartphone, the necklace can be set to respond to your walking pace (Airo), directional movement (Movi), rotation (Roto) and your surroundings’ ambient noise (Sono). 

A small pico projector is worn on the user’s chest and connected to your smartphone via an HDMI cable. This then projects the white patterns that dance around the neck. The firm intends to develop the design further as wearable tech becomes more prevalent. Keep an eye out for this one!

More info at: The Creators Project