We’ll go ahead and add this to the list of the things we never knew we needed until right now. Design studio Richard Clarkson Studio joined forces with Crealev to create a the coolest floating mini-cloud, a cloud that hovers indoors while both playing your favorite music and lighting up in tune to the beat to replicate a storm. Watch it in action here!
Deep space radio signals might be trying to tell us something. IBM and the SETI Institute are working together to analyze six terabytes of these complex signals to listen for patterns of life. Researchers are using IBM Analytics on Apache Spark to sift through signals gathered by the Allen Telescope Array, and cognitive machine learning to determine which signals are from humans, and which might be from aliens. Maybe they’ll ask us to call-in.
First it was the cognitive dress, now Australian designer Jason Grech leveraged insights from Watson to create the world’s first cognitive couture collection for Melbourne Spring Fashion Week. Jason ran ten years of fashion data and real-time social posts through the Watson Visual Recognition API to analyze and predict trends that helped him find new ways to work with fabrics, color and textures. Jason also used Watson to infuse the couture with his love of architecture by matching architectural images with fashion images and taking inspiration from the lines, curves and corners. Now that’s some fashion forward thinking.
IBM announced today something rather interesting: Sword Art Online The Beginning Project, a “virtual MMO project” in collaboration with Kadokawa, Bandai Namco, Apniplex and more, inspired by the popular anime series.
Not much is known at the moment, besides the fact that the project uses IBM’s cloud computing tech SoftLayer, and aims to explore the idea of using cognitive computing (the simulation of human thought processes in a computerized model) for the future of gaming.
Apparently, the prototype utilizes body motions to control movement in game. The protagonists of the anime Kirito and Asuna are also involved in some form, retaining their original voice actors.
You wouldn’t expect to pay a local tax when you stream a movie on Netflix, but Chicago has decided that such cloud-based services should be taxed just like tickets for live entertainment.
There was no debate or public hearing over the city’s “cloud tax” — a 9 percent tax on streaming entertainment like Netflix and Spotify.
The city says that’s because the tax isn’t new and is actually a clarification — not an expansion of two taxes that have long been in effect. One is called the Personal Property Lease Transaction Tax, and the other is the Amusement Tax, which has traditionally been tacked onto tickets for concerts and sporting events.
But now just about everybody who pays to stream a video or television show will have to pay more.
Close your eyes and imagine you’re on a boat. You wave to the people on the shore as you speed by at 150mph. This boat has got heat sensors and load sensors and pressure sensors and all of that data is streaming to the IBM Cloud. SilverHook Powerboats make boats just like the one you’re thinking of for professional racing teams. The cloud lets these teams keep an eye on 2,000 bits of data per second so teams can spot hazards before accidents happen. Which means even in your wildest dreams, you can be totally safe.
Somewhere along the way, the new basecom has gained somewhat of a reputation. He had become Someone-You-Don’t-Fuck-With. Because if you do, you can be sure there would be people after you. It didn’t matter who. If you were lucky,
you got a group of Thirds still learning the limits of their new
strength. If you were unlucky, you got a thorough lecture from Zack Fair, followed by a visit from one of the Generals.
Tech art student project from Michelle Ma experiments with photographic technique that delays capture over time within a 3D environment:
I invited several of my friends to the Kinoptic Dome in order to capture
their movements for intervals of 3-5 seconds. Ideally, I would have
liked to use the panoptic dome for the video data. However, those
resources were not available to me so I switched gears to point cloud
data. Keep in mind that there was 80GB of point cloud data for 5 scenes,
each being 3-5 seconds long. The point clouds ended up being 650K -
700K points each. In addition, I really struggled with finding the right
software to handle the data but finally settled with OpenFrameworks and
MeshLab to convert the point clouds into meshed OBJs. After that,
writing the program to manipulate the point cloud data was fun and fluid