Prelistaću blog svake osobe koja ovo rebloguje i potom joj poslati poruku u kojoj će biti napisno moje vidjenje i misljenje o vlasniku/vlasnici istih i samim tim će mnogi dobiti follow. ☺
I koga briga što se ne poznajemo, vaš blog će mi dosta toga reći :))
As soon as he sees them, his arms are wrapped around your waist. “Looks like you had fun last night.” He says, lips now pressing in the spots that he previously marked. You giggled and reached around to brush your fingers through his hair. “I did, actually. I had an amazing time last night.” You mused as you turned around to peck his lips.
The two of you stand there for a moment, still locked in an innocent kiss. You pull away and grin at him. “Shouldn’t you be doing your monster homework?” You teased before kissing him one last time and walking away to grab a book off the shelve.
Another machine learning experiment from Samim explores regression method to moving image, breaking down each frame into visual compartments creating a polygon / Modernist style:
Regression is a widely applied technique in machine learning … Regression analysis is a statistical process for estimating the relationships among variables. Lets have some fun with it ;-)
… This experiment test a regression based approach for video stylisation. The following video was generated using Stylize by Alec Radford. Alec extends Andrej’s implementation and uses a fast Random Forest Regressor. The source video is a short by JacksGap.
You can find out more about the machine learning experiment here
Experiment and overview put together by samim explores the subject of ‘Augmented Creativity’, exploring graphics research which compliments artistic practice with computational technology:
Drawing is one of the oldest forms of human expression. According to Wikipedia it is a “visual art in which a person uses various instruments to mark paper or another medium”. Drawing helps us to communicate. The history of drawing is intertwined with the evolution of tools. From Rocks to Photoshop: Tools help us to change perspective and draw differently.
This experiment started with an exploration of the research project “How Do Humans Sketch Objects?” by Mathias Eitz, James Hays and Marc Alexa. The researchers collected 20,000 sketches across 250 categories (741h drawing time) and build a system that classifies drawings in realtime. For the following video, I asked a professional Illustrator to test the system.
A primary aim of this experiment was to explore how humans draw differently,
when a machine continuously is trying to guess what is being drawn. To
intensifying the user/machine feedback-loop, a Text2Speech system was