eeg,

2

A Bionic Fingertip Let an Amputee Feel Texture Again

The world currently has some awesome prosthetics. We have low-cost 3D-printed ones, prosthetics based on Metal Gear Solid aesthetics that send emails, and slick-looking hands that (finally) come in more than one size.

Yet despite this proliferation of designs, but researchers are still working towards one holy grail in the prosthetic-making sphere: How can we endow them with a sense of touch so that users will be able to feel the world around them?

Researchers from the Ecole Polytechnique Federale de Lausanne (EPFL) and the Scuola Superiore Sant’Anna (SSSA), say they’ve created an artificial fingertip that lets amputees differentiate between smooth and rough surfaces in real-time.

“The touch sensation is quite close to what you would feel with your normal finger,” says amputee Dennis Aavo Sorenson in a video released by EPFL. For the video, Sorenson is tasked with sensing if he can feel the ridges on a small rectangular piece of material that the artificial finger is passed along. “You can feel the coarseness of the plates, and the different gaps and ribs,” he notes.

In a study published Tuesday in the journal eLife, the researchers describe surgically connecting the artificial fingertip, replete with sensors, to electrodes implanted in the peripheral nervous system in Sorensen’s left arm stump. They believe that their invention could accelerate the development of prostheses equipped with advanced sensory feedback features in the future. The main aim is for wearers to be able to regain an understanding of the weight, surface texture, and temperatures of the objects that surround them.

To test the bionic finger, the researchers chose four healthy non-amputee volunteers alongside Sorensen to act as control subjects. They connected the artificial fingertip to them via an electrode inserted into a nerve in their arm. The fingertip was then moved over a piece of plastic engraved with smooth and rough surfaces, making the sensors produce “patterns of electrical pulses that stimulated the nerve,” according to their paper.

They found that the control volunteers were able to distinguish between different textures 77 percent of the time; Sorensen answered correctly 96 percent of the time. The researchers also compared brain wave activity captured through an EEG device between non-amputee and amputee reactions, and found that the same brain regions were activated. These findings confirmed that different texture surfaces could indeed be distinguished by amputees, paving the way for a study involving more participants and advanced prototypes of the bionic finger.

While it might still be early days, there are high hopes that touch-sensitive systems will have applications not just for prosthetics, but also for robotics used in surgery, rescue and manufacturing. For example, humanoid rescue robots with tactile sensors in their feet could be able to sense what kind of terrain they were on, and adjust their balance accordingly.

Image Credit: Hillary Sanchez/EPFL

Source: Motherboard (by EMIKO JOZUKA)

2

Lucia N°03 is a revolutionary lamp-system that combines a stroboscopic flickering light-source at variable speeds and intensity with a constant light that can be operated at different brightness levels. The stroboscopic lamp can induce varying types of hypnagogic effects such as intense perception of colour and form, bodilessness, meditative and dream-like visionary experiences. It is widely used in therapeutic settings as well as altered states, consciousness and NDE research.

www.lucialightexperience.com

4

PsychicVR

Project from MIT Fluid Interfaces Group combines a VR HMD with EEG mind reading headset to produce experiences of superpowers when you concentrate:

We present PsychicVR, a proof-of-concept system that integrates a brain-computer interface device and Virtual Reality headset to improve mindfulness while enjoying a playful immersive experience. The fantasy that any of us could have superhero powers has always inspired people around the world. By using Virtual Reality and real-time brain activity sensing we are moving one step closer to making this dream real. We non-invasively monitor and record electrical activity of the brain and incorporate this data in the VR experience using an Oculus Rift and the MUSE headband. By sensing brain waves using a series of EEG sensors, the level of activity is fed back to the user via 3D content in the virtual environment. When the user is focused they are able to make changes in the 3D environment and control their powers. Our system increases mindfulness and helps achieve higher levels of concentration while entertaining the user.

More Here

Researchers can identify you by your brain waves with 100 percent accuracy

Your responses to certain stimuli – foods, celebrities, words – might seem trivial, but they say a lot about you. In fact (with the proper clearance), these responses could gain you access into restricted areas of the Pentagon.

A team of researchers at Binghamton University, led by Assistant Professor of Psychology Sarah Laszlo and Assistant Professor of Electrical and Computer Engineering Zhanpeng Jin, recorded the brain activity of 50 people wearing an electroencephalogram headset while they looked at a series of 500 images designed specifically to elicit unique responses from person to person – e.g., a slice of pizza, a boat, Anne Hathaway, the word “conundrum.” They found that participants’ brains reacted differently to each image, enough that a computer system was able to identify each volunteer’s “brainprint” with 100 percent accuracy.

“When you take hundreds of these images, where every person is going to feel differently about each individual one, then you can be really accurate in identifying which person it was who looked at them just by their brain activity,” said Laszlo.

In their original study, titled “Brainprint,” published in 2015 in

Neurocomputing

, the research team was able to identify one person out of a group of 32 by that person’s responses, with only 97 percent accuracy, and that study only incorporated words, not images

Maria V. Ruiz-Blondet, Zhanpeng Jin, Sarah Laszlo. CEREBRE: A Novel Method for Very High Accuracy Event-Related Potential Biometric Identification. IEEE Transactions on Information Forensics and Security, 2016; 11 (7): 1618 DOI: 10.1109/TIFS.2016.2543524

Woman wearing an EEG headset.Credit: Jonathan Cohen/Binghamton University

2

Controlling genes with your thoughts

It sounds like something from the scene in Star Wars where Master Yoda instructs the young Luke Skywalker to use the force to release his stricken X-Wing from the swamp: Marc Folcher and other researchers from the group led by Martin Fussenegger, Professor of Biotechnology and Bioengineering at the Department of Biosystems (D-BSSE) in Basel, have developed a novel gene regulation method that enables thought-specific brainwaves to control the conversion of genes into proteins – called gene expression in technical terms.

“For the first time, we have been able to tap into human brainwaves, transfer them wirelessly to a gene network and regulate the expression of a gene depending on the type of thought. Being able to control gene expression via the power of thought is a dream that we’ve been chasing for over a decade,” says Fussenegger.

A source of inspiration for the new thought-controlled gene regulation system was the game Mindflex, where the player wears a special headset with a sensor on the forehead that records brainwaves. The registered electroencephalogram (EEG) is then transferred into the playing environment. The EEG controls a fan that enables a small ball to be thought-guided through an obstacle course.

Wireless transmission to implant

The system, which the Basel-based bioengineers recently presented in the journal Nature Communications, also makes use of an EEG headset. The recorded brainwaves are analysed and wirelessly transmitted via Bluetooth to a controller, which in turn controls a field generator that generates an electromagnetic field; this supplies an implant with an induction current.

A light then literally goes on in the implant: an integrated LED lamp that emits light in the near-infrared range turns on and illuminates a culture chamber containing genetically modified cells. When the near-infrared light illuminates the cells, they start to produce the desired protein.

Thoughts control protein quantity

The implant was initially tested in cell cultures and mice, and controlled by the thoughts of various test subjects. The researchers used SEAP for the tests, an easy-to-detect human model protein which diffuses from the culture chamber of the implant into the mouse’s bloodstream.

To regulate the quantity of released protein, the test subjects were categorised according to three states of mind: bio-feedback, meditation and concentration. Test subjects who played Minecraft on the computer, i.e. who were concentrating, induced average SEAP values in the bloodstream of the mice. When completely relaxed (meditation), the researchers recorded very high SEAP values in the test animals. For bio-feedback, the test subjects observed the LED light of the implant in the body of the mouse and were able to consciously switch the LED light on or off via the visual feedback. This in turn was reflected by the varying amounts of SEAP in the bloodstream of the mice.

New light-sensitive gene construct

“Controlling genes in this way is completely new and is unique in its simplicity,” explains Fussenegger. The light-sensitive optogenetic module that reacts to near-infrared light is a particular advancement. The light shines on a modified light-sensitive protein within the gene-modified cells and triggers an artificial signal cascade, resulting in the production of SEAP. Near-infrared light was used because it is generally not harmful to human cells, can penetrate deep into the tissue and enables the function of the implant to be visually tracked.

The system functions efficiently and effectively in the human-cell culture and human-mouse system. Fussenegger hopes that a thought-controlled implant could one day help to combat neurological diseases, such as chronic headaches, back pain and epilepsy, by detecting specific brainwaves at an early stage and triggering and controlling the creation of certain agents in the implant at exactly the right time.

2

Observe the Heart · 观心

Performance from Shi Weili presents the artist in meditative state, illustrated with realtime projected visuals read from an EEG headset:

If you ask a Zen master how to meditate, he might answer you, “Observe the heart.” But the heart is so abstract to imagine, not even to mention observation. Observe the Heart is an artistic attempt to represent the meditator’s mental state, generating visuals and sounds based on realtime brainwave input. The generative visuals are projected back onto the meditator, transforming the introspective meditation into an observable performance, in a sense.

There is more to tell about the concept. While third-party audiences can watch and hear one’s meditation, the meditator themselves couldn’t experience the generative contents in real time (given that they close their eyes during the meditation, and may even wear earplugs to block the sound). It is then questionable who is this meditation for. Moreover, the meditator will nonetheless be curious about how their meditation looks and sounds like, and this mental activity will be captured by the brainwave sensor and be reflected by the generative output. Therefore, it could make it even harder for the meditator to really “observe the heart”.

More Here

Reposted from pro_choice on instagram:
(Here’s the exact text for those who might not be able to fully read it)

“Life is determined by electroencephalogram (EEG). If you no longer have EEG this means you are ‘Brain Dead’ which is considered the legal, medical, and scientific definition of death, despite your beating heart. Brain Death is the final cessation of activity in the central nervous system especially as indicated by a flat EEG for a predetermined length of time. Those who are Brain Dead show no clinical signs of brain activity including no response to pain and no cranial nerve reflexes. This means an organism such as a fetus is by definition not alive due to the lack of EEG. Which means a fetus CANNOT be murdered, only disposed of. By legal scientific and medical terms, abortion is not murder.“

Can I get a “hell yeah” for science??
(Also: if anyone wants to read more on EEG’s then I suggest: http://en.m.wikipedia.org/wiki/Electroencephalography )

-asha

Text messaging with smartphones triggers a new type of brain rhythm

Sending text messages on a smartphone can change the rhythm of brain waves, according to a new study published in Epilepsy & Behavior.  

People communicate increasingly via text messaging, though little is known on the neurological effects of smartphone use. To find out more about how our brains work during textual communication using smartphones, a team led by Mayo Clinic researcher William Tatum analyzed data from 129 patients. Their brain waves were monitored over a period of 16 months through electroencephalograms (EEGs) combined with video footage.

Dr. Tatum, professor of neurology and director of the epilepsy monitoring unit and epilepsy center at Mayo Clinic in Jacksonville, Florida found a unique ‘texting rhythm’ in approximately 1 in 5 patients who were using their smartphone to text message while having their brain waves monitored.

The researchers asked patients to perform activities such as message texting, finger tapping and audio cellular telephone use in addition to tests of attention and cognitive function. Only text messaging produced the newly observed brain rhythm, which was different than any previously described brain rhythm.

More information: William O. Tatum et al. Cortical processing during smartphone text messaging, Epilepsy & Behavior (2016). DOI: 10.1016/j.yebeh.2016.03.018

Credit: Elsevier    

Brainflight: Brain Computer Interface for controlling drones

Researchers from the EU project Brainflight (Portugal, Germany and the Netherlands) have developed a brain-to-computer interface that enables people to control drones with their minds.

During a public presentation in Lisbon (Portugal), TEKEVER and Champalimaud teams use high-performance electroencephalogram (EEG) systems to measure brain waves noninvasively, and then use specially conceived algorithms to convert brain signals into drone commands. The “drone operator”, wearing a cap that measures brain activity, influences the drone’s path using nothing but simple thoughts. Essentially, the electricity flowing through the pilot’s brain acts as an input to the drone’s control system, in order to perform, on the air, a mission with objectives previously defined by the research team.

“The project has successfully demonstrated that the use of the brain computer interface (BMI) on a simulator for the Diamond DA42 aircraft, where one pilot controlled the simulator through the BRAINLFIGHT system. We’ve also integrated the BMI the UAV ground systems and have successfully tested it in UAV simulators. We’re now taking it one step further, and performing live flight tests with the UAV.”, said Ricardo Mendes TEKEVER’s COO.

[read more]

New prosthetic arm controlled by neural messages

This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.

Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.

First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,“ says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.

He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. "If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”

However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”

Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.

According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”

The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

4

Illumino

Homemade wearable tech project is an beanie hat with an LED bobble and EEG brain-reading attachment which changes colour depending on which mental state you are in - video embedded below:

Ever wanted to visualize your brain activity in real-time? Move an object on a screen with your mind? EEG devices are fantastic fun and allow you to do such things!

This tutorial will show you how to make an illumino: a recreational EEG hat that turns your brainwaves into an array of colorful light, using Neopixel RGB LEDs hidden inside a white pompom. Don’t like the pompom idea? Put the LEDs on a bracelet, clothing, or other accessory!

The device runs with Neurosky’s ThinkGear™ ASIC Module and the TinyLily Arduino microcontroller. All electronic components are discretely hidden, so it looks and feels as though you’re just wearing a cool & comfy beanie.

To make your own, there is an Instructable tutorial here or check out the project page here

dne.ws
Cauldron of Psychic Goo Reads Your Mind : DNews
Interactive art installation uses EEGs and magnets to make your thoughts come to life. Continue reading →

One of science fiction’s greatest books is “Solaris,” the 1961 novel by Polish author Stanislaw Lem, which imagines a sentient, psychic ocean on a faraway planet. That’s the short version, anyway.

“Solaris” has been turned into two good films, as well — the 1972 classic from Russian director Andrei Tarkovsky and the underrated 2002 remake with George Clooney.

Now the “Solaris” idea has been turned into a techie, interactive art piece, featuring ferromagnetic liquids, EEG headsets and a cauldron of fluorescent psychic goo.

health.ucsd.edu
EEG Test to Help Understand and Treat Schizophrenia

Researchers at University of California, San Diego School of Medicine have validated an EEG test to study and treat schizophrenia. The findings, published in two separate studies, offer a clinical test that could be used to help diagnose persons at risk for developing mental illness later in life, as well as an approach for measuring the efficacies of different treatment options.

One of the studies, reported online Oct. 23 in Schizophrenia Research, shows that schizophrenia patients don’t register subtle changes in reoccurring sounds as well as others and that this deficit can be measured by recording patterns of electrical brain activity obtained through electroencephalography (EEG).

The second, published online earlier this month in NeuroImage: Clinical, establishes a link between certain EEG tests and patients’ cognitive and psychosocial impairments, suggesting that the EEG test could be used to objectively measure the severity of a patient’s condition, and conversely that it might be possible to alleviate some of the symptoms of schizophrenia with specialized cognitive exercises designed to strengthen auditory processing.

“We are at the point where we can begin to bring simple EEG tests into the clinical setting to help patients,” said Gregory Light, PhD, associate professor of psychiatry and a co-author on both studies. “We think it may be possible to train some patients’ auditory circuits to function better. This could improve their quality of life, and possibly reduce common symptoms of schizophrenia such as hearing voices.”

Read more here

Neuroscientists study our love for deep bass sounds

Have you ever wondered why bass-range instruments tend to lay down musical rhythms, while instruments with a higher pitch often handle the melody?

According to new research from Laurel Trainor and colleagues at the McMaster Institute for Music and The Mind, this is no accident, but rather a result of the physiology of hearing.

In other words, when the bass is loud and rock solid, we have an easier time following along to the rhythm of a song.

Read more

Fast talking

A couple of great videos showing world-record-holding fast-talkers. Both of these videos are pretty old, but a video from 2014 suggests that Fran Capo, at least, still holds her record. The later portion of this first video includes an interesting demonstration of fast-talking, EEG, and Broca’s area: 

As for non-record-breaking fast talking though, this report from Mic suggests that it’s more a choice than specialized brain function: 

In a study, published in 2011, Bakker and colleagues compared the rate and clarity of speech in three groups of people: fast-talkers, clutterers and control group members who spoke normally. All study participants had to read the “rainbow passage” and a phonetically balanced string of nonsense words, as well as recite their nursery rhyme of choice, first at a comfortable pace and then a second time as quickly as their lips could move.
While reading at a natural pace, fast-talkers and clutterers both spoke faster than control group members. But when participants made a concerted effort to speed up, everyone spoke at about the same rate. 

Thought Crimes

Could your own brain betray you?

by Pearl Tesler

Imagine this: You get scooped up by police, fitted with electronic headgear, and shown a series of random pictures. Among the random pictures is a not-so-random one: A crime scene. A tiny electric twinge on your scalp tips off police; the crime scene is familiar to you. You are under arrest.

This story is made up—but the technology is not.

Decades ago, researchers discovered a neural phenomenon known as the P300 wave, a voltage spike that occurs in your brain whenever you see something familiar to you. Detectable by EEG, P300 is the basis of a lie-detection technique known as brain fingerprinting. Brain fingerprinting evidence has been deemed admissible in several court cases.

Newer techniques promise to go even further in getting inside your head.

At the 2015 AAAS Annual Meeting last weekend in San Jose, brain researcher Jack Gallant of UC Berkeley presented surprisingly faithful reconstructions of brain activity using fMRI scans, a technique called brain decoding (see below). He also cautioned that there was, as yet, no way to distinguish between real and distorted or fabricated memories. “We have a long way to go before this stuff is reliable.”

Still, rapid strides in neuroscience raise new questions about just how far the justice system can or should go in peeking into peoples’ minds. At what point, if any, do you lose your right to the privacy of your own thoughts?

Legal scholar Nita Farahany of Duke University is already on the case. She sees two potential bulwarks against neural prying: the 4th and 5th constitutional amendments, which protect against unlawful search and self-incrimination, respectively.

But even these redoubtable legal barriers may not be enough to guard against an Orwellian future in which your own brain betrays you in a court of law. Farahany cites fingerprint and DNA evidence, both routinely collected, as examples in which the body “testifies” against itself.

As scanning technologies improve, the open question of whether scrutiny of grey matter constitutes a reasonable search will become an increasingly grey area. For now, at least, your thoughts are your own.

Locked-in people’s awareness revealed by rainbow hair

These colourful Mohawks might be key to identifying people who are locked in their bodies with no way to communicate.

People who are in a vegetative state can sometimes have some awareness of their surroundings, but it can be difficult for doctors and family to work out how much when they can’t respond physically or verbally.

Researchers at the University of Cambridge measured 32 patients' brain activity using an electroencephalograph (EEG) machine. The team analysed the networks of signals between brain regions in a bid to discover what they call "the neural signatures of consciousness”, shown in this image.“

Learn more from newscientist.