eeg

2

Brain stethoscope will use music to help identify epileptic seizures

An artistic project developed by two researchers at Stanford University has turned into a quest to build a biofeedback tool that will help identify epileptic seizures through music.

Inspired by a performance based on radio signals from outer space, Stanford neurologist Josef Parvizi embarked on a project to discover what the brain’s electrical activity would sound like set to music.

He enlisted the help of Chris Chafe, a professor of music research, and passed on to him the electroencephalogram (EEG) recording data of a consenting patient. From there Chafe, who has experience in converting natural signals into electronic music, set the electrical spikes of the rapidly firing neurons against music with human voice-like tones.

When the pair listened back to the music, they realised they had gone beyond creating something artistic and had by chance stumbled upon a way to clearly differentiate seizure activity in the brain from non-seizure activity. The two states were clearly discernible from the change in the music, says Chafe. “It was like turning a radio dial from a static-filled station to a clear one.”

The results don’t necessarily make for particularly easy listening (as you can hear in the embedded video), but it’s easy to distinguish the distinct phases of a seizure as the music changes throughout the piece.

The first, fairly rhythmic sounds represent the brain activity during the pre-ictal stage, which is before the seizure begins. Just before the seizure occurs, the sounds become increasingly louder, more frenzied and more unpredictable, peaking as the seizure takes place and the brain enters the ictal state. Suddenly the chaos subsides and the high frequency noises die down as the seizure tails off and the brain enters the fatigued, post-ictal state.

Caregivers to people with epilepsy often struggle to identify when a seizure is occurring or when one might be about to occur, but Chafe and Parvizi concluded that if they could work out how to achieve the same result using real-time brain activity data, it might be possible to develop a tool – a brain stethoscope – that would be able to tell.

The pair is currently developing the tool, which could be used for listening for seizures or for distinguishing when the brain is in a post-ictal state.

“Someone – perhaps a mother caring for a child – who hasn’t received training in interpreting visual EEGs can hear the seizure rhythms and easily appreciate that there is a pathological brain phenomenon taking place,” says Parvizi.

A prototype of the stethoscope, which will consist of a headset that will transmit an EEG of the wearer’s brain activity to a handheld device, is due to go on display at Stanford next year.

4

Eunoia II

Tech art performance piece by Lisa Park creates sounds and visuals from brainwaves using an EEG headset and 48 arranged speakers each with plates of water - video embedded below:

“Eunoia II” is an iteration “Eunoia”, which was my first performance using a commercial EEG (brainwave sensor) headset to obtain real-time feedback of my brainwaves and emotional reactions.

“Eunoia II” uses the emotional values (frustration, excitement, engagement, meditation) picked up by the Emotiv EEG headset, which then gets translated into sound waves that create vibrations in the pools of water placed atop speakers. Throughout the performance, the intensity of my feelings at the same time are mirrored in the intensity of the sound in terms of volume, speed, and the panning of the sound output.

“Eunoia II” is comprised of 48 speakers with metal plates in various sizes. The number 48 symbolizes philosopher Baruch Spinoza’s definition of 48 different emotions from the book “Ethica”.

You can find out more about the artist at her website here, including the first Eunoia project.

Researcher controls colleague’s motions in 1st human brain-to-brain interface

University of Washington researchers have performed what they believe is the first noninvasive human-to-human brain interface, with one researcher able to send a brain signal via the Internet to control the hand motions of a fellow researcher.

Using electrical brain recordings and a form of magnetic stimulation, Rajesh Rao sent a brain signal to Andrea Stocco on the other side of the UW campus, causing Stocco’s finger to move on a keyboard.

While researchers at Duke University have demonstrated brain-to-brain communication between two rats, and Harvard researchers have demonstrated it between a human and a rat, Rao and Stocco believe this is the first demonstration of human-to-human brain interfacing.

“The Internet was a way to connect computers, and now it can be a way to connect brains,” Stocco said. “We want to take the knowledge of a brain and transmit it directly from brain to brain.”

The researchers captured the full demonstration on video recorded in both labs.

Rao, a UW professor of computer science and engineering, has been working on brain-computer interfacing in his lab for more than 10 years and just published a textbook on the subject. In 2011, spurred by the rapid advances in technology, he believed he could demonstrate the concept of human brain-to-brain interfacing. So he partnered with Stocco, a UW research assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences.

On Aug. 12, Rao sat in his lab wearing a cap with electrodes hooked up to an electroencephalography machine, which reads electrical activity in the brain. Stocco was in his lab across campus wearing a purple swim cap marked with the stimulation site for the transcranial magnetic stimulation coil that was placed directly over his left motor cortex, which controls hand movement.

The team had a Skype connection set up so the two labs could coordinate, though neither Rao nor Stocco could see the Skype screens.

Rao looked at a computer screen and played a simple video game with his mind. When he was supposed to fire a cannon at a target, he imagined moving his right hand (being careful not to actually move his hand), causing a cursor to hit the “fire” button. Almost instantaneously, Stocco, who wore noise-canceling earbuds and wasn’t looking at a computer screen, involuntarily moved his right index finger to push the space bar on the keyboard in front of him, as if firing the cannon. Stocco compared the feeling of his hand moving involuntarily to that of a nervous tic.

“It was both exciting and eerie to watch an imagined action from my brain get translated into actual action by another brain,” Rao said. “This was basically a one-way flow of information from my brain to his. The next step is having a more equitable two-way conversation directly between the two brains.”

The technologies used by the researchers for recording and stimulating the brain are both well-known. Electroencephalography, or EEG, is routinely used by clinicians and researchers to record brain activity noninvasively from the scalp. Transcranial magnetic stimulation is a noninvasive way of delivering stimulation to the brain to elicit a response. Its effect depends on where the coil is placed; in this case, it was placed directly over the brain region that controls a person’s right hand. By activating these neurons, the stimulation convinced the brain that it needed to move the right hand.

Computer science and engineering undergraduates Matthew Bryan, Bryan Djunaedi, Joseph Wu and Alex Dadgar, along with bioengineering graduate student Dev Sarma, wrote the computer code for the project, translating Rao’s brain signals into a command for Stocco’s brain.

“Brain-computer interface is something people have been talking about for a long, long time,” said Chantel Prat, assistant professor in psychology at the UW’s Institute for Learning & Brain Sciences, and Stocco’s wife and research partner who helped conduct the experiment. “We plugged a brain into the most complex computer anyone has ever studied, and that is another brain.”

At first blush, this breakthrough brings to mind all kinds of science fiction scenarios. Stocco jokingly referred to it as a “Vulcan mind meld.” But Rao cautioned this technology only reads certain kinds of simple brain signals, not a person’s thoughts. And it doesn’t give anyone the ability to control your actions against your will.

Both researchers were in the lab wearing highly specialized equipment and under ideal conditions. They also had to obtain and follow a stringent set of international human-subject testing rules to conduct the demonstration.

“I think some people will be unnerved by this because they will overestimate the technology,” Prat said. “There’s no possible way the technology that we have could be used on a person unknowingly or without their willing participation.”

Stocco said years from now the technology could be used, for example, by someone on the ground to help a flight attendant or passenger land an airplane if the pilot becomes incapacitated. Or a person with disabilities could communicate his or her wish, say, for food or water. The brain signals from one person to another would work even if they didn’t speak the same language.

Rao and Stocco next plan to conduct an experiment that would transmit more complex information from one brain to the other. If that works, they then will conduct the experiment on a larger pool of subjects.

7

Artist Manipulates 48 Pools of Water with Her Mind

“Brain power” takes on a literal meaning when it comes to EEG painting, mind-responsive furniture, and the work of Lisa Park. Park combines EEG scanning with speakers and pools of water to visualize her thoughts and emotions. Last year, she exposed her brain patterns to the world with Eunoia, in which she placed five water-filled metal plates atop speakers designed to respond to her real-time brain data. In that project, Park sorted the data into five emotions—sadness, anger, desire, happiness, and hatred, one per plate. But the latest iteration of the project takes the experiment to the next level:

Eunoia II is outfitted with 48 vibration pools, inspired by the 48 emotions philosopher Baruch Spinoza outlined in his book, Ethica, like frustration, excitement, engagement, and meditation. Each speaker vibrates according to Park’s brain wave-interpreting algorithm, which tranforms intense signals from Park’s Emotiv EEG headset into intense vibrations in the pools of water. Here, Park is literally putting her inner struggles on display, and the whole show depends on how she deals with her feelings.

“I started working with biosensors especially EEG headset, because I questioned, ‘how can I take this invisible energy and emotions and make it visible?’” Park told The Creators Project. “When I am feeling certain emotions (anger, sadness, happiness), I believe that what’s inside me, more than 60% of water in human body, will create vibrations/energy within myself. So, I wanted to create an artwork that represents the inner part of myself.”

Eunoia II metaphorically gives Park’s inner self faculty and visibility, continuing the exploration she began in her first Eunoia performance, tenfold. via:thecreatorsproject

4

BIOMEDIATION

Audio visual performance project by João Beira involves a performer wearing an EEG brainwave reader who is meditating - the results distort the Kinect-captured presentation - video embedded below:

BIOMEDIATION is a sensor based audiovisual performance that digitizes the practice of meditation.
Through the use of a EEG headset, the cognitive and emotional experience of the performer is translated dynamically to sound and video compositions. It connects the body to brain activity, merging the physical world with the psychic dimension. Collaboration between João Beira and Yago de Quay.

[Link]

Smartphone thumb skills are altering our brains

Every region of the body – from the toes to the jaw and tongue – has a particular processing area in our emotional center in the brain, the somatosensory cortex. These areas are flexible and can change. In the case of violinists, for instance, the area representing the fingers that guide the instrument is larger than in other people. Arko Ghosh from the Institute of Neuroinformatics of the University of Zurich and ETH Zurich decided to investigate the impact that the finger dexterity of Smartphone users has on the brain and discovered that the day-to-day plasticity of the human brain could be researched based on our Smartphone usage. And with their recordings the digital devices provide a fertile source of data for this behavior. “Smartphones offer us an opportunity to understand how normal life shapes the brains of ordinary people,” explains Ghosh.

Teaming up with colleagues from the University of Fribourg, he studied the activation in the sensorimotor cortex, which is triggered by finger movements. The scientists used electroencephalography (EEG) to measure the cortical brain activity in 37 right-handed people, of whom 26 were touchscreen Smartphone users and 11 users of old cellphones. 62 electrodes placed on the test subject’s heads recorded this potential based on movements of the thumb, forefinger and middle finger. The results revealed that the cortical representation in touchscreen Smartphone users differed compared to people with conventional cellphones.

Cortical activity depends on daily usage

Ghosh was also able to demonstrate that the frequency of Smartphone usage influences cortical activity. The more the Smartphone had been used in the previous ten days, the greater the signal in the brain. This correlation was the strongest, i.e. proportional, in the area that represented the thumb.

“At first glance, this discovery seems comparable to what happens in violinists,” explains Ghosh. However, the researchers were able to draw two distinctions: Firstly, how long Smartphone users have owned and used a device does not play a role. In the case of violinists, however, the activity in the brain depended on the age at which they started playing. Secondly, there is a linear connection between the activation in the brain and the most recent use of a Smartphone, while there was no evidence of this for violinists in earlier studies.

“The digital technology we use on a daily basis shapes the sensory processing in our brains – and on a scale that surprised us,” says the neuroscientist in summary.

2

Lucia N°03 is a revolutionary lamp-system that combines a stroboscopic flickering light-source at variable speeds and intensity with a constant light that can be operated at different brightness levels. The stroboscopic lamp can induce varying types of hypnagogic effects such as intense perception of colour and form, bodilessness, meditative and dream-like visionary experiences. It is widely used in therapeutic settings as well as altered states, consciousness and NDE research.

www.lucialightexperience.com

2

Controlling genes with your thoughts

It sounds like something from the scene in Star Wars where Master Yoda instructs the young Luke Skywalker to use the force to release his stricken X-Wing from the swamp: Marc Folcher and other researchers from the group led by Martin Fussenegger, Professor of Biotechnology and Bioengineering at the Department of Biosystems (D-BSSE) in Basel, have developed a novel gene regulation method that enables thought-specific brainwaves to control the conversion of genes into proteins – called gene expression in technical terms.

“For the first time, we have been able to tap into human brainwaves, transfer them wirelessly to a gene network and regulate the expression of a gene depending on the type of thought. Being able to control gene expression via the power of thought is a dream that we’ve been chasing for over a decade,” says Fussenegger.

A source of inspiration for the new thought-controlled gene regulation system was the game Mindflex, where the player wears a special headset with a sensor on the forehead that records brainwaves. The registered electroencephalogram (EEG) is then transferred into the playing environment. The EEG controls a fan that enables a small ball to be thought-guided through an obstacle course.

Wireless transmission to implant

The system, which the Basel-based bioengineers recently presented in the journal Nature Communications, also makes use of an EEG headset. The recorded brainwaves are analysed and wirelessly transmitted via Bluetooth to a controller, which in turn controls a field generator that generates an electromagnetic field; this supplies an implant with an induction current.

A light then literally goes on in the implant: an integrated LED lamp that emits light in the near-infrared range turns on and illuminates a culture chamber containing genetically modified cells. When the near-infrared light illuminates the cells, they start to produce the desired protein.

Thoughts control protein quantity

The implant was initially tested in cell cultures and mice, and controlled by the thoughts of various test subjects. The researchers used SEAP for the tests, an easy-to-detect human model protein which diffuses from the culture chamber of the implant into the mouse’s bloodstream.

To regulate the quantity of released protein, the test subjects were categorised according to three states of mind: bio-feedback, meditation and concentration. Test subjects who played Minecraft on the computer, i.e. who were concentrating, induced average SEAP values in the bloodstream of the mice. When completely relaxed (meditation), the researchers recorded very high SEAP values in the test animals. For bio-feedback, the test subjects observed the LED light of the implant in the body of the mouse and were able to consciously switch the LED light on or off via the visual feedback. This in turn was reflected by the varying amounts of SEAP in the bloodstream of the mice.

New light-sensitive gene construct

“Controlling genes in this way is completely new and is unique in its simplicity,” explains Fussenegger. The light-sensitive optogenetic module that reacts to near-infrared light is a particular advancement. The light shines on a modified light-sensitive protein within the gene-modified cells and triggers an artificial signal cascade, resulting in the production of SEAP. Near-infrared light was used because it is generally not harmful to human cells, can penetrate deep into the tissue and enables the function of the implant to be visually tracked.

The system functions efficiently and effectively in the human-cell culture and human-mouse system. Fussenegger hopes that a thought-controlled implant could one day help to combat neurological diseases, such as chronic headaches, back pain and epilepsy, by detecting specific brainwaves at an early stage and triggering and controlling the creation of certain agents in the implant at exactly the right time.

Brainflight: Brain Computer Interface for controlling drones

Researchers from the EU project Brainflight (Portugal, Germany and the Netherlands) have developed a brain-to-computer interface that enables people to control drones with their minds.

During a public presentation in Lisbon (Portugal), TEKEVER and Champalimaud teams use high-performance electroencephalogram (EEG) systems to measure brain waves noninvasively, and then use specially conceived algorithms to convert brain signals into drone commands. The “drone operator”, wearing a cap that measures brain activity, influences the drone’s path using nothing but simple thoughts. Essentially, the electricity flowing through the pilot’s brain acts as an input to the drone’s control system, in order to perform, on the air, a mission with objectives previously defined by the research team.

“The project has successfully demonstrated that the use of the brain computer interface (BMI) on a simulator for the Diamond DA42 aircraft, where one pilot controlled the simulator through the BRAINLFIGHT system. We’ve also integrated the BMI the UAV ground systems and have successfully tested it in UAV simulators. We’re now taking it one step further, and performing live flight tests with the UAV.”, said Ricardo Mendes TEKEVER’s COO.

[read more]

4

Mood Controlled Lights

Project by CEDE which alters the colour of networked lights based on the real-time data of an EEG brain-reading headset - video embedded below:

So, we wondered, what would be the effects of controlling the ambient lighting of a room using our own – or others’– mood state? What if my friends, family or coworkers could “see” how I am feeling in real-time? What if we could design a system that reacts to our mental state, in a way to enhance or balance a certain mental state? What could the mean for digital communications, exchanges and relationships?

And so, we built a such an affect-aware system, using a few Hue Lights (recently launched by Phillips) and an EEG-based emotion detection system. The principle is simple: Emotiv reads the brain activity of the user and translates it into five different emotional states: excitement, frustration, engagement, meditation and long term excitement. Our approach was to use these states as variables to control the colour emitted from the lamp.

More at CEDE here

New High-Tech Lab Records the Brain and Body in Action

How does an autistic child take in information when he sits in a classroom abuzz with social activity? How long does it take someone with multiple sclerosis, which slows activity in the brain, to process the light bouncing off the windshield while she drives?

Until recently, the answers to basic questions of how diseases affect the brain – much less the ways to treat them – were lost to the limitations on how scientists could study brain function under real-world conditions. Most technology immobilized subjects inside big, noisy machines or tethered them to computers that made it impossible to simulate what it’s really like to live and interact in a complex world.

But now UC San Francisco neuroscientist Adam Gazzaley, MD, PhD, is hoping to paint a fuller picture of what is happening in the minds and bodies of those suffering from brain disease with his new lab, Neuroscape, which bridges the worlds of neuroscience and high-tech.

In the Neuroscape lab, wireless and mobile technologies set research participants free to move around and interact inside 3-D environments, while scientists make functional recordings with an array of technologies. Gazzaley hopes this will bring his field closer to understanding how complex neurological and psychiatric diseases really work and help doctors like him repurpose technologies built for fitness or fun into targeted therapies for their patients.

“I want us to have a platform that enables us to be more creative and aggressive in thinking how software and hardware can be a new medicine to improve brain health,” said Gazzaley, an associate professor of neurology, physiology and psychiatry and director of the UCSF Neuroscience Imaging Center. “Often, high-tech innovations take a decade to move beyond the entertainment industry and reach science and medicine. That needs to change.”

As a demonstration of what Neuroscape can do, Gazzaley’s team created new imaging technology that he calls GlassBrain, in collaboration with the Swartz Center at UC San Diego and Nvidia, which makes high-end computational computer chips. GlassBrain creates vivid, color visualizations of the structures of the brain and the white matter that connects them, as they pulse with electrical activity in real time.

These brain waves are recorded through electroencephalography (EEG), which measures electrical potentials on the scalp. Ordinary EEG recordings look like wavy horizontal lines, but GlassBrain turns the data into bursts of rhythmic activity that speed along golden spaghetti-like connections threading through a glowing, multi-colored glass-like image of a brain. Gazzaley is now looking at how to feed this information back to his subjects, for example by using the data from real-time EEG to make video games that adapt as people play them to selectively challenge weak brain processes. 

Gazzaley has already used the technology to image the brain of former Grateful Dead drummer Mickey Hart as he plays a hypnotic, electronic beat on a Roland digital percussion device with NeuroDrummer, a game the Gazzaley Lab is designing to enhance brain function through rhythmic training. Hart, whose brain is healthy, is collaborating with Gazzaley to develop the game and performed on NeuroDrummer while immersed in virtual reality on an Oculus Rift at the Neuroscape lab opening on March 5.

The Neuroscape lab will be available to all UCSF researchers who study the brain. And Gazzaley ultimately hopes it will aid in the development of therapies to treat diseases as various as Alzheimer’s, post-traumatic stress disorder, attention deficit and hyperactivity disorder, schizophrenia, autism, depression and multiple sclerosis.

Reposted from pro_choice on instagram:
(Here’s the exact text for those who might not be able to fully read it)

“Life is determined by electroencephalogram (EEG). If you no longer have EEG this means you are ‘Brain Dead’ which is considered the legal, medical, and scientific definition of death, despite your beating heart. Brain Death is the final cessation of activity in the central nervous system especially as indicated by a flat EEG for a predetermined length of time. Those who are Brain Dead show no clinical signs of brain activity including no response to pain and no cranial nerve reflexes. This means an organism such as a fetus is by definition not alive due to the lack of EEG. Which means a fetus CANNOT be murdered, only disposed of. By legal scientific and medical terms, abortion is not murder.“

Can I get a “hell yeah” for science??
(Also: if anyone wants to read more on EEG’s then I suggest: http://en.m.wikipedia.org/wiki/Electroencephalography )

-asha

New prosthetic arm controlled by neural messages

This design hopes to identify the memory of movement in the amputee’s brain to translate to an order allowing manipulation of the device.

Controlling a prosthetic arm by just imagining a motion may be possible through the work of Mexican scientists at the Centre for Research and Advanced Studies (CINVESTAV), who work in the development of an arm replacement to identify movement patterns from brain signals.

First, it is necessary to know if there is a memory pattern to remember in the amputee’s brain in order to know how it moved and, thus, translating it to instructions for the prosthesis,“ says Roberto Muñoz Guerrero, researcher at the Department of Electrical Engineering and project leader at Cinvestav.

He explains that the electric signal won’t come from the muscles that form the stump, but from the movement patterns of the brain. "If this phase is successful, the patient would be able to move the prosthesis by imagining different movements.”

However, Muñoz Guerrero acknowledges this is not an easy task because the brain registers a wide range of activities that occur in the human body and from all of them, the movement pattern is tried to be drawn. “Therefore, the first step is to recall the patterns in the EEG and define there the memory that can be electrically recorded. Then we need to evaluate how sensitive the signal is to other external shocks, such as light or blinking.”

Regarding this, it should be noted that the prosthesis could only be used by individuals who once had their entire arm and was amputated because some accident or illness. Patients were able to move the arm naturally and stored in their memory the process that would apply for the use of the prosthesis.

According to the researcher, the prosthesis must be provided with a mechanical and electronic system, the elements necessary to activate it and a section that would interpret the brain signals. “Regarding the material with which it must be built, it has not yet been fully defined because it must weigh between two and three kilograms, which is similar to the missing arm’s weight.”

The unique prosthesis represents a new topic in bioelectronics called BCI (Brain Computer Interface), which is a direct communication pathway between the brain and an external device in order to help or repair sensory and motor functions. “An additional benefit is the ability to create motion paths for the prosthesis, which is not possible with commercial products,” says Muñoz Guerrero.

4

Illumino

Homemade wearable tech project is an beanie hat with an LED bobble and EEG brain-reading attachment which changes colour depending on which mental state you are in - video embedded below:

Ever wanted to visualize your brain activity in real-time? Move an object on a screen with your mind? EEG devices are fantastic fun and allow you to do such things!

This tutorial will show you how to make an illumino: a recreational EEG hat that turns your brainwaves into an array of colorful light, using Neopixel RGB LEDs hidden inside a white pompom. Don’t like the pompom idea? Put the LEDs on a bracelet, clothing, or other accessory!

The device runs with Neurosky’s ThinkGear™ ASIC Module and the TinyLily Arduino microcontroller. All electronic components are discretely hidden, so it looks and feels as though you’re just wearing a cool & comfy beanie.

To make your own, there is an Instructable tutorial here or check out the project page here