brain to brain interface

techcrunch.com
Elon Musk reiterates the need for brain-computer interfaces in the age of AI
By Darrell Etherington

He’s spoken about the potential of brain interfaces, including a “neural lace,” before, but at the launch of Tesla in UAE during the World Government Summit in Dubai on Monday, Musk articulated more clearly why we might seek to deep our ties to our computing devices in the near future.

The need for this, he said on Monday in Dubai, could “achieve a symbiosis between human and machine intelligence, and maybe solves the control problem and the usefulness problem,” reports CNBC.

Source: https://techcrunch.com/2017/02/13/elon-musk-reiterates-the-need-for-brain-computer-interfaces-in-the-age-of-ai/

youtube

Brain-computer interface allows Locked-in syndrome Patients to Communicate: 

Scientists have developed a brain-computer interface that reads the brain’s blood oxygen levels and enables communication by deciphering the thoughts of patients who are totally paralyzed and unable to talk.

In a trial of the system in four patients with complete locked-in syndrome - incapable of moving even their eyes to communicate - it helped them use their thought waves to respond yes or no to spoken questions.

People who are paralyzed except for up and down eye movements and blinking are classified as having locked-in syndrome. If all eye movements are lost, the condition is referred to as complete locked-in syndrome.

Researchers leading this trial said the brain-computer interface (BCI), which is non-invasive, could transform the lives of such patients, allowing them to express feelings and opinion to their loved ones and carers.

Controlling turtle motion with human thought

Korea Advanced Institute of Science and Technology (KAIST) researchers have developed a technology that can remotely control an animal’s movement with human thought.

In the 2009 blockbuster “Avatar,” a human remotely controls the body of an alien. It does so by injecting human intelligence into a remotely located, biological body. Although still in the realm of science fiction, researchers are nevertheless developing so-called ‘brain-computer interfaces’ (BCIs) following recent advances in electronics and computing. These technologies can 'read’ and use human thought to control machines, for example, humanoid robots.

Keep reading

youtube

This Robot is Controlled by Your Thoughts

“Lucy is a Brain-Machine Interface (BMI) wearable that can detect your brain signal and allow you to directly control electronic devices (including toys, smart home appliances and robotic devices). Lucy is designed to make your daily life more adventurous, enjoyable and efficient. Just by wearing Lucy, you can explore a new way of interacting with your environment!”

jpartspace  asked:

Why doesn't Bioware show the full dialog options before you select them? Its been a criticism for a while now, but they have seemingly stuck to it. Seeing as Fallout 4 decided to adopt a similar system, they have to have a good reason, right?

The issue primarily stems from using a voiced protagonist. Because the player character speaks, it causes several implementation issues that can be either logistical or cognitive. These issues vary in scope, some being bigger than others. Localization and screen space, for example, are problematic. The biggest issue is probably the subvocalization element to it, however. That just doesn’t play well with a lot of players.

Here’s an example of a line from Fallout 4. If you choose “Sarcastic” as a response in one case, you get the line “I’m here to pick up an order. Two large pepperoni and a calzone. Name is ‘Fuck you’.” As you can see, it takes up a lot more screen space if you replace the ‘Sarcastic’ keyword. If you replace all of the shorthand options on screen with the full line, you’re going to overwhelm the player with too many words. Furthermore, you can also lose the clarity in purpose - the words “Tell me more” could be used in a sarcastic or affirmative way, just because of tone. This still doesn’t take into account issues with localization, like translating to other languages. Making the same line fit in the same screen space in other languages like German or Spanish can prove difficult. 

However, the biggest problem is cognitive. There’s a documented term in cognitive science called “subvocalization” - when humans read, they actually say the words to themselves in order to understand what they are reading. Researchers have actually observed tiny motions in the human larynx while subjects read silently. Thus, when we present the entire line to the player, the vast majority of them will literally say it to themselves first, and then the character on screen will immediately say the same line aloud, repeating it twice. This gets annoying very, very quickly. Have you ever had a situation where a child imitates somebody else by repeating every word said? It’s a lot like that, except it applies to the protagonist (who has the most lines out of every character in the game), so it happens constantly. It absolutely kills any humor and timing that’s dependent on delivery. We’ve tested it out. It was not well received.

I will admit that games have had problems with paraphrasing not matching the actual intent of the line. It’s a bug, and those happen. It is a major problem for some of the fans, and I know Bioware has recognized it. However, the paraphrasing deals a lot better with localization and the subvocalization problem. There might be a better way overall to represent dialogue options in game that we haven’t thought of yet, but showing the entire line is not it. Really.


Got a burning question you want answered?

Elon Musk's latest target: Brain-computer interfaces

NEW YORK (AP) – Tech billionaire Elon Musk is announcing a new venture called Neuralink focused on linking brains to computers.

The company plans to develop brain implants that can treat neural disorders — and that may one day be powerful enough to put humanity on a more even footing with possible future superintelligent computers, according to a Wall Street Journal report citing unnamed sources.

Musk, a founder of both the electric-car company Tesla Motors and the private space-exploration firm SpaceX, has become an outspoken doomsayer about the threat artificial intelligence might one day pose to the human race. Continued growth in AI cognitive capabilities, he and like-minded critics suggest, could lead to machines that can outthink and outmaneuver humans with whom they might have little in common.

In a tweet Tuesday, Musk gave few details beyond confirming Neuralink’s name and tersely noting the “existential risk” of failing to pursue direct brain-interface work.

STIMULATING THE BRAIN

Some neuroscientists and futurists, however, caution against making overly broad claims for neural interfaces.

Hooking a brain up directly to electronics is itself not new. Doctors implant electrodes in brains to deliver stimulation for treating such conditions as Parkinson’s disease, epilepsy and chronic pain. In experiments, implanted sensors have let paralyzed people use brain signals to operate computers and move robotic arms. Last year , researchers reported that a man regained some movement in his own hand with a brain implant.

Musk’s proposal goes beyond this. Although nothing is developed yet, the company wants to build on those existing medical treatments as well as one day work on surgeries that could improve cognitive functioning, according to the Journal article.

Neuralink is not the only company working on artificial intelligence for the brain. Entrepreneur Bryan Johnson, who sold his previous payments startup Braintree to PayPal for $800 million, last year started Kernel, a company working on “advanced neural interfaces” to treat disease and extend cognition.

RISK OF OVERHYPE

Neuroscientists posit that the technology that Neuralink and Kernel are working on may indeed come to pass, though it’s likely to take much longer than the four or five years Musk has predicted. Brain surgery remains a risky endeavor; implants can shift in place, limiting their useful lifetime; and patients with implanted electrodes face a steep learning curve being trained how to use them.

“It’s a few decades down the road,” said Blake Richards, a neuroscientist and assistant professor at the University of Toronto. “Certainly within the 21st century, assuming society doesn’t implode, that is completely possible.”

Amy Webb, CEO of Future Today Institute, pointed out that the Neuralink announcement is part of a much larger field of human-machine interface research, dating back over a decade, performed at the University of Washington, Duke University and elsewhere.

Too much hype from one “buzzy” announcement like Neuralink, she said, could lead to another “AI Winter.” That’s a reference to the overhype of AI during the Cold War, which was followed by a backlash and reduced research funding when its big promises didn’t materialize.

“The challenge is, it’s good to talk about potential,” Webb said. “But the problem is if we fail to achieve that potential and don’t start seeing all these cool devices and medical applications we’ve been talking about then investors start losing their enthusiasm, taking funding out and putting it elsewhere.”

__

AP Science Writer Malcolm Ritter in New York contributed to this report.

Can the brain feel it? The world’s smallest extracellular needle-electrodes

A research team in the Department of Electrical and Electronic Information Engineering and the Electronics-Inspired Interdisciplinary Research Institute (EIIRIS) at Toyohashi University of Technology developed 5-μm-diameter needle-electrodes on 1 mm × 1 mm block modules. This tiny needle may help solve the mysteries of the brain and facilitate the development of a brain-machine interface. The research results were reported in Scientific Reports
on Oct 25, 2016.

(Image caption: Extracellular needle-electrode with a diameter of 5 μm mounted on a connector)

The neuron networks in the human brain are extremely complex. Microfabricated silicon needle-electrode devices were expected to be an innovation that would be able to record and analyze the electrical activities of the microscale neuronal circuits in the brain.

However, smaller needle technologies (e.g., needle diameter < 10 μm) are necessary to reduce damage to brain tissue. In addition to the needle geometry, the device substrate should be minimized not only to reduce the total amount of damage to tissue but also to enhance the accessibility of the
electrode in the brain. Thus, these electrode technologies will realize new experimental neurophysiological concepts.

A research team in the Department of Electrical and Electronic Information Engineering and the EIIRIS at Toyohashi University of Technology developed 5-
μm-diameter needle-electrodes on 1 mm × 1 mm block modules.

The individual microneedles are fabricated on the block modules, which are small enough to use in the narrow spaces present in brain tissue; as demonstrated in the recording using mouse cerebrum cortices. In addition, the block module remarkably improves the design variability in the packaging, offering numerous in vivo recording applications.

“We demonstrated the high design variability in the packaging of our electrode device, and in vivo neuronal recordings were performed by simply placing the device on a mouse’s brain. We were very surprised that high quality signals of a single unit were stably recorded over a long period using the 5-μm-diameter needle,” explained the first author, Assistant Professor Hirohito Sawahata, and co-author, researcher Shota Yamagiwa.

The leader of the research team, Associate Professor Takeshi Kawano said: “Our silicon needle technology offers low invasive neuronal recordings and provides novel methodologies for electrophysiology; therefore, it has the potential to enhance experimental neuroscience.” He added, “We expect the development of applications to solve the mysteries of the brain and the development of brain–machine interfaces.”

Scientists prove that Communication between Minds separated by Distance is Possible.

Telepathy is the purported transmission of information from one person to another without using any of our known sensory channels or physical interaction. A brain-to-brain communication study organized by Harvard Medical School has revealed that human brains can “talk” directly to one another despite being physically separated by thousands of miles. Conducted by a group of robotics engineers and neuroscientists from across the globe, the study proves that information can be transmitted between two human brains through leveraging different passageways to the mind.

The group of participants that took part in the study was between 28 and 50 in age. Electrodes were attached to one person’s scalp on the sending end, to monitor brain currents and were then hooked up to a computer that interpreted the signal. On the receiving end subjects were tasked with interpreting the message using the computer-brain interface. When the messages were sent, the receivers experienced brain stimulation; flashes of light in their peripheral vision and they were able to decipher simple messages.

You have probably experienced telepathy before with someone you are close with. Perhaps you have thought of an individual and all of a sudden you have received a call front them. Many people have claimed to know when a loved one has passed away despite thousands of miles of physical separation. For telepathy to work, both the sender and receiver must have an open-minded attitude and belief that it will work. An environment free of distractions is ideal and your mind must be clear of thoughts. The next step is visualising the receiver sitting beside you and imagine a tube connecting the two of you. Picture your thoughts being transmitted through the tube. Paint a mental picture of whatever you thinking of and infuse your thoughts with positive emotion.

This study represents only a small step toward engineering telepathy, which might take years - or decades - to perfect. Ultimately, the goal is to remove the computer middleman from the transmission equation and allow direct brain-to-brain communication between people. 

The Collective Consciousness, During an interactive audio driven experience “The Collective Consciousness” people through active participation are confronted with all sorts of philosophical questions that arise from the use of a brain-to-brain interface.This program was developed by the theater in collaboration with the Radboud University and was previously in Over het IJ Festival and Stukafest. 

 Photography by ©  ilovisual

Watching thoughts — and addiction — form in the brain

More than a hundred years ago, Ivan Pavlov conducted what would become one of the most famous and influential psychology studies — he conditioned dogs to salivate at the ringing of a bell. Now, scientists are able to see in real time what happens in the brains of live animals during this classic experiment with a new technique. Ultimately, the approach could lead to a greater understanding of how we learn, and develop and break addictions. 

(Image caption: In a mouse brain, cell-based detectors called CNiFERs change their fluorescence when neurons release dopamine. Credit: Slesinger & Kleinfeld labs)

Scientists presented their work at the 252nd National Meeting & Exposition of the American Chemical Society (ACS).

The study presented is part of the event: “Kavli symposium on chemical neurotransmission: What are we thinking?” It includes a line-up of global research and thought leaders at the multi-disciplinary interfaces of the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative with a focus on chemists’ contributions. The effort was launched in 2013 by the Obama Administration to enable researchers to study how brain cells interact to form circuits.

“We developed cell-based detectors called CNiFERs that can be implanted in a mouse brain and sense the release of specific neurotransmitters in real time,” says Paul A. Slesinger, Ph.D., who used this tool to revisit Pavlov’s experiment. Neurotransmitters are the chemicals that transmit messages from one neuron to another.

CNiFERs stands for “cell-based neurotransmitter fluorescent engineered reporters.” These detectors emit light that is readable with a two-photon microscope and are the first optical biosensors to distinguish between the nearly identical neurotransmitters dopamine and norepinephrine. These signaling molecules are associated respectively with pleasure and alertness.

Slesinger, of the Icahn School of Medicine at Mount Sinai in New York, collaborated on the project with David Kleinfeld, Ph.D., at the University of California at San Diego. Their team conditioned mice by playing a tone and then, after a short delay, rewarding them with sugar. After several days, the researchers could play the tone, and the mice would start licking in anticipation of the sugar.

“We were able to measure the timing of dopamine surges during the learning process,” Slesinger says. “That’s when we could see the dopamine signal was measured initially right after the reward. Then after days of training, we started to detect dopamine after the tone but before the reward was presented.”

Slesinger and colleagues will also share new results on the first biosensors that can detect a subset of neurotransmitters called neuropeptides. Ultimately, Slesinger says they’d like to use this sensing technique to directly measure these neuromodulators, which affect the rate of neuron firing, in real time.

In a first, brain computer interface helps paralyzed man feel again

Imagine being in an accident that leaves you unable to feel any sensation in your arms and fingers. Now imagine regaining that sensation, a decade later, through a mind-controlled robotic arm that is directly connected to your brain.

That is what 28-year-old Nathan Copeland experienced after he came out of brain surgery and was connected to the Brain Computer Interface (BCI), developed by researchers at the University of Pittsburgh and UPMC. In a study published online today in Science Translational Medicine, a team of experts led by Robert Gaunt, Ph.D., assistant professor of physical medicine and rehabilitation at Pitt, demonstrated for the first time ever in humans a technology that allows Mr. Copeland to experience the sensation of touch through a robotic arm that he controls with his brain.

Keep reading

thescene

Ian Burkhart is paralyzed from the neck down, but thanks to an array of electrodes implanted in his brain he’s able to swipe credit cards and play video games with his own hands.

MORE. A Brain Implant Brings a Quadriplegic’s Arm Back to Life

A direct Brain-to-brain interface in humans

Humans communicated using only their brains and this sort of computer setup.
One person, the sender, sat in front of a gaming console with an EEG device attached to her head. Just by thinking “fire,” she could make her partner a mile away press a touchpad to fire a cannon.
The receiver was wearing a cap with a coil near the part of the brain that controls hand movements. He was in a dark room and couldn’t see what was happening in the game. If he received his partner’s brain signal, his hand would jerk upward, pressing a touchpad that fired a cannon.

3

Scientists just invented a rudimentary form of mind reading 

Using “brain-to-brain interface,” University of Washington researchers hooked up two participants to specialized machines for a simple game of “20 Questions.” In the game, one subject would ask a series of “Yes” or “No” questions like, “Can it fly?” The other would think the answer, which would signal a brain wave back to the first participant. The process stimulates the visual cortex causing that first participant to see a flash of light if the answer was yes. The results were impressively consistent.

wired.com
Ultrathin Silk-Based Electronics Make Better Brain Implants
Silk has made its way from the soft curves of the body to the spongy folds of the brain. Engineers have now designed silk-based electronics that stick to t

The research team printed electrode arrays onto silk films that disintegrate after they are placed on the brain’s surface and flushed with saline. They’re just 2.5 microns thick, so thin that they need to rest on a platform so they don’t fall apart during fabrication or implantation. After the silk film dissolves, the array wraps around the curves on the brain.