Johns Hopkins researchers have demonstrated the ability to "feel" virtual objects by integrating neural stimulation in a mixed-reality environment. The study participant, an incomplete quadriplegic, demonstrated virtual tactile perception as part of a larger study exploring neural multiplexing and new modes of perception enabled by brain-computer interfaces.
"All organisms rely exclusively on their sensory organs to perceive information about the world around them," says Mike Wolmetz at the Johns Hopkins Applied Physics Laboratory. "BCI creates a new pathway to perceive information directly, in ways that are not constrained by, filtered through, or aligned with our specific sensory organs.
The research is part of the Neurally Enhanced Operations project, funded by the U.S. Defense Advanced Research Projects Agency to investigate neural multiplexing.
The team has been focusing on augmenting tactile perception but the possibilities likely go further, Wolmetz says. "Having direct access to the mind and brain could change everything, and this is just the tip of the iceberg," he says.
From Johns Hopkins University
View Full Article
No entries found