There were a lot of stories about various human computer confluences in the news this week. Almost too many to choose from for our weekly news roundup! Two major breakthroughs made global headlines: one from Brown University, and another from MIT. A few other exciting announcements caught our attention as well. At Trinity College, a thesis on gaze-controlled human computer interfaces has been made available to the public. It’s well worth the read, and a perfect example of the variety of areas being researched with non-invasive neurotechnology. And at the ACM SIGCHI earlier this month, for the first time ever, a Digital Arts component was added to the renowned conference series. This special interest group conference is a premier destination for Human Computer Interaction, and celebrated it’s 30th anniversary this year as well.
Read more after the jump.
1// Research Breakthrough for Paralysis Patients
At Brown University, a report authored by neuroscientists from the school demonstrated that patients with severe brain injuries or paralysis are able to move and control objects with help of a brain computer interface. The report gives conclusive results that this interface could support such patients by providing day-to-day autonomy. Two test patients were given sensor implants in the motor cortex, which recorded neuron activity from that area of the brain and translated it to a computer. Even after significant lengths of time in paralyzed states, both patients successfully learned the mechanisms of the interface, and how to translate thought to action through the interface.
A few interesting opinion pieces even appeared about this, including one by computer scientist Peter Bentley
2// Relax: Let This BCI Handle The Rest
Researchers at MIT and Tufts University have just released Brainput, a wearable brain computer interface which detects when a user is overwhelmed by their workload. Essentially, BrainPut would enable the computer to “learn” the mental boundaries of its user, and assist with completing tasks. The system is designed to use fNIRS technology, especially effective when looking at how a user handles multi-tasking. This can also be used to interpret a users brain scan data, and to understand how an interface could be used to assist with offsetting workload. In the report, researchers found a variety of other ways the interface could detect when a user is overwhelmed, such as speed of typing or facial expressions.
The InteraXon news roundup is published weekly, every Sunday night, to recap trends and breaking news in the world of brain computer interfaces and thought controlled computing. Do you have a story you’d like to submit or share? Contact us at email@example.com (subject line “News Story”) or leave a comment here.