The goal of brain-to-digital interfaces has been the stuff of science fiction for decades, but is edging closer to science fact. The recent use of high-density electrocorticography (ECoG) to achieve ~60% success rates in “mind reading” points toward these capabilities becoming reality sooner rather than later. When combined with IoT sensor networks (including body computing/wearables) and rapidly evolving machine learning techniques, we’re probably looking at having a viable capability in the lab within the next five years. It will likely take longer to move from the lab to commercial deployment for a variety of reasons.
First will be technical and a set of problems we’ve seen and solved before – lighter, smaller, faster, less obtrusive, un-tethered, etc. Cell phones evolved from a shoe box to the current smartphone, AR and VR devices are in the middle of that technological improvement, and Brain Interfaces are in the earlier stages of that arc. We know what technical “right answers” look like, so we can work towards them.
The second set of developments will be around the user experience. New interfaces and mediums require rethinking of what a human does, how they interact with technology and the world, and why they should embrace a new capability. Just as the first movies were crude, the first computer User Interactions clunky, and the first Augmented and Virtual Reality experiences exciting but not particularly useful or compelling, the Brain Interfaces will need to evolve a new set of interactions and affordances to move from curiosity to compelling to necessary. We don’t know what “right answers” look like, so this is a challenging area.
The third set of developments will be around improving accuracy, efficacy, and safety. We’re in a world where technology capabilities are ubiquitous, literally from cradle to grave, and adoption rates continue to increase. Like any consumer product, we’ll need to sort out what agencies are looking at what aspects of how devices impact humans, both individually and from a society level. This is where independent standards organizations like IEEE can play an important role. There needs to be evidence-based analysis beyond marketing and hype as we move deeper into the digital epoch.
The fourth set of developments dovetail with the above, and will need to be around policy. What situations are appropriate for application of these interfaces, what aren’t, and new societal norms and ethical mores will need to be considered. Because of the complexity of the technology, it is critical that the scientists, engineers, and developers are involved in the conversation along with other stakeholders. The future of technology needs informed input from technologists, policy makers, and the public. It is not solely a government decision, nor does it entirely rest with corporations. It is a human question that needs engagement from all sides.
These emerging technologies beg the question, “what does it mean to be human?” The answer needs to be an ongoing negotiation for society.