Philosopher John Locke said, “I have always thought the actions of men the best interpreters of their thoughts.” Locke lived during the Age of Enlightenment. He probably wasn’t thinking about human machine actions during his philosophical ponderings. But what does it mean when machine actions are the result of human thoughts? No longer part of science fiction, many would argue that brain-machine and brain-computer interfaces are the next way we will communicate with machines and even with one another.
Brain-machine interfaces (BMI) and brain-computer interfaces (BCI) are devices that enable direct communication between a brain and an external device. BCIs let someone type onto a screen – without a keyboard. Brain-machine interfaces make it possible for amputees to move robotic limbs. BCIs can be as intricate as placing devices directly on the brain or via devices that communicate directly to machines without invasive surgery.
This type of technology opens a whole world of business applications. From dangerous jobs that already utilize robots to manufacturing, and even the consumer space. Brain-machine interfaces create a new way for humans to interact with technology, whether it be their smartphones, smart speakers, voice assistants, cars, and even each other. Startups and established companies alike realize the promise of brain-machine interfaces. They are racing to link humans to tech and machines, allowing humans to control digital technology using only their minds, which in turn opens up a whole new world of opportunities for businesses and brands to reach the customer of the future.
Here are 10 companies that are working on connecting our brains or actions to our machines and creating the future of input.
Recommended For You
- Apple, Microsoft And Other Tech Giants Top Forbes’ 2020 Most Valuable Brands List
- Purpose At Work: SanMar, Sustainability And Pivoting During Covid-19
- Tesla’s Brand Believers
Neurable’s mission is very exciting. Ramses Alcaide, founder of Neurable, got the idea for helping people with technology when he was a kid after his uncle lost his legs in a trucking accident. Alcaide said, “the idea of developing technology for people who are differently abled has been my big, never-ending quest.” Neurable launched onto the brain-computer interface scene in 2017 at SIGGRAPH with a proof of concept BCI game called “The Awakening.” Users put on a VR headset to escape from a room with only their minds. In December 2019, Neurable raised a $6 million Series A round to develop an everyday consumer based brain-computer interface in the form of headphones.
PROMOTEDMitsubishi Heavy Industries BRANDVOICE | Paid ProgramOil & Gas Outlook – Embracing Change From Now To 2040Deloitte BRANDVOICE | Paid ProgramResilient Generations Hold The Key To Creating A “Better Normal”UNICEF USA BRANDVOICE | Paid ProgramThe Case For Intersectionality: Commemorating Pride Amidst COVID-19
Alcaide sees neurotechnology built into a pair of headphones as the first step towards a BCI for consumers. Think about stopping, starting, or skipping songs with your mind without ever touching your phone. Interacting with smart devices with just our thoughts through a headphone-like device is impressive enough on its own. For Neurable, it’s the data behind the interactions that show the real value of BCIs.
Cognitive analytics are, “measures of different mental states, especially those aligned with performance.” BCI enabled headphones could help a person, “enter their desired emotion and then have a customized playlist generated to provoke that response.” Not to mention open a whole new world of metrics for marketers, training, health professionals, and a variety of other industries. Alcaide believes computing is going to become more spatial. He said, “As it continues to go down that path, we need forms of interaction that enable us to more seamlessly interact with our technology.”
MindX believes the next frontier in computing is a direct link from the brain to the digital world. They’re creating this link by “combining neurotechnology, augmented reality and artificial intelligence to create a “look-and-think” interface for next-generation spatial computing applications.” Part of spatial computing, is being able to interact with computers beyond a two-dimensional screen.
MindX uses smart glasses to create a link between human brains and technology. Julia Brown, MindX’s CEO, said smart glasses will let wearers access information with a single thought. Glasses connect to the mind from eye movements. Brain waves signal back what the wearer is thinking and where they are looking. BCI enabled smart glasses opens a world of opportunities for visual search. Think about your lost car keys and the smart glasses can locate them. Wonder what someone is wearing and get the brand and link to places to buy from the glasses – all with a thought.
While some brain-computer interface companies focus on understanding the brain and cognitive metrics, others focus on real-time device control. NextMind, headquartered in Paris, France, uses a non-invasive BMI “that translates brain signals instantly from the user’s visual cortex into digital commands for any device in real-time.” NextMind debuted their device at CES 2020. Visitors to the booth demoed changing channels on a TV with just their thoughts.
Users wear NextMind on the back of their heads. It “creates a symbiotic connection with the digital world” by combining neural networks and neural signals. The Next Mind SDK is open to developers. They’re at a price point that the industry believes consumers are ready for the next phase in computer interaction.
Neurosity’s goal is to help developers get focused faster and stay focused longer. Notion (Neurosity’s thought-powered computer) has eight sensors as part of an EEG headset. In their demo, a woman scrolls through a recipe on her tablet while cooking. In another, a man changes the lighting in the room with his mind. The Notion brain sensor can be pre-ordered. The device touts its secure design saying, “it never stores your brainwaves.” Something to look out for in a BCI.
Neurosity launched dev kits in 2019. The Neurosity developer community is one of the signs that brain-computer interfaces have arrived. Developers can write apps for Notion’s brain sensor, which is developed to do two things: to detect human intent and to quantify the self. Think of it like wearing a fitness tracker for the brain. In April 2020, Neurosity temporarily cut the price of Notion pre-orders to $799 for developers. Neurosity pledges their support to “developers interested in helping quantify the human mind even further” by making their team available to brainstorm, code, and deploy neuro apps.
Kernel is a neurotechnology company based in Los Angeles, California. Their aim is to create “a brain interface that develops real-world applications of high-resolution brain activity.” Kernal’s founder and CEO, Bryan Johnson, believes in a world where people are empowered by technology, not limited by it. He sees neuroscience, specifically Kernal’s “neuroscience as a service (NaaS)” as a way to get there. Kernel was featured in I Am Human, a 2020 award-winning documentary about the “the scientists and entrepreneurs on a quest to unlock the secrets of the brain.”
Kernel created two different experiments with their technology. One is Speller which allows participants to type with only their gaze and a visual keyboard. The other is Sound ID that can decode song IDs based on the brain signals from the listener. These experiments show that with just a helmet, brain scientists can run the same type of experiments as those in labs with room size equipment. With the use of a helmet, brain scientists can study thousands of more people than they can currently. Johnson believes this can help people who have suffered strokes and are unable to speak or those dealing with mental disorders.
There are so many applications when it comes to brain-machine interfaces and neuroscience. Nectome’s technology is developed to preserve human memory by studying how the brain physically creates memories. Nectome isn’t just creating a BMI for the present. They’re hoping to change how people “preserve the languages, cultures, and wisdom of the past, and how health care engages with individuals’ memories and personal narratives.”
President of Y Combinator, Sam Altman, is “one of 25 people who have put down a $10,000 refundable deposit to join a waiting list at Nectome.” The only catch is, Nectome needs a living brain to capture the memories. The procedure kills the patient. “Nectome planned to test it with terminally ill volunteers in California, which permits doctor-assisted suicide for those patients.” Sam said of the procedure, “I assume my brain will be uploaded to the cloud.” The startup has faced some setbacks but seems to still be in operation.
Eventually, Nectome believes their biological preservation techniques will be like an episode from Amazon’s Upload TV series. At the end of their life, patients can choose to “upload” themselves into a digital afterlife.
CRTL-Labs uses non-invasive neural interfaces to “expand human bandwidth”. CRTL-Labs recreates the “0s and 1s” of neurons by listening to muscle twitches. They send the signals into machine learning to decode a person’s intention. This network is fed back to the wearer to create a symbiotic relationship. Thomas Reardon, CEO of CRTL-Labs said, “AI and Machine Learning can be dominated by us.” CRTL-Labs does all of this with a wristband.
Facebook’s leadership have talked about a new type of interface “that includes work around direct brain interfaces that are going to, eventually, one day, let you communicate using only your mind.” In September 2019, they bought CRTL-Labs and have said the following about the acquisition, “‘The goal is to eventually make it so that you can think something and control something in virtual or augmented reality.’”
Neuralink is a company owned by none other than Elon Musk. The man who made electric cars cool (Tesla) and sends astronauts to space is his own spacecraft (SpaceX) also wants to connect humans to machines. Neuralink takes a slightly different approach to brain-machine interfaces by placing “threads” into the brain. Elon Musk “wants his brain implants to stop humans being outpaced by artificial intelligence.”
Neuralink threads are connected to a 4mm chip called the N1. The chips are “placed close to important parts of the brain and are able to detect messages as they are relayed between neurons, recording each impulse and stimulating their own.” The chip connects to a wireless device worn over the ear which is Bluetooth enabled. Currently, the chip is placed via traditional brain surgery but Musk envisions the chip will be inserted virtually painlessly in the future. Applications for the Neuralink are endless – from treating neurological disorders to replacing language, and eventually turning humans into cyborgs.
Paradromics developed brain-computer interface technology to help those disconnected from the world by mental illness, paralysis, or other types of brain disorders. Paradromics believes they can meet medical challenges with technical solutions. That is, a high-data rate brain-computer interface. Similar to the Neuralink, Paradromics places electrode arrays on the brain. They do this with a “computer chip that plugs into a part of the brain called the cortex.” With Paradromics technology, mental disorders and injuries no longer have to be debilitating. They can connect those affected back to the world.
Zoolingua – Decoding Fido’s Thoughts
Brain-computer interfaces aren’t just for people. Zoolingua, owned by Con Slobodchikoff, wants people to understand dogs. Their device will allow both dog and human to communicate in both directions. The translating dog collar from the movie UP is coming to real life. Zoolingua bases their technology on research. “Observing (through video) dog vocalizations and behavior in specific contexts; classifying the complex forms of communication that occur; and working with computer programs to effectively and accurately decode and translate into US-English.”
According to an Amazon report, “advances in AI and machine learning will enable companies to make devices that can accurately translate a cat’s meows and a dog’s barks into English.” William Higham, co-author of the report, “believes devices that can talk dog could be less than 10 years away.”
Separate from Zoolingua, another example of someone working on decoding Fido’s thoughts is Dr. Gregory Berns from Emory University. He is a neuroscientist who’s also interested in what dogs think. Dr. Berns developed a “go/no-go” test to scan dog brains in M.R.I machines. The results show “dogs use corresponding parts of their brain to solve similar tasks as people do.” This isn’t something seen in non-primates before.
What Comes Next
Neuroscience technology is a quickly developing field. It’s one with endless applications for understanding the brain, unlocking human potential, and preserving today’s minds for the future. Some of the companies listed above are working towards specific use cases. Some use direct-brain sensors while others use non-invasive devices.
What each brain-machine interface company has in common is that they see the world as a connected place. It’s going to become even more so. The future of computing is beyond two-dimensional interactions. It’s more than voice, facial recognition, artificial intelligence, and augmented reality.
It’s all these things coming together under the power of the human brain.