The Ultimate Telepresence
By Jon Sung | Stardate 68900.4 | Earthdate 01.01.1970
Stardate 47215.5: The crew of the USS Enterprise NCC 1701-D is testing a brand new probe operated by a unique control scheme: linking directly to chief engineer Geordi LaForge’s brain through his VISOR inputs, providing an unparalleled level of control and “you are there”-style verisimilitude. Geordi can see what the probe sees and feel what the probe’s sensors detect. For all intents and purposes, it’s as if he’s there in person.
This comes in super handy when the Enterprise is tasked with the rescue of the USS Raman, a Federation science vessel trapped in the atmosphere of a gas giant. It comes in less handy when it turns out the interface works almost too well: above a certain level of signal strength, feedback from the probe can physically burn, shock, or otherwise injure Geordi while he’s operating it. They didn’t make him chief engineer of the flagship for nothing, though…he stays at the probe’s controls and risks frying his entire nervous system to bring to safety some subspace beings native to the gas giant who accidentally killed the crew of the Raman and briefly took the form of his mother (it’s a long story). After being safely brought back from the brink of neural overload by Data and Dr. Crusher, Geordi packs the interface suit away and (one can safely assume) puts a few appointments on Counselor Troi’s calendar.
This episode of Star Trek: The Next Generation was called “Interface” for a reason. Let’s talk about this thing! Let’s talk about how it’s the ultimate in telepresence: It’s got instinctive movement. Geordi doesn’t have to mess around with a joystick or even a steering wheel, he just thinks about walking somewhere, and the interface correctly interprets his mental impulses. A thought like “hey legs, let’s head this way…” becomes instructions like “turn to heading 026 and advance at 4 km/h” for the probe.
Geordi can manipulate objects without using a controller or even a waldo: he just thinks about pressing buttons or pulling levers with his hands. Here too, the interface is able to understand these neural commands and translate them into pulses from a miniaturized tractor beam mounted on the probe.
Interpolated through the probe and its interface, Geordi suddenly gets superpowers! Confronted by a locked door the probe can’t open, Geordi requests a phaser burst at a specific power level, then aims his hand at the door’s controls. A phaser beam leaps from his palm and fries the panel, popping open the sealed door.
Now that’s what I call an interface! It’s the kind to which any biomedical engineer or cyberneticist would surely aspire. Who wouldn’t want to be able to pilot something with their mind? Will we ever achieve that level of control?
The first steps have already been taken. A couple of years ago, the Defense Advanced Research Projects Agency (DARPA) found a paralyzed woman named Jan Scheuermann who agreed to have some cortical probes planted in the surface of her brain, which were then wired to a robot arm. Eventually, Scheuermann learned to control the arm with her mind, and gained enough dexterity to feed herself and give people high-fives.
Already we see some interesting parallels with Geordi’s 24th-century marvel, like the fact that it uses a brain implant; the probe interface works so well for Geordi because it goes through his VISOR inputs. Despite not being a fighter pilot, Jan Scheuermann was able to keep her F-35 in the air using nothing but intuitive neural impulses, similar to how Geordi operated the probe by walking around and touching things.
Of course, there are also gaps. An F-35 can break the sound barrier and carry air-to-surface missiles, but it can’t traverse the corridors of a downed ship, pull control levers to activate a fire suppression system, or take someone’s pulse. Neither does DARPA have anything like Geordi’s bodysuit, which transmitted useful tactile information to the rest of him.
But those are engineering problems to be solved, refinements to be made; the basic concept, incredibly, seems sound. What can we do with it? Since it’s a DARPA project, we can bet there’ll be military applications, but what about civilian ones? Interestingly, “Interface” also dealt with this directly: it has amazing possibilities for search and rescue. Imagine the aftermath of an earthquake where there are zones of unstable rubble, fires everywhere, and you want to look for survivors. What if you could send a swarm of remote-operated robots out instead of risking more lives? What if these robots were able to do serious work instead of just carry cameras — things like rendering medical aid with a suite of onboard tools, putting out fires, or shifting wreckage? Intuitive telepresence combined with advanced robotics would also do wonders for the nascent field of remote surgery. I’m for all of this, especially if we can get interface suits like the one Geordi wore; it’s not the future unless somebody’s in a super-advanced techno-unitard.
Jon Sung is a contributing writer for XPRIZE and copywriting gun-for-hire to startups and ventures all over the San Francisco Bay area. When not wrangling words for business or pleasure, he serves as the captain of the USS Loma Prieta, the hardest-partying Star Trek fan club in San Francisco.
XPRIZE is an innovation engine. We design and operate prize competitions to address global crises and market failures, and incentivize teams around the world to solve them. Currently, we are operating numerous prizes, including the $30M Google Lunar XPRIZE, challenging privately funded teams to successfully land a robot on the Moon’s surface, and the $10M Qualcomm Tricorder XPRIZE, challenging teams around the world to create a portable, wireless, Star Trek-inspired medical device that allows you to monitor your health and medical conditions anywhere, anytime. The result? Radical innovation that will help us all live long and prosper.