Eyefluence: the Missing Link in MR Headsets
[NOTE: This is an extract from Chapter 6 in Beyond Mobile: Life After Headsets, my book-in-progress with Robert Scoble. The chapter spotlight only one company: Eyefluence, which you probably never heard about. Since it is only involved in business-to-business deals, you may not often hear about them. But chance are that if you use AR or MR glasses at some point in the future, the quality of your experience will rely on the technology their technology.]
Jim Marggraff and David Stiehr, co-founded Eyefluence on 2012 but they had already been collaborating at that time with eye tracking software technology they had acquired. They started playing with ways to make it more valuable in all senses of the word.
Eye tracking, in itself, is no big deal. It predates the PC by over 100 years. Scientists, equipped with pens and paper, developed a method to study how the brain works when people read. The approach came to computers in the ‘80s when two psychologists named Marcel Just and Patricia Carpenter theorized that brains process what they see as fast as the eye sees it.
You can’t get much faster if humans are involved.
Scientists found a string of valuable ways to use eye tracking in psychology, psycholinguistics and other health-related fields. It is used to enable quadriplegics manipulate objects on screens by nodding and blinking. Stephen Hawking’s speech synthesis system uses it to enable his thoughts to be translated into simulated speech. More recently, Fove, a startup headset maker, incorporated it into a VR headset, exciting action gamers who enjoy zapping aliens faster than was previously possible.
Yet none of these work at a pace that is anything close to the speed in which the brain understands what the eye reads, which is precisely what Eyefluence does.
They have taken the old eye tracking software and built upon it extensively, filing 30 patents to protect what they have created. The result is a new category of operating headset software that they call Eye Interaction [EI].
Instead of just watching a user’s eye and taking cues from staring, blinking or nodding as eye tracking does, Eyefluence’s EI software watches where the headset user’s eyes go and responds to the natural ocular motions. The company hopes those patents will allow it to be the single source of this type of software giving it a very strong position in a nascent but inevitably huge market.
Marggraff told us that Eyefluence is engaged at various stages with most major headset makers.
EI software is best appreciated when used, rather than talked about in books. We received our demoes separately. Scoble saw it at SXSW and was blown away. Of the two of us, he has a faster, more intuitive, understanding for new technology.
Israel looks more at business strategies with the skepticism acquired over years as a reporter who saw many dazzling tech launches fizzle when the same products failed under less controlled market conditions. He visited Eyefluence in May at the company’s Milpitas, California headquarters and was determined to not be impressed.
By that time, Israel had tried most new headsets and doubted that what he had seen so far would survive, never mind prevail, in everyday situations as smartphones do. He was particularly concerned about personal productivity.
When he talked earlier with Marggraff, the entrepreneur had argued that headsets would be the only digital technology that people would need within the next decade and he had predicted Israel might write his next book with his eyes in a headset rather than with fingers on a keyboard.
It wasn’t just word processing challenges that made him skeptical. Israel also wondered about personal productivity including spreadsheets and presentation slides.
Like most writers, Israel spends more time on a desktop computer than on a mobile phone. He has never perceived writing a long piece without an external keyboard and large screen. The idea of using MR glasses to write a book sounded like an un-pragmatic fantasy to him.
Yet, after a fifteen-minute demo that included Marggraff’s two-minute lesson on how to use Eyefluence, Israel was looking forward to writing anything he wanted with a smart glass device in the near-term future.
Marggraff gave Israel about a two-minute demo, then Israel found himself effortlessly moving and opening objects with his eyes. He was relaxed, but found himself navigating faster than he had ever gone with a point-and-click device. He booked a flight and saw how his doctor could call up x-rays during a visit just by looking at an icon.
Israel experienced a reversal in his usual process: Instead of going to technology to make something happen, Eyefluence, brought the technology to him.
Prior to the demo, Marggraff had talked about how his PhD-dominated team had spent years studying the human eye. Israel had paid little attention. Nearly all entrepreneurs boast of team talents, educational degrees and unique cultures.
Now, as Israel tested the new tech, he quickly realized as he watched virtual screens: The technology was watching him, was seeing where he looked and where his eye stopped. He intentionally wandered off from where Marggraff was telling him to go, and the technology followed his eyes and not Marggraff’s guidance. The demo was real and the software was understanding what he wanted because it was watching, and, he was told — remembering.
Software that watches you and gets smart about you is freaky stuff, we wrote previously. And while this could have been extremely freaky, Israel was having too much fun to allow that to to distract him from Whac-a-Mole, a game introduced in 1976 that Israel had not previously played, when it came up.
If you, like Israel, are among the few people on Earth unfamiliar with it, the object of the game is to clobber cute little rodents as they pop their smiley heads out of the ground. The more you whack; the more points you get.
First, Marggraff instructed him to whack moles by eye tracking which required head nodding. Every time a mole showed itself, Israel nodded and bopped. As he got the hang of it, his nodding-and-bopping rate accelerated. When it was over, he thought he done pretty well for a rookie.
Next, he went mole whacking with Eyefluence. Instead of nodding, he just moved his eyes. The device kept score and Eyefluence empowered Israel to whack 40 percent more moles in the same period of time.
Eventually, Israel found himself looking at 40 screens surrounding him by 360 degrees horizontally and 180 degrees vertically—the full span of any enhanced reality environment. It was far different than a computer desktop home screen where different content is arranged in tabs. With the headset, Israel could easily zoom to any of 40 sites just by looking: he could also scroll or manipulate as he saw fit.
This is what a computer environment will look like in the near-future and it did not seem to be even slightly freaky; it seemed natural and productive. He knew his personal data was being collected, but that seemed to be a fair rice for the experience he was having.
Typing by Eye
Marggraff showed Israel how Eyefluence could expedite the daunting task of organizing content for writing a book-writing project. Very often, authors take notes on multiple devices and applications. The end up in their own quagmires of text, video and audio snippets. Israel uses everything from Post-It notes to video and audio clips. In extreme cases he has jotted URLs on the back of his hand.
The hard part is always assembling these disparate informational tidbits into a single, cohesive chapter and then into a full book where thousands of pieces fit snugly together like a jigsaw puzzle.
Like most authors, Israel uses Microsoft Word for the assembly and production. For him it has always been an awkward and inefficient process, and he has lost more than one valuable vignette in this haphazard process.
As he sat looking through the headset at the 40 screens, Marggraff showed him how he could cut and paste content from all into one consolidating screen, thus streamlining his process in a way that could save weeks, perhaps months.
After the demo, Marggraff told Israel about a killer app for authors that would be coming soon: a QWERTY keyboard for headsets that you could type by eye. Eyefluence already had it, but it was still too slow for a demo, but he predicted that when ready, people would type as fast as the eye could move.
In the tech community, paradigm moments are mentioned far more often than they actually occur in life. But for Israel, this was one. He could think of nothing that would not be better on a headset than on a mobile or desktop device—and he was eager to make the swap. For him, that keyboard is the killer app. For others, there would be equally dazzling and useful apps.