The view from Inside Epson's Moverio smart glasses program

Is It Live? Or Is It Augmented Reality?


“You know, there’s probably a few million people playing video games in Brooklyn right now,” Mark Skwarek guessed. “I don’t know who they are. They’re having an isolated experience.”

And Mark should know— he used to be one of them. Before he was a full-time faculty member at New York University Polytech and its director of the mobile augmented reality lab, Skwarek did a lot of virtual reality work. He spent a lot of time in multi-user online games and, as many in that field do, getting completely immersed in his work. He confesses that it took a lot to get him out from behind the computer.

But then he started working with augmented reality—before there were any mobile devices to use it on. It took a significant amount of work to build an app without the actual phone, but they had it ready as soon as the iPhone 3GS launched.

Frustrated by the amount of effort it took to make any changes, Skwarek found the web-based content service Hoppala that had created an app to allow people to place augments anywhere on Earth. For example, he could create a model of a Tyrannosaurus Rex at the Brooklyn Bridge, and if you went there with a mobile device and the app, you could see it in all its 3-D glory. And even though it was easier to make changes on this app than their first iteration, you couldn’t make changes to the app onsite.

Soon, a crossover occurred between Skwarek’s augmented reality app and his art career, and it didn’t just get him out from behind the computer—he also started to travel. He started creating high-profile AR “installations”, like in the Demilitarized Zone between North and South Korea. “I would basically travel along the DMZ and create this artwork,” remembered Skwarek. Other destinations included the Israeli-Palestine border, the U.S.-Mexico border, South America, Australia and Europe.

Finally, the digital work had released him from being shackled to the computer, and it opened his eyes.

“It made the real world more exciting for me,” Skwarek said. “I could do anything I wanted, without all the permits you needed to have art realized in the public space. You can put art anywhere you want, even the White House—and we did.”

When Skwarek accepted the full-time position at NYU, it gave him access to better tools, and allowed him to do more with his project. Being able to make changes onsite had always been his goal, but now Skwarek also wanted to make it as easy as possible for any level of user to get on board and become part of the augmented reality movement that was just taking off.

“To me, people have great ideas then are held back by the technology barrier,” explained Mark. “Our app, Create AR, allowed people to make stuff anywhere and everywhere, and modify it at the location they created it. We found that this was wildly popular.”

So, they were on the right track. But it still didn’t have the complex interactivity that they were looking for. “I could create my own digital world, but it’s still not interactive. We wanted to make the real world exciting enough to go out and explore, without breaking someone’s digital experience.”

The team took a chance and created Play AR, which basically turns the entire planet into a multi-user online video game, but one that’s skewed towards people working together in physical reality. Each player claims their own space in the world, and customizes it as they wish, and fortifies it against the advances of other players. “I could make a giant billboard above my office at NYU or have a waterfall,” explained Skwarek. “I could even put unicorns on my front lawn.”

Right now, Skwarek and his team are focusing on the game play aspect, but they are also looking to refine the Play AR experience, which includes first person view (FPV) technology, like the Moverio BT-200s. “If you want to hit the larger markets, gaming on a phone is not enough. It’s still great, but the next step with the glasses is going to be a lot cooler.”

Unlocking a City’s Secrets with Moverio



The panel discussion Neon Roots hosted at the Los Angeles Public Library discussing our collaboration with USC and the library. Saturday, November 1, 2014. From left: Ben Lee, Robert Hernandez and Serhan Ulkumen.  

A library card. How many people reading this will have one in their wallet? How many will have visited a library within the last month? Year? Decade?

For bibliophiles, it’s a sad reality that many people never visit libraries any more, unless they need to for school or research. And in truth, won’t most people try searching the Internet for information first? Perhaps that’s why USC’s Annenberg School for Communication and Journalism chose to bring a neglected but historically fascinating public library together with cutting edge technology.

Neon Roots has come a long way from their start in 2011, when they had the intention of being the first AR shop in L.A. “It was very new, and very novel…very much in its infancy,” remembered Ben Lee, managing partner of Neon Roots, one of the developers who joined us at CES this past January. “There weren’t even any wearables, or anything like that.” They got their first big break working with Steve Angello’s electronic dance music label Size Matters, then moved on to producing AR experiences for companies like Radical Studios. “Then last year was our biggest break when we started working with Moverio,” explained Lee. That’s when Annenberg and the library project came calling.

“The class was called ‘Storytelling Through Augmented Reality’,” recalled Lee, speaking from the company’s studio in Los Angeles, California. “We taught the students how to develop the content to augment the Los Angeles Public Library.” The building has a rich history, including a fire in the 80s started by an arsonist whose affections were spurned by a librarian—and who was never apprehended.

“We created this really fun scavenger hunt with ghosts and clues using object tracking,” Lee added. “There’s a tremendous amount of awesome things in that library, and so many stories we found sifting through old content.” Using a platform from Metaio, another of Epson’s development partners, the Neon Roots team created an augmented reality (AR) app that would overlay information once it was pointed at a specific object.

And if you think the librarians were looking over their reading glasses disapprovingly at the students and their technology, you’d be wrong. “They were so excited to see a way to productize something to help augment their library,” said Lee. “We heard a lot of ‘We love this!’ from the librarians. It was a really cool experience providing a fun attraction for people of any age.”

In fact, the librarians liked the Annenberg project so much that they invited Lee and his team to sit on a panel to discuss creating AR solutions for other libraries. This garnered the attention of the mayor’s office. Now Neon Roots is working on many initiatives to help the Los Angeles Tourist Bureau, such as helping foreign tourists navigate the city and find public transport. Lee feels that consumers are starting to not only understand the technology, but also using it on a day-to-day level. He sees it as a boon to resurrecting magazines, newspapers and other physical-content media, even augmenting a movie poster.

“Augmented reality will be the reality,” explained Lee. “Whether we like it or not, we can expect holograms and some form of wearable device around us to be part of our daily workflow.” Just like our mobile devices and smart phones? “Exactly.”

(Photo courtesy:Napoleon Martinez/Neon Roots) 

Fighting Fire With Wearable Technology


“Our escape route has been cut off.” These were some of the last words spoken by Eric Marsh, the group leader for the Granite Mountain Hotshots, an elite crew of wildland firefighters in Arizona. This last radio transmission from Marsh on June 30, 2013 shows the miscommunication between the firefighters and dispatch, and the disarray of the team surrounded by the flames.

None of the 19 Granite Mountain Hotshots made it out alive.

No one becomes a firefighter without fully understanding the dangers of the job. And, out of the almost 1.2 million firefighters in the United States, over 780,000 are volunteer firefighters, meaning that they don’t get paid to run into burning buildings to save lives.

Rob Dearden is one of those volunteer firefighters, but he’s also an employee at New Frontier Technologies in North Kansas City, Missouri. His experience and interest in technology has helped him develop an application for the Moverio BT 200 glasses with the goal of making such deadly miscommunications a thing of the past.

It all started at a Google hackathon, where Dearden met Mike Sterle-Contala from McGill University in Montreal, Canada. From a series of initial ideas at the event, the team finally struck on creating an augmented-reality training program for first responders.

“Based on this, we started doing geolocation inside of buildings, creating wave forms or “breadcrumbs”, as we called them, to mark where you came in and how to find your way out again,” explained Dearden. “It allows firefighters to go through a search and rescue session with virtually no visibility.”

The “breadcrumb” digital markers could also be used to mark potential hazards, like weak spots in a floor, or potential access or exit points, like windows and doors. Being able to use these virtual markers would affect firefighters’ safety in many ways, but two in particular: One, if a firefighter became disoriented due to low oxygen or injury, the markers would guide them to safety; and two, if a firefighter called mayday, the markers could lead the Rapid Intervention Team right to them.

Many fire departments have started to use drones, and Dearden imagines how they could combine the video from the air with GPS technology to communicate to the firefighters on the ground. “The base could warn the team if a dangerous situation was developing so they don’t get caught, like they did in Arizona,” Dearden said. “Or, if they’re already surrounded by fire, base can show them where the weakest point is. They may come out burned, but at least they come out.”

Dearden has many more goals for the device’s functionality, including temperature sensors, accelerometers and thermal imaging—being able to “see” if there’s a fire on the other side of the door. “I literally have to reel myself in at times,” said Dearden about how many ideas he has for development. “I have to remind myself that you have to get the first things working before you can do the rest.”

The Moverio BT 200s are Dearden’s choice for an eventual commercially viable product, and he is currently looking for additional funding. “It started out as developing a training software, but the concept goes way beyond that.”

Vision Quest for the Legally Blind

Oxford prototype BT 100 and Xtion

Oxford Prototype using Moverio BT-100

There are over 360,000 people in Britain alone who are partially sighted or registered legally blind. In the United States, this number rises astronomically to over 6.7 million people. And “legally blind” does not mean that these men, women and children are completely without vision; many are just suffering from severe vision problems like macular degeneration, retinitis pigmentosa or retinopathy.

“For most people, it’s a gradual continuum of sight loss,” explained Dr. Stephen Hicks, a clinical neuroscientist at the University of Oxford in the United Kingdom. “Naturally, no one really wants to tell everyone about their condition. Many people have reported that they’ve really embarrassed their friends by just walking past them. It’s hard to maintain relationships like that.”

Not being able to detect faces or expressions is just one of the problematic situations that visually impaired people experience. Almost four years ago he began designing his Smart Specs, a pair glasses that will help people with limited vision “see” their surroundings. He was interested in building his own product, something that looked like a regular pair of glasses—something that would be discreet, so that people wearing them wouldn’t be recognized as visually impaired.

Although Dr. Hicks and his team had created prototypes, they would be costly to produce, in regards to both time and money. They tried a lot of other augmented reality glasses on the market, but Epson’s first version of the Moverio glasses, the BT-100, won out.

“There were a lot of things I was looking for in a display that I found in the BT-100s. I wanted something really clear and transparent. I wanted people to use sight as they normally do, which meant they needed to be binocular,” said Dr. Hicks. “The characteristics of the display were great. Plus, the people we were testing it on preferred it to the other choices.”

Dr. Hicks and his team installed a depth camera, a combination of an infrared projector and an infrared camera, on top of the BT-100 using a custom 3-D printed frame. It projects a structured light pattern, and the camera interprets the patterns to find objects nearby. “It could be a wall, a desk or a person,” Dr. Hicks said. “They’re coded by brightness: If you’re very close, the objects become very bright. If they’re further away they become darker.”

Recently, Dr. Hicks and his team received funding from Google to take the latest Moverio version, the BT-200, and build on what they’ve already developed in order to start trials in 2015.

And so far, the glasses have been well received. “I’ve always liked augmented reality. I’ve always thought that it would obviously be an awesome thing,” said Dr. Hicks. “The fact that I can create something to help bring that about is incredibly satisfying.”

The Royal National Institute for the Blind feel that glasses could help over 150,000 legally blind people in the United Kingdom, as well as over 15 million people worldwide. See the Smart Specs in action in the two videos below:

Epson Featured at the Largest Software Event Ever…

bt200_04What do Hillary Rodham Clinton, Bruno Mars, former Vice President Al Gore, Michael Leyva, Brian Ballard (APX Labs), and I have in common?  We’re all speakers at Dreamforce 2014, a conference hosted by that is being billed as the largest software event ever.  Based on the fact that I couldn’t find a hotel room under $1,000 in San Francisco, I wouldn’t doubt that claim!  The Epson Moverio team is thrilled to be participating at the event in a number of capacities, including:


  • A booth in the Developer Zone/IOT (Internet of Things). We’ll be here with our friends at APX Labs showing off our Moverio smart glasses running on APX’s Skylight platform powered off of data from the Service Cloud.

Monday, October 13th:                    8:00 AM – 8:00 PM

Tuesday, October 14th:                   8:00 AM – 6:00 PM

Wednesday, October 15th:              8:00 AM – 6:00 PM

Thursday, October 16th:                  8:00 AM – 2:00 PM

  • A panel: Wearing the Future – how wearables will impact the enterprise.  This panel is all about showcasing real use cases and applications that can impact customer experience and employee productivity within the enterprise.

Tuesday, October 14th, 11:30 – 12:30 / Location: Palace Hotel – Ralston Ballroom

  • A workshop: Optimizing your app for smart glasses. In this workshop, you’ll learn more about optimizing your application for the Epson Moverio smart glasses.  The hands on component of this session will include:
    • an exhibition of a cutting edge integrated demo built by APX Labs
    • the opportunity to bring your own Android app (apk file) to test on a pair of Epson Moverio smart glasses
    • a detailed overview of developer tools and resources available for the Epson Moverio platform

Wednesday, October 15th, 1:30pm-2:15pm / Location: Moscone Center West IOT Lab #1

Register here:

We’ve committed to give away a limited number of glasses to those developers that can port and demo their Android apps on the Moverio smart glasses, so please swing by our booth or workshop to find out more.

See you there!

Eric (@wteric)

Propeller Panoramas


Key Hole – Big Sur

“I really didn’t think they were such a great idea until I had them on and I was flying,” remembered Romeo Durscher about the first time he tried flying a drone with the Moverio BT-200 glasses. “It completely changed my perception of the possibilities.”

He and his DJI partner in crime, Mark Johnson, had used other FPV (first-person view) solutions before, but none offered transparency. “Once you put on any other kind of glasses, you forget everything around you. While that’s really cool, it’s not very safe when you’re flying a drone,” explained Romeo. “Moverio gives you the option to see what’s happening around you, and you can quickly focus on the screen to see what the camera sees.”

Romeo finds that the Moverio’s dark shades help him shoot better when it’s really bright outside, especially compared to a tablet or smartphone, where the glare from the sun hinders the view of the screen. They allow him to fly more confidently while lining up better shots.

Shooting photographs using a drone can make terrestrial-bound photographers green with envy. Romeo recalls his trip to Big Sur at the winter solstice, where for just 10 days the setting sun shines through a rocky archway just offshore. “Literally hundreds of photographers travel from all over the world to see this,” said Romeo. “Your position on the beach becomes extremely important. Once the time was right, I flew the drone over the beach and grabbed a shot that none of them could hope to capture. It was truly a unique perspective.”

Romeo shared his top four techniques for shooting panoramas with a drone:

  1. God’s eye view: “If you just want to take a single image, go for the god’s eye view. With the Phantom Vision+, you can look straight down on any sort of tower or tall object. You get a perspective that looks very surreal because it’s rare we ever see it.”
  2. Panorama–horizontal: Using the drone’s camera and some good flying skills, photographers can take a number of stills along the horizon to “stitch” together in Photoshop to create this common form of panorama.
  3. Panorama–vertical: “Few people do this because it’s not very intuitive,” Romeo explained. Start from the god’s eye view and then begin tilting the camera up until it reaches the horizon. “Once the panorama is stitched together, it really screws with your brain. It realizes that something isn’t right—you can’t look straight down and at the horizon at the same time.”
  4. Tiny planet: “You take a panorama and work it in Photoshop so that the edges match each other and you wrap it around. Essentially, you make a little planet out of it.”

You can see more of Romeo’s panoramas and photographs on his website, and follow him on Twitter, Instagram and Vimeo to see what he and Mark get up to next.



God’s Eye view


Panorama – horizontal


Panorama – vertical


Tiny Planet

Photos Courtesy: Romeo Durscher

Note: This is the second in a series of four posts inspired by a pre-show workshop arranged by Adobe’s Russell Brown for Photoshop World 2014. Checkout the first and third posts called Drones on a Plain and A Stitch of Nine in No Time.

Drones on a Plain


“Oh my God,” Abbé Lyle said softly. Her reaction made the rest of us nervous: Would she crash the drone? Would she blow the last shot of the day? Tears ran down her face, followed by something peculiar: a wide smile.

Abbé was one of 45 participants who left the hubbub of the Las Vegas strip and headed southeast to a dusty ghost town nestled in a red-rock valley to attend Russell Brown’s Lights, Aerial Camera, Action! Workshop.

Nelson, Nevada is a photographer’s playground and the perfect place to fly a dozen camera-toting DJI Phantom drones. Tilted, hollow barns surrounded by antique trucks sitting on melted tires lined the single road. Detritus of years past, including a crashed WWII plane, covered the desert. It was surreal, hot and…dangerous. We were warned to watch out for fire ants, rattlesnakes and cacti that will literally shoot their spines at those who pass too close.

Our goal was to create a collaborative video. The participants, their skill level ranging from beginner to advanced, were separated into eight teams. The instructors put the participants through their paces, making sure that their team had the skills required to not only take great shots, but to keep the drones safely in the air.

AbbeAs the sun set, the shadows grew long and the sky turned a deep amber. Magic Hour had finally arrived. Up until this point, everyone had flown the drones using a smart phone mounted on top of the drone’s controller to frame their shots. But Abbé’s flight was different. She was the first to pilot her drone wearing Epson’s Moverio augmented reality glasses. Her tears of joy resulted from her first person view (FPV) from the drone.

The rest of the participants tried their hand at FPV flying until it was too dark to continue. And while a dusty, tired crowd of newly minted drone photographers filed back on to the bus, the Adobe video editors had their work cut out for them—to comb through the day’s aerial footage and assemble a video to commemorate the day—a video that is as mind-bending as the town of Nelson itself.

Here’s the result of our day in the desert.

Note: This is the first in a series of four posts inspired by a pre-show workshop arranged by Adobe’s Russell Brown for Photoshop World 2014. Check out the second and third posts called Propeller Panoramas and A Stitch of Nine in No Time.

Our Second Moverio Hackathon: AWE 2014

Team HUDHowTo: From left to right: David Lee, Lesley Bell, Liza Gere, and Tim Hayes.

One of the best parts about working on the New Ventures team is that I get to interact with developers from around the world. We rely on this developer community to both come-up with augmented reality applications, and to port them to our Moverio BT-200 smart glasses. One of the coolest places to watch developers bring their ideas to life is at a hackathon.

We sponsored our first hackathon last summer at our corporate headquarters and have been looking for the right opportunity to hold another event. We did by hosting our second hackathon, May 26th-27th, at the Augmented World Expo (AWE 2014) in Santa Clara, Calif.

More than 80 developers gathered for the 36-hour, two-day event. With a prestigious Auggie Award and more than $24,000 in prizes on the line, several teams worked through the night to build working demos in such a short amount of time.

So what did the teams come up with? Here are a few examples:

The teams presented their concept demos to a judging panel of AR industry leaders and the assembled audience. The winning team, HUDHowTo, created a DIY augmented reality ‘How To’ tutorials.

I spoke with team member Tim Hayes to get his insights on the winning app. Here’s what he told me:

After deciding that the original idea wasn’t really going to work, we all quickly pivoted into brainstorming mode and landed on the HUDHowTo application. The app allows the user to play a how-to video that has been segmented into individual steps. After each step, users have the option to continue to the next if they are ready, replay the last step if they need to review it, or call an expert.

The big value-add for business is that when the user calls for help, the customer support representative also receives the exact point the user is in the procedure. This means they no longer need to ask introductory questions to ensure the user has successfully completed the initial steps. This could greatly reduce call times. Future versions of the app could also allow the service representative to see what the user is looking at via a streamed live camera feed, giving them further context and allowing them to provide more accurate assistance in even less time.

Tim also loved hacking Moverio, telling me:

The Moverio smart glasses were incredibly easy to work with. Any Android developer should have no trouble getting applications up and running on these devices. The touch pad allows for user interaction nearly equivalent to a mouse or touch screen, which makes it perfect for getting prototypes out very quickly as you don’t need to support gesture-based input from the start. Obviously, this is perfect for a hackathon scenario.

It was clear that everyone was excited to be hacking on some of the most advanced technology on the market — I was no exception.

Thanks to Tim, AWE, and everyone who came out to participate in the hackathon. Now, when will our next one be?

What the Oculus VR Acquisition Means for Smart Glasses

The tech world was buzzing over Facebook’s acquisition of Oculus VR a few weeks ago, especially among those interested in the smart glasses industry.

First, my congratulations go out to my friends at Oculus. They’ve built a tremendous platform and have managed the company very well to get where they are today.

For Epson and its development partners and customers, news of the acquisition has created a lot of excitement. Countless developers and companies have asked me about the Oculus acquisition and how our technology compares to theirs and Google Glass. I think the buzz is giving people a framework for how to think about the industry, and is bringing additional insight and clarity to the space. For Epson, that means people are gaining a better understanding of the Moverio BT-200 platform’s strength in augmented reality.

What many people haven’t realized yet is that specific use cases are determined by how each platform is built. For instance, Google Glass has a singular/monocular, transparent “look up” display, which is great for social media notifications, transcribing data, and taking photos and videos, but is not necessarily the best platform for AR overlays or VR gaming.

The Oculus Rift platform, on the other hand, is binocular and opaque, meaning that the user is completely immersed in the digitally created VR content, and cannot see or interact with the real-world.

Epson’s Moverio BT-200 smart glasses are binocular and transparent, allowing users to see and interact with digital content and their real-world surroundings. With the Moverio BT-200, content is at the center of the user’s field of view. Because the content is transparent, users can project 3D overlays on top of real objects, enabling an endless number of augmented reality app development opportunities. Some of the best early use cases are being found in logistics, training, wellness, education and consumer gaming.


The acquisition is a validation ($2 billion dollars’ worth!) that VR and AR platforms will play a major role in the future of computing. While Oculus is strictly a VR play, the potential for additional innovative technologies is evident, with Zuckerberg himself mentioning AR during his conference call with Facebook investors. I believe the acquisition will inspire even more developers outside of gaming to explore the exciting possibilities of VR and AR smart glasses.

You’re so Vein


When we announced the Moverio BT-200 smart glasses at CES this year, we got all sorts of reactions–some of them listed in the comments of CNET’s YouTube video Epson’s Moverio BT-200 Smart Glasses deliver Andriod Apps in Augmented Reality.

  • “Stupid.”
  • “pointless”
  • “LOL, hysterical 🙂 Who would wear this… Unbelievable”
  • “My goodness that thing looks so dumb”
  • “Why do they even bother making this stuff”

I totally get it. Moverio is an early augmented reality development platform, and without the context of a real-life application, the glasses might look silly. But, what about applications where looking stylish is secondary to functionality?

For example, have you ever had an intravenous injection? If so, did the nurse stick you more than once while trying to find a vein? Three times? More? What if the nurse was wearing a device that highlighted exactly where veins were located, helping to guide the needle to hit its mark the first time? Would you care how the nurse looked?

Evena Medical has developed an application that does exactly that. Using the Moverio platform, the Evena has created a wearable device to fulfill the following mantra: “One nurse, one stick, in one minute.”

At first glance, new wearable technologies may look strange, but put into the context of a useful application, it may not matter.

Check out the demo that Evena Medical gave at our booth at CES 2014.

Photo Credit: Library of Congress