Andrew Wheeler posted on August 25, 2016 |
IDF 2016 – San Francisco
This city is probably best known for gold-mining, Joe Montana, earthquakes, Robin Williams, hippies, Steph Curry and tech-induced rapid-fire gentrification. Cultural fluctuations and pop culture icons aside, last week saw the late Andy Grove’s legendary processor company Intel put on Intel Developer Forum (IDF) 2016, where it made some big announcements in the “mixed reality” space. Mixed reality is a term that encompasses both virtual reality (VR) and augmented reality.
Before getting into the details of Project Alloy, it may be worth noting that there is a small “language battle” going on to describe technology that comes from the convergence of technologies that were previously thought of as singular. First of all, remember that these terms exist in the 3D space for the most part.
Project Alloy was built by Intel. The hardware requirements and design files are going to be released later this year in an effort to promote both Intel hardware and Windows Holographic, the new OS in development by Microsoft. (Image courtesy of Intel.)
For example, HP came out with “blended reality” to describe a workflow realized in itsSprout computers—3D scanning (capturing reality data), 3D computation (augmenting or changing the data with software) and 3D printing (turning captured reality data into new reality data).
There is the “digital thread” from 3D Systems and “reality computing” from Autodesk—and now “mixed reality” (perhaps originating from Magic Leap) seems to be catching on. But at least mixed reality conflates two of the many interesting 3D technologies that seem to be swimming to the same points in the near future.
The technological combinations include those of photogrammetry, light detection and ranging (LiDAR), generative design, spatial design, artificial intelligence, sensor data, robotics, robotic digital fabrication, machine vision, additive manufacturing, computer numerical controlled (CNC) machining, drones and mobile devices, as well as augmented reality and virtual reality. They’re all bumping into each other. Overall, these terms are abstract expressions of a linguistic failure to extrapolate some pithy yet cohesive description of a workflow that moves either circuitously or singularly in the digital and physical realms, or even within either the digital or physical alone.
But on top of all of these terms, Intel is distinguishing its VR escapade from even the catchy and popular “mixed reality” by calling it “merged reality.” Whatever it is, Intel’s leap of faith is certainly noteworthy to those who have a vested interest in producing VR experiences.
After the presentation of Project Alloy at San Francisco’s Moscone Center, Intel appeared to have jumped headfirst into the VR space. But there were a lot of potentially zipped-up vagaries in the big reveal.
Remember, this itself was a capricious move by Intel. Nobody was expecting to see the reveal of an untethered VR headset that allows you to move around multiple rooms while integrating external reality data (your hands, other people, walls, furniture, stairs) in realtime, “merging” physical and virtual worlds.
The headset uses embedded cameras and motion sensors to detect when objects are close enough to cause problems. When such objects are detected, they do something that is perhaps similar to “breaking the fourth wall” in film—they transform into digital data and peek through into your virtual world. Pretty cool stuff. It’s like a digital doorway. You walk through the physical and into the digital.
Intel isn’t actually making the headset, which is a notion that most people would find surprising. However, it did present onstage what looked like a finished and mass-produced hardware product.
CEO Brian Krzanich presented the user experience of Project Alloy, and the big denouement: The hardware requirements and design files are going to be made available as open source docs for people to create their own versions by next year. But there is also no website set up for Project Alloy, and no exact date was specified for when Intel would post the files.
Look, Ma, all hands! (Image courtesy of Intel.)
Before leaving the stage, Krzanich introduced Terry Myerson of Microsoft to talk about holographic computing. As you may or may not know, Microsoft HoloLens is a mixed reality device that allows users to see both the digital and physical intertwined. In a way, the HoloLens is a reverse experience of what Intel demonstrated with Project Alloy. With HoloLens, you can see all physical information, but digital information is projected over it, and you can interact with it physically. It’s funny that at the digital doorway, where the user interacts with digital data, the difference between the two becomes negligible from a user perspective.
Digital mock-up of Windows Holographic OS. Skype and Microsoft Edge are lurking in the corners. (Image courtesy of Microsoft.)
There’s good news for you and a small team: if you’re interested in electronics design and want to work with whichever manufacturer you’d like, with hardware and specs free from Intel, this could be your year.
For now, you’ll just have to wait for more announcements from Intel.
But forget waiting. There are a lot of start-ups and small companies that are jumping into the mixed reality space in innovative ways that impact engineers, machinists and mechanics.
Take Boston-based AMA, for example.
MBTA stands for the Massachusetts Bay Transportation Authority and KCS is an acronym for Keolis Commuter Services, which will be giving its mechanics pairs ofOsterhout Design Group (ODG)R-7 Smartglasses. Here’s the hook: these R-7 Smartglasses come packed with customized software designed by AMA XpertEye, which is based out of Cambridge, Mass.
Similar to the idea of Microsoft creating a universal holographic OS for mixed reality headsets (which it calls Windows Holographic) and using it as a standard OS with open-source headset hardware designs like Intel is offering with Project Alloy, AMA XpertEye built its own Android-based OS for mixed reality devices from Epson, Vuzix, ODG and Google.
AMA XpertEye’s software allows field mechanics wearing the ODG R-7 Smartglasses to augment their vision by streaming a remote video connection to expert technicians in KCS’s central maintenance station.
The field mechanic’s Smartglasses are tethered and accessible only by a smartphone connection, and the technician at a central facility accesses the connection by laptop.
During the video communication, either party can take snapshots of the video, annotate them with written message or instructions, and send them back and forth to each other. The video can be saved and repurposed as a training video for a specific maintenance job or function.
In the field, the range of locations affects the character of a specific or routine issue. KCS covers a wide swath of MBTA tracks and stations, stretching from the northern border of New Hampshire to Providence, R.I., all the way to Worcester, Mass.
Testing the Smartglasses in Three Locations
If you’re near Somerville, Mass., you may see a field mechanic with some unusually large looking sunglasses with a wire hanging off them talking to him or herself at an MBTA maintenance station. Or, if you happen to live near a MBTA maintenance facility in the Boston community of Readville, you might see the same thing. However, I don’t think you’ll see the Smartglasses in action at the end of an MBTA train line, the third location, unless you work for KCS.
Organizations like KCS are interested in the mixed reality Smartglasses with AMA XpertEye’s custom Android OS, because they know that remote field mechanics and other workers become more versatile when using them. The expert technicians will save workers in Somerville from having to walk 30 minutes back and forth from the site to a main facility, and field mechanics equipped with the R-7 Smartglasses could help KCS save resources by enabling mechanics to fix different issues on the trains where they are located, instead of having to transport them to the main facility for repairs.
For field mechanics, it certainly beats the radio connection headsets they use now. As for AMA XpertEye, the fact that its custom Android OS can be ported to different mixed reality headset hardware shows who Microsoft is competing with in the mixed reality OS sector: Google.
Android smartphones have traditionally used Qualcomm processors and Windows phones have used Intel processors. And if you’re curious, Apple uses its own A9 processors in the more recent iPhones, which are commonly manufactured by its smartphone rival Samsung.
Recommended For You
Making a Mechanical Octopus with Fusion 360
Mixed Reality: Apple Dives in and Microsoft HoloLens Comes out Enterprise
Rendering with AutoCAD 2016 and Its New Enhancements
Robotic Exoskeleton and Virtual Reality Help Paraplegics to Walk