At the 2017 Augmented World Expo (AWE) — the world’s largest conference and expo dedicated to AR/VR innovation — Kyle Nel joined the likes of Google, Microsoft and Niantic (of Pokémon Go ® fame), to share how Lowe’s is shaping the future of retail using breakthrough technology. I sat down with Kyle at the event.
Q. Some people might be surprised to find Lowe’s at a conference like AWE, and it was one of only two retailers chosen as finalists for an award at this year’s event. So, how is a home improvement company in the same conversation as Pokémon Go® and Snapchat®, applications that have helped to mainstream augmented reality?
A. Visualization is critical to planning and executing a home improvement project, and AR/VR technology allows customers to “see” their projects come to life before they ever make a purchase – giving them a sense of confidence that transforms the experience.
Lowe’s is leading the way in developing virtual tools that help customers in the home design process. Our immersive in-store Holoroom experiences place users in realistic home improvement scenarios, like designing a kitchen or learning DIY skills to complete a home improvement project. The Lowe’s Vision application turns Tango enabled smartphones into digital power tools that enable customers to easily capture room dimensions and preview appliances and home décor in their home. The platform’s newest feature enables in-store navigation to help customers find products in Lowe’s stores.
Just recently, we launched LIL 3D, a proprietary, end-to-end 3D scanning technology platform that produces high-definition virtual content.
Q. Why is Lowe’s developing 3D scanning technology in-house? What’s the story, why was it developed and how?
A. We’ve learned through extensive research that mixed and virtual reality experiences are only as good as their content. If assets don’t look real, the ability to produce a truly immersive user experience is limited. Much of the 3D product content available in retail today is computer generated, or simulated graphics, and not at all life-like. LIL 3D creates photo-real 3D product images with ultra-high-definition textures and color accuracy, offering detailed 360-degree product views that are nearly indistinguishable from reality.
Early on, when building our first AR/VR applications, we scoured the tech industry for partners who could help us build a system for generating high-fidelity 3D content. We quickly realized that our needs exceeded what was available – so we got to work developing our own extremely high-quality 3D scanning solution. LIL 3D was built from the ground up by our Lowe’s Innovation Labs Seattle team, a group of former AAA game developers and designers.
Q. How are virtual images changing the shopping experience, and what are your plans for future applications?
A. LIL 3D assets are available as part of a virtual image collection on The Mine, our premier online home furnishings destination. In some cases, customers prefer to actually see products before they make a purchase. With LIL 3D, they can easily interact with virtual products online, examining the texture and viewing items from every angle to get a better sense of how it will actually look in their home. For instance, they can view a wooden coffee table and observe all the details, including wood grain, variance in color, even down to the type of screws used. This level of digital visualization isn’t available anywhere else in retail. And, we’ve observed a significant increase in page views, cart additions and orders for products available in the virtual imaging collection.
Additionally, our 3D product assets are the backbone of the Lowe’s Vision application. We’ll continue to make LIL 3D more widely available in the months to come, including through Lowe’s digital platforms and potentially well beyond.
Q. In your talk, you shared that behavioral research informs just about every AR/VR initiative at Lowe’s. Why is this important and is it a common practice within the retail industry?
A. Lowe’s Innovation Labs is paving the way in behavioral-driven innovation, which means that we don’t develop our applications based on technology trends. Instead, we develop our ideas based on human behavior when interacting with technology. Since 2013, we have tested all of our AR and VR experiences with applied neuroscience to study what motivates people, what captures their attention and what overwhelms them, among other measures.