We use augmented reality (AR) to achieve a great user experience while also collecting enough data for our algorithms to accurately understand the key characteristics of each shopper’s feet. In building the Volumental Mobile app, we needed to solve a range of complex technical problems.
Accurate 3D foot measurements is hard to achieve, even when you have laser sensors in the camera, like Lidar, that can measure what they see. In Volumental Mobile, we don’t use Lidar or other depth sensors for measuring because we needed a completely different approach to get the accuracy we need in order to provide great size recommendations. Instead, we rely on correctly tracking where the camera is within the room. Tracking the camera’s location within a space is a key feature of Apple’s ARKit that enables Augmented Reality experiences like ours. However, we had to create our own camera tracking that is far more precise than Apple’s in order to scan feet accurately in a wide variety of settings. In order to achieve this, we did thousands of experiments with an industrial robot that never gets tired of doing the exact same camera movement again and again. Solving a hard computer vision problem better than anyone in the world, in order to get accurate size recommendations, is just one example of many technical challenges we have overcome to deliver a user-friendly product.
We have scanned 10 million people in retail stores all over the world in the last few years. For the mobile app, we’ve had thousands of use-testers taking top-down images of their feet at home, with and without socks, to make sure our app will work anywhere. This exercise showed us how people intuitively think about photographing their feet and helped surface common user errors that shoppers would make. This information directly informed the design of our user interface (UI). Further, this data is now also a part of the machine learning behind the foot detection algorithm that makes using the app faster and easier than other options on the market.
Because Volumental has already scanned millions of people and paired those scans with actual purchase data of what shoes those shoppers chose, we have a statistical understanding of what kind of feet fit in what styles and sizes of each shoe. This means that we are able to use existing foot data and match it to scans from our app in order to recommend the likeliest fit for each shopper.