The future is now. Yannick Boers of Boldly-XR explains how they’re making your Electrified dreams reality – augmented reality, that is.
How did the VanMoof AR experience start?
It’s funny, we actually reached out to you guys directly – we built a pretty crude model of the Electrified and put it in an augmented reality environment, then said, “this is what we can do for you.” Augmented reality is still very new for most people, so it’s something you have to make very tangible to show people the benefits.
The use cases for AR are particularly effective when you have a strong brand with a carefully considered visual design and artistic impact. VanMoof ticks all those boxes in that the meeting of form and function is one of the Electrified’s key USPs – it’s a beautiful thing, but it’s also a physical tool that you use in real life.
After this ‘demo’ version, what were the next steps in bringing the Electrified to virtual life?
Well, it’s relatively easy to build an extremely detailed 3D model of a product. The VanMoof Product Design team sent us their CAD renders of the Electrified S2, and we took hundreds of reference photos of the real thing – we used these to build our own model from the ground up.
The problem is that, in its current state, augmented reality is always limited by having to run on a smartphone. Phones have advanced massively in the past few years, but they’re still much less powerful than a ‘real’ computer. This means we then downsize our first model until it can run on a standard phone. There are lots of clever ways you can do this without compromising on visual quality.
You’re always ‘budgeting’ your polygon count in AR, so we made heavy use of a process called normal mapping. That’s when you take an intricate 3D surface – such as the tread of a bike tire – and render it as a 2D image with the illusion of depth. If you account for the different angles you view the model from, it’s impossible to tell the difference. That allowed us to downsize a 700,000 polygon model into something which looks the same with just 100,000 polygons.
When does the animation factor into the process?
It’s really a red thread throughout the whole project. You develop these elements as static components, but you always have to consider how they’ll work together in motion. It’s important that we made clear splits in the model from an early stage, so we could work out which elements would be ‘swapped out’ with a moving part when the user activates that feature.
How do you account for the different phones and operating systems riders will use?
Different platforms are always going to prioritize certain aspects of the experience. The Facebook AR experience clearly sets out to be as inclusive as possible, so everything has to be optimized for older and less powerful devices. That means they can be quite picky in dictating the amount of detail you can show.
iOS and Android both have slightly different focuses, which largely comes down to the way their occlusion engines work. Simply put, this is the tech which allows real-world objects to move in front of the rendered objects without breaking the illusion. Apple’s engine is more optimized for people, whereas Google’s is better at estimating the positioning of objects. It’s a subtle difference, but it really affects the way this ‘reality’ works.
We were in close contact with both parties to really push the limits of their platforms. It was important that we could showcase the full functionality of the bike, so Apple actually extended the capabilities of their Reality Composer software for us. We’re pretty proud of that.
We also wanted to make this VanMoof experience as accessible as possible. People can’t be bothered to download a separate app, so we had to make something that can run natively, no matter what platform you’re using. You can share this experience via the web and it opens instantly, you don’t need to install something or do anything extra.
Let’s get practical. How does this benefit companies like VanMoof?
In any retail environment, you have two parties: the retailer and the consumer. Augmented reality is fairly unique in that it can benefit both. The advantage for the retailer is that it makes customers so much more certain regarding their purchase – they can get to know the product way better, so they’re much less likely to return anything.
In the age of e-commerce, retailers need to find ways to cut down on unnecessary shipping, and AR really helps with this. We also see customers spending more, as customers are more comfortable purchasing high-end products after going ‘hands-on’ in AR.
On the consumer side, we’re still benefiting from what we call the ‘goldilocks zone’. We’re at a stage where the tech is accessible to literally anyone with a smartphone, but it still entices customers as something that’s kind of magical. It helps consumers make the right choice for them, but it’s also a lot of fun – which helps streamline any retail experience.
Is it difficult to bring a very physical-tactile product to life in a virtual space?
It’s not so much a challenge as an inherent benefit of the platform. Even though society is moving more and more towards buying goods online, we can see that people still depend on brick-and-mortar stores. Being able to experience a product in person is something that’s very difficult to replace; humans really like to see and touch things before making a decision.
And we know there’s some psychology behind this – our brains are wired to have a way better of three-dimensional objects than they are for 2D images. People have a much higher attention rate when using AR, which leads to a much deeper understanding of a product like the Electrified.
And the really cool thing about the current tech is that we can now place objects completely true-to-scale. Your phone now has such a deep understanding of your physical space that an AR experience is just as lifelike as seeing it for real… except you can’t touch it, of course.
What are the next steps for AR? What excites you about the tech?
The entire XR spectrum – that’s augmented and virtual reality – has been the stuff of science fiction as long as we can remember. We’ve always said “one day, we’ll be able to…”
What’s really exciting for me is that we’re now in the age where we can make these things reality. And we’re really on the advent of the mass adoption of AR – there are all sorts of signs across the tech world that the major players are working on developing their own AR systems and content. Apple has been buying up lots of AR companies, so I think we can definitely expect something big from them in the next couple of years.
It’s going to completely change the way we use computers. Right now, everything is confined to a screen, but all sorts of workflows become more efficient and flexible when you move them into a physical space. Our digital lives are about to flow way more into our physical lives, and that can have a positive and negative impact. It’s really up to us to make sure these advances are beneficial, and not just a way to broadcast more ads to us all the time.
And where do you see the real innovation coming from?
Up to this point, there’s been a certain disconnect between agencies, and brands stakeholders. There are all sorts of dreams from brands that have been deemed too ambitious or just not possible. We used to call it a story of ‘buts’ – “Augmented reality could be wonderful for your brand, but it can only work on this platform... you need to place a marker here… you need to have a custom app installed…”
But the tech is moving so fast that the possibilities now stretch way beyond peoples’ current ambitions. Something that wasn’t possible a year ago, might be very doable right now. As brands begin to understand that, we’re seeing so much more creativity on both sides of the equation – people are discovering new ways to dream.
Take the Electrified S2 for a virtual test ride right now.