On my second day here at AWE US 2021, I have tested a lot of interesting things, and of course, I will describe my experience to you in the next few days. I had a few highlights, and the most mindblowing one has been trying the demo of Mojo Vision’s AR contact lenses, while the product that won my developer’s heart has been The 8th Wall’s new Reality Engine. Let me tell you about this one, and then if you want to know more about my other experiences, subscribe to my newsletter not to lose my upcoming posts about them!
AWE 2021 Expo area
Yesterday I have told you about the fantastic energy around AWE 2021, with all people meeting, hugging, and having fun together. On the second day of the event, a new cool place has opened: the expo area, with tons of demos of cool hardware. Some interesting companies that are showcasing here are Tilt Five, HaptX, Niantic, The 8th Wall, Varjo, bHaptics, just to name a few. There are also the amazing creatives of Lucas Rizzotto’s AR House showing their artistic projects. The expo area is filled with stands, and two days won’t be enough for me to try everything I would like to… I’ll have to make some choices. Kudos to AWE’s organizers for having organized such a cool place.
And for me the experience of visiting the stands has been even better because in the first hour that was dedicated only to us communicators, I have been able to meet with fellow members of the “press” like the cool Erik “e-minus” or the king of podcasts Kent Bye. I loved meeting so many amazing people from our community!
The only downside of the Expo area is that, as many people complained, there is nothing “new”. AWE is not the CES, so every company is here with products it has already announced… with the only exception being The 8th Wall that has launched here its game Summit Scramble. Honestly I didn’t care much, because even talking about already announced products, there are a lot of them that are cool and I haven’t tried yet, so this expo has been a cool opportunity to try these pieces of hardware, and chat with their creators (…and as a part of this chatting, I told Jeri Ellsworth I love her, but that’s another story I will maybe tell you another day).
Among the many cool things I hvae tested, today I want to focus on The 8th Wall and its just launched Reality Engine.
The 8th Wall Reality Engine
In the expo area, I’ve visited The 8th Wall, that is an interesting startup that I follow since a long time and that now has as its VR of product Tom Emrich, who is an XR professional I esteem a lot (I love his monthly roundups of AR news he publishes on Linkedin). I met with Tom, and after some selfies, I had a chat with him and the VP of Engineering about the just launched The 8th Wall Reality Engine, that is for sure the product that at the moment has won my developer’s heart at this AWE.
The 8th Wall, in case you didn’t know yet, is a startup focused in developing a framework to let you create easily web-based AR experiences. It lets you create marker-based or markerless-based AR experiences, AR try-on solutions, or AR filters for the face in an easy way, with everything running inside the browser of iOS and Android phones. This means that the users won’t have to download an app just to try a simple marketing experience for a brand (that is pretty annoying because marketing experiences are usually used only once, and installing an app is a permanent action), but they can just follow a link and enjoy it with no friction. The 8th Wall is to my knowledge one of the best (if not the best) framework to offer that, because it works well and across all phones, not only the ones that are compatible with ARCore and ARKit, with very solid tracking. It also offers many templates ready out of the box for developers to speed up development times. Its only big problem is that its licenses are quite expensive, so usually this engine is used if you have a good budget to invest.
Reality Engine is a just-announced total rebuild of The 8th Wall engine so that it can work on all devices. The questions it tries to answer are: since the mobile AR experiences are accessible via a link, so they can be opened by all browsers, is it possible to give a pleasant experience of them not only on a mobile phone but on all the devices? And how to make the work of the developer easy in this cross-device deployment?
The solution offered by the company comes in 5 main points:
- Reality Application Runtime: the Reality Engine integrates with WebXR APIs so it is able to run the Web AR experiences on almost all devices. The 8th Wall calls this fact that you can build your application once and run it on all devices “Metaversal Deployment”, and my eyes bleed blood for the fact that they used the M-word
- Spatialized UI: the system is able to bring the classical 2D UI made of buttons and sliders that you use on a phone to all XR-compatible devices, including VR headsets. This is not done trying to do some complicated mapping of 2D elements to 3D ones, but just showing you a sort of a tablet with the original UI in your 3D world, so that you can interact with it via ray-casting (point and click) if you are in VR
- Intearction mapping: the 8th Wall engine translates automatically all the most used gestures of an AR experience on mobile to input with similar meaning on other devices. E.g. if you tap on the screen to make something happen on mobile, on PC you can use a mouse click and in VR a point with the controllers and click with index trigger to perform the same action
- Environment mapping: if you run your AR experience on a device that has not AR-vision, the user will see a 3D environment around the elements of the application. By default there is a simple 3D space, but the developer can use his/her custom one. So for instance if your AR experience is about a Christmas tree, people in AR can see the tree inside their houses, while the ones from PC can see the tree inside a 3D model of a house that the developer has chosen. This makes the experience more coherent across all the devices
- Responsive Scale: the experience adapts itself to the point of view of the user. This means that the camera position and the scale of the depicted objects changes automatically to ensure that the vision of the elements of the application is optimal for the device you are running it on.
Long story short: you make an AR experience for mobile, and some magic makes sure that it runs well with a nice UX also on AR headsets, VR headsets, PC, and Nintendo 64 (no ok, I was joking about this last one). For me, as a developer, this is something simply fantastic. I loved how they thought about making the life of us content creators easy, making us focusing only on one platform and then giving us for free all the other ones. They thought about all the best ways to translate the user input and user visualization from mobile to the other devices and this is pretty amazing. I find what they did super useful.
They also used the M-word and you know how I totally don’t like it, but for once it has quite a sense. We always say that the m**averse should be accessible from all devices, and I have criticized this statement saying that for us developers would be a pain to maintain the compatibility of every experience with all possible platforms. We simply don’t have the time and the money to do that. Frameworks like The 8th Wall that take care themselves about handling this compatibility are what we need to have the various experiences in the m**averse offer a coherent behaviour on all possible devices.
I am pretty amazed by Reality Engine, but if my developer’s heart is in love, my developer’s brain knows that nothing is perfect as it seems. Whenever I use some cross-platform development tools (like Unity.XR), there is always some little extra-work to do to ensure compatibility with different devices: it’s quite rare that everything in your application just works everywhere without you putting any kind of effort. If it happens, probably you are developing something very simple.
For instance, I’m sure The 8th Wall engineers worked on ensuring cross-compatibility of the most common use cases, but if your app has some fancy interactions, e.g. it makes something happen if the user does a somersault, you have to work on the cross-compatibility of that interaction by yourself. Also it is clearly a “mobile AR first” approach: it lets you translate your mobile AR app in a pleasant experience for all the other headsets. This means that for instance, VR users will just have a “porting” of your mobile experience. In case you want to privilege other categories of users, e.g. making something for VR headsets and then port it to mobile AR and PC, it is not possible. The 8th Wall is a framework for building mobile AR experiences, so this makes sense, but it was worth noticing.
Basically what you have to remember is that The 8th Wall built a system that makes you port the most common and most used interactions in mobile AR to all the other devices. It is not a solution for all your cross-development problems, but anyway this is enough for the most common mobile AR use cases. And that’s why I think it’s a very cool product and I totally endorse it.
Hands-on with Summit Scramble
After a quick chat with people from the company, I went inside the booth, and I’ve tried Summit Scramble, the multiplayer mobile AR game that The 8th Wall was using to showcase its Reality Engine. It was a multiplayer game where the players, that had customizable Ready Player Me avatars, had to go up and down a hill to collect the biggest amount of gold possible.
I don’t want to go much into detail about the game, because honestly speaking I haven’t liked it that much. I think it had not a fun gameplay, it was hard to understand what to do without having some hints, and the graphics were a bit simple. I don’t know how it is playing it at home, but as an expo demo was not fun. From my experience, when you are in an expo showcasing a product, you have to propose a demo that is catchy, easy to play, and has lots of eye candies and explosions (everyone loves explosions). Here there were no explosions, and that was kinda sad.
So the game was not amazing, but the Reality Engine was working very well in it. I have tried the same game on PC, my mobile phone (that is not ARCore compatible), Oculus Quest 2 and HoloLens 2 and the game worked in an enjoyable way on all devices. The interactions were different on all the platforms, but they were the best for each platform the game was running on: on PC I moved my avatar with WASD, on Quest with my thumbstick, on mobile with the virtual thumbstick. It all felt so natural to use. On HoloLens 2, since there were no controllers, an “air-tap and then swipe your hand” mechanic was implemented, and it was nice, but quite non-standard, so someone had to explain how to make it work to me. Probably what is needed in some cases is also some kind of “cross-platform” tutorial mechanism, to explain the interactions on all devices. But it was impressive that all of this worked out of the box, with all devices, with everyone joining just with a web link, and with the experience just built once and deployed everywhere.
So the Reality Engine worked, and it was cool that it was demoed with a little AR game and not with the standard commercial AR experience where you just have a 3D object on your environment and you tap on it to make some animations happen. It was a more difficult use case, also considering that there also was Ready Player Me avateering system integrated. This shows that the engine is really versatile with a good number of mobile AR experiences. Very cool.
And that’s it for my experience with The 8th Wall! If you want to discover more about the cool Reality Engine, you can find more info in its official blog post.
You will hear more from me tomorrow with new stories from AWE! Cheers!
(Header image by The 8th Wall)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I’ll be very happy because I’ll earn a small commission on your purchase. You can find my boring full disclosure here.
This article was originally published on this site