An EPHAS augmented reality (AR) platform

It is an early morning in June. It’s still a little chilly as you stand and gaze out across a plain from a vantage point near Dinosaur National Monument on the Colorado and Utah border. The sun has risen over the mountains and it looks like it will be a beautiful day. Looking out across the plain, you see several Brontosaurus walking towards you in the distance. Above them, a few pterosaurs fly overhead. They hover above you and fly so low that you can see these scary flying reptiles very clearly – the details on their wings catch your eye. The sun reflects off their bodies and their shadows pass you by as they continue their journey up a ridge.

Returning your attention to the plain, the brontosaurs gradually moved closer to where you were standing. As they pass you about 50 yards away, a female stops to eat in a tree, and you can see how her heavy torso moves as she breathes. She whips out her remarkably long and detailed cock, and for a moment you think she’s watching you.

It could have been a scene from one of the Jurassic Park movies or a book depicting the long life of dinosaurs in Morrison’s formation in the United States. In this case, however, it is a description of the augmented reality (AR) experiences that will be possible to create and offer in the very near future.

We’re confident that over the next few years you’ll be able to stand in the Morrison Formation (or other locations of your choosing) and experience augmented reality stories so lifelike you’ll barely be able to tell what’s real from which is not. To use my example, the level of detail is so real that you’ll be able to see which Brontosaurus have scars from a fight with a T-Rex last year. You will be able to see the sun reflecting off the dinosaurs and the shadows cast in the right direction.

What is Augmented Reality?

So what is AR and why is it getting so hyped right now? Well, with AR technology, you can add digital assets to the real world. We could say that you are capable of enhancing the real world or merging the digital with the real world.

In the past, you had to be in front of a 2D screen to consume digital content superimposed on the real world. Traditionally, the most common way to experience this would be an action movie with digital special effects. As technology evolved, we were able to add digital assets to real-time experiences in apps like Pokémon Go. The more advanced the technology, the more realistic the experience becomes, and the blurrier the lines between the real world and the digital world. This is true both from a timing and realism perspective. There is convergence. But this convergence is not just happening between the real and digital worlds.

Let’s think about the future where this evolution and convergence will take us. Eventually, we may move from interacting with the digital world on 2D screens of smart phones and tablets to interacting in the converged real/digital world through head-mounted devices (HMDs). As Meta (formerly Facebook) described at its 2021 rebranding event, the possibilities and opportunities are endless. This is also the reason why tech giants such as Meta, Google and Apple are investing huge sums in AR/MR/VR. Think about it: the software and hardware platforms we use today will evolve over time to become 3D AR/MR platforms on HMD.

EPHAS: how we facilitate the current evolution of AR

Over the past few years, the EPHAS team and I, with support from Ericsson ONE, have built an E2E augmented reality storytelling platform called EPHAS – a location-based platform that uses 5G, Edge and AR to share stories and events. It addresses the creative and consumer side of immersive augmented reality storytelling while democratizing augmented reality storytelling.

Today, anyone wanting to create AR experiences will typically need to hire a skilled developer to produce the experience and create the application through which it can be consumed. The problem is that if the storyteller wants to make changes or add more stories in the future, the developer has to come back and make those changes. It is essentially an expensive, non-scalable and bespoke solution.

Moreover, the solution offered by the developer will become obsolete as the technology evolves. To solve this problem, we built a storyteller studio where a lay storyteller can easily compile advanced AR experiences using any media they choose. Think of the solution as a cross between a movie editing tool and a game creation engine. Storytellers can then publish these AR stories which the consumer can access through an app on their own personal device. Today, these devices are usually smartphones and tablets. Tomorrow will be head-mounted smart glasses. Stories are stored in the cloud and as technology evolves, EPHAS is updated so that the latest and greatest technology is always used.

 Learn more about EPHAS

Under the hood of an augmented reality platform

Several other components are needed to bring super immersive augmented reality stories to life. First, you need extremely powerful computing power to manage many detailed data-rich assets, how they interact with your device (how you move and where you look, for example) and how the assets interact with the surrounding environment (like how light reflects and how shadows are displayed).

Second, when you’ve calculated the experience at the edge or near the cloud, you need to communicate the experience to your device in near real time. If you want the human brain to believe that the experience is real, you are only allowed a few milliseconds of delay. A side effect of a delay is that the viewer may feel nauseous. This is where low latency, high throughput 5G solutions come in. They contribute to the important connectivity of immersive AR solutions by providing a fast pipeline to transport the experience from the nearby cloud to the device.

So what needs to be done to make this vision come true? First, the necessary technologies must be capable enough to deliver these experiences. Second, the technology ecosystem must be connected and work together. And third, to achieve mass adoption, the cost to the end user must be low and the platforms must be easy to use.

On the first criterion, the technologies needed to deliver fully immersive interactive AR experiences are becoming a reality. AR technology is becoming readily available, devices are becoming powerful, HMD development is intensive, 5G and edge computing are being rolled out. The industry is working hard to connect technologies into a well-functioning ecosystem, reducing costs and improving ease of use for end users. It may not be possible to say exactly when the explosion in mass adoption will occur, but all signs suggest it will occur within the next few years.

What is Ericsson’s role in the ecosystem described above and in the converging real and digital world? Throughout our 150 year history, Ericsson has created technology to connect people. A cornerstone of our human connection is storytelling. With EPHAS and its new augmented reality storytelling platform, we continue our tradition of human communication and connection.

AR is one of the most tangible 5G use cases right now. By connecting the possibilities of 5G and edge computing to an easy-to-use AR storytelling platform, imagine what historical stories and events you could immerse yourself in. Imagine the things you might discover.

You want to know more ?

Read our blog post on 5G and augmented reality.

Learn more about augmented reality games

Previous Danse Lumière will hold a public rehearsal for Sidra Bell's performance alongside an open conversation
Next The update | Wall ideas exhibited; symphony in concert; Pinball Fest at the MetroPlex - Business Journal Daily