Ericsson Research introduced Bowser in December 2012 as the world’s first WebRTC-enabled browser for mobile devices. Last week, the company announced it was discontinuing Bowser, which led to some disappointment from users. But the disappointment didn’t last long, as Ericsson introduced a new standard for WebRTC-enabled solutions.
Imagine an operator controlling a backhoe from outside of the machine, with a remote control. Now, imagine an operator sitting in New York controlling a backhoe that is located across the country in Wyoming. This capability is what Ericsson demonstrated this week at Mobile World Congress: A remote-controlled excavator using WebRTC.
Ericsson Research Labs uses a WebRTC app to remote control an excavator. An iPhone with a mounted 360-degree lens sends a low-latency video to a Mac app that integrates with Oculus Virtual Reality goggles, creating the illusion of users sitting inside the digger.
In order to create that illusion, the emphasis is on the user experience. If HD video isn’t bringing enough quality to the user, the next step is to implement 4K data streams. There’s also latency and a sense of depth to consider – in order for the solution to be successful, the users have to feel like they really are sitting in the machine. This requires strong, reliable networks that have to be able to handle these types of factors in order for use cases like this to be possible. With virtual reality goggles, a joystick controller and a miniature digger equipped with a 360-degree camera, the team showcased what types of demands industry applications will put on mobile networks (5G) in terms of capacity, throughput and latency.
“If the user experience isn’t good enough, we won’t be using it,” the team explained at Mobile World Congress.
Stefan Alund, manager at Ericsson Research, explained both iOS and OS X applications use the same underlying framework that Ericsson Research developed for Bowser.
“This is a research prototype that we use to highlight certain requirements that future networks needs to be able to handle,” he said.
Hans Vestberg, Ericsson CEO, covered the project in some detail with the Research Lab in his Mobile World Congress keynote (you can jump to about 5:15 in for coverage on this project).
The main take away from this part of the keynote, as well as from the project in general, is to look at what the network society will look like in the future. What will it contain? What kind of technology needs to be built in order for the user experience to be as good or even better than if a user was driving the machine in real life? The question becomes more interesting when you consider that a machine might be in Paris, while the operator is in Barcelona.
Ericsson asks, “Why should he leave his family for six months to dig a hole in Paris?” The data the machine itself generates, all of the sensors around the machine and the users themselves help Ericsson and developers formulate an idea of what the shape of the next-generation mobile networks will be about.
“What we see, if you ask us here and now, is a really end-to-end perspective of next-generation of mobile networks,” the team explained.
“We believe real-time communications technology will be integrated in many different contexts in the future. It will be essential for solving some industry use-cases -- the remote-controlled excavator is one such example,” Alund told me. “In order to be able to integrate communication in to different contexts, we needed a flexible client framework running on multiple platforms. Bowser is built on the same technology and adds another level of abstraction with Web APIs. For some scenarios, a browser-based approach is best, for others, you need a more deeply integrated solution.”
Edited by
Cassandra Tucker