One of the more serious limitations facing the robotics industry today is that each bot it produces is an island unto itself. Worse, robots' primitive AI doesn’t allow for intuitive thinking or problem solving — what’s known as artificial general intelligence. Looking to overcome this problem, researchers from several different European universities have developed a cloud-computing platform for robots that will allow them to collaborate — and make each other smarter — over the Internet.
Essentially, the new system, called Rapyuta: The
RoboEarth Cloud Engine, is an open source repository of accumulated information
for robots. Its name is taken from the movie Castle in the Sky by Hayao
Miyazaki, in which Rapyuta is the castle inhabited by robots. The name is quite
perfect, actually. In terms of the technology required, the developers
implemented a Platform-as-a-Service (PaaS) framework designed specifically for
robotics applications. Each robot can kickstart its own computational
environment or launch any node that has already been set up in advance
(typically by another developer). It can also communicate with other nodes
using the WebSockets protocol. Data stored in the cloud will include software
components, maps for navigation (including the location of objects and world
models), task knowledge (like action scripts and manipulation strategies),
processing human voice commands, and object recognition models.
By allowing robots to collaborate and share
information in this way, each bot will essentially offload its “brain” into the
cloud. Moreover, each unit can be considerably “lighter” in terms of its
processing and software requirements; when in doubt, it just needs to hit the
cloud. Ultimately, this will make robots cheaper, more efficient — and more intelligent.
The platform will allow robots who are connected to
the Internet to directly access powerful computational, storage, and
communications technologies, including those of modern data centers.
"The RoboEarth Cloud Engine is particularly
useful for mobile robots, such as drones or autonomous cars, which require lots
of computation for navigation,” noted Mohanarajah Gajamohan through an official
statement, and a researcher at the Swiss Federal Institute of Technology (ETH
Zurich) and Technical Lead of the project. “It also offers significant benefits
for robot co-workers, such as factory robots working alongside humans, which
require large knowledge databases, and for the deployment of robot teams."
As exciting as this appears, the concept is not
without its problems. Two things conern me in particular.
First, anything that’s connected to the Internet is
inherently hackable. This system will need to be crazy secure, otherwise the
robots could be controlled by a malicious source (either individually, or
collectively).
And second, the query response-and-match algorithms
will need to be very strict to prevent a robot from getting the wrong
instructions. For example, a robot could ask the cloud for instructions on how
to perform task x, but the cloud-engine could misunderstand and provide it with
instructions for task y. The robot, because it’s stupid, will then execute task
y. This could be dangerous, and even potentially catastrophic in some contexts.
Essentially, the new system, called Rapyuta: The
RoboEarth Cloud Engine, is an open source repository of accumulated information
for robots. Its name is taken from the movie Castle in the Sky by Hayao
Miyazaki, in which Rapyuta is the castle inhabited by robots. The name is quite
perfect, actually. In terms of the technology required, the developers
implemented a Platform-as-a-Service (PaaS) framework designed specifically for
robotics applications. Each robot can kickstart its own computational
environment or launch any node that has already been set up in advance
(typically by another developer). It can also communicate with other nodes
using the WebSockets protocol. Data stored in the cloud will include software
components, maps for navigation (including the location of objects and world
models), task knowledge (like action scripts and manipulation strategies),
processing human voice commands, and object recognition models.
By allowing robots to collaborate and share
information in this way, each bot will essentially offload its “brain” into the
cloud. Moreover, each unit can be considerably “lighter” in terms of its
processing and software requirements; when in doubt, it just needs to hit the
cloud. Ultimately, this will make robots cheaper, more efficient — and more intelligent.
The platform will allow robots who are connected to
the Internet to directly access powerful computational, storage, and
communications technologies, including those of modern data centers.
"The RoboEarth Cloud Engine is particularly
useful for mobile robots, such as drones or autonomous cars, which require lots
of computation for navigation,” noted Mohanarajah Gajamohan through an official
statement, and a researcher at the Swiss Federal Institute of Technology (ETH
Zurich) and Technical Lead of the project. “It also offers significant benefits
for robot co-workers, such as factory robots working alongside humans, which
require large knowledge databases, and for the deployment of robot teams."
As exciting as this appears, the concept is not
without its problems. Two things conern me in particular.
First, anything that’s connected to the Internet is
inherently hackable. This system will need to be crazy secure, otherwise the
robots could be controlled by a malicious source (either individually, or
collectively).
And second, the query response-and-match algorithms will need to be very strict to prevent a robot from getting the wrong instructions. For example, a robot could ask the cloud for instructions on how to perform task x, but the cloud-engine could misunderstand and provide it with instructions for task y. The robot, because it’s stupid, will then execute task y. This could be dangerous, and even potentially catastrophic in some contexts.
Be in touch.....
0 comments:
Post a Comment