
Robots are becoming smarter and smarter thanks to a brain provided by artificial intelligence. However, this brain, as any brain, has to learn about its environment, what is the goal and how to achieve it selecting among different strategies that have to take into account its surrounding,
This learning phase takes some time (even Roomba, the first widespread intelligent vacuum cleaner, takes some time to get the lay of the land) and, along with it, it takes computer time and cost quite a bit of money.
Facebook and Matterport are working to develop virtual houses each one modelled through a digital twin that can interact with the Robot (actually with the robot’s digital twin) to speed up the learning phase. You can get more info and see a nice video clip here.
The Digital Twins are used to generate a variety of situations stemming from a given virtual home, like different position of furniture… The Home digital twin interacts with the robot digital twin, supporting the actual (although virtual) execution of actions, like picking up a fork from the floor and placing it in the sink.
It is a new way of seeing a Digital Twin and supporting their fully autonomous interaction in the cyberspace. Actually, this is one of the first examples I am aware of of Digital Twins at stage V.
In the image you can see in the upper part a photo of a real home and in the lower part its digital model. The conversion has been made using 3D scanners (if you have a latest iPhone model it contains a Lidar and there are a few apps that leverage on Lidar to create pretty accurate 3D digital models.