Working with smart machines
From what I have discussed it is obvious that in this decade work will be intertwined more and more with intelligent machines (Machines used to include both hard -robot like- and soft -applications).
As pointed out in previous posts, artificial intelligence extends the applicability of automation and in doing so some jobs are lost. At the same time the growing use of AI stimulates growth of jobs in companies providing AI based systems and companies using them are on the quest for AI skills.
However, most of the impact of AI is going to be felt in the way work is performed and mostly by high-skills jobs, the ones that so far have not been affected by automation.
Let’s look first at the way work changes as result of AI and of the Digital Transformation (fuelling AI and integrating AI into work processes) and let’s use ass an example the evolution of work in a manufacturing industry.
Today we have plenty of robots on the shop floor, in warehouses and in the supply/delivery chain. Each of these robots has a certain level of autonomy that will increase over the coming years. Interaction with blue collars is taking place at “machine level”. There are meters to be red, levers to be pulled, buttons to be pushed, dials to be turned… see case 1 in the drawing.
What we are seeing is that machines are progressively equipped with a Digital Twin that models their behaviour, mirrors their status (DT at stage III) and progressively participate in the delivery of functionality (DT at stage IV). By the end of this decade we will likely see this DT cooperating with other DTs on the shop floor and beyond (DT at stage V).
Blue collars can interact with the DT, since this one can provide more information on the status of the machine, since the data provided by the machine (through shadowing, ensuring a real time mirroring of the status of the machine) can be interpreted using AI to create a meaningful interpretation of what is going on. As an example, a meter indicating the temperature inside a component of the machine (that was read by the blue collar in case 1) can now be correlated by AI to other parameters resulting in a situation awareness that can be much more meaningful to the blue collar operating the machine. The interaction with the DT can occur via a normal screen reporting images and text message or, better, it can be provided via AR goggles that let the blue collar look “inside” the machine “seeing” both the temperature and the potential impact on other components. This is represented as case 2 in the drawing. Notice the green dashed line indicating that the initiative is on the blue collar and there is not an interaction from the machine (nor DT) but this is resulting from the proactivity on the human side. Also, the same digital twin can be inspected by a white collar, like a designer, to verify what is the actual behaviour of a product using VR (since the white collar will not the co-located with the machine).
A more advanced situation occurs when, case 3, the DT achieves stage IV. At this point some of the functionality of the machine can actually be implemented (co-implemented) by the DT that will now autonomously interact with the blue (or white) collar (solid green line) when the need arises.
Further down the lane, by the end of this decade, many workers will have their own personal DT, PDT. In this case (case 4 in the graphic) the interaction may occur between the machine DT and the worker PDT that in turns will convert the information into a personalised information to maximise the effectiveness of the interaction. As a trivial example, two workers accessing the same machine could get the information in two different languages, because one is an Italian worker and the other a Korean one. The machine is being used in an Italian factory and was produced by a South Korean company. The same event may require the notification to both the user (Italian) and producer (Korean) for different purposes, like the user will need to be notified that a fine tuning is ongoing that will increase the yield, so that more pieces will be produced in the next 24 hours and the producer is notified of the increasing yield resulting from that fine tuning. The fine tuning and the decision on who should be informed is driven by AI and the actual flow of information is the result of adaptation taking place in the PDTs.
Although this example is a trivial one it helps in imaging how much the working environment will be changing in this decade and how much more interconnected the various players will be. This calls in the aspect of open data and this is a double edged sword. On the one hand open data increase the value of data by bringing in more players that can invest and create services but on the other hand the protection of data becomes trickie
Whilst today a factory is a closed environment, controlled, with well defined processes and it is clear who has access (and responsibility) to what, in the future machines and humans will be sharing responsibility through data and these data can be visible to third parties ) like the manufacturer. Notice that today we are seeing this happening (may be not perceiving) when using an app on our smartphone or PC. The producer of that app most likely will get info on the way we are using it, on possible problems. This is (I hope so) for improving our app performances and fixing bugs, yet it opens the door to unexpected side effect.
Industry 4.0 is facing these kinds of issues, healthcare with personal data shared to a number of players (doctors, hospitals, pharma, health institutions) is another point in case. Workers will need to understand the broader implications of their activities.
Work is going on within the European Union to define a comprehensive architecture for data management (sharing, protection, ownership) based on a distributed cloud and a federated data infrastructure, Gaia-X. It is now involving hundred of European and non European companies and is being “tested” through a number of use cases, including manufacturing and healthcare.