I had the opportunity, few days ago, to attend a talk in the framework of IEEE 2050, on the evolution of scientific publishing. Having a peak into what/how people will be publishing and what/how people will access the published content some 30 years from now is really challenging. Hence the speaker basically focussed on the signs of change that are starting to be perceived today looking at them as weak signals that might become stronger and more relevant as time goes by.
Among the several considerations and observations he put forward I was most impressed by the one pointing to a new readership of scientific publications: machines!
According to most recent data there is an increase access to scientific publications by machines (software applications) that are “reading” the text made available on the web. What these applications are doing with these readings is still an open question. For sure data analytics is being applied, like getting information and then trend analyses, on where certain topics are being created (addressed) the frequency, the affiliation of authors… Possibly, more advanced AI software is able (and for sure will be able) to dig into the meaning of the articles comparing this meaning with those emerging from other articles. Even more advanced AI might be able to extend the conclusions to become starting point for new investigations.
Interestingly, the discussion pointed out that in the future we might be facing both an increasing readership of machines as well as an increasing “authorship” of machines: GPT-3 and its siblings are already showing the possibility to create articles that includes generation of new ideas, not just syntheses of what is already available.
For sure the exponential growth in scientific publication makes it impossible its digestion by any single person. Having many people reading in parallel does not help too much since the sharing of knowledge is not “free”, it requires effort (and time). This is pointing to the need for some sort of autonomous reading able to distill meanings and instantiate it where and when it matters.
Enter the space of Executable Knowledge a major shift in value perception. Whilst in the past “knowledge” was perceived as a value (the Knowledge Society) that was worth pursuing and having, now knowledge has become so huge, on the one hand, and so accessible, on the other, that it is both too difficult to digest and a commodity. The value is now in the possibily of using the knowledge when it can be useful, packaged in ways that can be digested and applied.
Whilst publishers, like IEEE, in the past were concerned with storing knowledge (and guaranteeing that what is stored is “sound”, in the case of IEEE the content can be trusted because it is peer reviewed) in the future publishers can continue to be relevant (generate value) only if they add to the “storage” a packaging and delivery of knowledge that makes it executable (hence fits the context where it is needed).
A huge shift.