Home / Blog / What will “knowledge” mean in 2050? – VII

What will “knowledge” mean in 2050? – VII

The roadmap followed by the Trentino Region in the development of their Digital Platform. Image credit: Francesco Flammini

Executable knowledge

If you go back to the last century web and, partially, also to the first decade of this century, you’ll notice that you accessed the web via a browser. The web was a gigantic, and growing warehouse of documents (like IEEE eXplore is a warehouse of articles). Ever better browsers flanked by search engines gave you the capability to find and look at what you wanted.

The landscape changed dramatically with the advent of apps (and the iPhone was surely a breaking point in this respect). Nowadays we are still using browsers and search engines but a significant portion of our web usage is based on apps. Apps are mediating our needs with service providers of some sort. 

Apps are evolving, becoming smarter. They learn about us as we use them and thus are able to finely tune the interactions to our needs. Some apps keep working in the backstage acquiring data that might be of our interest to have them available in the backstage ready for the next time we will use the app. Some go even further, generating messages to capture our attention.

The appearance of intelligent assistants begun in the last decade. They are moved from being a curiosity to taking a centerstage in our daily interaction with the cyberspace and even with other people, socially and in the business environment. The progress in artificial intelligence has been the enabling factor. 

A very similar path will be taken by knowledge and knowledge access. Knowledge services embedded in apps will provide the knowledge that we need, when we need it and in the form that we will be able to apply it. This is what is called executable knowledge.

The packaging of “data” with an interpreter of those data that can interact with an access agent is transforming the knowledge landscape. Data in a data centre, in a company repository, in a research centre and in academia, will no longer be a static entity. Rather they will become, in a way -and I know I will generate some discussion and opposition – self aware of their potential “usefulness”.

I have seen this happening, although on a smaller scale, with the initiative “Open Data Trentino” by the Province of Trento when I was at the EIT Digital Italian Node in Trento. The idea was to make available to anyone interested all data stored in the many (some 120) data bases of the Province. The data was encapsulated and made accessible only via API – Application Programming Interfaces-, pieces of software that will respond to a query by a user not providing the raw data, rather the meaning that was of significant to that specific query, a meaning that could be derived by performing data analytics on all the data sets (potentially). The idea was that as more data would become available (the Open Data Trentino initiative intended to stimulate the sharing of private companies data as well) and data analytics evolved – read inclusion of artificial intelligence – it would be possible to offer more and more meaning to a growing variety of users, in other words to extract more and more value from data. To my knowledge it was a first, basic, attempt at the development of a system to deliver executable knowledge. 

What was notable was that all aspects were tackled: 

  • providing a meaning that was derived from many data sets – the system tried to “understand” the query and worked out the necessary actions to retrive the meaning disseminated in the data;
  • providing a framework that was preserving the ownership of data;
  • protecting the data owners from potential undesired issues of misuse of their data
  • providing a mechanism to extract value from the data and share the economic value resulting from the use of data with the owner(s).

To my knowledge the Open Data Trentino has proved its value in stimulating the opening of data by private companies, however its full potential has not been pursued (i.e. it is still looking like a more advanced data centre rather than a knowledge service centre).

About Roberto Saracco

Roberto Saracco fell in love with technology and its implications long time ago. His background is in math and computer science. Until April 2017 he led the EIT Digital Italian Node and then was head of the Industrial Doctoral School of EIT Digital up to September 2018. Previously, up to December 2011 he was the Director of the Telecom Italia Future Centre in Venice, looking at the interplay of technology evolution, economics and society. At the turn of the century he led a World Bank-Infodev project to stimulate entrepreneurship in Latin America. He is a senior member of IEEE where he leads the New Initiative Committee and co-chairs the Digital Reality Initiative. He is a member of the IEEE in 2050 Ad Hoc Committee. He teaches a Master course on Technology Forecasting and Market impact at the University of Trento. He has published over 100 papers in journals and magazines and 14 books.