Home / Tag Archives: Nvidia

Tag Archives: Nvidia

Can AI use far less computational power?

A very interesting article was published in June 2018 on Forbes describing how Artificial Intelligence progress in the last decade has been basically tied to the increased availability, in the core and at the edges, of processing power. In an interview  Jihan Wu, CEO of Bitman -a Chinese company specialised …

Read More »

From virtual to real …with AI

OK, rather than reading this post you should start by looking at the clip provided by NVIDIA explaining how they have used Artificial Intelligence to let people create images that look like real photos. I think that is just amazing. So, watch it! Now, if you paid attention to the …

Read More »

Beyond Moore’s law

The Moore’s law is over, in both its implication: keep doubling the number of transistors on a chip (in a given area) every 18 months keep decreasing the cost per transistor as density increases The second implication ceased around 2014/2015 when the cost per transistor flattened out as density kept …

Read More »

AI can slow you down

Artificial Intelligence is usually associated with improving performances, with “faster”. Well, here is an application of AI to slow things down! NVIDIA researchers have used AI to transform a standard video into a slow motion one, preserving its quality (watch the clip). Of course, one could in principle have the …

Read More »

Ever wished you were there in Summertime?

Did you ever wish to take a photo of a place in a different season or in a different weather? It happened to me several times. Well, get ready for your wish to be granted: Nvidia researchers have created an Artificial Intelligence based software that can take an image, or …

Read More »