The state of Artificial Intelligence

Screenshot_2019-07-18_at_07.52.20.jpg
 

You might have heard about Open AI: Elon Musk initiative to address big AI challenges.

At the end of last year, following the publication of the AI Index Report (a dense outlook on the state of AI) Azeem Azhar, technologist, founder and host in www.exponentialview.co interviewed Jack Clark  policy director at OpenAI.

This is what we retained from an incredibly rich discussion covering both the societal and geopolitical aspects of the AI revolution.

  1. Artificial Intelligence Monks vs. AI peasants. The staggering speed of the AI transition is causing tension between AI Monks (AI researchers) and AI Peasants (AI practitioners). AI researchers have kept the flame of AI alive for more than two decades following the AI winter in the '80s. However, it is hard for them to redefine their role now that interest in AI has surged. The upside is that AI practitioners make the AI and Machine Learning landscapes more diverse, heterogeneous, and vibrant.

  2. Speed is the most defining feature of the wave of AI innovation: while mobile phones and the internet took a decade to thrive, AI is ramping up much faster. This was confirmed by a recent statement by the CTO of Accenture who declared, "AI is the fastest growing IT trend I have seen in thirty years of [my] career."

 
 
Screenshot 2019-05-20 at 14.54.43.jpg

Do you have a project in mind?

We’d love to get your ideas and comments

 
 

The AI lords and the data farmers. We live in an era of Neo-Feudalism where big data and AI companies (i.e. the lords) govern the rest of the people by providing some level of service and security (i.e. services like Google, Twitter, etc.) and by getting resources (i.e. data) in return. The big difference from Feudalism is that the resources in the past were food and hence they were consumed, whereas now it is data which can be infinitely re-used.

  1. NEW TREND - 🚀Every Data center can be turned into a configurable, gigantic learning machine. The factories where AI is made (i.e. Data centers) are now gigantic, multi-purpose computational machines that can be reconfigured depending on the problem you need to solve. You can re-train the same deep neural network stack to understand language, to recognize images, or move a robotic arm.

  2. Deep learning researchers are GPUs greedy. The human brain consumes 25 Watts; a GPUs consumes 10 times as much. At Google and Alibaba, DeepMind researchers are training deep neural networks with thousands of GPUs! At the moment, the most common approach towards Deep Learning is brute force because we are still at an exploration phase, and computational resources are still relatively cheap.

Our Suggestions