A survey of artificial intelligence in industry
Artificial Intelligence (AI) is fast becoming a buzzword in the technology industry. If you are following up technology news, you will get to read eye catching AI news almost every day. Large companies are realising that they will be at a loss if they remain behind in acquiring competence in AI. Prominent technology companies like Google, Facebook, Microsoft and IBM are investing billions of dollars in developing AI teams and technologies. Venture capitalists are investing in a number of start-ups promising to make AI products.
This article is an attempt to introduce the various developments within the AI landscape and arrange them in a way that is easy to connect to the technology industry as a whole. I will also try to add a historical perspective that will give more meaning to the current events.
Firstly, AI is not future, but present. You are already using many tools based on AI, though you may not be aware of it. If you are using any sophisticated mail, like gmail or hotmail (now Outlook.com), you are using its AI based spam filtering mechanism. The early spam filtering systems were simply not up to the task of matching the ingenuity of the spammers. But the modern systems are based on an AI technique called Bayesian classification, which as you all know, works quite well.
The personal assistants like Siri or Google Now incorporate speech recognition engines, which use Machine Learning (ML), a prominent branch of AI. Google uses ML to automatically suggest responses to the emails you receive, which I admit is a slightly creepy thing.
And of course all those ‘People who bought this also bought’ and ‘Recommended for you’ labels that you see on almost all e-commerce sites like Amazon and Flipkart are prepared by AI programs too. They use a variety of techniques like machine learning and symbolic AI. This wonderful but scary capability is a part of a whole gamut of techniques collectively called Big Data, that puts together a huge number of pieces of information about something (like you!) and draws meaningful conclusions. Big Data relies heavily on AI and its earlier (and more humble) avatar Data Mining is a subfield of AI.
You must have heard that AI based programs are now the world champions in chess, scrabble and backgammon and AI is on its way to become the Go champion. In 2011, IBM’s AI supercomputer Watson won the TV game Jeopardy against two of the world’s best players. All in all, AI is very much a part of current technology industry.
It has been a long journey though. Artificial Intelligence as a separate discipline was launched in a 1956 meeting of prominent researchers in Dartmouth College, New Hampshire, USA. In the initial years, great things were expected. Scientists promised that AI will be able to develop human capabilities within a decade. Popular movies like 2001: A Space Odyssey (1968) added to the hype.
But reality turned out to be very different. It took 40 years to beat human beings at chess, which is a relatively simple task. A cut a long story short, AI could not live up to the promises, which resulted in removal of funding by governments and investors. This went on till the turn of the last century, a period called as the ‘AI Winter’.
But in the first decade of this century, a number of factors resulted in a strong comeback. First factor was the patient work done by a handful of researchers in the universities, who despite of the scarcity of funding kept on the work. Another major factor was the availability of cheap and massive computing power in form of cloud. But most important was the advent of content rich internet which provided AI the right problem to solve. Products such as the few I have mentioned above were introduced and found a sound foothold in the industry. Building on their success, more and more ambitious products are being made.
As I write, AI technology is experiencing a kind of explosion. Its power is growing so much that prominent scientists like Stephen Hawking are worried about AI dominating over human beings. But in the near future, we can keep this fear aside and check what AI will mean for the industry.
A welcome trend that we saw in the last year was that the leaders in AI are making part of their technology open. Google’s TensorFlow and Microsoft’s CMTK (both machine learning framework) are but two examples. Other such initiatives are the Open AI foundation promoted by the likes of Elon Musk and AI specific open server architecture by Facebook. These open source technologies will enable independent developers to experiment with the latest in AI, allowing them to either design products or seek opportunities in the companies that need AI expertise.
Currently, there is a shortage of talent skilled in AI areas such as machine learning and genetic algorithms. More and more professionals are updating their skill sets by learning these technologies. While some have preferred to pursue PhDs from universities, many are taking online courses from eLearning platforms such as Coursera and Udacity. The most important area as far as current job opportunities are concerned is of course Big Data. While Big Data includes a large array of separate technologies, its critical aspects are dominated by AI techniques such as natural language processing (NLP), pattern recognition and genetic algorithms. It is no wonder that a lot of current progress in AI is driven by Big Data.
However impressive these achievements of AI might look, in the eyes of the scientists all these techniques are called Artificial Narrow Intelligence (ANI). The reason for giving them this deprecatory name is that the programs are good at doing only one thing. The program that beats a chess champion cannot read natural language. But the real artificial intelligence should learn to do various things, just like we do. Such intelligence is called Artificial General Intelligence (AGI). As of now, we are far away from AGI. But we are making huge strides towards that, and this is the reason why the likes of Stephen Hawking are worried.