Adventures of Natural Language Processing

“Alexa! Turn off the Alarm” is probably, how we start our days, a simple command given in our mother tongue which sets your schedule, opens doors, turns off lights and books appointments with your barber, amongst many other things. And yet, many of us fail to grasp the complexity this technology possesses. Neither Natural Language Processing (or NLP) has developed overnight, nor is it a brainchild of a single person, but what it is, is a product of decades and centuries of compound efforts from scientists around the globe.

NLP: Origins

Alan Turing

The theoretical debates date back to the 17th century where certain philosophers proposed codes that related words between languages. Throughout the following centuries, similar research papers, theories and thought experiments were proposed. These were very remotely related to the fields of NLP or even to computers processing unstructured languages and many times were just concerned about conversing from one language to another. The first person who argued and presented a view of human-computer interaction in an unstructured language medium was none other than the father of theoretical computer science — Alan Turing. In his 1950’s paper “Computing Machinery and Intelligence”,he proposed what is now called the Turing Machine, thus associating the ability of computers to process and produce human-understandable language to Intelligence. This was followed by the proposal of the “Hodgkin-Huxley model” in 1952, which explained how human brains had an electric network of neurons, thus giving a subsequent boost for the fields of AI and NLP.

This was followed by Noam Chomsky publishing his book, Syntactic Structures in 1957, revolutionizing the linguistic concepts, and concluding that for a computer to understand a language the sentence structure must be changed. Keeping this in mind Chomsky, proposed his own style of grammar called phase structure grammar. And in 1964 “ELIZA”, a program for imitating a psychiatrist was developed, this program had less to no part to play in language understanding by computers, as it was based on rearranging sentences and following relatively simple grammar rules. After the researches in the fields of Natural Language Processing were halted for more than a decade as the funding from the majority of organizations were diverted. This was justified as the need for machine translators were less and the human alternative was much more affordable rather than relying on machines, also the fact the machines and hardware used back then was too primitive to handle such mammoth tasks.

NLP : Returns

After almost 15 years NLP and AI were resurrected after abandoning the earlier concepts of machine translation, to newer ideas in both of these domains. The earlier mixture of statistics and linguistics which had been a beacon for the prior version of NLP was supplanted with a perspective of pure statistics. This was caused due to the adoption of machine learning algorithms that were used for this purpose — this was solely possible due to the increased computational capabilities. Further research was focused on these statistical models which were capable of making soft, probabilistic decisions. In the year 1997, LSTM recurrent neural networks (RNN) were introduced and these found their niche in 2007, for text and voice processing. Prior to this in 2001, Yoshi Bengio and his team proposed the first neural “language” model using a feed-forward neural network. The first biggest breakthrough was in 2011 when Apple Inc. Announced Siri a mobile voice assistant.

NLP: Forever

NLP Forever

NLP is now an integral part of Artificial intelligence, as many of the experts believe that having flawless communication in unstructured languages plays a major part in creating and having intelligent machines. It is also one of the topics being extensively researched by thousands of researchers from top-notch colleges and professionals from huge multinational companies. Today Natural Language Processing is far more advanced than just translating from one language to another and it can take humongous tasks like:

And many apart from these.

But the process of practicing and researching these highly advanced tools is quite simple, thanks to the open-source libraries like NLTK and TensorFlow which make the process of creating your own NLP models a lot easier. NLP is widely implemented in the fields of:

At last, I would like to conclude that NLP though has faced some hard times in the beginning, surely is a mighty field in itself now. The discoveries in this field are nowhere near to halt. It, if in the right hands will help the human races in achieving new heights in the forthcoming years.