Adventures of Natural Language Processing

“Alexa! Turn off the Alarm” is probably, how we start our days, a simple command given in our mother tongue which sets your schedule, opens doors, turns off lights and books appointments with your barber, amongst many other things. And yet, many of us fail to grasp the complexity this technology possesses. Neither Natural Language Processing (or NLP) has developed overnight, nor is it a brainchild of a single person, but what it is, is a product of decades and centuries of compound efforts from scientists around the globe.

NLP: Origins

The theoretical debates date back to the 17th century where certain philosophers proposed codes that related words between languages. Throughout the following centuries, similar research papers, theories and thought experiments were proposed. These were very remotely related to the fields of NLP or even to computers processing unstructured languages and many times were just concerned about conversing from one language to another. The first person who argued and presented a view of human-computer interaction in an unstructured language medium was none other than the father of theoretical computer science — Alan Turing. In his 1950’s paper “Computing Machinery and Intelligence”,he proposed what is now called the Turing Machine, thus associating the ability of computers to process and produce human-understandable language to Intelligence. This was followed by the proposal of the “Hodgkin-Huxley model” in 1952, which explained how human brains had an electric network of neurons, thus giving a subsequent boost for the fields of AI and NLP.

This was followed by Noam Chomsky publishing his book, Syntactic Structures in 1957, revolutionizing the linguistic concepts, and concluding that for a computer to understand a language the sentence structure must be changed. Keeping this in mind Chomsky, proposed his own style of grammar called phase structure grammar. And in 1964 “ELIZA”, a program for imitating a psychiatrist was developed, this program had less to no part to play in language understanding by computers, as it was based on rearranging sentences and following relatively simple grammar rules. After the researches in the fields of Natural Language Processing were halted for more than a decade as the funding from the majority of organizations were diverted. This was justified as the need for machine translators were less and the human alternative was much more affordable rather than relying on machines, also the fact the machines and hardware used back then was too primitive to handle such mammoth tasks.

NLP : Returns

After almost 15 years NLP and AI were resurrected after abandoning the earlier concepts of machine translation, to newer ideas in both of these domains. The earlier mixture of statistics and linguistics which had been a beacon for the prior version of NLP was supplanted with a perspective of pure statistics. This was caused due to the adoption of machine learning algorithms that were used for this purpose — this was solely possible due to the increased computational capabilities. Further research was focused on these statistical models which were capable of making soft, probabilistic decisions. In the year 1997, LSTM recurrent neural networks (RNN) were introduced and these found their niche in 2007, for text and voice processing. Prior to this in 2001, Yoshi Bengio and his team proposed the first neural “language” model using a feed-forward neural network. The first biggest breakthrough was in 2011 when Apple Inc. Announced Siri a mobile voice assistant.

NLP: Forever

NLP Forever

NLP is now an integral part of Artificial intelligence, as many of the experts believe that having flawless communication in unstructured languages plays a major part in creating and having intelligent machines. It is also one of the topics being extensively researched by thousands of researchers from top-notch colleges and professionals from huge multinational companies. Today Natural Language Processing is far more advanced than just translating from one language to another and it can take humongous tasks like:

  1. Speech Recognition, or also known as text-to-speech recognition is a pretty commonly used field in applications such as voice notes and generating live captions.
  2. Part of speech tagging, also known as grammatical tagging. Used primarily for specifying the type figure of speech used in the selected sentence.
  3. Word Sense disambiguation, providing the user with the suitable meaning of the similar word in different sentences and different scenarios.
  4. Named entity recognition, identifying words and phrases as useful entities as “Atharva” with name and “India” with the country.
  5. Sentiment analysis, making sense of the human emotion behind the given sentence.

And many apart from these.

But the process of practicing and researching these highly advanced tools is quite simple, thanks to the open-source libraries like NLTK and TensorFlow which make the process of creating your own NLP models a lot easier. NLP is widely implemented in the fields of:

  1. Spam Detection: NLP is used to classify texts and emails consisting of suspicious contents into relevant folders or blocking the sender totally.
  2. Machine translation: This can be a lifesaver when traveling places that don’t speak your mother tongue as tools using this, for example, take google translate can help you translate things in real-time.
  3. Virtual Assistants and Chat-bots: Well, the very start of this blog was with an example of a Virtual Assistant. These tools can really help your business boost by attending multiple clients at a time or can prove to be a great time pass in your free time.
  4. Social Media: Yes, the features like sentiment analysis can help with censoring and deleting hate spreading posts and messages, it can also help in, text summation which can help with searching the most relevant posts and articles across the platform.

At last, I would like to conclude that NLP though has faced some hard times in the beginning, surely is a mighty field in itself now. The discoveries in this field are nowhere near to halt. It, if in the right hands will help the human races in achieving new heights in the forthcoming years.

--

--

--

Foodie, professional geek, developer and a lot more

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

For you and the dream of a better future

Unveiling Noble.AI’s New Product Branding

Deep Learning with NCCO Air Purification Technology for Schools in Hong Kong to Maintain Healthy…

The Lazy Brain Hypothesis

Artificial Intelligence — A Model for Humanity

Handling obstacles while deploying AI models in your Business.

Beyond the Hype: How machine learning will transform business decisions

The Role of Humans in the Age of Robots

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
AtharvaDongare

AtharvaDongare

Foodie, professional geek, developer and a lot more

More from Medium

News Article Classification Task using SOTA models and their comparison

How Hugging Face is breaking entry barriers in the application of Natural Language Processing.

Understanding Natural Language Processing- A case study for Autonomous Vehicle (AV): Part3

Introduction to Natural Language Processing (NLP)