Overview
- Here is a list of natural language processing frameworks (PNL) most important things you need to know in the last two years
- From Google AI Transformer to XLM / mBERT de Facebook Research, we trace the rise of NLP through the lens of these seismic advances.
Introduction
Have you heard of the latest Natural language processing framework that was recently released? I don't blame you if you're still catching up on the excellent StanfordNLP library or the PyTorch-Transformers framework!!
There has been a notable increase in the amount of research and advancements in NLP in recent years.
I can track this recent promotion to an item (seismic): “Attention is all you need” from Google AI in June 2017. This advancement has spawned so many exciting new NLP libraries that allow us to work with text in ways previously limited to our imaginations. (or Hollywood).
Here's the interest in natural language processing based on Google searches in the last 5 years in the US. UU.:
We can see a similar pattern when we broaden the search to include everyone!
Today, we have cutting-edge approaches to language modeling, transfer learning and many other important and advanced NLP tasks. Most of these involve the application of deep learning, especially the Transformer architecture that was introduced in the previous document.
So we decided to collect all the important developments in one place and in a neat timeline.. You will love this infographic and a must have it on hand during your own NLP journey..
I have listed some tutorials below to help you get started with these frameworks:
Without any more preambles, Here's the infographic in all its glory! And if you want to download the high resolution PDF (that i really should), head over here.