top of page
Emerging technologies NLP language models
Globally developers adopt the newest #tech and their questions online are a great source of identifying lead indicators.
#AI applications in text processing have trended. To make machines understand the text they are trained on a large enough corpus using AI techniques and are able to understand the context and meaning of words. Once done these language models are used for a variety of tasks, such as text classification, audio to text conversion, etc.
Earlier models were built around the corpus to be analyzed (ex: word2vec). These continue to hold sway. These models are "context-free" in that they assign only one meaning to a word ex: Apple. Recently BERT has led to a significant shift. The model is trained differently on a big corpus like wiki and has a richer understanding of words. These models are "context-aware" in the sense that "Apple" shares & fruit mean different things. #BERT has trended in usage
OpenAI further advanced this and released multiple versions of #GPT which has picked steam.
Another trend is the emergence of models fine-tuned for specific domains like #finance, #healthcare, etc, and functions like hypothesis testing, Q&A, etc.
bottom of page