Neural language models

Natural Language Processing, Natural language understanding, Biomedical natural language processing, Language models for health records, Language models for genetics

Language model implemented with Neural networks.

Neural language models perform the language modeling tasks by using a neural network rather than explicitly estimating the conditional probabilities, by learning good vector-space representations for words, sentences, etc.

Apparently, you can keep scaling up the model and dataset size and keep improving the model: Kaplan2020scaling.

They can also suffer from Training data leakage in language models.

It can be used “non-language data” as well. For instance, we can think of Language models for health records or Language models for genetics

Pre-trained language models are currently dominating the field because fine-tuning approach on a small dataset can benefit from a better, bigger pre-trained models. In other words, even with a limited dataset, by using a better pre-trained model, we can improve performance on the limited dataset. Here is a review of such pre-trained language models: https://github.com/thunlp/PLMpapers

Text generation