Natural Language Processing

47 Posts

Illustration of a laughing robot
Natural Language Processing

Toward Machines That LOL: Scientists Teach a Speech Recognition Robot to Laugh

Even if we manage to stop robots from taking over the world, they may still have the last laugh. Researchers at Kyoto University developed a series of neural networks that enable a robot engaged in spoken conversation to chortle along with its human interlocutor.
Bloom logo
Natural Language Processing

Large Language Models Unbound: BLOOM is the Largest Open Source NLP Model to Date

A worldwide collaboration produced the biggest open source language model to date. BLOOM is a family of language models built by the BigScience Research Workshop, a collective of over 1,000 researchers from 250 institutions around the globe.
A series of graphs show the carbon emissions associated with training AI models.
Natural Language Processing

Cutting the Carbon Cost of Training: A New Tool Helps NLP Models Lower Their Gas Emissions

You can reduce your model’s carbon emissions by being choosy about when and where you train it.
Everlaw's clustering feature organizing thousands of documents
Natural Language Processing

Order in the Court: Machine Learning Tool from Everlaw Finds Legal Evidence

Machine learning is helping lawyers sift through mountains of documents to find evidence. The legal technology company Everlaw launched a clustering feature that automatically organizes up to 25 million documents for lawyers gathering evidence to be used during a trial.
Example of text generated by LaMDA
Natural Language Processing

LaMDA Comes Alive?: Google Engineer Says LaMDA AI is Sentient

A chatbot persuaded at least one person that it has feelings. A senior engineer at Google announced his belief that the company’s latest conversational language model is sentient.
Andrew Ng staring at neural networks
Natural Language Processing

The Batch: Special Issue! Foundational Algorithms, Where They Came From, Where They're Going

Years ago, I had to choose between a neural network and a decision tree learning algorithm. It was necessary to pick an efficient one, because we planned to apply the algorithm to a very large set of users on a limited compute budget.
Man sitting on a tree with a monkey and a gorilla
Natural Language Processing

Decision Trees: From Root to Leaves — Decision Trees for Machine Learning Explained

What kind of beast was Aristotle? The philosopher's follower Porphyry, who lived in Syria during the third century, came up with a logical way to answer the question...
Illustration of a robot with a captain costume
Natural Language Processing

Neural Networks: Find the Function — A Basic Introduction to Neural Networks

Let’s get this out of the way: A brain is not a cluster of graphics processing units, and if it were, it would run software far more complex than the typical artificial neural network. Yet neural networks were inspired by the brain’s architecture.
One person pouring a drink of poison in the company of another person
Natural Language Processing

Logistic Regression: Follow the Curve — A Basic Introduction to Logistic Regression for Machine Learning

There was a moment when logistic regression was used to classify just one thing: If you drink a vial of poison, are you likely to be labeled “living” or “deceased”? Times have changed.
Graph Average across 14 NLP Tasks parameters versus Average Accuracy
Natural Language Processing

GPT-Free: Meta Releases Open Source Large Language Models OPT

Itching to get your hands on a fully trained large language model? The wait is over. Meta introduced the OPT family of transformer-based language models with nearly unfettered access to source code and trained weights.
Shifted Patch Tokenization (SPT) | Locality Self-Attention (LSA)
Natural Language Processing

Less Data for Vision Transformers: Boosting Vision Transformer Performance with Less Data

Vision Transformer (ViT) outperformed convolutional neural networks in image classification, but it required more training data. New work enabled ViT and its variants to outperform other architectures with less training data.
GLaM model architecture
Natural Language Processing

Efficiency Experts: Mixture of Experts Makes Language Models More Efficient

The emerging generation of trillion-parameter language models take significant computation to train. Activating only a portion of the network at a time can cut the requirement dramatically and still achieve exceptional results.
Jurassic-X's software infrastructure
Natural Language Processing

Neural Nets + Rules = Truer Text: Jurassic-X NLP Can Solve Math, Check Facts, and More

A new approach aims to cure text generators of their tendency to produce nonsense. AI21 Labs launched Jurassic-X, a natural language processing system that combines neural networks and rule-based programs.
Evolutionary Model of Variant Effect (EVE)
Natural Language Processing

Spot the Bad Mutation: AI Model Spots Disease Linked Protein Mutations

Every gene in the human genome exists in a variety of mutations, and some encode protein variants that cause cells to malfunction, resulting in illness. Yet which mutations are associated with disease is largely unknown.
Transformer Architecture
Natural Language Processing

Transformers See in 3D: Using transformers to visualize depth in 2D images.

Visual robots typically perceive the three-dimensional world through sequences of two-dimensional images, but they don’t always know what they’re looking at. For instance, Tesla’s self-driving system has been known to mistake a full moon for a traffic light.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox