Different videoclips showing windmills

Wind in the Forecast: AI Tool Predicts Wind Turbine Energy Output

Machine learning is making wind power more predictable. Engie SA, a multinational energy utility based in France, is the first customer for an AI-powered tool from Google that predicts the energy output of wind farms.
Example of text generated by LaMDA

LaMDA Comes Alive?: Google Engineer Says LaMDA AI is Sentient

A chatbot persuaded at least one person that it has feelings. A senior engineer at Google announced his belief that the company’s latest conversational language model is sentient.
Metaverse illustration with Meta AI product names

Meta Decentralizes AI Effort: Meta Restructures its AI Research Teams

The future of Big AI may lie with product-development teams. Meta reorganized its AI division. Henceforth, AI teams will report to departments that develop key products.
Contentedge screen video capture

Winning The Google Game: 14 Companies Using GPT-3 to Top SEO

AI startups are helping writers tailor articles that appear near the top of Google’s search results. At least 14 companies sell access to software that uses GPT-3, the language model from OpenAI, to generate headlines, product descriptions, blog posts, and video scripts.
Didactic diagram of a hypothetical embedded-model architecture

Image Generation + Probabilities: New Method Boosts Performance for Normalizing Flow

If you want to both synthesize data and find the probability of any given example — say, generate images of manufacturing defects to train a defect detector and identify the highest-probability defects — you may use the architecture known as a normalizing flow.
GLaM model architecture

Efficiency Experts: Mixture of Experts Makes Language Models More Efficient

The emerging generation of trillion-parameter language models take significant computation to train. Activating only a portion of the network at a time can cut the requirement dramatically and still achieve exceptional results.
No-code AI platform

Who Needs Programming? Behind the rise of no-code AI.

The next killer AI application may be developed by someone who has never heard of gradient descent. A rising generation of software development platforms serves users who aren’t familiar with AI — and even programming.
The performance of different downstream (DS)

The Limits of Pretraining: More pretraining doesn't guarantee a better fine-tuned AI.

The higher the accuracy of a pretrained model, the better its performance after fine-tuning, right? Not necessarily. Researchers conducted a meta-analysis of image-recognition experiments and performed some of their own.
Robot cleaning a table

Robots in the Workplace: Google uses robot janitors to clean up offices.

Machines are doing light janitorial work in the uncontrolled environment of Google’s offices. Everyday Robots, a new spin-out from Google’s experimental X Development division, unleashed 100 robots to perform an array of cleanup tasks.
A living room made out of cups of coffee: the people, the seats, the chimney, the lamp, all gather around a cozy fire.

One Architecture to Do Them All: Transformer: The AI architecture that can do it all.

The transformer architecture extended its reach to a variety of new domains.What happened: Originally developed for natural language processing, transformers are becoming the Swiss Army Knife of deep learning.
An illustration shows a cozy cabin where all the furniture is made out of coffee mugs.

Transformers Take Over: Transformers Applied to Vision, Language, Video, and More

In 2021, transformers were harnessed to discover drugs, recognize speech, and paint pictures — and much more.
Illustration of a woman riding a sled

Multimodal AI Takes Off: Multimodal Models, such as CLIP and DALL·E, are taking over AI.

While models like GPT-3 and EfficientNet, which work on text and images respectively, are responsible for some of deep learning’s highest-profile successes, approaches that find relationships between text and images made impressive
Illustration of giant Christmas tree in a town plaza

Trillions of Parameters: Are AI models with trillions of parameters the new normal?

The trend toward ever-larger models crossed the threshold from immense to ginormous. Google kicked off 2021 with Switch Transformer, the first published work to exceed a trillion parameters, weighing in at 1.6 trillion.
Two images showing RETRO Architecture and Gopher (280B) vs State of the Art

Large Language Models Shrink: Gopher and RETRO prove lean language models can push boundaries.

DeepMind released three papers that push the boundaries — and examine the issues — of large language models.
Google's Decision Transformer

Reinforcement Learning Transformed: Transformers succeed at reinforcemend learning tasks.

Transformers have matched or exceeded earlier architectures in language modeling and image classification. New work shows they can achieve state-of-the-art results in some reinforcement learning tasks as well.

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox