Vanilla Neural Network

19 Posts

Long-Range Weather Forecasts: This ML-based forecast simulator outperformed medium-range forecast systems.
Vanilla Neural Network

Long-Range Weather Forecasts: This ML-based forecast simulator outperformed medium-range forecast systems.

Machine learning models have predicted weather a few days ahead of time. A new approach substantially extends the time horizon. Remi Lam and colleagues at Google developed GraphCast, a weather-forecasting system based on graph neural networks (GNNs).
Plot demonstrating the relative sizes of parallel and monolingual examples
Vanilla Neural Network

Massively Multilingual Translation: Machine Learning Model Trained to Translate 1,000 Languages

Recent work showed that models for multilingual machine translation can increase the number of languages they translate by scraping the web for pairs of equivalent sentences in different languages. A new study radically expanded the language repertoire through training on untranslated web text.
Deck of playing cards
Vanilla Neural Network

Bridge to Explainable AI: AI System Outplays Human Bridge Champions

DeepMind’s AlphaGo famously dominated Go, a game in which players can see the state of play at all times. A new AI system demonstrated similar mastery of bridge, in which crucial information remains hidden.
Diagram with automated decision systems
Vanilla Neural Network

Roadblocks to Regulation: Why laws to regulate AI usually fail.

Most U.S. state agencies use AI without limits or oversight. An investigative report probed reasons why efforts to rein them in have made little headway. Since 2018, nearly every proposed bill aimed at studying or controlling how state agencies use automated decision systems.
Books on the floor
Vanilla Neural Network

How to Learn Machine Learning

Want to become an AI practitioner? Here’s a program that will take you from beginner to job-ready. You may already have a head start, depending on your background. For a motivated person who starts with a solid high-school education it may take around two years.
Multimodal deep learning model
Vanilla Neural Network

AI Versus the Garbage Heap: How Amazon uses AI to cut waste.

Amazon reported long-term success using machine learning to shrink its environmental footprint. The online retailer developed a system that fuses product descriptions, images, and structured data to decide how an item should be packed for shipping.
Animated charts show the results of a machine learning model that predicts the best depression drugs for a patient.
Vanilla Neural Network

Which Drug Helps Your Depression?: AI System Matches Patients With the Right Depression Drug

Deep learning can predict how patients will respond to two antidepressant medicines.
Animated illustration shows the model architecture of a graph neural network.
Vanilla Neural Network

A Deeper Look at Graphs: Graph Neural Networks Work Better With More Layers

New research shows that drastically increasing the number of layers in a graph neural networks improves its performance on large datasets.
Animated chart shows how AI can help robots locate key spatial coordinates.
Vanilla Neural Network

Finding Useful Points in Space: Keypoint3D Helps Robots Locate Spatial Coordinates

A new machine learning method aims to improve a machine’s ability to determine and locate points of interest.
Animation showing a simulated football team and how it works
Vanilla Neural Network

Team Players: Football-Playing AI Blends Individual and Group Skills

Playing a team sport involves a fluid blend of individual and group skills. Researchers integrated both types of action into realistic humanoid agents that play football (known as soccer in the U.S.).
Animation showing gMLP, a simple architecture that performed some language and vision tasks as well as transformers
Vanilla Neural Network

Perceptrons Are All You Need: Google Brain's Multi-Layer Perceptron Rivals Transformers

The paper that introduced the transformer famously declared, “Attention is all you need.” To the contrary, new work shows you may not need transformer-style attention at all.What’s new: Hanxiao Liu and colleagues at Google
Graph showing Expire-span which enables attention to ignore tokens that aren’t useful to the task at hand
Vanilla Neural Network

Sharper Attention: NLP transformer technique for more Efficient token usage.

Self-attention enables transformer networks to track relationships between distant tokens — such as text characters — in long sequences, but the computational resources required grow quadratically with input size.
Graphs, images and data related to the activation function known as ReLU
Vanilla Neural Network

Upgrade for ReLU: The sin(x) activation function is an alternative to ReLU.

The activation function known as ReLU builds complex nonlinear functions across layers of a neural network, making functions that outline flat faces and sharp edges. But how much of the world breaks down into perfect polyhedra?
Simpler multilayer neural network
Vanilla Neural Network

Revenge of the Perceptrons: Perceptrons do some AI tasks on par with complex AI.

Why use a complex model when a simple one will do? New work shows that the simplest multilayer neural network, with a small twist, can perform some tasks as well as today’s most sophisticated architectures.
Neural networks generating novel views of a 3D scene based on existing pictures
Vanilla Neural Network

3D Scene Synthesis for the Real World: Generating 3D scenes with radiance fields and image data

Researchers have used neural networks to generate novel views of a 3D scene based on existing pictures plus the positions and angles of the cameras that took them. In practice, though, you may not know the precise camera
Load More

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox