The DeepMind technique that will supercharge AI training with less energy
Image Credits: Curto News/Bing AI Creator

JEST: The DeepMind technique that will supercharge AI training with less energy

Researchers from DeepMind, the division of artificial intelligencel (IA) of Google, published new research revolutionary introducing JEST (Joint Example Selection Technique). This is a method that dramatically speeds up the training of AI models, while significantly reducing the need for computational power.

ADVERTISING

How does JEST work?

  • The technique uses two AI models: a pre-trained reference model and a “learner” model that is being trained to identify the most valuable data examples.
  • JEST intelligently selects the most instructive batches of data, making AI training up to 13 times faster and 10 times more efficient than current state-of-the-art methods.
  • In benchmark testing, JEST achieved top-notch performance using just 10% of the training data required by previous leading models.
  • The method enables “data quality bootstrapping” using small, well-selected data sets to guide learning on larger, unstructured sets.

Why is it important?

  • The massive consumption of the energy by AI is under increasing scrutiny. JEST's ability to drastically reduce computational requirements could be a game-changer for training models more energy-efficiently.
  • Additionally, the ability to train faster means that the acceleration of the rollout of advanced models is just beginning.

With JEST, AI training can become faster, more efficient and much more environmentally friendly. This innovation paves the way for the development of even more powerful AI models without overwhelming the environment.

Read also

Scroll up