[ad_1]
Reported By: Shaurya Sharma
Last Updated: February 16, 2024, 09:03 IST
Mountain View, California, USA
Google’s Gemini LLM gets an upgrade.
Google’s Gemini 1.5 is out, just two months after the release of Gemini 1.0, bringing forth a slew of meaningful changes. Here’s all you need to know.
Google Gemini has been out for just roughly two months, but the search giant has already launched its successor and it’s its latest Large Language Model (LLM) to date—Gemini 1.5. This version is currently only available for enterprises and developers, and a full consumer rollout is expected soon.
Google CEO Sundar Pichai says Gemini 1.5 shows “dramatic improvements across a number of dimensions,” and meanwhile, achieves quality that is comparable to Gemini 1.0 Ultra—which is its most advanced LLM—using less compute.
Further, Pichai added that this new generation achieved a breakthrough in long-context understanding, and now, Google has been able to “increase the amount of information our models can process—running up to 1 million tokens consistently, achieving the longest context window of any large-scale foundation model yet.”
Gemini 1.5: What’s New?
Firstly, the Gemini 1.5 model comes with a new Mixture-of-Experts (MoE) architecture, making it more efficient and easier to serve.
Initially, Google is releasing the 1.5 Pro version for early testing, and it performs on a similar level to 1.0 Ultra. Gemini 1.5 Pro will be available with a standard 128,000 token context window, but a limited group of developers and enterprises can try it with a context window of up to 1 million tokens.
Google also emphasized that thanks to being built on Transformer and MoE architecture, Gemini 1.5 is more efficient, as MoE models are divided into smaller “expert” neural networks.
Moreover, understanding context is another key area where Gemini 1.5 has been improved. “1.5 Pro can process vast amounts of information in one go — including 1 hour of video, 11 hours of audio, codebases with over 30,000 lines of code, or over 7,00,000 words. In our research, we’ve also successfully tested up to 10 million tokens,” Google said.
Simply put, if you give Gemini 1.5 a big chunk of info like a big novel, research article, or as Google mentions, Apollo 11’s 402-page mission transcript, and ask it to summarise, it can do that. And later, you can also ask detailed questions based on what Gemini 1.5 understands from it.
Performance Sees A Jump
Compared to Gemini 1.0 Pro, the new 1.5 Pro model outperforms it in 87% of the benchmarks used for developing Google’s LLMs and performs similarly to Gemini 1.0 Ultra.
Another big change is the ability to show “in-context learning” skills. This means that Gemini 1.5 Pro can “learn a new skill from information given in a long prompt, without needing additional fine-tuning.”
[ad_2]
Source link