- What's brewing in AI
- Posts
- 🧙🏼♂️ Mistral just dropped a top-tier model
🧙🏼♂️ Mistral just dropped a top-tier model
Also: Grok-3 by December
Howdy, wizards.
A quick one for today. Mistral had a surprise launch right after Meta yesterday, and Elon got some big things brewing.
Let’s jump in!
Dario’s Picks
The most important news stories in AI this week
Mistral quitely ships new flagship model. Mistral Large 2 is the French startups' answer to OpenAI and Meta's newest models. It's another open model that's totally up there with the leading closed models, and it’s “only” 123B parameters in size (that’s a third of Llama 3.1 405B). It also features improved multilingual support over Mistral's previous models. You can test the new model on Le Chat.
Why it matters With zero trumpet calls, Mistral just dropped a model whose performance is very much on par with, and in some cases even exceeds, models like GPT-4o, Claude Sonnet 3.5 and the newborn Llama 3.1 405B. Important to note here is that while Mistral’s Large models are more open than most, e.g. GPT-4o, commercial applications need a paid license.
Continued after the ad…
This issue is brought to you by
Learn AI in 5 Minutes a Day
AI Tool Report is one of the fastest-growing and most respected newsletters in the world, with over 550,000 readers from companies like OpenAI, Nvidia, Meta, Microsoft, and more.
Our research team spends hundreds of hours a week summarizing the latest news, and finding you the best opportunities to save time and earn more using AI.
Elon Musk announces Grok-3 by December. He claims it'll be the most powerful AI "by every metric"; the training has started, with a cluster of compute referred to as Memphis Supercluster: 100,000 Nvidia H100 GPUs (yes, a lot of compute).
Why it matters Looks like the xAI team are putting their fresh $6 billion in funding to work. While Musk is known for his bold predictions that don't always come into fruition, there's a lot of expertise and compute going into the mix here, so it's likely that their next model would put xAI up there with leading models.
OpenAI is in talks with Broadcom on a new AI chip. OpenAI is in discussion with Broadcom and potentially other semiconductor companies about developing its own AI chip. Like all the major AI companies, OpenAI is highly reliant on chips from Nvidia, and this is a strategy to reduce that dependency.
Why it matters Chips play a crucial component in building increasingly advanced AI. OpenAI is playing the long game here, aiming to secure access to AI chips which have long been in shortage.
Perplexity improves its interface for navigational searches. If a user types the name of a website into Perplexity, the first thing that pops up now is a big link to the website (similar to when Googling). After, that Perplexity's default answer and the other related links follow.
Why it matters This is definitely a UI upgrade making it easier to find the websites with Perplexity. Between the lines, this means people are actually using Perplexity like a regular search engine to find websites by searching their names instead of typing the URL. It's not just for summarising, getting quick answers, etc. Makes me wonder at what point the "Sponsored" links will start appearing.
The wizard’s favourite AI newsletters
what i’m reading right now
TLDR 💨 - essential tech news in 5 mins. Read by 1.2 million people
simple.ai - 🕵🏻♂️ - deep-dives on AI by Hubspot’s co-founder
Bagel Bots 🥯 - best hands-on tips & tricks
The Neuron 😸 - easy weekday read on AI’s latest developments
Was this email forwarded to you? Sign up here. Want to get in front of 13,000 AI enthusiasts? Work with me. This newsletter is written & curated by Dario Chincha. |
What's your verdict on this week's email? |
Affiliate disclosure: To cover the cost of my email software and the time I spend writing this newsletter, I sometimes link to products and other newsletters. Please assume these are affiliate links. If you choose to subscribe to a newsletter or buy a product through any of my links then THANK YOU – it will make it possible for me to continue to do this.