French AI darling Mistral is keeping the new releases coming this summer. Just days after announcing its own domestic AI-optimized cloud service Mistral Compute, the well-funded company has released ...
When Liquid AI, a startup founded by MIT computer scientists back in 2023, introduced its Liquid Foundation Models series 2 (LFM2) in July 2025, the pitch was straightforward: deliver the fastest ...
The original version of this story appeared in Quanta Magazine. Large language models work well because they’re so large. The latest models from OpenAI, Meta, and DeepSeek use hundreds of billions of ...
Large language models (LLMs) use vast amounts of data and computing power to create answers to queries that look and sometimes even feel “human”. LLMs can also generate music, images or video, write ...
There’s a paradox at the heart of modern AI: The kinds of sophisticated models that companies are using to get real work done and reduce head count aren’t the ones getting all the attention. Ever-more ...
Large language models work well because they’re so large. The latest models from OpenAI, Meta and DeepSeek use hundreds of billions of “parameters” — the adjustable knobs that determine connections ...
Small language models shine for domain-specific or specialized use cases, while making it easier for enterprises to balance performance, cost, and security concerns. Since ChatGPT arrived in late 2022 ...