The latest AI news from Meta
Latest News
Get the latest from AI at Meta in your inbox
Blog
September 18, 2024
To inspire developers who build on Llama, Together AI built the LlamaCoder app—an open source web app that allows people to generate an entire app from a prompt using Llama 3.1 405B.
September 13, 2024
Refik Anadol’s studio trains their own models, leveraging Meta’s open source Llama collection of models in the process.
September 11, 2024
Powered by Meta’s Llama 3 and 3.1, Upeo Labs' Somo-GPT serves as a multi-subject teaching assistant.
April 25, 2024
Meditron, a suite of open-source large multimodal foundation models tailored to the medical field, was built on Meta Llama 2.
August 29, 2024
Llama models are approaching 350 million downloads to date, and they were downloaded more than 20 million times in the last month alone, making Llama the leading open source model family.
August 14, 2024
Our partners at NVIDIA explain how they used structured weight pruning and model distillation to create Llama-Minitron 3.1 4B—their first work within the Llama 3.1 open source collection of models.
August 09, 2024
Zoom uses its models and closed-source and open-source large language models—including Llama—to power its AI Companion.
August 08, 2024
Leveraging Llama 2, LyRise has been able to build robust matching pipelines that present relevant candidates to clients—reducing time-to-hire by as much as 50%.
August 07, 2024
In this post, we explore some rules of thumb for curating a good training dataset.
In this post, we’ll discuss the following question: “When should we fine-tune, and when should we consider other techniques?”
In this post, we’ll take a look at the various approaches available to adapt LLMs to domain data.
August 05, 2024
We’re excited to begin accepting applications for the Llama 3.1 Impact Grants, the next iteration of a larger portfolio of work to support organizations as they pursue their ideas for how Llama can be used to address social challenges in their communities.
Product experiences
Foundational models
Our approach
Research
Latest news
Meta © 2024