Meta platforms revealed the latest repetition of the Great Language model (LLMA), on Saturday.
The company introduced two new versions, Llama 4 Scout and Llama 4 MAVERICK, described by “the most advanced models so far” and put them as leaders in the AI technology. These models are designed to process and integrate various types of data, including text, photos, videos and sound, allowing them to convert content through various formats.
Meta has also announced that both Llama 4 Scout and Llama 4 MAVERICK will be an open source program, which enhances their access to developers. In addition, Meta inspected Llama 4 Beheemoth, which she describes as one of the most powerful LLMS in the world, aimed at working as a teacher of the latest models.
The launch of these forms follows an increase in investment in the AI infrastructure by major technology companies, which are stimulated by the success of Chatgpt from Openai. However, Meta was initially late for Llama 4 launch after failing to meet internal technical standards, especially in thinking and mathematics tasks.
There were also fears that Llama 4 was unable to match Openai’s capabilities in human voice conversations.
To support its efforts from artificial intelligence, Meta plans to invest up to $ 65 billion this year in expanding the infrastructure of Amnesty International, and in response to increasing investors’ demands to obtain returns on these investments.