Highlights
Meta Platforms has officially launched early versions of its new large language model, Llama 3, in a strategic move to solidify its position within the rapidly evolving AI landscape. This release contains two implementations that feature powerful computational abilities developed to improve Meta AI virtual assistant and being integrated into the major platforms such as Facebook, Instagram, WhatsApp, and Messenger.
The early versions of Llama 3 have 8 billion and 70 billion parameters, respectively, and Meta claims they have outperformed in benchmarks for reasoning, coding, and creative writing, as well as others like Google and various startups.
These models are being integrated into the Meta AI to enable users to get more advanced and smooth digital assistance directly in the applications they use each day. Furthermore, new functionalities of the AI assistant are now being advertised on a separate website, which denotes the transition of the company to direct competition with theoretically more advanced solutions provided by such industry leaders as ChatGPT from OpenAI.
In addition to the computational improvements, Llama 3 also brings enhancements in handling subtle language, an area where predecessors (e.g., Llama 2) struggled. These improvements are essential as Meta gets ready to launch its AI assistant in more than a dozen new international markets besides the U.S., including Australia, Canada, Singapore, Nigeria, and Pakistan. This move is an element of Meta’s wider plan to use its global platform reach to get advanced AI tools to a larger audience.
Chris Cox, Meta’s Chief Product Officer, highlighted the fact that combined text and image data addition to the training of Llama 3 is likely to enhance the utility of the assistant. In this manner, this approach is set to introduce additional improvements later this year, with the potential to reshape interactions with the devices such as Ray-Ban Meta smart glasses, as they will be capable of detecting objects and offering contextual information in real-time.
Although Meta has uncovered that Llama 3 was trained on a dataset seven times larger than its predecessor, the details of the data employed are rather vague. The company guarantees that no user data was used and used data sources, including public internet data and synthetic AI-generated data. This approach is a typical trend across the industry where mounting concerns about privacy and ethical data usage offset the huge data requirements for AI training.
The Llama 3 blueprints include gradual improvements to increase reasoning and multimodal abilities and the possibility of a version with over 400 billion parameters. Although the decision to release the largest model is still pending, Meta CEO Mark Zuckerberg has been positive about the model’s capacities and the company’s open-source approach.
Read Also: Kraken Expands US Presence With TradeStation Crypto Buyup
Strategy CEO Phong Lee said the company is unlikely to sell Bitcoin before 2029, citing…
President Donald Trump has confirmed that he will reveal his choice for the next Fed…
Kraken has agreed to acquire Backed Finance, the tokenized asset issuer behind its xStocks product.…
Crypto firm Ripple has secured another major partnership in a bid to expand its payment…
After the Federal Reserve declared the withdrawal of its quantitative tightening, Bitcoin rose above the…
Hyperliquid has received a major boost following Sonnet's shareholders' approval of the merger to establish…