FalconLM open-source language model beats Meta’s LLaMA



summary
Summary

The open-source language model FalconLM offers better performance than Meta’s LLaMA and can also be used commercially. Purpose commercial use is subject to royalties if revenues exceed $1 million.

FalconLM is being developed by the Technology Innovation Institute (TII) in Abu Dhabi, United Arab Emirates. The organization claims that FalconLM is the most powerful open-source language model to date, although the largest variant, with 40 billion parameters, is significantly smaller than Meta’s LLaMA with 65 billion parameters.

On the Hugging Face OpenLLM Leaderboard, which summarizes the results of various benchmarks, the two largest FalconLM models, one of which has been refined with instructions, currently hold the top two positions by a significant margin. TII also offers a 7-billion model.

The falcon outflies its rivals. | Picture: Hugging Face (screenshot).

FalconLM trains more efficiently than GPT-3

An important aspect of FalconLM’s competitive edge, according to the development team, is the data selection for training. Language models are sensitive to data quality during training.

ad

The research team developed a process to extract high-quality data from the well-known common crawl dataset and remove duplicates. Despite this thorough cleaning, five trillion pieces of text (tokens) remained – enough to train powerful language models.

Das Forschungsteam hat sich zum Ziel gesetzt, mit Hilfe des RefinedWeb-Datensatzes nur die qualitativ hochwertigsten Originaldaten aus Common Crawl herauszufiltern.
The research team’s goal is to use the RefinedWeb dataset to filter out only the highest-quality original data from Common Crawl. | Picture: TII

FalconLM was trained on a trillion tokens. Data from the RefinedWeb dataset was enriched with “a few” curated datasets from scientific articles and social media discussions. The best-performing instructional version, the chatbot version, was fine-tuned using the Baize data set.

The TII also mentions an architecture optimized for performance and efficiency, but does not provide details. The paper is not yet available.

According to the team, the optimized architecture combined with the high-quality dataset resulted in FalconLM requiring only 75 percent of the computational effort of GPT-3 during training, but significantly outperforming the older OpenAI model. Inference costs are said to be one-fifth of GPT-3.

Available as open source, commercial use can get expensive

TII’s use cases for FalconLM include text generation, solving complex problems, using the model as a personal chatbot, or in commercial areas such as customer service or translation.

Recommendation

free download from Huggingface. Along with the models, the team is also releasing a portion of the “RefinedWeb” dataset of 600 billion text tokens as open source under an Apache 2.0 license. The dataset is also said to be ready for multimodal extension, as the examples already include links and alt text for images.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top