Technology Innovation Institute
Research interests
Large language models
Do you believe in a better tomorrow? We do. Our team of expert researchers live the dream and work to build it every day.
π₯ Falcon-180B is now available in open-access! Try it now in our chat demo!
News
- π₯ TII has open-sourced Falcon-180B for research and commercial utilization! Access the 180B, as well as 7B/40B models, and explore our high-quality web dataset, RefinedWeb.
- β¨ Falcon-40B/7B are now available under the Apache 2.0 license, TII has waived all royalties and commercial usage restrictions.
Falcon LLM
Falcon LLM is TII's flagship series of large language models, built from scratch using a custom data pipeline and distributed training library. Papers coming soon π.
To promote collaborations and drive innovation, we have open-sourced a number of artefacts:
- The Falcon-180B pretrained and chat models, under the Falcon-180B TII license. Falcon-180B is the largest and most powerful open-access model available.
- The Falcon-7/40B pretrained and instruct models, under the Apache 2.0 software license . Falcon-7B/40B models are state-of-the-art for their size, outperforming other open-source models on NLP benchmarks.
- The RefinedWeb dataset, a massive web dataset with stringent filtering and large-scale deduplication, enabling models trained on web data alone to match or outperform models trained on curated corpora. See π the paper for more information. RefinedWeb is licensed under ODC-By 1.0.
See below for a detailed list of artefacts in the Falcon LLM family:
Artefact | Link | Type | Details |
---|---|---|---|
π₯ Falcon-180B | Here | pretrained model | 180B parameters trained on 3,500 billion tokens. |
Falcon-180B-Chat | Here | chat model | Falcon-180B finetuned on a mixture of Ultrachat, Platypus and Airoboros. |
π₯ Falcon-40B | Here | pretrained model | 40B parameters trained on 1,000 billion tokens. |
Falcon-40B-Instruct | Here | instruction/chat model | Falcon-40B finetuned on the Baize dataset. |
π₯ Falcon-7B | Here | pretrained model | 6.7B parameters trained on 1,500 billion tokens. |
Falcon-7B-Instruct | Here | instruction/chat model | Falcon-7B finetuned on the Baize, GPT4All, and GPTeacher datasets. |
π RefinedWeb | Here | pretraining web dataset | ~600 billion "high-quality" tokens. |
Falcon-RW-1B | Here | pretrained model | 1.3B parameters trained on 350 billion tokens. |
Falcon-RW-7B | Here | pretrained model | 7.5B parameters trained on 350 billion tokens. |
About us
The Technology Innovation Institute (TII) is a leading global research center dedicated to pushing the frontiers of knowledge. Our teams of scientists, researchers and engineers work in an open, flexible and agile environment to deliver discovery science and transformative technologies. Our work means we will not only prepare for the future; we will create it. Working together, we are committed to inspiring innovation for a better tomorrow.
We are part of Abu Dhabi Governmentβs Advanced Technology Research Council, which oversees technology research in the emirate. As a disruptor in science, we are setting new standards and serve as a catalyst for change.
Faced with a future of limitless possibilities and supported by strategically funded investments, we are encouraging a culture of discovery. Our work reinforces Abu Dhabi and the UAEβs status as an R&D hub and a global leader in breakthrough technologies.
Collections
2
-
The RefinedWeb Dataset for Falcon LLM: Outperforming Curated Corpora with Web Data, and Web Data Only
Paper β’ 2306.01116 β’ Published β’ 21 -
tiiuae/falcon-refinedweb
Preview β’ Updated β’ 1.68k β’ 541 -
tiiuae/falcon-rw-1b
Text Generation β’ Updated β’ 15.3k β’ 52 -
tiiuae/falcon-rw-7b
Text Generation β’ Updated β’ 482 β’ 14