Introducing Falcon-40B. A new language model trained on 1000B tokens.
What's included:
– 7B and 40B models made available by TII
– surpasses LLaMA 65B and other models like MPT and RedPajama on the Open LLM Leaderboard
– architecture is optimized for inference, with FlashAttention and multiquery
– Instruct model available
– license allows personal and research use and commercial use with limitations
β€οΈ If you want to support the channel β€οΈ
Support here:
Patreon –
Ko-Fi –