πŸ‘‘ FALCON LLM beats LLAMA

Introducing Falcon-40B. A new language model trained on 1000B tokens.

What's included:

– 7B and 40B models made available by TII
– surpasses LLaMA 65B and other models like MPT and RedPajama on the Open LLM Leaderboard
– architecture is optimized for inference, with FlashAttention and multiquery
– Instruct model available
– license allows personal and research use and commercial use with limitations

❀️ If you want to support the channel ❀️
Support here:
Patreon –
Ko-Fi –

Leave a Reply

Your email address will not be published. Required fields are marked *

Amazon Affiliate Disclaimer

Amazon Affiliate Disclaimer

β€œAs an Amazon Associate I earn from qualifying purchases.”

Learn more about the Amazon Affiliate Program