A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.
You must log in or register to comment.
How does this compare to falcon 40b? Do we know yet?
A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.
How does this compare to falcon 40b? Do we know yet?