A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.

  • holycrap
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    How does this compare to falcon 40b? Do we know yet?