- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Orca 2 released by Microsoft!
Full weights here:
https://huggingface.co/microsoft/Orca-2-7b
https://huggingface.co/microsoft/Orca-2-13b
My own exllamav2 quants here:
https://huggingface.co/bartowski/Orca-2-7b-exl2
https://huggingface.co/bartowski/Orca-2-13b-exl2
GGUF from TheBloke (links to GPTQ/AWQ inside it):
How does it compare to Mistral? That’s the best performing 7b model and it’s suspiciously missing from this report.
I’m looking forward to trying it today, I think this might make a good RAG model based on the orca 2 paper, but testing will be needed