MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs
Jonathan Frankle, Chief Scientist @MosaicML, just announced the latest entry in the MosaicML Foundation Series: MPT-7B
MPT is here! Check out our shiny new LLMs, open-source w/commercial license. The base MPT-7B model is 7B params trained on 1T tokens and reaches LLaMA-7B quality. We also created Instruct (commercial), Chat,