Author: jasondavies Posted: Wednesday, March 27, 2024

DBRX: A new open LLM

DBRX: A new open LLM
Zumi Article summary

The linked article is about Databrick's new open LLM, DBRX, that is capable of providing enterprises with the capabilities that were previously limited to closed model APIs. DBRX has fine-grained mixture-of-experts (MoE) architecture, and it is especially capable of programming, surpassing specialized models like CodeLLaMA-70B on programming, in addition to its strength as a general-purpose LLM. DBRX advances the state-of-the-art in efficiency among open models thanks to its MoE architecture, and inference on it is up to 2x faster than LLaMA2-70B. DBRX is available for Databricks customers to use via APIs starting today, and the weights of the base model and the finetuned model are available on Hugging Face under an open license.

databricks.com 848
Read on Hacker News Visit linked article Comments 339