Overview of Dracarys
Abacus.ai has launched Dracarys, a new family of open large language models (LLMs) specifically designed for coding tasks. This initiative follows the company’s previous release of the Smaug model, and aims to enhance the capabilities of existing coding models. Dracarys uses a unique “Dracarys recipe,” which combines optimized fine-tuning techniques with a carefully selected training dataset to improve coding performance. The initial focus is on models with 70 billion parameters, such as Qwen-2 and Llama 3.1.
Key Features and Benefits
- Dracarys enhances coding abilities of open-source LLMs, outperforming competitors.
- It boosts coding scores significantly; for example, it improves Llama 3.1 from 32.67 to 35.23.
- The recipe also enhances Qwen-2, raising its score from 32.38 to 38.95.
- The models are available on Hugging Face and through Abacus.ai’s Enterprise offerings, catering to businesses concerned about data privacy.
Significance of Dracarys in the Market
The launch of Dracarys is timely as the demand for generative AI in coding is rapidly growing. With established players like GitHub Copilot and new entrants like Tabnine and Replit, competition is fierce. Dracarys stands out by focusing on open-source solutions, making it an attractive option for enterprises wary of using closed-source models like Claude 3.5. As Abacus.ai plans to expand the Dracarys family to include more models, it aims to solidify its position in the coding AI market and provide developers with powerful tools to streamline their work.











