Small Specialized Language
Model Experts
Our models are small but mighty!
Ranging from 1-7B parameters, we have developed over 50 small, specialized CPU-friendly language models in Hugging Face.
Our latest innovation is the SLIM model portfolio (Structured Language Instruction Models) - the world's first function-calling structured output small language models developed to be used in multi-model, Agent-based workflows.
We custom-train models for specialized use cases in data-sensitive, highly-regulated industries such as financial services, legal and insurance to run in private cloud or on-prem.
Find us in Hugging Face! 🤗
Collection
BLING Models
Small CPU-based RAG-optimized, instruct-following 1B-3B parameter models,
Great for quickly prototyping POCs on a laptop.
Fast inference time.
Industry and specialized domain finetuned BERT embedding models.
Specialized domains include: Insurance, SEC documents, Contracts & Asset Management.
SLIM GGUF
Quantized GGUF 'tool' implementations of SLIM Models.
Provide 'gguf' and 'tool' versions of many SLIM, DRAGON and BLING models, optimized for CPU deployment.
GGUF Generative Model class - support for Stable-LM-3B, CUDA build options, and better control over sampling strategies.
Expert Custom model training services for your company and your domain.
Full-service custom model fine-tuning from datasets to training (and beyond).
Services include small specialized models (7B and under) and embedding models.