Small Specialized Language Model Experts
Our models are small but mighty!
Ranging from 1-22B parameters, we have developed over 150 small, specialized CPU- and laptop friendly language models in Hugging Face.
Our latest innovation is the SLIM model portfolio (Structured Language Instruction Models) - the world's first function-calling structured output small language models developed to be used in multi-model, Agent-based workflows.
We custom-train and optimize models for specialized use cases in data-sensitive, highly-regulated industries such as financial services, legal and insurance to run in private cloud or on-prem.
Find us in Hugging Face! 🤗
LLMWare.ai
We are open-source. Star us on GitHub ⭐️
OUR MODELS
Explore our Hugging Face Models
OUR MODELS
Explore our Hugging Face Models
OUR MODELS
Explore our Hugging Face Models
DRAGON Models
Production-grade RAG-optimized 6-7B parameter models - "Delivering RAG on ...". Fine-tuned for question-answering, fact-based uses in RAG workflows. Uses the context provided to answer yes/no and multiple-choice questions. Trained to reduce hallucinations.
DRAGON Models
Production-grade RAG-optimized 6-7B parameter models - "Delivering RAG on ...". Fine-tuned for question-answering, fact-based uses in RAG workflows. Uses the context provided to answer yes/no and multiple-choice questions. Trained to reduce hallucinations.
DRAGON Models
Production-grade RAG-optimized 6-7B parameter models - "Delivering RAG on ...". Fine-tuned for question-answering, fact-based uses in RAG workflows. Uses the context provided to answer yes/no and multiple-choice questions. Trained to reduce hallucinations.
SLIM Models
Function-calling, structured output models for classifying and clustering tasks. Designed to be used in an AI Agent workflow either alone or stacked together. 10+ Structured Language Instruction Models (SLIMs) for almost every classifying task.
SLIM Models
Function-calling, structured output models for classifying and clustering tasks. Designed to be used in an AI Agent workflow either alone or stacked together. 10+ Structured Language Instruction Models (SLIMs) for almost every classifying task.
SLIM Models
Function-calling, structured output models for classifying and clustering tasks. Designed to be used in an AI Agent workflow either alone or stacked together. 10+ Structured Language Instruction Models (SLIMs) for almost every classifying task.
Custom Models
Expert custom model training services for your company and your domain. Full-service custom model fine-tuning from datasets to training (and beyond). Services include small specialized models (7B and under) and embedding models.
Custom Models
Expert custom model training services for your company and your domain. Full-service custom model fine-tuning from datasets to training (and beyond). Services include small specialized models (7B and under) and embedding models.
Custom Models
Expert custom model training services for your company and your domain. Full-service custom model fine-tuning from datasets to training (and beyond). Services include small specialized models (7B and under) and embedding models.
Industry BERT Models
Industry and specialized domain finetuned BERT embedding models. Specialized domains include: Insurance, SEC documents, Contracts & Asset Management.
Industry BERT Models
Industry and specialized domain finetuned BERT embedding models. Specialized domains include: Insurance, SEC documents, Contracts & Asset Management.
Industry BERT Models
Industry and specialized domain finetuned BERT embedding models. Specialized domains include: Insurance, SEC documents, Contracts & Asset Management.
SLIM GGUF
Quantized GGUF "tool" implementations of SLIM Models. Provide "gguf" and "tool" versions of many SLIM, DRAGON, and BLING models, optimized for CPU deployment. GGUF Generative Model class - support for Stable-LM-3B, CUDA build options, and better control over sampling strategies.
SLIM GGUF
Quantized GGUF "tool" implementations of SLIM Models. Provide "gguf" and "tool" versions of many SLIM, DRAGON, and BLING models, optimized for CPU deployment. GGUF Generative Model class - support for Stable-LM-3B, CUDA build options, and better control over sampling strategies.
SLIM GGUF
Quantized GGUF "tool" implementations of SLIM Models. Provide "gguf" and "tool" versions of many SLIM, DRAGON, and BLING models, optimized for CPU deployment. GGUF Generative Model class - support for Stable-LM-3B, CUDA build options, and better control over sampling strategies.
BLING Models
Small CPU-based RAG-optimized, instruct-following 1B-3B parameter models. Great for quickly prototyping POCs on a laptop. Fast inference time.
BLING Models
Small CPU-based RAG-optimized, instruct-following 1B-3B parameter models. Great for quickly prototyping POCs on a laptop. Fast inference time.
BLING Models
Small CPU-based RAG-optimized, instruct-following 1B-3B parameter models. Great for quickly prototyping POCs on a laptop. Fast inference time.
MODELS THAT PROVES US
Supported Model Families of Model HQ
MODELS THAT PROVES US
Supported Model Families of Model HQ
MODELS THAT PROVES US
Supported Model Families of Model HQ
Qwen 2.5 Instruct 14B
Qwen 2 Based Models
Llama 3 Based Models
Phi-3 Based Models
Google Gemma 2 Based Models
Mistral Small Model 22B
Mistral 7B Based Models
StableLM 3B Based Models
Yi 6B Based Models
Yi 9B Based Models
Dragon RAG Model
SLIM Function Calling Models
Qwen 2.5 Instruct 14B
Qwen 2 Based Models
Llama 3 Based Models
Phi-3 Based Models
Google Gemma 2 Based Models
Mistral Small Model 22B
Mistral 7B Based Models
StableLM 3B Based Models
Yi 6B Based Models
Yi 9B Based Models
Dragon RAG Model
SLIM Function Calling Models
Try MODEL HQ by LLMWare.ai and start using AI models on your Intel AI PCs today
If you need any assistance, feel free to reach out to us!
Try MODEL HQ by LLMWare.ai and start using AI models on your Intel AI PCs today
If you need any assistance, feel free to reach out to us!
Try MODEL HQ by LLMWare.ai and start using AI models on your Intel AI PCs today
If you need any assistance, feel free to reach out to us!