# smolmodels ✨ [](https://pypi.org/project/smolmodels/) [](https://discord.gg/3czW7BMj) Build specialized ML models using natural language. `smolmodels` is a Python library that lets you create machine learning models by describing what you want them to do in plain English. Instead of wrestling with model architectures and hyperparameters, you simply describe your intent, define your inputs and outputs, and let `smolmodels` handle the rest... ## How Does It Work? `smolmodels` combines graph search with LLMs to generate candidate models that meet the specified intent, and then selects the best model based on performance and constraints. The process consists of four main phases: 1. **Intent Analysis**: problem description is analyzed to understand the model needed. 2. **Data Generation**: synthetic data can be generated for model building. 3. **Model Building**: 1. Selects appropriate model architectures 2. Handles feature engineering 3. Manages training and validation 4. **Validation & Refinement**: the model is tested against constraints and refined using directives... ## Key Features ### 📝 Natural Language Intent Models are defined using natural language descriptions and schema specifications... ### 🎲 Data Generation Built-in synthetic data generation for training and validation... ### 🎯 Directives for fine-grained Control (Not Yet Implemented - Coming Soon) Guide the model building process with high-level directives: ```python from smolmodels import Directive model.build(directives=[ Directive(Optimize for inference speed), Directive(Prioritize interpretability) ]) ``` ### ✅ Optional Constraints (Not Yet Implemented - Coming Soon) Optional declarative constraints for model validation: ```python from smolmodels import Constraint # Ensure predictions are always positive positive_constraint = Constraint( lambda inputs, outputs: outputs[predicted_price] > 0, description=Predictions must be positive ) model = Model( intent=Predict house prices..., constraints=[positive_constraint], ... ) ``` ### 🌐 Multi-Provider Support You can use multiple LLM providers as a backend for model generation. You can specify the provider an