Data Intake + Mapping
- Ingest local, URL, HF, Kaggle sources in one interface.
- Generate canonical schema mappings and reusable profiles.
- Apply validation checks before training visibility.
Capabilities
BrewSLM gives immediate productivity through autopilot presets while preserving full control over data adapters, runtime preflight, training behavior, and deployment packaging.
Capability Clusters
Control Depth
| Control Area | Autopilot Default | Manual Override | Outcome |
|---|---|---|---|
| Schema Mapping | Suggested canonical mapping | Edit mapping and validation rules | Faster setup with traceable field decisions |
| Model Selection | Benchmark-informed shortlist | Pin exact model and constraints | Balanced quality, cost, and consistency |
| Training Config | Generated train plan | Tune optimizer, schedule, stop policy | Safe baseline with targeted optimization |
| Runtime Placement | Recommended local/cloud placement | Force local or burst by policy | Operational flexibility by workload |
| Release Gating | Default quality checks | Define custom promotion thresholds | Reduced regression risk on deployment |
Automation Playbooks
Check GPU/dependency/secret readiness before scheduled automation starts.
$ ./brewslm doctor --project 1
Kick off a guided one-click run from intent text with project defaults.
$ ./brewslm train --project 1 --autopilot --one-click --intent "Refresh support model"
Tune for device constraints, then package an export candidate for serving.
$ ./brewslm optimize --project 1 --target mobile_iphone15
$ ./brewslm export --project 1 --format huggingface --target vllm
Capability Test Drive
$ ./brewslm project create --name demo-slm --template general
$ ./brewslm dataset import --project 1 --sample general-chat-v1
$ ./brewslm preflight --project 1 --task causal_lm
$ ./brewslm train --project 1 --autopilot --one-click
$ ./brewslm export --project 1 --format huggingface --target vllm