API Reference
Endpoints for orchestrating jobs, monitoring experiments, and fetching artifacts.
Experiments API
POST `/api/v1/experiments` with a model manifest to spin up a new training and validation cycle. Use GET to fetch status or DELETE to cancel.
Datasets API
Upload datasets via `POST /api/v1/datasets` with signed URLs. Use the dataset ID in experiment manifests to trigger automatic preprocessing.
Artifacts API
List best-performing checkpoints via `GET /api/v1/artifacts?experiment_id=` and download them to redeploy or fine-tune elsewhere.
Notifications API
Subscribe webhooks to `/api/v1/events` so your infrastructure gets notified when runs complete, regress, or enter human-review state.
Rate limits
Default tenant limits allow 200 experiment mutations per minute and 10 concurrent uploads. Contact support to raise limits for burst workloads.
Errors and retries
APIs return structured error codes. Use exponential backoff for 5xx errors, and reference the `error_id` when talking to support for faster resolution.
Pagination
List endpoints use cursor-based pagination. Pass `next_token` from previous responses to continue streaming artifacts or experiments.
Webhooks
Create webhooks with `POST /api/v1/webhooks` and subscribe to events like `experiment.completed` or `governance.requested`. Responses must return within five seconds.
Sample request / response
Use this placeholder payload to explore the API: `{ "model": "autonomous-researcher", "dataset": "demo/vision" }`. The API responds with queued job IDs and estimated completion windows.
SDK clients
Placeholder text describing the TypeScript, Python, and Go SDKs. Each mirrors these REST routes, handles retries, and exposes friendly helpers for uploads.