MLOps

Track Experiments
Ship Better Models

Full reproducibility with metrics, parameters, and artifacts. Compare runs, collaborate with your team, and version models automatically.

Works with:
PyTorch
TensorFlow
Ultralytics
Hugging Face
+more
Logging

Log everything, miss nothing

Capture metrics, parameters, artifacts, and evaluations with a simple API. Full lineage tracking with minimal integration code.

#

Scalar

Single values (loss, accuracy)

Line

Time-series metrics

Image

Visualizations & samples

Table

Structured data

Histogram

Distributions

Confusion Matrix

Classification results

train.py
from picsellia import Client

client = Client()
project = client.get_project("my-project")
experiment = project.get_experiment(
  "my-experiment"
)

# Log hyperparameters
experiment.log_parameters({
  "learning_rate": 1e-4,
  "batch_size": 32
})

# Log metrics during training
for epoch in range(epochs):
  experiment.log(
    "train_loss",
    loss.item()
  )

Artifact Storage

Store checkpoints, configs, and outputs

experiment.store("model.pt")

Dataset Attachment

Link training data for reproducibility

experiment.attach_dataset(dataset_version)

Model Export

Push to model registry with one call

experiment.export_as_model("my-model")
Compare

Compare trainings instantly

See the exact training distribution, hyperparameters, and augmentations behind every performance change. Find your best model faster.

Side-by-side metric comparison
Parameter diff highlighting
Dataset version tracking
Collaborative comments
EXPERIMENT COMPARISON
NamemAPLossLREpochsStatus
exp-0010.870.091e-4100completed
exp-0020.820.121e-380completed
exp-0030.910.075e-5150running
CV Engine

Build training pipelines with ease

Picsellia CV Engine is a modular toolkit for constructing computer vision workflows. Composable steps, framework extensions, and CLI automation.

Training Pipelines

Data → Model → Results. Streamlined training processes with built-in logic.

UltralyticsPyTorch

Processing Pipelines

Dataset transformation, pre-annotation, and data cleaning operations.

SAM2CLIP

Framework Extensions

Pluggable architecture supporting multiple training libraries.

Grounding DINO
terminal
$ pip install picsellia-cv-engine
# Initialize a new training pipeline
$ pxl-pipeline init --type training
# Run locally for testing
$ pxl-pipeline test
# Deploy to Picsellia cloud
$ pxl-pipeline deploy --gpus 1
CLI + Python decoratorsView Documentation →
MODEL REGISTRYyolo-defect-detector
v3
Ultralyticslatest
2 hours ago
0.91
42MB
v2
Ultralytics
1 day ago
0.87
42MB
v1
Ultralytics
3 days ago
0.82
41MB
Model Registry

Version your models automatically

Export experiments to the model registry with a single call. Track versions, compare performance, and deploy with confidence.

Automatic version incrementation
Framework metadata (TensorFlow, PyTorch, etc.)
Docker configuration for deployment
Lineage to training experiment
Evaluation

COCO metrics built-in

Add predictions, compare against ground truth, and compute standard evaluation metrics automatically.

0.91
mAP@50
Mean Average Precision
0.68
mAP@50:95
Strict mAP
0.89
Precision
True Positives / Predicted
0.87
Recall
True Positives / Actual

Supports rectangles, polygons, classifications, and keypoints

Ready to track your experiments?

Start logging metrics, comparing runs, and shipping better models with full reproducibility.