Quick‑Start Tutorial
TL;DR
You have CaliPy installed and want to fit a first simple model to showcase grammar and signs of life. Copy‑paste the short script below; it builds a tiny probabilistic model, runs stochastic variational inference, and prints the inferred bias mu.
Prerequisites
CaliPy, Pyro and PyTorch ≥ 2.2 installed (see Installing CaliPy).
Python ≥ 3.8; GPU optional.
A tiny probabilistic model
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
This is a minimal demo of calipy functionality. An unknown parameter mu is
observed noisily and has to be estimated from these observations.
The script is meant solely for educational and illustrative purposes. Written by
Dr. Jemil Avers Butt, Atlas optimization GmbH, www.atlasoptimization.com.
"""
# Imports and definitions
import pyro
import matplotlib.pyplot as plt
from calipy.base import NodeStructure, CalipyProbModel
from calipy.effects import UnknownParameter, NoiseAddition
from calipy.utils import dim_assignment
from calipy.tensor import CalipyTensor
# Simulate data
n_meas = 20
mu_true, sigma_true = 0.0, 0.1
data = pyro.distributions.Normal(mu_true, sigma_true).sample([n_meas])
# Define dimensions
batch_dims = dim_assignment(['batch'], [n_meas])
single_dims = dim_assignment(['single'], [])
# Set up model nodes
mu_ns = NodeStructure(UnknownParameter)
mu_ns.set_dims(batch_dims=batch_dims, param_dims = single_dims)
mu_node = UnknownParameter(mu_ns, name='mu')
noise_ns = NodeStructure(NoiseAddition)
noise_ns.set_dims(batch_dims=batch_dims, event_dims = single_dims)
noise_node = NoiseAddition(noise_ns, name='noise')
# Define probabilistic model
class DemoProbModel(CalipyProbModel):
def model(self, input_vars = None, observations=None):
mu = mu_node.forward()
return noise_node.forward({'mean': mu, 'standard_deviation': sigma_true}, observations)
def guide(self, input_vars = None, observations=None):
pass
# Train model
demo_probmodel = DemoProbModel()
data_cp = CalipyTensor(data, dims=batch_dims)
optim_results = demo_probmodel.train(None, data_cp, optim_opts = {})
# Plot results
plt.plot(optim_results)
plt.xlabel('Epoch'); plt.ylabel('ELBO loss'); plt.title('Training Progress')
plt.show()
# :emphasize-lines: 33, 36, 38-45, 50
Expected output
Posterior mu: around true mu = 0.0
(The exact numbers vary but should be identical to mean(data).)
Where to go next
Core Concepts & Architecture – glossary & architecture overview
Usage – in‑depth guides (models, effects, data, inference)
Example notebooks in
examples/engineering_geodesy/for mean_estimation, level_calibration, totalstation_calibration, …
Troubleshooting
Common hiccups
ImportErrors – verify your environment matches the Installation page (“Bleeding‑edge install” section).
Diverging ELBO – set
torch.manual_seed(0)and/or lower the learning rate (optim.Adam({"lr": 0.005})).