# GPy.inference.mcmc package¶

## GPy.inference.mcmc.hmc module¶

class HMC(model, M=None, stepsize=0.1)[source]

Bases: object

An implementation of Hybrid Monte Carlo (HMC) for GPy models

Initialize an object for HMC sampling. Note that the status of the model (model parameters) will be changed during sampling.

Parameters: model (GPy.core.Model) – the GPy model that will be sampled M (numpy.ndarray) – the mass matrix (an identity matrix by default) stepsize (float) – the step size for HMC sampling
sample(num_samples=1000, hmc_iters=20)[source]

Sample the (unfixed) model parameters.

Parameters: num_samples (int) – the number of samples to draw (1000 by default) hmc_iters (int) – the number of leap-frog iterations (20 by default) the list of parameters samples with the size N x P (N - the number of samples, P - the number of parameters to sample) numpy.ndarray
class HMC_shortcut(model, M=None, stepsize_range=[1e-06, 0.1], groupsize=5, Hstd_th=[1e-05, 3.0])[source]

Bases: object

sample(m_iters=1000, hmc_iters=20)[source]

## GPy.inference.mcmc.samplers module¶

class Metropolis_Hastings(model, cov=None)[source]

Bases: object

Metropolis Hastings, with tunings according to Gelman et al.

new_chain(start=None)[source]
predict(function, args)[source]

Make a prediction for the function, to which we will pass the additional arguments

sample(Ntotal=10000, Nburn=1000, Nthin=10, tune=True, tune_throughout=False, tune_interval=400)[source]