From 0535fde990c4a2999019cae867a9d62c2a2cc9f1 Mon Sep 17 00:00:00 2001 From: Michael Clerx Date: Thu, 5 Feb 2026 13:34:10 +0000 Subject: [PATCH 1/4] Updated list of methods in docs. Closes #1109. --- docs/source/index.rst | 185 ++++++++++++++++++++++++------------------ 1 file changed, 104 insertions(+), 81 deletions(-) diff --git a/docs/source/index.rst b/docs/source/index.rst index 1a0cddbfe..8d0f3bc62 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -3,6 +3,8 @@ .. _GitHub: https://github.com/pints-team/pints .. _Detailed examples: https://github.com/pints-team/pints/blob/main/examples/README.md +.. module:: pints + Welcome to the pints documentation ================================== @@ -16,127 +18,148 @@ Welcome to the pints documentation * :ref:`genindex` * :ref:`search` -Contents -======== -.. module:: pints -.. toctree:: +Defining inference problems in PINTS +==================================== - abc_samplers/index - boundaries - core_classes_and_methods - diagnostics - diagnostic_plots - error_measures - function_evaluation - io - log_likelihoods - log_pdfs - log_priors - mcmc_samplers/index - nested_samplers/index - noise_generators - optimisers/index - noise_model_diagnostics - toy/index - toy/stochastic/index - transformations - utilities +PINTS provides methods to sample distributions, implemented as a +:class:`LogPDF`, and to optimise functions, implemented as an +:class:`ErrorMeasure` or a :class:`LogPDF`. + +Users can define LogPDF or ErrorMeasure implementations directly, or they can +use PINTS' :class:`ForwardModel` and problem classes to set up their problems, +and then choose one of many predefined pdfs or errors. + +PINTS defines :class:`single` and +:class:`multi-output` problem classes that wrap around +a model and data, and over which :class:`error measures` +or :class:`log-likelihoods` can be defined. + +To find the appropriate type of Problem to use, see the overview below: + +#. Systems with a single observable output + + - Single data set: Use a :class:`SingleOutputProblem` and any of the + appropriate error measures or log-likelihoods + - Multiple, independent data sets: Define multiple + :class:`SingleOutputProblems` and an error measure + / log-likelihood on each, and then combine using e.g. + :class:`SumOfErrors` or :class:`SumOfIndependentLogPDFs`. + +#. Systems with multiple observable outputs + + - Single data set: Use a :class:`MultiOutputProblem` and any of the + appropriate error measures or log-likelihoods -Hierarchy of methods -==================== -Pints contains different types of methods, that can be roughly arranged into a -hierarchy, as follows. +Provided methods +================ + +PINTS contains different types of methods, that can be roughly arranged into +the classification shown below. Sampling -------- -#. :class:`MCMC without gradients` +#. :class:`MCMC without gradients`, work on any :class:`LogPDF`. - - :class:`MetropolisRandomWalkMCMC`, works on any :class:`LogPDF`. - - Metropolis-Hastings + - :class:`MetropolisRandomWalkMCMC` - Adaptive methods - - :class:`AdaptiveCovarianceMC`, works on any :class:`LogPDF`. + - :class:`AdaptiveCovarianceMC` + - :class:`DramACMC` + - :class:`HaarioACMC` + - :class:`HaarioBardenetACMC` + - :class:`RaoBlackwellACMC` - - :class:`PopulationMCMC`, works on any :class:`LogPDF`. + - :class:`PopulationMCMC` - Differential evolution methods - - :class:`DifferentialEvolutionMCMC`, works on any :class:`LogPDF`. - - :class:`DreamMCMC`, works on any :class:`LogPDF`. - - :class:`EmceeHammerMCMC`, works on any :class:`LogPDF`. - -#. :class:`Nested sampling` + - :class:`DifferentialEvolutionMCMC` + - :class:`DreamMCMC` + - :class:`EmceeHammerMCMC` - - :class:`NestedEllipsoidSampler`, requires a :class:`LogPDF` and a - :class:`LogPrior` that can be sampled from. - - :class:`NestedRejectionSampler`, requires a :class:`LogPDF` and a - :class:`LogPrior` that can be sampled from. + - Slice sampling -#. Particle based samplers + - :class:`SliceDoublingMCMC` + - :class:`SliceRankShrinkingMCMC` + - :class:`SliceStepoutMCMC` - - SMC +#. First order sensitivity MCMC samplers, require a :class:`LogPDF` that + provides first order sensitivities. -#. :class:`ABC sampling` + - :class:`Hamiltonian Monte Carlo` + - :class:`Metropolis-Adjusted Langevin Algorithm (MALA) ` + - :class:`Monomial Gamma HMC ` + - :class:`No U-Turn Sampler with dual averaging (NUTS) ` + - :class:`RelativisticMCMC` - - :class:`ABCSMC`, requires a :class:`LogPrior` that can be sampled from - from and an :class:`ErrorMeasure`. - - :class:`RejectionABC`, requires a :class:`LogPrior` that can be sampled - from and an :class:`ErrorMeasure`. +#. :class:`Nested sampling`, require a :class:`LogPDF` and a + :class:`LogPrior` that can be sampled from. -#. 1st order sensitivity MCMC samplers (Need derivatives of :class:`LogPDF`) + - :class:`NestedEllipsoidSampler` + - :class:`NestedRejectionSampler` - - :class:`Metropolis-Adjusted Langevin Algorithm (MALA) `, works - on any :class:`LogPDF` that provides 1st order sensitivities. - - :class:`Hamiltonian Monte Carlo`, works on any - :class:`LogPDF` that provides 1st order sensitivities. - - NUTS +#. :class:`ABC sampling`, require a :class:`LogPrior` that can be + sampled from from and an :class:`ErrorMeasure`. -#. Differential geometric methods (Need Hessian of :class:`LogPDF`) + - :class:`ABCSMC` + - :class:`RejectionABC` - - smMALA - - RMHMC Optimisation ------------ -All methods shown here are derivative-free methods that work on any -:class:`ErrorMeasure` or :class:`LogPDF`. - -1. Particle-based methods +1. Particle, or population-based methods, work on any :class:`ErrorMeasure` or + :class:`LogPDF`. - - Evolution strategies (global/local methods) + - Evolution strategies - :class:`CMAES` - :class:`SNES` - :class:`XNES` + - :class:`BareCMAES` - - :class:`PSO` (global method) + - :class:`PSO` +2. General derivative-free methods + - :class:`NelderMead` -Problems in Pints -================= +3. Gradient-descent methods, require first order sensitivities -Pints defines :class:`single` and -:class:`multi-output` problem classes that wrap around -models and data, and over which :class:`error measures` or -:class:`log-likelihoods` can be defined. + - :class:`GradientDescent` + - :class:`Adam` -To find the appropriate type of Problem to use, see the overview below: +4. General derivative-using methods -#. Systems with a single observable output + - :class:`IRPropMin` - - Single data set: Use a :class:`SingleOutputProblem` and any of the - appropriate error measures or log-likelihoods - - Multiple, independent data sets: Define multiple - :class:`SingleOutputProblems` and an error measure - / log-likelihood on each, and then combine using e.g. - :class:`SumOfErrors` or :class:`SumOfIndependentLogPDFs`. -#. Systems with multiple observable outputs +Contents +======== - - Single data set: Use a :class:`MultiOutputProblem` and any of the - appropriate error measures or log-likelihoods +.. toctree:: + :maxdepth: 2 + + abc_samplers/index + boundaries + core_classes_and_methods + diagnostics + diagnostic_plots + error_measures + function_evaluation + io + log_likelihoods + log_pdfs + log_priors + mcmc_samplers/index + nested_samplers/index + noise_generators + optimisers/index + noise_model_diagnostics + toy/index + toy/stochastic/index + transformations + utilities From ce0f588c3fb0b8cacfa5d850c5a4577c990f4365 Mon Sep 17 00:00:00 2001 From: Michael Clerx Date: Thu, 5 Feb 2026 13:36:27 +0000 Subject: [PATCH 2/4] Made __call__ show in docs (and other improvements). Closes #1482. Closes #1519. --- docs/source/abc_samplers/index.rst | 3 ++- docs/source/boundaries.rst | 8 -------- docs/source/conf.py | 3 ++- docs/source/core_classes_and_methods.rst | 22 +++++++++------------- docs/source/diagnostics.rst | 5 ----- docs/source/error_measures.rst | 11 ----------- docs/source/function_evaluation.rst | 8 -------- docs/source/log_likelihoods.rst | 18 ------------------ docs/source/log_pdfs.rst | 9 --------- docs/source/log_priors.rst | 18 ------------------ docs/source/mcmc_samplers/index.rst | 1 + docs/source/nested_samplers/index.rst | 1 + docs/source/noise_generators.rst | 9 --------- docs/source/optimisers/index.rst | 1 + docs/source/toy/index.rst | 1 + docs/source/toy/stochastic/index.rst | 1 + docs/source/transformations.rst | 15 --------------- docs/source/utilities.rst | 10 ---------- pints/_error_measures.py | 8 ++++++-- pints/_log_pdfs.py | 9 +++++++-- pints/_log_priors.py | 21 ++++++++++----------- 21 files changed, 41 insertions(+), 141 deletions(-) diff --git a/docs/source/abc_samplers/index.rst b/docs/source/abc_samplers/index.rst index ea9c88f6d..9e5432189 100644 --- a/docs/source/abc_samplers/index.rst +++ b/docs/source/abc_samplers/index.rst @@ -11,7 +11,8 @@ given a :class:`LogPrior` and a :class:`ErrorMeasure`. .. toctree:: + :maxdepth: 1 base_classes abc_smc - rejection_abc \ No newline at end of file + rejection_abc diff --git a/docs/source/boundaries.rst b/docs/source/boundaries.rst index f6f2b0b5a..c4b9bc9a3 100644 --- a/docs/source/boundaries.rst +++ b/docs/source/boundaries.rst @@ -9,14 +9,6 @@ Simple boundaries for an optimisation can be created using More complex types can be made using :class:`LogPDFBoundaries` or a custom implementation of the :class:`Boundaries` interface. -Overview: - -- :class:`Boundaries` -- :class:`ComposedBoundaries` -- :class:`LogPDFBoundaries` -- :class:`RectangularBoundaries` - - .. autoclass:: Boundaries .. autoclass:: ComposedBoundaries diff --git a/docs/source/conf.py b/docs/source/conf.py index fad5c0cb3..175ddccac 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -39,6 +39,7 @@ autodoc_default_options = { 'members': None, 'inherited-members': None, + 'special-members': '__call__', } # Add any paths that contain templates here, relative to this directory. @@ -55,7 +56,7 @@ # General information about the project. project = u'Pints' -copyright = u'2022, Pints Authors' +copyright = u'2017-2026, Pints Authors' author = u'Pints Authors' # The version info for the project you're documenting, acts as replacement for diff --git a/docs/source/core_classes_and_methods.rst b/docs/source/core_classes_and_methods.rst index b49a62f25..7f40272e7 100644 --- a/docs/source/core_classes_and_methods.rst +++ b/docs/source/core_classes_and_methods.rst @@ -9,19 +9,6 @@ Pints provides the :class:`SingleOutputProblem` and inverse problems based on time series data and :class:`ForwardModel`. -Overview: - -- :class:`ForwardModel` -- :class:`ForwardModelS1` -- :class:`MultiOutputProblem` -- :class:`SingleOutputProblem` -- :class:`TunableMethod` -- :func:`version` - -.. autofunction:: version - -.. autoclass:: TunableMethod - Forward model ************* @@ -40,3 +27,12 @@ Problems .. autoclass:: MultiOutputProblem +Hyperparameters +*************** + +.. autoclass:: TunableMethod + +PINTS version +************* + +.. autofunction:: version diff --git a/docs/source/diagnostics.rst b/docs/source/diagnostics.rst index 9a3676155..0c77b4c44 100644 --- a/docs/source/diagnostics.rst +++ b/docs/source/diagnostics.rst @@ -6,11 +6,6 @@ Diagnosing MCMC results Pints provides a number of functions to diagnose MCMC progress and convergence. -Overview: - -- :func:`effective_sample_size` -- :func:`rhat` - .. autofunction:: rhat diff --git a/docs/source/error_measures.rst b/docs/source/error_measures.rst index c8320f894..113ef4851 100644 --- a/docs/source/error_measures.rst +++ b/docs/source/error_measures.rst @@ -13,17 +13,6 @@ Example:: x = [1,2,3] fx = error(x) -Overview: - -- :class:`ErrorMeasure` -- :class:`MeanSquaredError` -- :class:`NormalisedRootMeanSquaredError` -- :class:`ProbabilityBasedError` -- :class:`ProblemErrorMeasure` -- :class:`RootMeanSquaredError` -- :class:`SumOfErrors` -- :class:`SumOfSquaresError` - .. autoclass:: ErrorMeasure diff --git a/docs/source/function_evaluation.rst b/docs/source/function_evaluation.rst index 0e089b085..c2e3c8acb 100644 --- a/docs/source/function_evaluation.rst +++ b/docs/source/function_evaluation.rst @@ -18,14 +18,6 @@ Example:: ] fx = e.evaluate(x) -Overview: - -- :func:`evaluate` -- :class:`Evaluator` -- :class:`ParallelEvaluator` -- :class:`SequentialEvaluator` -- :class:`MultiSequentialEvaluator` - .. autofunction:: evaluate diff --git a/docs/source/log_likelihoods.rst b/docs/source/log_likelihoods.rst index 29d1aaef9..cdf4594cc 100644 --- a/docs/source/log_likelihoods.rst +++ b/docs/source/log_likelihoods.rst @@ -14,24 +14,6 @@ Example:: x = [1, 2, 3] fx = logpdf(x) -Overview: - -- :class:`AR1LogLikelihood` -- :class:`ARMA11LogLikelihood` -- :class:`CauchyLogLikelihood` -- :class:`CensoredGaussianLogLikelihood` -- :class:`ConstantAndMultiplicativeGaussianLogLikelihood` -- :class:`GaussianIntegratedLogUniformLogLikelihood` -- :class:`GaussianIntegratedUniformLogLikelihood` -- :class:`GaussianKnownSigmaLogLikelihood` -- :class:`GaussianLogLikelihood` -- :class:`KnownNoiseLogLikelihood` -- :class:`LogNormalLogLikelihood` -- :class:`MultiplicativeGaussianLogLikelihood` -- :class:`ScaledLogLikelihood` -- :class:`StudentTLogLikelihood` -- :class:`UnknownNoiseLogLikelihood` - .. autoclass:: AR1LogLikelihood diff --git a/docs/source/log_pdfs.rst b/docs/source/log_pdfs.rst index 7e95a6716..9fbfb3717 100644 --- a/docs/source/log_pdfs.rst +++ b/docs/source/log_pdfs.rst @@ -15,15 +15,6 @@ Example:: p = pints.GaussianLogPrior(mean=0, variance=1) x = p(0.1) -Overview: - -- :class:`LogPDF` -- :class:`LogPrior` -- :class:`LogPosterior` -- :class:`PooledLogPDF` -- :class:`ProblemLogLikelihood` -- :class:`SumOfIndependentLogPDFs` - .. autoclass:: LogPDF diff --git a/docs/source/log_priors.rst b/docs/source/log_priors.rst index 65c1bb0c9..4fe066daf 100644 --- a/docs/source/log_priors.rst +++ b/docs/source/log_priors.rst @@ -12,24 +12,6 @@ Example:: p = pints.GaussianLogPrior(mean=0, variance=1) x = p(0.1) -Overview: - -- :class:`BetaLogPrior` -- :class:`CauchyLogPrior` -- :class:`ComposedLogPrior` -- :class:`ExponentialLogPrior` -- :class:`GammaLogPrior` -- :class:`GaussianLogPrior` -- :class:`HalfCauchyLogPrior` -- :class:`InverseGammaLogPrior` -- :class:`LogNormalLogPrior` -- :class:`LogUniformLogPrior` -- :class:`MultivariateGaussianLogPrior` -- :class:`NormalLogPrior` -- :class:`StudentTLogPrior` -- :class:`TruncatedGaussianLogPrior` -- :class:`UniformLogPrior` - .. autoclass:: BetaLogPrior diff --git a/docs/source/mcmc_samplers/index.rst b/docs/source/mcmc_samplers/index.rst index 09b89bba8..a69be1e90 100644 --- a/docs/source/mcmc_samplers/index.rst +++ b/docs/source/mcmc_samplers/index.rst @@ -10,6 +10,7 @@ interface, that can be used to sample from an unknown :class:`Posterior`). .. toctree:: + :maxdepth: 1 running base_classes diff --git a/docs/source/nested_samplers/index.rst b/docs/source/nested_samplers/index.rst index f91f927ed..f310b96d4 100644 --- a/docs/source/nested_samplers/index.rst +++ b/docs/source/nested_samplers/index.rst @@ -3,6 +3,7 @@ Nested samplers *************** .. toctree:: + :maxdepth: 1 nested_sampler nested_ellipsoid_sampler diff --git a/docs/source/noise_generators.rst b/docs/source/noise_generators.rst index 47cef20b0..4dce14552 100644 --- a/docs/source/noise_generators.rst +++ b/docs/source/noise_generators.rst @@ -9,15 +9,6 @@ Pints contains a module ``pints.noise`` that contains methods that generate This can then be added to simulation output to create "realistic" experimental data. - Overview: - - - :func:`ar1` - - :func:`ar1_unity` - - :func:`arma11` - - :func:`arma11_unity` - - :func:`independent` - - :func:`multiplicative_gaussian` - .. autofunction:: ar1 diff --git a/docs/source/optimisers/index.rst b/docs/source/optimisers/index.rst index e41e5772d..1efe7ca9a 100644 --- a/docs/source/optimisers/index.rst +++ b/docs/source/optimisers/index.rst @@ -12,6 +12,7 @@ The easiest way to run an optimisation is by using the :func:`optimise` method or the :class:`OptimisationController` class. .. toctree:: + :maxdepth: 1 running base_classes diff --git a/docs/source/toy/index.rst b/docs/source/toy/index.rst index ae0964919..081e63666 100644 --- a/docs/source/toy/index.rst +++ b/docs/source/toy/index.rst @@ -12,6 +12,7 @@ Some toy classes provide extra functionality defined in the .. toctree:: + :maxdepth: 1 toy_classes annulus_logpdf diff --git a/docs/source/toy/stochastic/index.rst b/docs/source/toy/stochastic/index.rst index e25c7a2a7..b548be2fa 100644 --- a/docs/source/toy/stochastic/index.rst +++ b/docs/source/toy/stochastic/index.rst @@ -9,6 +9,7 @@ examples. .. toctree:: + :maxdepth: 1 markov_jump_model stochastic_degradation_model diff --git a/docs/source/transformations.rst b/docs/source/transformations.rst index 5f1760c3c..966851bef 100644 --- a/docs/source/transformations.rst +++ b/docs/source/transformations.rst @@ -31,21 +31,6 @@ Example:: transform = pints.LogTransformation(n_parameters) mcmc = pints.MCMCController(log_posterior, n_chains, x0, transform=transform) -Overview: - -- :class:`ComposedTransformation` -- :class:`IdentityTransformation` -- :class:`LogitTransformation` -- :class:`LogTransformation` -- :class:`RectangularBoundariesTransformation` -- :class:`ScalingTransformation` -- :class:`Transformation` -- :class:`TransformedBoundaries` -- :class:`TransformedErrorMeasure` -- :class:`TransformedLogPDF` -- :class:`TransformedLogPrior` -- :class:`UnitCubeTransformation` - Transformation types ******************** diff --git a/docs/source/utilities.rst b/docs/source/utilities.rst index d467b9451..364467f99 100644 --- a/docs/source/utilities.rst +++ b/docs/source/utilities.rst @@ -4,16 +4,6 @@ Utilities .. currentmodule:: pints -Overview: - -- :func:`strfloat` -- :class:`Loggable` -- :class:`Logger` -- :class:`Timer` -- :func:`matrix2d` -- :func:`vector` -- :func:`sample_initial_points` - .. autofunction:: strfloat .. autoclass:: Loggable diff --git a/pints/_error_measures.py b/pints/_error_measures.py index 1885ff43d..ac52bcffa 100644 --- a/pints/_error_measures.py +++ b/pints/_error_measures.py @@ -16,11 +16,15 @@ class ErrorMeasure(object): means a better fit. ErrorMeasures are callable objects: If ``e`` is an instance of an - :class:`ErrorMeasure` class you can calculate the error by calling ``e(p)`` - where ``p`` is a point in parameter space. In PINTS, all parameters must be + :class:`ErrorMeasure` class you can calculate the error by calling ``e(x)`` + where ``x`` is a point in parameter space. In PINTS, all parameters must be continuous and real. + + All subclasses of ``ErrorMeasure`` should provide an implementation of + :meth:`__call__` and :meth:`n_parameters`. """ def __call__(self, x): + """ Evaluates this error measure for parameters ``x``. """ raise NotImplementedError def evaluateS1(self, x): diff --git a/pints/_log_pdfs.py b/pints/_log_pdfs.py index c625f76c0..9c11f1e4d 100644 --- a/pints/_log_pdfs.py +++ b/pints/_log_pdfs.py @@ -14,12 +14,17 @@ class LogPDF(object): probability density function (PDF). All :class:`LogPDF` types are callable: when called with a vector argument - ``p`` they return some value ``log(f(p))`` where ``f(p)`` is an - unnormalised PDF. The size of the argument ``p`` is given by + ``x`` they return some value ``log(f(x))`` where ``f(x)`` is an + unnormalised PDF. The size of the argument ``x`` is given by :meth:`n_parameters()`. In PINTS, all parameters must be continuous and real. + + All subclasses of ``LogPDF`` should provide an implementation of + :meth:`__call__` and :meth:`n_parameters`. Providing :meth:`evaluateS1` is + optional. """ def __call__(self, x): + """ Evaluates this LogPDF for parameters ``x``. """ raise NotImplementedError def evaluateS1(self, x): diff --git a/pints/_log_priors.py b/pints/_log_priors.py index 77edbac65..1cf55b07e 100644 --- a/pints/_log_priors.py +++ b/pints/_log_priors.py @@ -386,15 +386,14 @@ def __init__(self, a, b): self._a) def __call__(self, x): - if x[0] < 0.0: + if x[0] < 0: return -np.inf - else: - return self._constant + scipy.special.xlogy(self._a - 1., - x[0]) - self._b * x[0] + return self._constant + scipy.special.xlogy( + self._a - 1, x[0]) - self._b * x[0] def cdf(self, x): """ See :meth:`LogPrior.cdf()`. """ - return scipy.stats.gamma.cdf(x, a=self._a, loc=0, scale=1.0 / self._b) + return scipy.stats.gamma.cdf(x, a=self._a, loc=0, scale=1 / self._b) def evaluateS1(self, x): """ See :meth:`LogPDF.evaluateS1()`. """ @@ -403,18 +402,18 @@ def evaluateS1(self, x): _x = x[0] # Account for pathological edge - if _x == 0.0: - _x = np.nextafter(0.0, 1.0) + if _x == 0: + _x = np.nextafter(0, 1) - if _x < 0.0: - return value, np.asarray([0.]) + if _x < 0: + return value, np.asarray([0]) else: # Use np.divide here to better handle possible v small denominators - return value, np.asarray([np.divide(self._a - 1., _x) - self._b]) + return value, np.asarray([np.divide(self._a - 1, _x) - self._b]) def icdf(self, p): """ See :meth:`LogPrior.icdf()`. """ - return scipy.stats.gamma.ppf(p, a=self._a, loc=0, scale=1.0 / self._b) + return scipy.stats.gamma.ppf(p, a=self._a, loc=0, scale=1 / self._b) def mean(self): """ See :meth:`LogPrior.mean()`. """ From 9ee3bcd9d53d358dce87777d82d83ad641fd4a72 Mon Sep 17 00:00:00 2001 From: Michael Clerx Date: Thu, 5 Feb 2026 13:51:13 +0000 Subject: [PATCH 3/4] Moved MCMCSummary to diagnostics in docs. Closes #1281 --- docs/source/diagnostic_plots.rst | 12 ++++++------ docs/source/index.rst | 2 +- .../source/{diagnostics.rst => mcmc_diagnostics.rst} | 9 +++++---- docs/source/mcmc_samplers/index.rst | 2 +- docs/source/mcmc_samplers/summary_mcmc.rst | 7 ------- pints/_diagnostics.py | 6 +++--- pints/_mcmc/_monomial_gamma_hamiltonian.py | 1 - 7 files changed, 16 insertions(+), 23 deletions(-) rename docs/source/{diagnostics.rst => mcmc_diagnostics.rst} (50%) delete mode 100644 docs/source/mcmc_samplers/summary_mcmc.rst diff --git a/docs/source/diagnostic_plots.rst b/docs/source/diagnostic_plots.rst index 4a2370868..8c8cc384c 100644 --- a/docs/source/diagnostic_plots.rst +++ b/docs/source/diagnostic_plots.rst @@ -7,13 +7,13 @@ Diagnostic plots For users who have Matplotlib installed, Pints offers a number of diagnostic plots that can be used to quickly check obtained results. -Plotting functions: +Plots of functions: - :func:`function` - :func:`function_between_points` - :func:`surface` -Diagnosing MCMC results: +MCMC Diagnostic plots: - :func:`autocorrelation` - :func:`histogram` @@ -21,8 +21,8 @@ Diagnosing MCMC results: - :func:`series` - :func:`trace` -Functions ---------- +Plots of functions +------------------ .. autofunction:: function @@ -30,8 +30,8 @@ Functions .. autofunction:: surface -MCMC Diagnostics ----------------- +MCMC Diagnostic plots +--------------------- .. autofunction:: autocorrelation diff --git a/docs/source/index.rst b/docs/source/index.rst index 8d0f3bc62..173ea9410 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -146,7 +146,6 @@ Contents abc_samplers/index boundaries core_classes_and_methods - diagnostics diagnostic_plots error_measures function_evaluation @@ -155,6 +154,7 @@ Contents log_pdfs log_priors mcmc_samplers/index + mcmc_diagnostics nested_samplers/index noise_generators optimisers/index diff --git a/docs/source/diagnostics.rst b/docs/source/mcmc_diagnostics.rst similarity index 50% rename from docs/source/diagnostics.rst rename to docs/source/mcmc_diagnostics.rst index 0c77b4c44..b940e4a1b 100644 --- a/docs/source/diagnostics.rst +++ b/docs/source/mcmc_diagnostics.rst @@ -1,11 +1,12 @@ -*********************** -Diagnosing MCMC results -*********************** +**************** +MCMC Diagnostics +**************** .. currentmodule:: pints -Pints provides a number of functions to diagnose MCMC progress and convergence. +PINTS provides a number of functions to diagnose MCMC progress and convergence. +.. autoclass:: MCMCSummary .. autofunction:: rhat diff --git a/docs/source/mcmc_samplers/index.rst b/docs/source/mcmc_samplers/index.rst index a69be1e90..b5a7dfce7 100644 --- a/docs/source/mcmc_samplers/index.rst +++ b/docs/source/mcmc_samplers/index.rst @@ -33,4 +33,4 @@ interface, that can be used to sample from an unknown slice_doubling_mcmc slice_rank_shrinking_mcmc slice_stepout_mcmc - summary_mcmc + diff --git a/docs/source/mcmc_samplers/summary_mcmc.rst b/docs/source/mcmc_samplers/summary_mcmc.rst deleted file mode 100644 index c87377ff4..000000000 --- a/docs/source/mcmc_samplers/summary_mcmc.rst +++ /dev/null @@ -1,7 +0,0 @@ -************ -MCMC Summary -************ - -.. currentmodule:: pints - -.. autoclass:: MCMCSummary diff --git a/pints/_diagnostics.py b/pints/_diagnostics.py index eb97e65bb..7332011d1 100644 --- a/pints/_diagnostics.py +++ b/pints/_diagnostics.py @@ -133,7 +133,7 @@ def _between(chains): def rhat(chains, warm_up=0.0): r""" Returns the convergence measure :math:`\hat{R}` for the approximate - posterior according to [1]_. + posterior according to [3]_. :math:`\hat{R}` diagnoses convergence by checking mixing and stationarity of :math:`m` chains (at least two, :math:`m\geq 2`). To diminish the @@ -180,8 +180,8 @@ def rhat(chains, warm_up=0.0): References ---------- - .. [1] "Bayesian data analysis", ch. 11.4 'Inference and assessing - convergence', 3rd edition, Gelman et al., 2014. + .. [3] "Bayesian data analysis", ch. 11.4 'Inference and assessing + convergence', 3rd edition, Gelman et al., 2014. Parameters ---------- diff --git a/pints/_mcmc/_monomial_gamma_hamiltonian.py b/pints/_mcmc/_monomial_gamma_hamiltonian.py index cc0a6a429..a68323000 100644 --- a/pints/_mcmc/_monomial_gamma_hamiltonian.py +++ b/pints/_mcmc/_monomial_gamma_hamiltonian.py @@ -72,7 +72,6 @@ class MonomialGammaHamiltonianMCMC(pints.SingleChainMCMC): Lawrence Cari. Advances in Neural Information Processing Systems (NIPS) - .. [2] MCMC using Hamiltonian dynamics Radford M. Neal, Chapter 5 of the Handbook of Markov Chain Monte Carlo by Steve Brooks, Andrew Gelman, Galin Jones, and Xiao-Li Meng. From 14dfab0222da359f2a1991834e27057dc03fda79 Mon Sep 17 00:00:00 2001 From: Michael Clerx Date: Thu, 5 Feb 2026 13:55:40 +0000 Subject: [PATCH 4/4] Readme now points to methods overview in docs instead of graphical one that needs maintenance. Closes #1696. --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 2f8a7861f..7747a0f1f 100644 --- a/README.md +++ b/README.md @@ -37,7 +37,7 @@ The full code can be [viewed here](https://github.com/pints-team/pints/blob/main Beyond time-series models, PINTS can be used on any error function or log-likelihood that takes real-valued, continuous parameters. -A graphical overview of the methods included in PINTS can be [viewed here](https://pints-team.github.io/pints-methods-overview/). +An overview of the methods provided by PINTS can be [viewed here](https://pints.readthedocs.io/en/stable/#provided-methods). ### Examples and documentation