Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,7 @@ jobs:

- |
tests/dims/distributions/test_core.py
tests/dims/distributions/test_censored.py
tests/dims/distributions/test_scalar.py
tests/dims/distributions/test_vector.py
tests/dims/test_model.py
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.11.13
hooks:
- id: ruff
- id: ruff-check
args: [--fix, --show-fixes]
- id: ruff-format
- repo: local
Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -40,4 +40,4 @@ rtd: clean
@echo "Build finished. The HTML pages are in $(BUILDDIR)."

view:
python -m webbrowser $(BUILDDIR)/index.html
python -m webbrowser file://$(abspath $(BUILDDIR))/index.html
2 changes: 1 addition & 1 deletion conda-envs/environment-alternative-backends.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ dependencies:
- numpyro>=0.8.0
- pandas>=0.24.0
- pip
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- networkx
- rich>=13.7.1
Expand Down
2 changes: 1 addition & 1 deletion conda-envs/environment-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ dependencies:
- numpy>=1.25.0
- pandas>=0.24.0
- pip
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- networkx
- scipy>=1.4.1
Expand Down
2 changes: 1 addition & 1 deletion conda-envs/environment-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ dependencies:
- numpy>=1.25.0
- pandas>=0.24.0
- pip
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- rich>=13.7.1
- scipy>=1.4.1
Expand Down
2 changes: 1 addition & 1 deletion conda-envs/environment-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ dependencies:
- pandas>=0.24.0
- pip
- polyagamma
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- networkx
- rich>=13.7.1
Expand Down
2 changes: 1 addition & 1 deletion conda-envs/windows-environment-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ dependencies:
- numpy>=1.25.0
- pandas>=0.24.0
- pip
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- networkx
- rich>=13.7.1
Expand Down
2 changes: 1 addition & 1 deletion conda-envs/windows-environment-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ dependencies:
- pandas>=0.24.0
- pip
- polyagamma
- pytensor>=2.38.0,<2.39
- pytensor>=2.38.2,<2.39
- python-graphviz
- networkx
- rich>=13.7.1
Expand Down
11 changes: 11 additions & 0 deletions docs/source/api/dims/distributions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,3 +39,14 @@ Vector distributions
Categorical
MvNormal
ZeroSumNormal


Higher-Order distributions
==========================

.. currentmodule:: pymc.dims
.. autosummary::
:toctree: generated/
:template: distribution.rst

Censored
2 changes: 1 addition & 1 deletion docs/source/api/dims/transforms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
Distribution Transforms
***********************

.. currentmodule:: pymc.dims.transforms
.. currentmodule:: pymc.dims.distributions.transforms
Comment thread
elc45 marked this conversation as resolved.
.. autosummary::
:toctree: generated/

Expand Down
4 changes: 2 additions & 2 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -298,10 +298,10 @@

# intersphinx configuration to ease linking arviz docs
intersphinx_mapping = {
"arviz": ("https://python.arviz.org/en/latest/", None),
"arviz": ("https://python.arviz.org/", "https://python.arviz.org/en/stable/objects.inv"),
"pytensor": ("https://pytensor.readthedocs.io/en/latest/", None),
"home": ("https://www.pymc.io", None),
"pmx": ("https://www.pymc.io/projects/experimental/en/latest", None),
"pmx": ("https://www.pymc.io/projects/extras/en/latest", None),
"numpy": ("https://numpy.org/doc/stable/", None),
"nb": ("https://www.pymc.io/projects/examples/en/latest/", None),
"myst": ("https://myst-parser.readthedocs.io/en/latest", None),
Expand Down
4 changes: 2 additions & 2 deletions docs/source/contributing/developer_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ A more accessible, user facing deep introduction can be found in [Peadar Coyle's
Probability distributions in PyMC are implemented as classes that inherit from {class}`~pymc.Continuous` or {class}`~pymc.Discrete`.
Either of these inherit {class}`~pymc.Distribution` which defines the high level API.

For a detailed introduction on how a new distribution should be implemented check out the {ref}`guide on implementing distributions <implementing_distribution>`.
For a detailed introduction on how a new distribution should be implemented check out the {ref}`guide on implementing distributions <implementing-a-distribution>`.


## Reflection
Expand Down Expand Up @@ -548,7 +548,7 @@ For example, see the [MH sampler](https://github.com/pymc-devs/pymc/blob/89f6fcf
This is of course very different compared to the transition kernel in e.g. TFP, which is a tenor in tensor out function.
Moreover, transition kernels in TFP do not flatten the tensors, see eg docstring of [tensorflow\_probability/python/mcmc/random\_walk\_metropolis.py](https://github.com/tensorflow/probability/blob/main/tensorflow_probability/python/mcmc/random_walk_metropolis.py):

```python
```text
new_state_fn: Python callable which takes a list of state parts and a
seed; returns a same-type `list` of `Tensor`s, each being a perturbation
of the input state parts. The perturbation distribution is assumed to be
Expand Down
4 changes: 2 additions & 2 deletions docs/source/contributing/jupyter_style.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,10 +70,10 @@ This guide does not teach nor cover MyST extensively, only gives some opinionate
```
Example source:
```
{ref}`how to use InferenceData <arviz:working_with_InferenceData>`
{ref}`how to use InferenceData <arviz:schema>`
```

Rendered example: {ref}`how to use InferenceData <arviz:working_with_InferenceData>`
Rendered example: {ref}`how to use InferenceData <arviz:schema>`

where `key` in the pattern (`arviz` in the example) is one of the keys defined in
the `intersphinx_mapping` variable of `conf.py` such as `arviz`, `numpy`, `mpl`...
Expand Down
8 changes: 6 additions & 2 deletions docs/source/glossary.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Likelihood
- For univariate, continuous scenarios, see the calibr8 paper: Bayesian calibration, process modeling and uncertainty quantification in biotechnology by Laura Marie Helleckes, Michael Osthege, Wolfgang Wiechert, Eric von Lieres, Marco Oldiges

Posterior
The outcome of Bayesian inference is a posterior distribution, which describes the relative plausibilities of every possible combination of parameter values, given the observed data. We can think of the posterior as the updated {term}`priors` after the model has seen the data.
The outcome of Bayesian inference is a posterior distribution, which describes the relative plausibilities of every possible combination of parameter values, given the observed data. We can think of the posterior as the updated {term}`prior` after the model has seen the data.

When the posterior is obtained using numerical methods we generally need to first diagnose the quality of the computed approximation. This is necessary as, for example, methods like {term}`MCMC` has only asymptotic guarantees. In a Bayesian setting predictions can be simulated by sampling from the posterior predictive distribution. When such predictions are used to check the internal consistency of the models by comparing it with the observed data used for inference, the process is known as the posterior predictive checks.

Expand All @@ -76,6 +76,10 @@ GLM
[PMF](https://en.wikipedia.org/wiki/Probability_mass_function)
A function that gives the probability that a discrete random variable is exactly equal to some value.

[Maximum Likelihood Estimate](https://en.wikipedia.org/wiki/Maximum_likelihood_estimation)
[MLE](https://en.wikipedia.org/wiki/Maximum_likelihood_estimation)
A point-estimate of an unknown quantity obtained by finding the parameter values that maximize the {term}`likelihood` function.

[Maximum a Posteriori](https://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation)
[MAP](https://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation)
It is a point-estimate of an unknown quantity, that equals the mode of the posterior distribution.
Expand Down Expand Up @@ -104,7 +108,7 @@ Hierarchical Ordinary Differential Equation
Individual, group, or other level types calculations of {term}`Ordinary Differential Equation`'s.

[Generalized Poisson Distribution](https://doi.org/10.2307/1267389)
A generalization of the {term}`Poisson distribution`, with two parameters X1, and X2, is obtained as a limiting form of the generalized negative binomial distribution. The variance of the distribution is greater than, equal to or smaller than the mean according as X2 is positive, zero or negative. For formula and more detail, visit the link in the title.
A generalization of the [Poisson distribution](https://en.wikipedia.org/wiki/Poisson_distribution), with two parameters X1, and X2, is obtained as a limiting form of the generalized negative binomial distribution. The variance of the distribution is greater than, equal to or smaller than the mean according as X2 is positive, zero or negative. For formula and more detail, visit the link in the title.

[Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem)
Describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if the risk of developing health problems is known to increase with age, Bayes' theorem allows the risk to an individual of a known age to be assessed more accurately (by conditioning it on their age) than simply assuming that the individual is typical of the population as a whole.
Expand Down
Loading
Loading