spateo.segmentation.vi ====================== .. py:module:: spateo.segmentation.vi .. autoapi-nested-parse:: Variational inference implementation of a negative binomial mixture model using Pyro. Classes ------- .. autoapisummary:: spateo.segmentation.vi.NegativeBinomialMixture Functions --------- .. autoapisummary:: spateo.segmentation.vi.conditionals spateo.segmentation.vi.run_vi Module Contents --------------- .. py:class:: NegativeBinomialMixture(x: numpy.ndarray, n: int = 2, n_init: int = 5, w: Optional[numpy.ndarray] = None, mu: Optional[numpy.ndarray] = None, var: Optional[numpy.ndarray] = None, zero_inflated: bool = False, seed: Optional[int] = None) Bases: :py:obj:`pyro.nn.PyroModule` Subclass of :class:`torch.nn.Module` whose attributes can be modified by Pyro effects. Attributes can be set using helpers :class:`PyroParam` and :class:`PyroSample` , and methods can be decorated by :func:`pyro_method` . **Parameters** To create a Pyro-managed parameter attribute, set that attribute using either :class:`torch.nn.Parameter` (for unconstrained parameters) or :class:`PyroParam` (for constrained parameters). Reading that attribute will then trigger a :func:`pyro.param ` statement. For example:: # Create Pyro-managed parameter attributes. my_module = PyroModule() my_module.loc = nn.Parameter(torch.tensor(0.)) my_module.scale = PyroParam(torch.tensor(1.), constraint=constraints.positive) # Read the attributes. loc = my_module.loc # Triggers a pyro.param statement. scale = my_module.scale # Triggers another pyro.param statement. Note that, unlike normal :class:`torch.nn.Module` s, :class:`PyroModule` s should not be registered with :func:`pyro.module ` statements. :class:`PyroModule` s can contain other :class:`PyroModule` s and normal :class:`torch.nn.Module` s. Accessing a normal :class:`torch.nn.Module` attribute of a :class:`PyroModule` triggers a :func:`pyro.module ` statement. If multiple :class:`PyroModule` s appear in a single Pyro model or guide, they should be included in a single root :class:`PyroModule` for that model. :class:`PyroModule` s synchronize data with the param store at each ``setattr``, ``getattr``, and ``delattr`` event, based on the nested name of an attribute: - Setting ``mod.x = x_init`` tries to read ``x`` from the param store. If a value is found in the param store, that value is copied into ``mod`` and ``x_init`` is ignored; otherwise ``x_init`` is copied into both ``mod`` and the param store. - Reading ``mod.x`` tries to read ``x`` from the param store. If a value is found in the param store, that value is copied into ``mod``; otherwise ``mod``'s value is copied into the param store. Finally ``mod`` and the param store agree on a single value to return. - Deleting ``del mod.x`` removes a value from both ``mod`` and the param store. Note two :class:`PyroModule` of the same name will both synchronize with the global param store and thus contain the same data. When creating a :class:`PyroModule`, then deleting it, then creating another with the same name, the latter will be populated with the former's data from the param store. To avoid this persistence, either ``pyro.clear_param_store()`` or call :func:`clear` before deleting a :class:`PyroModule` . :class:`PyroModule` s can be saved and loaded either directly using :func:`torch.save` / :func:`torch.load` or indirectly using the param store's :meth:`~pyro.params.param_store.ParamStoreDict.save` / :meth:`~pyro.params.param_store.ParamStoreDict.load` . Note that :func:`torch.load` will be overridden by any values in the param store, so it is safest to ``pyro.clear_param_store()`` before loading. **Samples** To create a Pyro-managed random attribute, set that attribute using the :class:`PyroSample` helper, specifying a prior distribution. Reading that attribute will then trigger a :func:`pyro.sample ` statement. For example:: # Create Pyro-managed random attributes. my_module.x = PyroSample(dist.Normal(0, 1)) my_module.y = PyroSample(lambda self: dist.Normal(self.loc, self.scale)) # Sample the attributes. x = my_module.x # Triggers a pyro.sample statement. y = my_module.y # Triggers one pyro.sample + two pyro.param statements. Sampling is cached within each invocation of ``.__call__()`` or method decorated by :func:`pyro_method` . Because sample statements can appear only once in a Pyro trace, you should ensure that traced access to sample attributes is wrapped in a single invocation of ``.__call__()`` or method decorated by :func:`pyro_method` . To make an existing module probabilistic, you can create a subclass and overwrite some parameters with :class:`PyroSample` s:: class RandomLinear(nn.Linear, PyroModule): # used as a mixin def __init__(self, in_features, out_features): super().__init__(in_features, out_features) self.weight = PyroSample( lambda self: dist.Normal(0, 1) .expand([self.out_features, self.in_features]) .to_event(2)) **Mixin classes** :class:`PyroModule` can be used as a mixin class, and supports simple syntax for dynamically creating mixins, for example the following are equivalent:: # Version 1. create a named mixin class class PyroLinear(nn.Linear, PyroModule): pass m.linear = PyroLinear(m, n) # Version 2. create a dynamic mixin class m.linear = PyroModule[nn.Linear](m, n) This notation can be used recursively to create Bayesian modules, e.g.:: model = PyroModule[nn.Sequential]( PyroModule[nn.Linear](28 * 28, 100), PyroModule[nn.Sigmoid](), PyroModule[nn.Linear](100, 100), PyroModule[nn.Sigmoid](), PyroModule[nn.Linear](100, 10), ) assert isinstance(model, nn.Sequential) assert isinstance(model, PyroModule) # Now we can be Bayesian about weights in the first layer. model[0].weight = PyroSample( prior=dist.Normal(0, 1).expand([28 * 28, 100]).to_event(2)) guide = AutoDiagonalNormal(model) Note that ``PyroModule[...]`` does not recursively mix in :class:`PyroModule` to submodules of the input ``Module``; hence we needed to wrap each submodule of the ``nn.Sequential`` above. :param str name: Optional name for a root PyroModule. This is ignored in sub-PyroModules of another PyroModule. .. py:attribute:: zero_inflated :value: False .. py:attribute:: x .. py:attribute:: n :value: 2 .. py:attribute:: scale .. py:attribute:: __optimizer :value: None .. py:method:: assignment(train=False) .. py:method:: dist(assignment, train=False) .. py:method:: init_best_params(n_init) .. py:method:: init_mean_variance(w, mu, var) .. py:method:: optimizer() .. py:method:: get_params(train=False, transform=True) .. py:method:: forward(x) .. py:method:: train(n_epochs: int = 500) Set the module in training mode. This has any effect only on certain modules. See documentations of particular modules for details of their behaviors in training/evaluation mode, if they are affected, e.g. :class:`Dropout`, :class:`BatchNorm`, etc. :param mode: whether to set training mode (``True``) or evaluation mode (``False``). Default: ``True``. :type mode: bool :returns: self :rtype: Module .. py:method:: conditionals(params, x, use_weights=False) :staticmethod: .. py:function:: conditionals(X: numpy.ndarray, vi_results: Union[Dict[int, Dict[str, float]], Dict[str, float]], bins: Optional[numpy.ndarray] = None) -> Tuple[numpy.ndarray, numpy.ndarray] Compute the conditional probabilities, for each pixel, of observing the observed number of UMIs given that the pixel is background/foreground. :param X: UMI counts per pixel :param em_results: Return value of :func:`run_em`. :param bins: Pixel bins, as was passed to :func:`run_em`. :returns: Two Numpy arrays, the first corresponding to the background conditional probabilities, and the second to the foreground conditional probabilities :raises SegmentationError: If `em_results` is a dictionary but `bins` was not provided. .. py:function:: run_vi(X: numpy.ndarray, downsample: Union[int, float] = 0.01, n_epochs: int = 500, bins: Optional[numpy.ndarray] = None, params: Union[Dict[str, Tuple[float, float]], Dict[int, Dict[str, Tuple[float, float]]]] = dict(w=(0.5, 0.5), mu=(10.0, 300.0), var=(20.0, 400.0)), zero_inflated: bool = False, seed: Optional[int] = None) -> Union[Tuple[Tuple[float, float], Tuple[float, float], Tuple[float, float]], Dict[int, Tuple[Tuple[float, float], Tuple[float, float], Tuple[float, float]]]] Run negative binomial mixture variational inference. :param X: :param downsample: :param n_epochs: :param bins: :param params: :param zero_inflated: :param seed: Returns: