spateo.tdr.interpolations.interpolation_deeplearn.deep_interpolation¶
Classes¶
This module loads and retains the data pairs (X, Y) and delivers the batches of them to the DeepInterpolation |
Functions¶
|
Module Contents¶
- class spateo.tdr.interpolations.interpolation_deeplearn.deep_interpolation.DeepInterpolation(model: types.ModuleType, data_sampler: object, sirens: bool = False, enforce_positivity: bool = False, loss_function: Callable | None = weighted_mse(), smoothing_factor: float | None = True, stability_factor: float | None = True, load_model_from_buffer: bool = False, buffer_path: str = 'model_buffer/', hidden_features: int = 256, hidden_layers: int = 3, first_omega_0: float = 30.0, hidden_omega_0: float = 30.0, **kwargs)[source]¶
-
- train(max_iter: int, data_batch_size: int, autoencoder_batch_size: int, data_lr: float, autoencoder_lr: float, sample_fraction: float = 1, iter_per_sample_update: int | None = None)[source]¶
The training method for the DeepInterpolation model object
- Parameters:
- max_iter
The maximum iteration the network will be trained.
- data_batch_size
The size of the data sample batches to be generated in each iteration.
- autoencoder_batch_size
The size of the auto-encoder training batches to be generated in each iteration. Must be no greater than batch_size. .
- data_lr
The learning rate for network training.
- autoencoder_lr
The learning rate for network training the auto-encoder. Will have no effect if network_dim equal data_dim.
- sample_fraction
The best sample fraction to be filtered out of the velocity samples.
- iter_per_sample_update
The frequency of updating the subset of best samples (in terms of per iterations). Will have no effect if velocity_sample_fraction and time_course_sample_fraction are set to 1.
- spateo.tdr.interpolations.interpolation_deeplearn.deep_interpolation.subset_best_samples(best_sample_fraction, y_hat, y, loss_func)[source]¶
- class spateo.tdr.interpolations.interpolation_deeplearn.deep_interpolation.DataSampler(path_to_data: str | None = None, data: anndata.AnnData | dict | None = None, skey: str = 'spatial', ekey: str = 'M_s', wkey: str | None = None, normalize_data: bool = False, number_of_random_samples: str = 'all', weighted: bool = False)[source]¶
Bases:
object
This module loads and retains the data pairs (X, Y) and delivers the batches of them to the DeepInterpolation module upon calling. The module can load tha data from a .mat file. The file must contain two 2D matrices X and Y with equal rows.
X: The spatial coordinates of each cell / binning / segmentation. Y: The expression values at the corresponding coordinates X.
- generate_batch(batch_size: int, sample_subset_indices: str = 'all')[source]¶
Generate random batches of the given size “batch_size” from the (X, Y) sample pairs.
- Parameters:
- batch_size
If the batch_size is set to “all”, all the samples will be returned.
- sample_subset_indices
This argument is used when you want to further subset the samples (based on the factors such as quality of the samples). If set to “all”, it means it won’t filter out samples based on their qualities.