dival.reconstructors package
Subpackages
Submodules
- dival.reconstructors.dip_ct_reconstructor module
DeepImagePriorCTReconstructor
DeepImagePriorCTReconstructor.HYPER_PARAMS
DeepImagePriorCTReconstructor.__init__()
DeepImagePriorCTReconstructor.get_activation()
DeepImagePriorCTReconstructor.channels
DeepImagePriorCTReconstructor.gamma
DeepImagePriorCTReconstructor.iterations
DeepImagePriorCTReconstructor.loss_function
DeepImagePriorCTReconstructor.lr
DeepImagePriorCTReconstructor.mu_max
DeepImagePriorCTReconstructor.photons_per_pixel
DeepImagePriorCTReconstructor.scales
DeepImagePriorCTReconstructor.skip_channels
- dival.reconstructors.fbpunet_reconstructor module
FBPUNetReconstructor
FBPUNetReconstructor.HYPER_PARAMS
FBPUNetReconstructor.__init__()
FBPUNetReconstructor.train()
FBPUNetReconstructor.init_model()
FBPUNetReconstructor.init_scheduler()
FBPUNetReconstructor.batch_size
FBPUNetReconstructor.channels
FBPUNetReconstructor.epochs
FBPUNetReconstructor.filter_type
FBPUNetReconstructor.frequency_scaling
FBPUNetReconstructor.init_bias_zero
FBPUNetReconstructor.lr
FBPUNetReconstructor.lr_min
FBPUNetReconstructor.normalize_by_opnorm
FBPUNetReconstructor.scales
FBPUNetReconstructor.scheduler
FBPUNetReconstructor.skip_channels
FBPUNetReconstructor.use_sigmoid
- dival.reconstructors.iradonmap_reconstructor module
IRadonMapReconstructor
IRadonMapReconstructor.HYPER_PARAMS
IRadonMapReconstructor.__init__()
IRadonMapReconstructor.init_model()
IRadonMapReconstructor.batch_size
IRadonMapReconstructor.epochs
IRadonMapReconstructor.fully_learned
IRadonMapReconstructor.lr
IRadonMapReconstructor.normalize_by_opnorm
IRadonMapReconstructor.scales
IRadonMapReconstructor.skip_channels
IRadonMapReconstructor.use_sigmoid
- dival.reconstructors.learnedgd_reconstructor module
LearnedGDReconstructor
LearnedGDReconstructor.HYPER_PARAMS
LearnedGDReconstructor.__init__()
LearnedGDReconstructor.init_model()
LearnedGDReconstructor.batch_norm
LearnedGDReconstructor.batch_size
LearnedGDReconstructor.epochs
LearnedGDReconstructor.init_fbp
LearnedGDReconstructor.init_filter_type
LearnedGDReconstructor.init_frequency_scaling
LearnedGDReconstructor.init_weight_gain
LearnedGDReconstructor.init_weight_xavier_normal
LearnedGDReconstructor.internal_ch
LearnedGDReconstructor.kernel_size
LearnedGDReconstructor.lr
LearnedGDReconstructor.lr_time_decay_rate
LearnedGDReconstructor.lrelu_coeff
LearnedGDReconstructor.niter
LearnedGDReconstructor.nlayer
LearnedGDReconstructor.normalize_by_opnorm
LearnedGDReconstructor.prelu
LearnedGDReconstructor.use_sigmoid
- dival.reconstructors.learnedpd_reconstructor module
LearnedPDReconstructor
LearnedPDReconstructor.HYPER_PARAMS
LearnedPDReconstructor.__init__()
LearnedPDReconstructor.init_model()
LearnedPDReconstructor.init_optimizer()
LearnedPDReconstructor.init_scheduler()
LearnedPDReconstructor.batch_norm
LearnedPDReconstructor.batch_size
LearnedPDReconstructor.epochs
LearnedPDReconstructor.init_fbp
LearnedPDReconstructor.init_filter_type
LearnedPDReconstructor.init_frequency_scaling
LearnedPDReconstructor.internal_ch
LearnedPDReconstructor.kernel_size
LearnedPDReconstructor.lr
LearnedPDReconstructor.lr_min
LearnedPDReconstructor.lrelu_coeff
LearnedPDReconstructor.ndual
LearnedPDReconstructor.niter
LearnedPDReconstructor.nlayer
LearnedPDReconstructor.normalize_by_opnorm
LearnedPDReconstructor.nprimal
LearnedPDReconstructor.prelu
LearnedPDReconstructor.use_sigmoid
- dival.reconstructors.odl_reconstructors module
FBPReconstructor
CGReconstructor
GaussNewtonReconstructor
KaczmarzReconstructor
KaczmarzReconstructor.ops
KaczmarzReconstructor.x0
KaczmarzReconstructor.niter
KaczmarzReconstructor.omega
KaczmarzReconstructor.random
KaczmarzReconstructor.projection
KaczmarzReconstructor.callback
KaczmarzReconstructor.callback_loop
KaczmarzReconstructor.__init__()
KaczmarzReconstructor.iterations
LandweberReconstructor
MLEMReconstructor
ISTAReconstructor
PDHGReconstructor
DouglasRachfordReconstructor
ForwardBackwardReconstructor
ADMMReconstructor
BFGSReconstructor
- dival.reconstructors.reconstructor module
Reconstructor
Reconstructor.reco_space
Reconstructor.observation_space
Reconstructor.name
Reconstructor.hyper_params
Reconstructor.HYPER_PARAMS
Reconstructor.__init__()
Reconstructor.reconstruct()
Reconstructor.save_hyper_params()
Reconstructor.load_hyper_params()
Reconstructor.save_params()
Reconstructor.load_params()
LearnedReconstructor
IterativeReconstructor
StandardIterativeReconstructor
FunctionReconstructor
- dival.reconstructors.regression_reconstructors module
- dival.reconstructors.standard_learned_reconstructor module
StandardLearnedReconstructor
StandardLearnedReconstructor.model
StandardLearnedReconstructor.non_normed_op
StandardLearnedReconstructor.HYPER_PARAMS
StandardLearnedReconstructor.__init__()
StandardLearnedReconstructor.opnorm
StandardLearnedReconstructor.op
StandardLearnedReconstructor.eval()
StandardLearnedReconstructor.train()
StandardLearnedReconstructor.init_transform()
StandardLearnedReconstructor.transform
StandardLearnedReconstructor.init_model()
StandardLearnedReconstructor.init_optimizer()
StandardLearnedReconstructor.optimizer
StandardLearnedReconstructor.init_scheduler()
StandardLearnedReconstructor.scheduler
StandardLearnedReconstructor.batch_size
StandardLearnedReconstructor.epochs
StandardLearnedReconstructor.lr
StandardLearnedReconstructor.normalize_by_opnorm
StandardLearnedReconstructor.save_learned_params()
StandardLearnedReconstructor.load_learned_params()
- dival.reconstructors.tvadam_ct_reconstructor module
TVAdamCTReconstructor
TVAdamCTReconstructor.HYPER_PARAMS
TVAdamCTReconstructor.__init__()
TVAdamCTReconstructor.gamma
TVAdamCTReconstructor.init_filter_type
TVAdamCTReconstructor.init_frequency_scaling
TVAdamCTReconstructor.iterations
TVAdamCTReconstructor.loss_function
TVAdamCTReconstructor.lr
TVAdamCTReconstructor.mu_max
TVAdamCTReconstructor.photons_per_pixel
Module contents
- class dival.reconstructors.Reconstructor(reco_space=None, observation_space=None, name='', hyper_params=None)[source]
Bases:
object
Abstract reconstructor base class.
There are two ways of implementing a Reconstructor subclass:
Implement
reconstruct()
. It has to support optional in-place and out-of-place evaluation.Implement
_reconstruct()
. It must have one of the following signatures:_reconstruct(self, observation, out)
(in-place)_reconstruct(self, observation)
(out-of-place)_reconstruct(self, observation, out=None)
(optional in-place)
The class attribute
HYPER_PARAMS
defines the hyper parameters of the reconstructor class. The current values for a reconstructor instance are given by the attributehyper_params
. Properties wrappinghyper_params
are automatically created by the metaclass (for hyper parameter names that are valid identifiers), such that the hyper parameters can be written and read like instance attributes.- reco_space
Reconstruction space.
- Type:
odl.discr.DiscretizedSpace
, optional
- observation_space
Observation space.
- Type:
odl.discr.DiscretizedSpace
, optional
- name
Name of the reconstructor.
- Type:
str
- hyper_params
Current hyper parameter values. Initialized automatically using the default values from
HYPER_PARAMS
(but may be overridden by hyper_params passed to__init__()
). It is expected to have the same keys asHYPER_PARAMS
. The values for these keys in this dict are wrapped by properties with the key as identifier (if possible), so an assignment to the property changes the value in this dict and vice versa.- Type:
dict
- HYPER_PARAMS = {}
Specification of hyper parameters.
This class attribute is a dict that lists the hyper parameter of the reconstructor. It should not be hidden by an instance attribute of the same name (i.e. by assigning a value to self.HYPER_PARAMS in an instance of a subtype).
Note: in order to inherit
HYPER_PARAMS
from a super class, the subclass should create a deep copy of it, i.e. executeHYPER_PARAMS = copy.deepcopy(SuperReconstructorClass.HYPER_PARAMS)
in the class body.The keys of this dict are the names of the hyper parameters, and each value is a dict with the following fields.
Standard fields:
'default'
Default value.
'retrain'
bool, optionalWhether training depends on the parameter. Default:
False
. Any custom subclass of LearnedReconstructor must set this field toTrue
if training depends on the parameter value.
Hyper parameter search fields:
'range'
(float, float), optionalInterval of valid values. If this field is set, the parameter is taken to be real-valued. Either
'range'
or'choices'
has to be set.'choices'
sequence, optionalSequence of valid values of any type. If this field is set,
'range'
is ignored. Can be used to perform manual grid search. Either'range'
or'choices'
has to be set.'method'
{‘grid_search’, ‘hyperopt’}, optionalOptimization method for the parameter. Default:
'grid_search'
. Options are:'grid_search'
Grid search over a sequence of fixed values. Can be configured by the dict
'grid_search_options'
.'hyperopt'
Random search using the
hyperopt
package. Can be configured by the dict'hyperopt_options'
.
'grid_search_options'
dictOption dict for grid search.
The following fields determine how
'range'
is sampled (in case it is specified and no'choices'
are specified):'num_samples'
int, optionalNumber of values. Default:
10
.'type'
{‘linear’, ‘logarithmic’}, optionalType of grid, i.e. distribution of the values. Default:
'linear'
. Options are:'linear'
Equidistant values in the
'range'
.'logarithmic'
Values in the
'range'
that are equidistant in the log scale.
'log_base'
int, optionalLog-base that is used if
'type'
is'logarithmic'
. Default:10.
.
'hyperopt_options'
dictOption dict for
'hyperopt'
method with the fields:'space'
hyperopt space, optionalCustom hyperopt search space. If this field is set,
'range'
and'type'
are ignored.'type'
{‘uniform’}, optionalType of the space for sampling. Default:
'uniform'
. Options are:'uniform'
Uniform distribution over the
'range'
.
- reconstruct(observation, out=None)[source]
Reconstruct input data from observation data.
The default implementation calls _reconstruct, automatically choosing in-place or out-of-place evaluation.
- Parameters:
observation (
observation_space
element-like) – The observation data.out (
reco_space
element-like, optional) – Array to which the result is written (in-place evaluation). If None, a new array is created (out-of-place evaluation). If None, the new array is initialized with zero before calling_reconstruct()
.
- Returns:
reconstruction – The reconstruction.
- Return type:
reco_space
element or out
- save_hyper_params(path)[source]
Save hyper parameters to JSON file. See also
load_hyper_params()
.- Parameters:
path (str) – Path of the file in which the hyper parameters should be saved. The ending
'.json'
is automatically appended if not included.
- load_hyper_params(path)[source]
Load hyper parameters from JSON file. See also
save_hyper_params()
.- Parameters:
path (str) – Path of the file in which the hyper parameters are stored. The ending
'.json'
is automatically appended if not included.
- save_params(path=None, hyper_params_path=None)[source]
Save all parameters to file. E.g. for learned reconstructors, both hyper parameters and learned parameters should be included. The purpose of this method, together with
load_params()
, is to define a unified way of saving and loading any kind of reconstructor. The default implementation callssave_hyper_params()
. Subclasses must reimplement this method in order to include non-hyper parameters.Implementations should derive a sensible default for hyper_params_path from path, such that all parameters can be saved and loaded by specifying only path. Recommended patterns are:
if non-hyper parameters are stored in a single file and path specifies it without file ending:
hyper_params_path=path + '_hyper_params.json'
if non-hyper parameters are stored in a directory:
hyper_params_path=os.path.join(path, 'hyper_params.json')
.if there are no non-hyper parameters, this default implementation can be used:
hyper_params_path=path + '_hyper_params.json'
- Parameters:
path (str[, optional]) – Path at which all (non-hyper) parameters should be saved. This argument is required if the reconstructor has non-hyper parameters or hyper_params_path is omitted. If the reconstructor has non-hyper parameters, the implementation may interpret it as a file path or as a directory path for multiple files (the dir should be created by this method if it does not exist). If the implementation expects a file path, it should accept it without file ending.
hyper_params_path (str, optional) – Path of the file in which the hyper parameters should be saved. The ending
'.json'
is automatically appended if not included. If not specified, it should be determined from path (see method description above). The default implementation saves to the filepath + '_hyper_params.json'
.
- load_params(path=None, hyper_params_path=None)[source]
Load of parameters from file. E.g. for learned reconstructors, both hyper parameters and learned parameters should be included. The purpose of this method, together with
save_params()
, is to define a unified way of saving and loading any kind of reconstructor. The default implementation callsload_hyper_params()
. Subclasses must reimplement this method in order to include non-hyper parameters.See
save_params()
for recommended patterns to derive a default hyper_params_path from path.- Parameters:
path (str[, optional]) – Path at which all (non-hyper) parameters are stored. This argument is required if the reconstructor has non-hyper parameters or hyper_params_path is omitted. If the reconstructor has non-hyper parameters, the implementation may interpret it as a file path or as a directory path for multiple files. If the implementation expects a file path, it should accept it without file ending.
hyper_params_path (str, optional) – Path of the file in which the hyper parameters are stored. The ending
'.json'
is automatically appended if not included. If not specified, it should be determined from path (see description ofsave_params()
). The default implementation reads from the filepath + '_hyper_params.json'
.
- class dival.reconstructors.IterativeReconstructor(callback=None, **kwargs)[source]
Bases:
Reconstructor
Iterative reconstructor base class. It is recommended to use
StandardIterativeReconstructor
as a base class for iterative reconstructors if suitable, which provides some default implementation.Subclasses must call
callback
after each iteration inself.reconstruct
. This is e.g. required by theevaluation
module.- callback
Callback to be called after each iteration.
- Type:
odl.solvers.util.callback.Callback
or None
- HYPER_PARAMS = {'iterations': {'default': 100, 'retrain': False}}
Specification of hyper parameters.
This class attribute is a dict that lists the hyper parameter of the reconstructor. It should not be hidden by an instance attribute of the same name (i.e. by assigning a value to self.HYPER_PARAMS in an instance of a subtype).
Note: in order to inherit
HYPER_PARAMS
from a super class, the subclass should create a deep copy of it, i.e. executeHYPER_PARAMS = copy.deepcopy(SuperReconstructorClass.HYPER_PARAMS)
in the class body.The keys of this dict are the names of the hyper parameters, and each value is a dict with the following fields.
Standard fields:
'default'
Default value.
'retrain'
bool, optionalWhether training depends on the parameter. Default:
False
. Any custom subclass of LearnedReconstructor must set this field toTrue
if training depends on the parameter value.
Hyper parameter search fields:
'range'
(float, float), optionalInterval of valid values. If this field is set, the parameter is taken to be real-valued. Either
'range'
or'choices'
has to be set.'choices'
sequence, optionalSequence of valid values of any type. If this field is set,
'range'
is ignored. Can be used to perform manual grid search. Either'range'
or'choices'
has to be set.'method'
{‘grid_search’, ‘hyperopt’}, optionalOptimization method for the parameter. Default:
'grid_search'
. Options are:'grid_search'
Grid search over a sequence of fixed values. Can be configured by the dict
'grid_search_options'
.'hyperopt'
Random search using the
hyperopt
package. Can be configured by the dict'hyperopt_options'
.
'grid_search_options'
dictOption dict for grid search.
The following fields determine how
'range'
is sampled (in case it is specified and no'choices'
are specified):'num_samples'
int, optionalNumber of values. Default:
10
.'type'
{‘linear’, ‘logarithmic’}, optionalType of grid, i.e. distribution of the values. Default:
'linear'
. Options are:'linear'
Equidistant values in the
'range'
.'logarithmic'
Values in the
'range'
that are equidistant in the log scale.
'log_base'
int, optionalLog-base that is used if
'type'
is'logarithmic'
. Default:10.
.
'hyperopt_options'
dictOption dict for
'hyperopt'
method with the fields:'space'
hyperopt space, optionalCustom hyperopt search space. If this field is set,
'range'
and'type'
are ignored.'type'
{‘uniform’}, optionalType of the space for sampling. Default:
'uniform'
. Options are:'uniform'
Uniform distribution over the
'range'
.
- __init__(callback=None, **kwargs)[source]
- Parameters:
callback (
odl.solvers.util.callback.Callback
, optional) – Callback to be called after each iteration.
- reconstruct(observation, out=None, callback=None)[source]
Reconstruct input data from observation data.
Same as
Reconstructor.reconstruct()
, but with additional optional callback parameter.- Parameters:
observation (
observation_space
element-like) – The observation data.out (
reco_space
element-like, optional) – Array to which the result is written (in-place evaluation). If None, a new array is created (out-of-place evaluation).callback (
odl.solvers.util.callback.Callback
, optional) – Additional callback for this reconstruction that is temporarily composed withcallback
, i.e. also called after each iteration. If None, justcallback
is called.
- Returns:
reconstruction – The reconstruction.
- Return type:
reco_space
element or out
- property iterations
- class dival.reconstructors.StandardIterativeReconstructor(x0=None, callback=None, **kwargs)[source]
Bases:
IterativeReconstructor
Standard iterative reconstructor base class.
Provides a default implementation that only requires subclasses to implement
_compute_iterate()
and optionally_setup()
.- x0
Default initial value for the iterative reconstruction. Can be overridden by passing a different
x0
toreconstruct()
.- Type:
reco_space
element-like or None
- callback
Callback that is called after each iteration.
- Type:
odl.solvers.util.callback.Callback
or None
- __init__(x0=None, callback=None, **kwargs)[source]
- Parameters:
x0 (
reco_space
element-like, optional) – Default initial value for the iterative reconstruction. Can be overridden by passing a differentx0
toreconstruct()
.callback (
odl.solvers.util.callback.Callback
, optional) – Callback that is called after each iteration.
- reconstruct(observation, out=None, x0=None, last_iter=0, callback=None)[source]
Reconstruct input data from observation data.
Same as
Reconstructor.reconstruct()
, but with additional options for iterative reconstructors.- Parameters:
observation (
observation_space
element-like) – The observation data.out (
reco_space
element-like, optional) – Array to which the result is written (in-place evaluation). If None, a new array is created (out-of-place evaluation).x0 (
reco_space
element-like, optional) – Initial value for the iterative reconstruction. Overrides the attributex0
, which can be set when calling__init__()
. If bothx0
and this argument are None, the default implementation uses the value of out if called in-place, or zero if called out-of-place.last_iter (int, optional) – If x0 is the result of an iteration by this method, this can be used to specify the number of iterations so far. The number of iterations for the current call is
self.hyper_params['iterations'] - last_iter
.callback (
odl.solvers.util.callback.Callback
, optional) – Additional callback for this reconstruction that is temporarily composed withcallback
, i.e. also called after each iteration. If None, justcallback
is called.
- Returns:
reconstruction – The reconstruction.
- Return type:
reco_space
element or out
- property iterations
- class dival.reconstructors.LearnedReconstructor(reco_space=None, observation_space=None, name='', hyper_params=None)[source]
Bases:
Reconstructor
- train(dataset)[source]
Train the reconstructor with a dataset by adapting its parameters.
Should only use the training and validation data from dataset.
- Parameters:
dataset (
Dataset
) – The dataset from which the training data should be used.
- save_params(path, hyper_params_path=None)[source]
Save all parameters to file.
Calls
save_hyper_params()
andsave_learned_params()
, wheresave_learned_params()
should be implemented by the subclass.This implementation assumes that path is interpreted as a single file name, preferably specified without file ending. If path is a directory, the subclass needs to reimplement this method in order to follow the recommended default value pattern:
hyper_params_path=os.path.join(path, 'hyper_params.json')
.- Parameters:
path (str) – Path at which the learned parameters should be saved. Passed to
save_learned_params()
. If the implementation interprets it as a file path, it is preferred to exclude the file ending (otherwise the default value of hyper_params_path is suboptimal).hyper_params_path (str, optional) – Path of the file in which the hyper parameters should be saved. The ending
'.json'
is automatically appended if not included. If not specified, this implementation saves to the filepath + '_hyper_params.json'
.
- load_params(path, hyper_params_path=None)[source]
Load all parameters from file.
Calls
load_hyper_params()
andload_learned_params()
, whereload_learned_params()
should be implemented by the subclass.This implementation assumes that path is interpreted as a single file name, preferably specified without file ending. If path is a directory, the subclass needs to reimplement this method in order to follow the recommended default value pattern:
hyper_params_path=os.path.join(path, 'hyper_params.json')
.- Parameters:
path (str) – Path at which the parameters are stored. Passed to
load_learned_params()
. If the implementation interprets it as a file path, it is preferred to exclude the file ending (otherwise the default value of hyper_params_path is suboptimal).hyper_params_path (str, optional) – Path of the file in which the hyper parameters are stored. The ending
'.json'
is automatically appended if not included. If not specified, this implementation reads from the filepath + '_hyper_params.json'
.
- save_learned_params(path)[source]
Save learned parameters to file.
- Parameters:
path (str) – Path at which the learned parameters should be saved. Implementations may interpret this as a file path or as a directory path for multiple files (which then should be created if it does not exist). If the implementation expects a file path, it should accept it without file ending.
- load_learned_params(path)[source]
Load learned parameters from file.
- Parameters:
path (str) – Path at which the learned parameters are stored. Implementations may interpret this as a file path or as a directory path for multiple files. If the implementation expects a file path, it should accept it without file ending.
- class dival.reconstructors.FunctionReconstructor(function, *args, fun_args=None, fun_kwargs=None, **kwargs)[source]
Bases:
Reconstructor
Reconstructor defined by a callable.
- function
Callable that is used in reconstruct.
- Type:
callable
- fun_args
Arguments to be passed to function.
- Type:
list
- fun_kwargs
Keyword arguments to be passed to function.
- Type:
dict