chmncc.optimizers package
Submodules
chmncc.optimizers.adam module
Adam module
- chmncc.optimizers.adam.get_adam_optimizer(net: Module, lr: float, weight_decay: float) Optimizer [source]
ADAM optimizer
- Parameters:
[nn.Module] (net) – network architecture
[float] (weight_decay) – learning rate
[float] – weight_decay
- Returns:
optimizer [nn.Optimizer]
- chmncc.optimizers.adam.get_adam_optimizer_with_gate(net: Module, gate: DenseGatingFunction, lr: float, weight_decay: float) Optimizer [source]
ADAM optimizer
- Parameters:
[nn.Module] (net) – network architecture
[float] (weight_decay) – learning rate
[float] – weight_decay
- Returns:
optimizer [nn.Optimizer]
chmncc.optimizers.exponential module
Exponential scheduler
chmncc.optimizers.reduce_lr_on_plateau module
Reduce LR on plateau scheduler
chmncc.optimizers.sgd module
Stochastic gradient descend scheduler
- chmncc.optimizers.sgd.get_sdg_optimizer_with_gate(net: Module, gate: DenseGatingFunction, lr: float, weight_decay: float) Optimizer [source]
ADAM optimizer
- Parameters:
[nn.Module] (net) – network architecture
[float] (weight_decay) – learning rate
[float] – weight_decay
- Returns:
optimizer [nn.Optimizer]
chmncc.optimizers.step_lr module
Step LR scheduler
Module contents
Optimizers module It deals with all the optimizers we have employed or with the approaches we have experimented