flash.core¶
flash.core.adapter¶
The |
|
The |
flash.core.classification¶
A |
|
A base class for classification outputs. |
|
A |
|
A |
|
A |
|
A |
|
A |
flash.core.finetuning¶
FlashBaseFinetuning can be used to create a custom Flash Finetuning Callback. |
|
Hooks to be used in Task and FlashBaseTuning. |
|
flash.core.integrations.fiftyone¶
Visualizes predictions from a model with a FiftyOne Output in the FiftyOne App. |
flash.core.integrations.icevision¶
flash.core.integrations.pytorch_forecasting¶
flash.core.model¶
Specialized callback only used during testing Keeps track metrics during training. |
|
The |
|
The |
|
A general Task. |
flash.core.registry¶
This class is used to register function or |
|
The |
|
The |
flash.core.optimizers¶
Extends SGD in PyTorch with LARS scaling. |
|
Extends ADAM in pytorch to incorporate LAMB algorithm from the paper: Large batch optimization for deep learning: Training BERT in 76 minutes. |
|
Sets the learning rate of each parameter group to follow a linear warmup schedule between warmup_start_lr and base_lr followed by a cosine annealing schedule between base_lr and eta_min. |
Utilities¶
Modified version of |
|
|
|