Shortcuts

flash.core

flash.core.adapter

Adapter

The Adapter is a lightweight interface that can be used to encapsulate the logic from a particular provider within a Task.

AdapterTask

The AdapterTask is a Task which wraps an Adapter and forwards all of the hooks.

flash.core.classification

Classes

A Serializer which applies an argmax to the model outputs (either logits or probabilities) and converts to a list.

ClassificationSerializer

A base class for classification serializers.

ClassificationTask

FiftyOneLabels

A Serializer which converts the model outputs to FiftyOne classification format.

Labels

A Serializer which converts the model outputs (either logits or probabilities) to the label of the argmax classification.

Logits

A Serializer which simply converts the model outputs (assumed to be logits) to a list.

PredsClassificationSerializer

A ClassificationSerializer which gets the PREDS from the sample.

Probabilities

A Serializer which applies a softmax to the model outputs (assumed to be logits) and converts to a list.

flash.core.finetuning

FlashBaseFinetuning

FlashBaseFinetuning can be used to create a custom Flash Finetuning Callback.

FreezeUnfreeze

NoFreeze

UnfreezeMilestones

flash.core.integrations.fiftyone

visualize

Visualizes predictions from a model with a FiftyOne Serializer in the FiftyOne App.

flash.core.integrations.icevision

IceVisionTransformAdapter

type _sphinx_paramlinks_flash.core.integrations.icevision.transforms.IceVisionTransformAdapter.transform

List[Callable]

default_transforms

The default transforms from IceVision.

train_default_transforms

The default augmentations from IceVision.

flash.core.integrations.pytorch_forecasting

convert_predictions

rtype

Tuple[Dict[str, Any], List]

flash.core.model

BenchmarkConvergenceCI

Specialized callback only used during testing Keeps track metrics during training.

CheckDependenciesMeta

ModuleWrapperBase

The ModuleWrapperBase is a base for classes which wrap a LightningModule or an instance of ModuleWrapperBase.

DatasetProcessor

The DatasetProcessor mixin provides hooks for classes which need custom logic for producing the data loaders for each running stage given the corresponding dataset.

Task

A general Task.

flash.core.registry

FlashRegistry

This class is used to register function or functools.partial class to a registry.

ExternalRegistry

The ExternalRegistry is a FlashRegistry that can point to an external provider via a getter function.

ConcatRegistry

The ConcatRegistry can be used to concatenate multiple registries of different types together.

flash.core.optimizers

LARS

Extends SGD in PyTorch with LARS scaling from the paper Large batch training of Convolutional Networks.

LAMB

Extends ADAM in pytorch to incorporate LAMB algorithm from the paper: Large batch optimization for deep learning: Training BERT in 76 minutes.

LinearWarmupCosineAnnealingLR

Sets the learning rate of each parameter group to follow a linear warmup schedule between warmup_start_lr and base_lr followed by a cosine annealing schedule between base_lr and eta_min.

Utilities

from_argparse_args

Modified version of pytorch_lightning.utilities.argparse.from_argparse_args() which populates valid_kwargs from pytorch_lightning.Trainer.

get_callable_name

rtype

str

get_callable_dict

rtype

Union[Dict, Mapping]

predict_context

This decorator is used as context manager to put model in eval mode before running predict and reset to train after.

Read the Docs v: stable
Versions
latest
stable
0.5.2
0.5.1
0.5.0
0.4.0
0.3.2
0.3.1
0.3.0
0.2.3
0.2.2
0.2.1
0.2.0
0.1.0post1
docs-fix_tabular_forecasting
Downloads
pdf
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.