Shortcuts

flash.core

flash.core.adapter

Adapter

The Adapter is a lightweight interface that can be used to encapsulate the logic from a particular provider within a Task.

AdapterTask

The AdapterTask is a Task which wraps an Adapter and forwards all of the hooks.

flash.core.classification

ClassesOutput

A Output which applies an argmax to the model outputs (either logits or probabilities) and converts to a list.

ClassificationOutput

A base class for classification outputs.

ClassificationTask

FiftyOneLabelsOutput

A Output which converts the model outputs to FiftyOne classification format.

LabelsOutput

A Output which converts the model outputs (either logits or probabilities) to the label of the argmax classification.

LogitsOutput

A Output which simply converts the model outputs (assumed to be logits) to a list.

PredsClassificationOutput

A ClassificationOutput which gets the PREDS from the sample.

ProbabilitiesOutput

A Output which applies a softmax to the model outputs (assumed to be logits) and converts to a list.

flash.core.finetuning

FlashBaseFinetuning

FlashBaseFinetuning can be used to create a custom Flash Finetuning Callback.

FineTuningHooks

Hooks to be used in Task and FlashBaseTuning.

Freeze

FreezeUnfreeze

NoFreeze

UnfreezeMilestones

flash.core.integrations.fiftyone

visualize

Visualizes predictions from a model with a FiftyOne Output in the FiftyOne App.

flash.core.integrations.icevision

IceVisionTransformAdapter

type _sphinx_paramlinks_flash.core.integrations.icevision.transforms.IceVisionTransformAdapter.transform

List[Callable]

flash.core.integrations.pytorch_forecasting

convert_predictions

rtype

Tuple[Dict[str, Any], List]

flash.core.model

BenchmarkConvergenceCI

Specialized callback only used during testing Keeps track metrics during training.

CheckDependenciesMeta

ModuleWrapperBase

The ModuleWrapperBase is a base for classes which wrap a LightningModule or an instance of ModuleWrapperBase.

DatasetProcessor

The DatasetProcessor mixin provides hooks for classes which need custom logic for producing the data loaders for each running stage given the corresponding dataset.

Task

A general Task.

flash.core.registry

FlashRegistry

This class is used to register function or functools.partial class to a registry.

ExternalRegistry

The ExternalRegistry is a FlashRegistry that can point to an external provider via a getter function.

ConcatRegistry

The ConcatRegistry can be used to concatenate multiple registries of different types together.

flash.core.optimizers

LARS

Extends SGD in PyTorch with LARS scaling.

LAMB

Extends ADAM in pytorch to incorporate LAMB algorithm from the paper: Large batch optimization for deep learning: Training BERT in 76 minutes.

LinearWarmupCosineAnnealingLR

Sets the learning rate of each parameter group to follow a linear warmup schedule between warmup_start_lr and base_lr followed by a cosine annealing schedule between base_lr and eta_min.

Utilities

from_argparse_args

Modified version of pytorch_lightning.utilities.argparse.from_argparse_args() which populates valid_kwargs from pytorch_lightning.Trainer.

get_callable_name

rtype

str

get_callable_dict

rtype

Union[Dict, Mapping]

Read the Docs v: latest
Versions
latest
stable
0.8.2
0.8.1.post0
0.8.1
0.8.0
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.2
0.5.1
0.5.0
0.4.0
0.3.2
0.3.1
0.3.0
0.2.3
0.2.2
0.2.1
0.2.0
0.1.0post1
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.