Shortcuts

flash.core.data

flash.core.data.auto_dataset

AutoDataset

The AutoDataset is a BaseAutoDataset and a Dataset.

BaseAutoDataset

The BaseAutoDataset class wraps the output of a call to load_data() and a DataSource and provides the _call_load_sample method to call load_sample() with the correct CurrentRunningStageFuncContext for the current running_stage.

IterableAutoDataset

The IterableAutoDataset is a BaseAutoDataset and a IterableDataset.

flash.core.data.base_viz

BaseVisualization

This Base Class is used to create visualization tool on top of Preprocess hooks.

flash.core.data.batch

default_uncollate

This function is used to uncollate a batch into samples.

flash.core.data.callback

BaseDataFetcher

This class is used to profile Preprocess hook outputs.

ControlFlow

FlashCallback

FlashCallback is an extension of pytorch_lightning.callbacks.Callback.

flash.core.data.data_module

DataModule

A basic DataModule class for all Flash tasks.

flash.core.data.data_pipeline

DataPipeline

DataPipeline holds the engineering logic to connect Preprocess and/or Postprocess objects to the DataModule, Flash Task and Trainer.

DataPipelineState

A class to store and share all process states once a DataPipeline has been initialized.

flash.core.data.data_source

DatasetDataSource

The DatasetDataSource implements default behaviours for data sources which expect the input to load_data() to be a torch.utils.data.dataset.Dataset

DataSource

The DataSource class encapsulates two hooks: load_data and load_sample.

DefaultDataKeys

The DefaultDataKeys enum contains the keys that are used by built-in data sources to refer to inputs and targets.

DefaultDataSources

The DefaultDataSources enum contains the data source names used by all of the default from_* methods in DataModule.

FiftyOneDataSource

The FiftyOneDataSource expects the input to load_data() to be a fiftyone.core.collections.SampleCollection.

ImageLabelsMap

LabelsState

A ProcessState containing labels, a mapping from class index to label.

MockDataset

The MockDataset catches any metadata that is attached through __setattr__.

NumpyDataSource

The NumpyDataSource is a SequenceDataSource which expects the input to load_data() to be a sequence of np.ndarray objects.

PathsDataSource

The PathsDataSource implements default behaviours for data sources which expect the input to load_data() to be either a directory with a subdirectory for each class or a tuple containing list of files and corresponding list of targets.

SequenceDataSource

The SequenceDataSource implements default behaviours for data sources which expect the input to load_data() to be a sequence of tuples ((input, target) where target can be None).

TensorDataSource

The TensorDataSource is a SequenceDataSource which expects the input to load_data() to be a sequence of torch.Tensor objects.

has_file_allowed_extension

Checks if a file is an allowed extension.

has_len

rtype

bool

make_dataset

Generates a list of samples of a form (path_to_sample, class).

flash.core.data.process

BasePreprocess

DefaultPreprocess

DeserializerMapping

Deserializer Mapping.

Deserializer

Deserializer.

Postprocess

The Postprocess encapsulates all the data processing logic that should run after the model.

Preprocess

The Preprocess encapsulates all the data processing logic that should run before the data is passed to the model.

SerializerMapping

If the model output is a dictionary, then the SerializerMapping enables each entry in the dictionary to be passed to it’s own Serializer.

Serializer

A Serializer encapsulates a single serialize method which is used to convert the model output into the desired output format when predicting.

flash.core.data.properties

ProcessState

Base class for all process states.

Properties

flash.core.data.splits

SplitDataset

SplitDataset is used to create Dataset Subset using indices.

flash.core.data.transforms

ApplyToKeys

The ApplyToKeys class is an nn.Sequential which applies the given transforms to the given keys from the input.

KorniaParallelTransforms

The KorniaParallelTransforms class is an nn.Sequential which will apply the given transforms to each input (to .forward) in parallel, whilst sharing the random state (._params).

merge_transforms

Utility function to merge two transform dictionaries.

kornia_collate

Kornia transforms add batch dimension which need to be removed.

flash.core.data.utils

CurrentFuncContext

CurrentRunningStageContext

CurrentRunningStageFuncContext

FuncModule

This class is used to wrap a callable within a nn.Module and apply the wrapped function in __call__

convert_to_modules

download_data

Download file with progressbar.

Read the Docs v: latest
Versions
latest
stable
0.5.0
0.4.0
0.3.2
0.3.1
0.3.0
0.2.3
0.2.2
0.2.1
0.2.0
0.1.0post1
Downloads
pdf
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.