Shortcuts

SummarizationTask

class flash.text.seq2seq.summarization.model.SummarizationTask(backbone='sshleifer/distilbart-xsum-1-1', tokenizer_kwargs=None, max_source_length=128, max_target_length=128, padding='max_length', loss_fn=None, optimizer='Adam', lr_scheduler=None, metrics=None, learning_rate=None, num_beams=4, use_stemmer=True, enable_ort=False)[source]

The SummarizationTask is a Task for Seq2Seq text summarization. For more details, see Summarization.

You can change the backbone to any summarization model from HuggingFace/transformers using the backbone argument.

Parameters
classmethod available_finetuning_strategies(cls)

Returns a list containing the keys of the available Finetuning Strategies.

Return type

List[str]

classmethod available_lr_schedulers(cls)

Returns a list containing the keys of the available LR schedulers.

Return type

List[str]

classmethod available_optimizers(cls)

Returns a list containing the keys of the available Optimizers.

Return type

List[str]

classmethod available_outputs(cls)

Returns the list of available outputs (that can be used during prediction or serving) for this Task.

Examples

..testsetup:

>>> from flash import Task
>>> print(Task.available_outputs())
['preds', 'raw']
Return type

List[str]

Read the Docs v: stable
Versions
latest
stable
0.8.1
0.8.0
0.7.5
0.7.4
0.7.3
0.7.2
0.7.1
0.7.0
0.6.0
0.5.2
0.5.1
0.5.0
0.4.0
0.3.2
0.3.1
0.3.0
0.2.3
0.2.2
0.2.1
0.2.0
0.1.0post1
Downloads
html
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.