Shortcuts

Task API Reference

class lightning_transformers.task.nlp.language_modeling.LanguageModelingTransformer(*args, downstream_model_type='transformers.AutoModelForCausalLM', **kwargs)

Defines LightningModule for the Language Modeling Task.

Parameters
  • *argslightning_transformers.core.nlp.HFTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForCausalLM)

  • **kwargslightning_transformers.core.nlp.HFTransformer arguments.

class lightning_transformers.task.nlp.multiple_choice.MultipleChoiceTransformer(*args, downstream_model_type='transformers.AutoModelForMultipleChoice', **kwargs)

Defines LightningModule for the Multiple Choice Task.

Parameters
  • *argslightning_transformers.core.nlp.HFTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForMultipleChoice)

  • **kwargslightning_transformers.core.nlp.HFTransformer arguments.

class lightning_transformers.task.nlp.question_answering.QuestionAnsweringTransformer(*args, downstream_model_type='transformers.AutoModelForQuestionAnswering', **kwargs)

Defines LightningModule for the Question Answering Task.

Parameters
  • *argslightning_transformers.core.nlp.HFTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForQuestionAnswering)

  • **kwargslightning_transformers.core.nlp.HFTransformer arguments.

class lightning_transformers.task.nlp.summarization.SummarizationTransformer(*args, downstream_model_type='transformers.AutoModelForSeq2SeqLM', cfg=SummarizationConfig(val_target_max_length=128, num_beams=1, compute_generate_metrics=True, use_stemmer=True, rouge_newline_sep=True), **kwargs)

Defines LightningModule for the Summarization Task.

Parameters
  • *argslightning_transformers.core.nlp.seq2seq.Seq2SeqTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForSeq2SeqLM)

  • **kwargslightning_transformers.core.nlp.seq2seq.Seq2SeqTransformer arguments.

class lightning_transformers.task.nlp.text_classification.TextClassificationTransformer(*args, downstream_model_type='transformers.AutoModelForSequenceClassification', **kwargs)

Defines LightningModule for the Text Classification Task.

Parameters
  • *argslightning_transformers.core.nlp.HFTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForSequenceClassification)

  • **kwargslightning_transformers.core.nlp.HFTransformer arguments.

class lightning_transformers.task.nlp.token_classification.TokenClassificationTransformer(*args, labels, downstream_model_type='transformers.AutoModelForTokenClassification', **kwargs)

Defines LightningModule for the Text Classification Task.

Parameters
  • *argslightning_transformers.core.nlp.HFTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForTokenClassification)

  • **kwargslightning_transformers.core.nlp.HFTransformer arguments.

class lightning_transformers.task.nlp.translation.TranslationTransformer(*args, downstream_model_type='transformers.AutoModelForSeq2SeqLM', cfg=TranslationConfig(val_target_max_length=128, num_beams=1, compute_generate_metrics=True, n_gram=4, smooth=False), **kwargs)

Defines LightningModule for the Translation Task.

Parameters
  • *argslightning_transformers.core.nlp.seq2seq.Seq2SeqTransformer arguments.

  • downstream_model_type (str) – Downstream HuggingFace AutoModel to load. (default transformers.AutoModelForSeq2SeqLM)

  • **kwargslightning_transformers.core.nlp.seq2seq.Seq2SeqTransformer arguments.

class lightning_transformers.core.nlp.HFTransformer(downstream_model_type, backbone, optimizer=OptimizerConfig(lr=0.001, weight_decay=0.0), scheduler=SchedulerConfig(num_training_steps=- 1, num_warmup_steps=0.1), instantiator=None, tokenizer=None, pipeline_kwargs=None, **model_data_kwargs)

Base class for task specific transformers, wrapping pre-trained language models for downstream tasks. The API is built on top of AutoModel and AutoConfig, provided by HuggingFace.

see: https://huggingface.co/transformers/model_doc/auto.html

Parameters
  • downstream_model_type (str) – The AutoModel downstream model type. See https://huggingface.co/transformers/model_doc/auto.html

  • backbone (HFBackboneConfig) – Config containing backbone specific arguments.

  • optimizer (OptimizerConfig) – Config containing optimizer specific arguments.

  • scheduler (SchedulerConfig) – Config containing scheduler specific arguments.

  • instantiator (Optional[Instantiator]) – Used to instantiate objects (when using Hydra). If Hydra is not being used the instantiator is not required, and functions that use instantiation such as configure_optimizers has been overridden.

  • tokenizer (Optional[PreTrainedTokenizerBase]) – The pre-trained tokenizer.

  • pipeline_kwargs (Optional[dict]) – Arguments required for the HuggingFace inference pipeline class.

  • **model_data_kwargs – Arguments passed from the data module to the class.

Read the Docs v: stable
Versions
latest
stable
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.