Shortcuts

Lightning Transformers

Lightning Transformers offers a flexible interface for training and fine-tuning SOTA Transformer models using the PyTorch Lightning Trainer.

  • Powered by PyTorch Lightning - Accelerators, custom Callbacks, Loggers, and high performance scaling with minimal changes.

  • Backed by HuggingFace Transformers models and datasets, spanning multiple modalities and tasks within NLP/Audio and Vision.

  • Task Abstraction for Rapid Research & Experimentation - Build your own custom transformer tasks across all modalities with little friction.

  • Powerful config composition backed by Hydra - simply swap out models, optimizers, schedulers task and many more configurations without touching the code.

  • Seamless Memory and Speed Optimizations - Out of the box training optimizations such as DeepSpeed ZeRO or FairScale Sharded Training with no code changes.

Indices and tables

Read the Docs v: stable
Versions
latest
stable
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.