Lightning Transformers offers a flexible interface for training and fine-tuning SOTA Transformer models using the PyTorch Lightning Trainer.
Powered by PyTorch Lightning - Accelerators, custom Callbacks, Loggers, and high performance scaling with minimal changes.
Backed by HuggingFace Transformers models and datasets, spanning multiple modalities and tasks within NLP/Audio and Vision.
Task Abstraction for Rapid Research & Experimentation - Build your own custom transformer tasks across all modalities with little friction.
Powerful config composition backed by Hydra - simply swap out models, optimizers, schedulers task and many more configurations without touching the code.