pip install lightning-transformers
In Lightning Transformers, we offer the following benefits:
Powered by PyTorch Lightning - Accelerators, custom Callbacks, Loggers, and high performance scaling with minimal changes.
Backed by HuggingFace Transformers models and datasets, spanning multiple modalities and tasks within NLP/Audio and Vision.
Task Abstraction for Rapid Research & Experimentation - Build your own custom transformer tasks across all modalities with little friction.
Powerful config composition backed by Hydra - simply swap out models, optimizers, schedulers task, and many more configurations without touching the code.
Pick a task to train (passed to
Pick a dataset (passed to
Customize the backbone, optimizer, or any component within the config
python train.py \ task=<TASK> \ dataset=<DATASET> backbone.pretrained_model_name_or_path=<BACKBONE> # Optionally change the HF backbone optimizer=<OPTIMIZER> # Optionally specify optimizer (Default AdamW) trainer.<ANY_TRAINER_FLAGS> # Optionally specify Lightning trainer arguments
To learn more about Lightning Transformers, please refer to the Lightning Transformers documentation.