AdapterHub Documentation


This is the documentation of the legacy adapter-transformers library, which has been replaced by the new Adapters library, found here:

This documentation is kept for archival purposes, and the library will not be updated in the future. Please use the new library for all active projects.

AdapterHub is a framework simplifying the integration, training and usage of adapters and other efficient fine-tuning methods for Transformer-based language models. For a full list of currently implemented methods, see the table in our repository.

The framework consists of two main components:

  • adapter-transformers, an extension of Huggingface’s Transformers library that adds adapter components to transformer models

  • The Hub, a central repository collecting pre-trained adapter modules

Currently, we support the PyTorch versions of all models as listed on the Model Overview page.


   title={AdapterHub: A Framework for Adapting Transformers},
   author={Jonas Pfeiffer and
            Andreas R\"uckl\'{e} and
            Clifton Poth and
            Aishwarya Kamath and
            Ivan Vuli\'{c} and
            Sebastian Ruder and
            Kyunghyun Cho and
            Iryna Gurevych},
   booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020): Systems Demonstrations},
   address = "Online",
   publisher = "Association for Computational Linguistics",
   url = "",
   pages = "46--54",

Indices and tables