Integration with HuggingFace’s Model Hub

HuggingFace Hub logo.

Starting with v2.1 of adapter-transformers, you can download adapters from and upload them to HuggingFace’s Model Hub. This document describes how to interact with the Model Hub when working with adapters.

Downloading from the Hub

The HuggingFace Model Hub already provides a few pre-trained adapters available for download. To search for available adapters, use the Adapter Transformers library filter on the Model Hub website or use this link: https://huggingface.co/models?filter=adapter-transformers. Alternatively, all adapters on the HuggingFace Model Hub are also listed on https://adapterhub.ml/explore together with all adapters directly uploaded to AdapterHub.

After you have found an adapter you would like to use, loading it into a Transformer model is very similar to loading adapters from AdapterHub. For example, for loading and activating the adapter AdapterHub/roberta-base-pf-sick, write:

from transformers import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-sick", source="hf")
model.active_adapters = adapter_name

Note that source="hf" is the only change from loading an adapter from AdapterHub.

Uploading to the Hub

HuggingFace’s Model Hub provides a convenient way for everyone to upload their pre-trained models and share them with the world. Of course, this is also possible with adapters now! In the following, we’ll go through the fastest way of uploading an adapter directly via Python in the adapter-transformers library. For more options and information, e.g. for managing models via the CLI and Git, refer to HugginFace’s documentation.

  1. Prepare access credentials: Before being able to push to the HuggingFace Model Hub for the first time, we have to store our access token in the cache. This can be done via the transformers-cli by running:

    transformers-cli login
    
  2. Push an adapter: Next, we can proceed to upload our first adapter. Let’s say we have a standard pre-trained Transformers model with an existing adapter named awesome_adapter (e.g. added via model.add_adapter("awesome_adapter") and trained afterwards). We can now push this adapter to the Model Hub using model.push_adapter_to_hub() like this:

    model.push_adapter_to_hub(
        "my-awesome-adapter",
        "awesome_adapter",
        adapterhub_tag="sentiment/imdb",
        datasets_tag="imdb"
    )
    

    This will create a repository my-awesome-adapter under your username, generate a default adapter card as README.md and upload the adapter named awesome_adapter together with the adapter card to the new repository. adapterhub_tag and datasets_tag provide additional information for categorization.

    Important

    All adapters uploaded to HuggingFace’s Model Hub are automatically also listed on AdapterHub.ml. Thus, for better categorization, either adapterhub_tag or datasets_tag is required when uploading a new adapter to the Model Hub.

Voilà! Your first adapter is on the HuggingFace Model Hub. Anyone can now run:

model.load_adapter("<your_username>/my-awesome-adapter", source="hf")

To update your adapter, simply run push_adapter_to_hub() with the same repository name again. This will push a new commit to the existing repository.

You can find the full documentation of push_adapter_to_hub() here.