How to Use Hugging Face API

Last Updated : 27 Mar, 2026

Hugging Face is an AI platform that provides easy access to pre-trained models through its API, enabling tasks like text generation, translation and sentiment analysis without building models from scratch. Its key advantages include:

  • Simplifies working with AI models, making it beginner-friendly.
  • Provides a wide range of ready-to-use models for different tasks.
  • Enables quick integration, saving time and effort in development.

Setting Up the Hugging Face API

To start using the Hugging Face API, we need to create an account, install the required library and authenticate using an API token.

Step 1: Create a Hugging Face Account and Get API Token

Create an account on Hugging Face. After creating an account, go to your account settings and get your HuggingFace API token.

Step 2: Install the Hugging Face Hub Library

  • The Hugging Face Hub library helps us interact with the API. This library provides an easy to use interface for interacting with the Hugging Face models and making requests. We can easily install it using pip
  • After we have our API token and have installed the library, we can start making requests to the API.

pip install huggingface_hub

Step 3: Import the Necessary Library and Authenticate

  • In our Python environment, we start by importing the InferenceClient class from the Hugging Face Hub. Then, we can authenticate using our API token
  • Once authenticated, we are ready to use the API.
Python
from huggingface_hub import InferenceClient

client = InferenceClient(token="your_hf_api_token_here")

Using HuggingFace API for Text Generation

Using GPT-2 for text generation is straightforward with Hugging Face's API. By sending an input prompt, we can generate coherent, engaging text for various applications.

  • Import the required library (e.g., requests) and set up API details with model URL and headers.
  • Define a query function that sends a POST request with a prompt.
  • Receive and process the generated text from the API response.
Python
import requests

API_URL = "https://api-inference.huggingface.co/models/gpt2"
headers = {"Authorization": "your credentials"}  

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()

def generate_text(prompt):
    response = query({"inputs": prompt})
    
    print(response)
    
    if isinstance(response, list) and len(response) > 0:
        return response[0].get('generated_text', "No text generated.")
    else:
        return "Error: Unable to generate text."

prompt = "Once upon a time"
print(generate_text(prompt))

Output:

[{'generated_text': 'Once upon a time it seemed like Lightsaber on Coruscant would be a terrible threat. to Spike they decided . . . .and they continued their'}]

Comment

Explore