YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

🌟 Gemini 2.5 Pro β€” API Proxy Model

This repository hosts a Custom Hugging Face Inference Endpoint that connects directly to Google Gemini through the official google-generativeai SDK.


🧠 Overview

This model is not an open-weight model, but a proxy wrapper around the official Google Gemini API.

It allows developers to deploy a Gemini-powered chatbot on Hugging Face with:

  • Full handler.py control
  • Secure API key injection via Environment Variables
  • Compatible with any frontend (React, Flask, HTML, etc.)
  • Optional CORS for browser-based chatbots

βš™οΈ How It Works

When deployed, the endpoint:

  1. Receives a POST request (/) with either:
    {"inputs": "Hello Gemini!"}
    
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support