YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
π Gemini 2.5 Pro β API Proxy Model
This repository hosts a Custom Hugging Face Inference Endpoint that connects directly to Google Gemini through the official google-generativeai SDK.
π§ Overview
This model is not an open-weight model, but a proxy wrapper around the official Google Gemini API.
It allows developers to deploy a Gemini-powered chatbot on Hugging Face with:
- Full
handler.pycontrol - Secure API key injection via Environment Variables
- Compatible with any frontend (React, Flask, HTML, etc.)
- Optional CORS for browser-based chatbots
βοΈ How It Works
When deployed, the endpoint:
- Receives a POST request (
/) with either:{"inputs": "Hello Gemini!"}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support