Providers
Providers
Prompt2PR supports four LLM providers. Providers are interchangeable — the action behaves the same regardless of which one you choose. Pick based on your preference, existing subscriptions, or model availability.
Mistral
Mistral AI offers high-quality models with competitive pricing.
Default model: mistral-large-latest
Setup
- Sign up at console.mistral.ai
- Go to API Keys and create a new key
- Add
MISTRAL_API_KEYas a GitHub Secret in your repository - Set
provider: mistralin your workflow
Workflow
- uses: davd-gzl/Prompt2PR@v1
with:
prompt: 'Your prompt here'
provider: mistral
env:
MISTRAL_API_KEY: $
GITHUB_TOKEN: $
OpenAI
OpenAI provides GPT models with broad language and code understanding.
Default model: gpt-4o
Setup
- Sign up at platform.openai.com
- Go to API Keys and create a new secret key
- Add
OPENAI_API_KEYas a GitHub Secret in your repository - Set
provider: openaiin your workflow
Workflow
- uses: davd-gzl/Prompt2PR@v1
with:
prompt: 'Your prompt here'
provider: openai
env:
OPENAI_API_KEY: $
GITHUB_TOKEN: $
Anthropic
Anthropic provides Claude models known for detailed, careful analysis.
Default model: claude-sonnet-4-20250514
Setup
- Sign up at console.anthropic.com
- Go to API Keys and create a new key
- Add
ANTHROPIC_API_KEYas a GitHub Secret in your repository - Set
provider: anthropicin your workflow
Workflow
- uses: davd-gzl/Prompt2PR@v1
with:
prompt: 'Your prompt here'
provider: anthropic
env:
ANTHROPIC_API_KEY: $
GITHUB_TOKEN: $
GitHub Models
GitHub Models lets you use LLMs
directly through GitHub’s infrastructure. No external API key required — it
uses the built-in GITHUB_TOKEN.
Default model: openai/gpt-4o
Prerequisites
- A GitHub Copilot subscription
- Access to GitHub Models
Setup
- No additional secrets needed
- Add
models: readto your workflow permissions - Set
provider: githubin your workflow - Use models in
publisher/model-nameformat
Available Models
GitHub Models hosts models from multiple publishers. Use the
publisher/model-name format:
openai/gpt-4oanthropic/claude-sonnet-4.5- And more — check the GitHub Models marketplace
Workflow
permissions:
contents: write
pull-requests: write
models: read
jobs:
prompt2pr:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: davd-gzl/Prompt2PR@v1
with:
prompt: 'Your prompt here'
provider: github
model: openai/gpt-4o
env:
GITHUB_TOKEN: $
Important: The models: read permission is required. Without it, the action
will fail with an authentication error. {: .notice–warning }
Switching Providers
Since providers are interchangeable, switching is straightforward. Change two things:
- The
provider:input - The API key in the
env:block
# Before (Mistral)
provider: mistral
env:
MISTRAL_API_KEY: $
# After (OpenAI)
provider: openai
env:
OPENAI_API_KEY: $
The action handles all differences in API format, authentication, and response parsing internally.
Custom Endpoints
Use the base_url input to point any provider at a custom endpoint:
with:
provider: openai
base_url: 'https://my-proxy.example.com/v1'
This is useful for API proxies, self-hosted models, or testing with mock servers.
Next: Examples