mirror of
https://github.com/xtekky/gpt4free.git
synced 2025-12-06 02:30:41 -08:00
Update provider capabilities and model support (#2600)
* Update provider capabilities and model support - Update provider documentation with latest model support - Remove deprecated models and update model counts - Add new model variants and fix formatting - Update provider class labels for better clarity - Add support for new models including DeepSeek-R1 and sd-turbo - Clean up unused model aliases and improve code organization Key changes: - Update Blackbox vision capabilities - Remove legacy models (midijourney, unity, rtist) - Add flux variants and update provider counts - Set explicit provider labels - Update model aliases and mappings - Add new model support in multiple providers * Upodate g4f/models.py * Update docs/providers-and-models.md g4f/models.py g4f/Provider/Blackbox.py --------- Co-authored-by: kqlio67 <>
This commit is contained in:
parent
ac097eff19
commit
e4c4e7b5ba
7 changed files with 89 additions and 107 deletions
|
|
@ -5,7 +5,7 @@
|
||||||
|
|
||||||
This document provides an overview of various AI providers and models, including text generation, image generation, and vision capabilities. It aims to help users navigate the diverse landscape of AI services and choose the most suitable option for their needs.
|
This document provides an overview of various AI providers and models, including text generation, image generation, and vision capabilities. It aims to help users navigate the diverse landscape of AI services and choose the most suitable option for their needs.
|
||||||
|
|
||||||
> **Note**: See our [Authentication Guide] (authentication.md) for authentication instructions for the provider.
|
> **Note**: See our [Authentication Guide](authentication.md) for authentication instructions for the provider.
|
||||||
|
|
||||||
|
|
||||||
## Table of Contents
|
## Table of Contents
|
||||||
|
|
@ -40,7 +40,7 @@ This document provides an overview of various AI providers and models, including
|
||||||
|[aichatfree.info](https://aichatfree.info)|No auth required|`g4f.Provider.AIChatFree`|`gemini-1.5-pro` _**(1+)**_|❌|❌|✔||
|
|[aichatfree.info](https://aichatfree.info)|No auth required|`g4f.Provider.AIChatFree`|`gemini-1.5-pro` _**(1+)**_|❌|❌|✔||
|
||||||
|[aiuncensored.info/ai_uncensored](https://www.aiuncensored.info/ai_uncensored)|Optional API key|`g4f.Provider.AIUncensored`|`hermes-3`|❌|❌|✔||
|
|[aiuncensored.info/ai_uncensored](https://www.aiuncensored.info/ai_uncensored)|Optional API key|`g4f.Provider.AIUncensored`|`hermes-3`|❌|❌|✔||
|
||||||
|[autonomous.ai](https://www.autonomous.ai/anon/)|No auth required|`g4f.Provider.AutonomousAI`|`llama-3.3-70b, qwen-2.5-coder-32b, hermes-3, llama-3.2-90b, llama-3.3-70b, llama-3-2-70b`|✔|❌|✔||
|
|[autonomous.ai](https://www.autonomous.ai/anon/)|No auth required|`g4f.Provider.AutonomousAI`|`llama-3.3-70b, qwen-2.5-coder-32b, hermes-3, llama-3.2-90b, llama-3.3-70b, llama-3-2-70b`|✔|❌|✔||
|
||||||
|[blackbox.ai](https://www.blackbox.ai)|No auth required|`g4f.Provider.Blackbox`|`blackboxai, gpt-4, gpt-4o, gemini-1.5-flash, gemini-1.5-pro, claude-3.5-sonnet, blackboxai-pro, llama-3.1-8b, llama-3.1-70b, llama-3-1-405b, llama-3.3-70b, mixtral-7b, deepseek-chat, dbrx-instruct, qwq-32b, hermes-2-dpo, deepseek-r1` _**(+31)**_|`flux`|`blackboxai, gpt-4o, gemini-1.5-pro, gemini-1.5-flash, llama-3.1-8b, llama-3.1-70b, llama-3.1-405b`|✔||
|
|[blackbox.ai](https://www.blackbox.ai)|No auth required|`g4f.Provider.Blackbox`|`blackboxai, gpt-4, gpt-4o, gemini-1.5-flash, gemini-1.5-pro, claude-3.5-sonnet, blackboxai-pro, llama-3.1-8b, llama-3.1-70b, llama-3-1-405b, llama-3.3-70b, mixtral-7b, deepseek-chat, dbrx-instruct, qwq-32b, hermes-2-dpo, deepseek-r1` _**(+31)**_|`flux`|`blackboxai, gpt-4o, gemini-1.5-pro, gemini-1.5-flash, llama-3.1-8b, llama-3.1-70b, llama-3.1-405b, deepseek-r1`|✔||
|
||||||
|[cablyai.com](https://cablyai.com)|No auth required|`g4f.Provider.CablyAI`|`cably-80b`|❌|❌|✔||
|
|[cablyai.com](https://cablyai.com)|No auth required|`g4f.Provider.CablyAI`|`cably-80b`|❌|❌|✔||
|
||||||
|[chatglm.cn](https://chatglm.cn)|No auth required|`g4f.Provider.ChatGLM`|`glm-4`|❌|❌|✔||
|
|[chatglm.cn](https://chatglm.cn)|No auth required|`g4f.Provider.ChatGLM`|`glm-4`|❌|❌|✔||
|
||||||
|[chatgpt.com](https://chatgpt.com)|No auth required|`g4f.Provider.ChatGpt`|✔ _**(+7)**_|❌|❌|✔||
|
|[chatgpt.com](https://chatgpt.com)|No auth required|`g4f.Provider.ChatGpt`|✔ _**(+7)**_|❌|❌|✔||
|
||||||
|
|
@ -50,12 +50,13 @@ This document provides an overview of various AI providers and models, including
|
||||||
|[copilot.microsoft.com](https://copilot.microsoft.com)|Optional API key|`g4f.Provider.Copilot`|`gpt-4, gpt-4o`|❌|❌|✔||
|
|[copilot.microsoft.com](https://copilot.microsoft.com)|Optional API key|`g4f.Provider.Copilot`|`gpt-4, gpt-4o`|❌|❌|✔||
|
||||||
|[darkai.foundation](https://darkai.foundation)|No auth required|`g4f.Provider.DarkAI`|`gpt-3.5-turbo, gpt-4o, llama-3.1-70b`|❌|❌|✔||
|
|[darkai.foundation](https://darkai.foundation)|No auth required|`g4f.Provider.DarkAI`|`gpt-3.5-turbo, gpt-4o, llama-3.1-70b`|❌|❌|✔||
|
||||||
|[duckduckgo.com/aichat](https://duckduckgo.com/aichat)|No auth required|`g4f.Provider.DDG`|`gpt-4, gpt-4o-mini, claude-3-haiku, llama-3.1-70b, mixtral-8x7b`|❌|❌|✔||
|
|[duckduckgo.com/aichat](https://duckduckgo.com/aichat)|No auth required|`g4f.Provider.DDG`|`gpt-4, gpt-4o-mini, claude-3-haiku, llama-3.1-70b, mixtral-8x7b`|❌|❌|✔||
|
||||||
|[deepinfra.com/chat](https://deepinfra.com/chat)|No auth required|`g4f.Provider.DeepInfraChat`|`llama-3.1-8b, llama-3.1-70b, qwq-32b, wizardlm-2-8x22b, wizardlm-2-7b, qwen-2-72b, qwen-2.5-coder-32b, nemotron-70b`|❌|❌|✔||
|
|[deepinfra.com/chat](https://deepinfra.com/chat)|No auth required|`g4f.Provider.DeepInfraChat`|`llama-3.1-8b, llama-3.1-70b, qwq-32b, wizardlm-2-8x22b, wizardlm-2-7b, qwen-2.5-72b, qwen-2.5-coder-32b, nemotron-70b`|❌|❌|✔||
|
||||||
|[chat10.free2gpt.xyz](https://chat10.free2gpt.xyz)|No auth required|`g4f.Provider.Free2GPT`|`mistral-7b`|❌|❌|✔||
|
|[chat10.free2gpt.xyz](https://chat10.free2gpt.xyz)|No auth required|`g4f.Provider.Free2GPT`|`mistral-7b`|❌|❌|✔||
|
||||||
|[freegptsnav.aifree.site](https://freegptsnav.aifree.site)|No auth required|`g4f.Provider.FreeGpt`|`gemini-1.5-pro`|❌|❌|✔||
|
|[freegptsnav.aifree.site](https://freegptsnav.aifree.site)|No auth required|`g4f.Provider.FreeGpt`|`gemini-1.5-pro`|❌|❌|✔||
|
||||||
|[app.giz.ai/assistant](https://app.giz.ai/assistant)|No auth required|`g4f.Provider.GizAI`|`gemini-1.5-flash`|❌|❌|✔||
|
|[app.giz.ai/assistant](https://app.giz.ai/assistant)|No auth required|`g4f.Provider.GizAI`|`gemini-1.5-flash`|❌|❌|✔||
|
||||||
|[gprochat.com](https://gprochat.com)|No auth required|`g4f.Provider.GPROChat`|`gemini-1.5-pro`|❌|❌|✔||
|
|[gprochat.com](https://gprochat.com)|No auth required|`g4f.Provider.GPROChat`|`gemini-1.5-pro`|❌|❌|✔||
|
||||||
|[editor.imagelabs.net](editor.imagelabs.net)|No auth required|`g4f.Provider.ImageLabs`|❌|✔ _**(1+)**_|❌|✔||
|
|[editor.imagelabs.net](https://editor.imagelabs.net)|No auth required|`g4f.Provider.ImageLabs`|`gemini-1.5-pro`|❌|❌|✔||
|
||||||
|
|[editor.imagelabs.net](editor.imagelabs.net)|No auth required|`g4f.Provider.ImageLabs`|❌|`sdxl-turbo`|❌|✔||
|
||||||
|[huggingface.co/spaces](https://huggingface.co/spaces)|Optional API key|`g4f.Provider.HuggingSpace`|`qvq-72b, qwen-2-72b, command-r, command-r-plus, command-r7b`|`flux-dev, flux-schnell, sd-3.5`|❌|✔||
|
|[huggingface.co/spaces](https://huggingface.co/spaces)|Optional API key|`g4f.Provider.HuggingSpace`|`qvq-72b, qwen-2-72b, command-r, command-r-plus, command-r7b`|`flux-dev, flux-schnell, sd-3.5`|❌|✔||
|
||||||
|[jmuz.me](https://jmuz.me)|Optional API key|`g4f.Provider.Jmuz`|`claude-3-haiku, claude-3-opus, claude-3-haiku, claude-3.5-sonnet, deepseek-r1, deepseek-chat, gemini-exp, gemini-1.5-flash, gemini-1.5-pro, gemini-2.0-flash-thinking, gpt-4, gpt-4o, gpt-4o-mini, llama-3-70b, llama-3-8b, llama-3.1-405b, llama-3.1-70b, llama-3.1-8b, llama-3.2-11b, llama-3.2-90b, llama-3.3-70b, mixtral-8x7b, qwen-2.5-72b, qwen-2.5-coder-32b, qwq-32b, wizardlm-2-8x22b`|❌|❌|✔||
|
|[jmuz.me](https://jmuz.me)|Optional API key|`g4f.Provider.Jmuz`|`claude-3-haiku, claude-3-opus, claude-3-haiku, claude-3.5-sonnet, deepseek-r1, deepseek-chat, gemini-exp, gemini-1.5-flash, gemini-1.5-pro, gemini-2.0-flash-thinking, gpt-4, gpt-4o, gpt-4o-mini, llama-3-70b, llama-3-8b, llama-3.1-405b, llama-3.1-70b, llama-3.1-8b, llama-3.2-11b, llama-3.2-90b, llama-3.3-70b, mixtral-8x7b, qwen-2.5-72b, qwen-2.5-coder-32b, qwq-32b, wizardlm-2-8x22b`|❌|❌|✔||
|
||||||
|[liaobots.work](https://liaobots.work)|[Automatic cookies](https://liaobots.work)|`g4f.Provider.Liaobots`|`grok-2, gpt-4o-mini, gpt-4o, gpt-4, o1-preview, o1-mini, claude-3-opus, claude-3.5-sonnet, claude-3-sonnet, gemini-1.5-flash, gemini-1.5-pro, gemini-2.0-flash, gemini-2.0-flash-thinking`|❌|❌|✔||
|
|[liaobots.work](https://liaobots.work)|[Automatic cookies](https://liaobots.work)|`g4f.Provider.Liaobots`|`grok-2, gpt-4o-mini, gpt-4o, gpt-4, o1-preview, o1-mini, claude-3-opus, claude-3.5-sonnet, claude-3-sonnet, gemini-1.5-flash, gemini-1.5-pro, gemini-2.0-flash, gemini-2.0-flash-thinking`|❌|❌|✔||
|
||||||
|
|
@ -64,7 +65,7 @@ This document provides an overview of various AI providers and models, including
|
||||||
|[labs.perplexity.ai](https://labs.perplexity.ai)|No auth required|`g4f.Provider.PerplexityLabs`|`sonar-online, sonar-chat, llama-3.3-70b, llama-3.1-8b, llama-3.1-70b, lfm-40b`|❌|❌|✔||
|
|[labs.perplexity.ai](https://labs.perplexity.ai)|No auth required|`g4f.Provider.PerplexityLabs`|`sonar-online, sonar-chat, llama-3.3-70b, llama-3.1-8b, llama-3.1-70b, lfm-40b`|❌|❌|✔||
|
||||||
|[pi.ai/talk](https://pi.ai/talk)||[Manual cookies](https://pi.ai/talk)|`g4f.Provider.Pi`|`pi`|❌|❌|✔||
|
|[pi.ai/talk](https://pi.ai/talk)||[Manual cookies](https://pi.ai/talk)|`g4f.Provider.Pi`|`pi`|❌|❌|✔||
|
||||||
|[pizzagpt.it](https://www.pizzagpt.it)|No auth required|`g4f.Provider.Pizzagpt`|`gpt-4o-mini`|❌|❌|✔||
|
|[pizzagpt.it](https://www.pizzagpt.it)|No auth required|`g4f.Provider.Pizzagpt`|`gpt-4o-mini`|❌|❌|✔||
|
||||||
|[pollinations.ai](https://pollinations.ai)|No auth required|`g4f.Provider.PollinationsAI`|`gpt-4o, mistral-large, mistral-nemo, llama-3.3-70b, gpt-4, qwen-2-72b, qwen-2.5-coder-32b, claude-3.5-sonnet, claude-3.5-haiku, command-r, deepseek-chat, llama-3.1-8b, evil, p1, unity, midijourney, rtist`|`flux, midjourney, dall-e-3, sd-turbo`|❌|✔||
|
|[pollinations.ai](https://pollinations.ai)|No auth required|`g4f.Provider.PollinationsAI`|`gpt-4o-mini, gpt-4o, qwen-2.5-72b, qwen-2.5-coder-32b, llama-3.3-70b, mistral-nemo, deepseek-chat, llama-3.1-8b, deepseek-r1` _**(2+)**_|`flux, flux-pro, flux-realism, flux-cablyai, flux-anime, flux-3d, midjourney, dall-e-3, sdxl-turbo`|gpt-4o, gpt-4o-mini|✔||
|
||||||
|[app.prodia.com](https://app.prodia.com)|No auth required|`g4f.Provider.Prodia`|❌|✔ _**(46)**_|❌|❌||
|
|[app.prodia.com](https://app.prodia.com)|No auth required|`g4f.Provider.Prodia`|❌|✔ _**(46)**_|❌|❌||
|
||||||
|[teach-anything.com](https://www.teach-anything.com)|No auth required|`g4f.Provider.TeachAnything`|`llama-3.1-70b`|❌|❌|✔||
|
|[teach-anything.com](https://www.teach-anything.com)|No auth required|`g4f.Provider.TeachAnything`|`llama-3.1-70b`|❌|❌|✔||
|
||||||
|[you.com](https://you.com)|[Manual cookies](https://you.com)|`g4f.Provider.You`|✔|✔|✔|✔||
|
|[you.com](https://you.com)|[Manual cookies](https://you.com)|`g4f.Provider.You`|✔|✔|✔|✔||
|
||||||
|
|
@ -72,7 +73,7 @@ This document provides an overview of various AI providers and models, including
|
||||||
|
|
||||||
---
|
---
|
||||||
### Providers HuggingSpace
|
### Providers HuggingSpace
|
||||||
| Website | API Credentials | Provider | Text Models | Image Models | Vision Models | Stream | Status | Auth |
|
| Website | API Credentials | Provider | Text Models | Image Models | Vision (Image Upload) | Stream | Status | Auth |
|
||||||
|----------|-------------|--------------|---------------|--------|--------|------|------|------|
|
|----------|-------------|--------------|---------------|--------|--------|------|------|------|
|
||||||
|[black-forest-labs-flux-1-dev.hf.space](https://black-forest-labs-flux-1-dev.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.BlackForestLabsFlux1Dev`|❌|`flux-dev`|❌|✔||
|
|[black-forest-labs-flux-1-dev.hf.space](https://black-forest-labs-flux-1-dev.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.BlackForestLabsFlux1Dev`|❌|`flux-dev`|❌|✔||
|
||||||
|[black-forest-labs-flux-1-schnell.hf.space](https://black-forest-labs-flux-1-schnell.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.BlackForestLabsFlux1Schnell`|❌|`flux-schnell`|❌|✔||
|
|[black-forest-labs-flux-1-schnell.hf.space](https://black-forest-labs-flux-1-schnell.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.BlackForestLabsFlux1Schnell`|❌|`flux-schnell`|❌|✔||
|
||||||
|
|
@ -83,9 +84,8 @@ This document provides an overview of various AI providers and models, including
|
||||||
|[voodoohop-flux-1-schnell.hf.space](https://voodoohop-flux-1-schnell.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.VoodoohopFlux1Schnell`|❌|`flux-schnell`|❌|✔||
|
|[voodoohop-flux-1-schnell.hf.space](https://voodoohop-flux-1-schnell.hf.space)|[Get API key](https://huggingface.co/settings/tokens)|`g4f.Provider.VoodoohopFlux1Schnell`|❌|`flux-schnell`|❌|✔||
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Providers Needs Auth
|
### Providers Needs Auth
|
||||||
| Website | API Credentials | Provider | Text Models | Image Models | Vision Models | Stream | Status |
|
| Website | API Credentials | Provider | Text Models | Image Models | Vision (Image Upload) | Stream | Status |
|
||||||
|----------|-------------|--------------|---------------|--------|--------|------|------|
|
|----------|-------------|--------------|---------------|--------|--------|------|------|
|
||||||
|[console.anthropic.com](https://console.anthropic.com)|[Get API key](https://console.anthropic.com/settings/keys)|`g4f.Provider.Anthropic`|✔ _**(8+)**_|❌|❌|✔||
|
|[console.anthropic.com](https://console.anthropic.com)|[Get API key](https://console.anthropic.com/settings/keys)|`g4f.Provider.Anthropic`|✔ _**(8+)**_|❌|❌|✔||
|
||||||
|[bing.com/images/create](https://www.bing.com/images/create)|[Manual cookies](https://www.bing.com)|`g4f.Provider.BingCreateImages`|❌|`dall-e-3`|❌|❌||
|
|[bing.com/images/create](https://www.bing.com/images/create)|[Manual cookies](https://www.bing.com)|`g4f.Provider.BingCreateImages`|❌|`dall-e-3`|❌|❌||
|
||||||
|
|
@ -122,8 +122,8 @@ This document provides an overview of various AI providers and models, including
|
||||||
|-------|---------------|-----------|---------|
|
|-------|---------------|-----------|---------|
|
||||||
|gpt-3|OpenAI|2+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-3-5-turbo)|
|
|gpt-3|OpenAI|2+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-3-5-turbo)|
|
||||||
|gpt-3.5-turbo|OpenAI|2+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-3-5-turbo)|
|
|gpt-3.5-turbo|OpenAI|2+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-3-5-turbo)|
|
||||||
|gpt-4|OpenAI|11+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4)|
|
|gpt-4|OpenAI|10+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4-turbo-and-gpt-4)|
|
||||||
|gpt-4o|OpenAI|9+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4o)|
|
|gpt-4o|OpenAI|10+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4o)|
|
||||||
|gpt-4o-mini|OpenAI|9+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4o-mini)|
|
|gpt-4o-mini|OpenAI|9+ Providers|[platform.openai.com](https://platform.openai.com/docs/models/gpt-4o-mini)|
|
||||||
|o1|OpenAI|1+ Providers|[platform.openai.com](https://openai.com/index/introducing-openai-o1-preview/)|
|
|o1|OpenAI|1+ Providers|[platform.openai.com](https://openai.com/index/introducing-openai-o1-preview/)|
|
||||||
|o1-preview|OpenAI|1+ Providers|[platform.openai.com](https://openai.com/index/introducing-openai-o1-preview/)|
|
|o1-preview|OpenAI|1+ Providers|[platform.openai.com](https://openai.com/index/introducing-openai-o1-preview/)|
|
||||||
|
|
@ -134,23 +134,23 @@ This document provides an overview of various AI providers and models, including
|
||||||
|llama-3-8b|Meta Llama|2+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3/)|
|
|llama-3-8b|Meta Llama|2+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3/)|
|
||||||
|llama-3-70b|Meta Llama|1+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Meta-Llama-3-70B)|
|
|llama-3-70b|Meta Llama|1+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Meta-Llama-3-70B)|
|
||||||
|llama-3.1-8b|Meta Llama|6+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3-1/)|
|
|llama-3.1-8b|Meta Llama|6+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3-1/)|
|
||||||
|llama-3.1-70b|Meta Llama|9+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3-1/)|
|
|llama-3.1-70b|Meta Llama|6+ Providers|[ai.meta.com](https://ai.meta.com/blog/meta-llama-3-1/)|
|
||||||
|llama-3.1-405b|Meta Llama|2+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.1-405B)|
|
|llama-3.1-405b|Meta Llama|2+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.1-405B)|
|
||||||
|llama-3.2-1b|Meta Llama|1+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.2-1B)|
|
|llama-3.2-1b|Meta Llama|1+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.2-1B)|
|
||||||
|llama-3.2-11b|Meta Llama|3+ Providers|[ai.meta.com](https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/)|
|
|llama-3.2-11b|Meta Llama|3+ Providers|[ai.meta.com](https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/)|
|
||||||
|llama-3.2-70b|Meta Llama|1+ Providers|[ai.meta.com](https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/)|
|
|llama-3.2-70b|Meta Llama|1+ Providers|[ai.meta.com](https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/)|
|
||||||
|llama-3.2-90b|Meta Llama|2+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.2-90B-Vision)|
|
|llama-3.2-90b|Meta Llama|2+ Providers|[huggingface.co](https://huggingface.co/meta-llama/Llama-3.2-90B-Vision)|
|
||||||
|llama-3.3-70b|Meta Llama|7+ Providers|[llama.com/]()|
|
|llama-3.3-70b|Meta Llama|8+ Providers|[]( )|
|
||||||
|mixtral-7b|Mistral AI|1+ Providers|[mistral.ai](https://mistral.ai/news/mixtral-of-experts/)|
|
|mixtral-7b|Mistral AI|1+ Providers|[mistral.ai](https://mistral.ai/news/mixtral-of-experts/)|
|
||||||
|mixtral-8x7b|Mistral AI|2+ Providers|[mistral.ai](https://mistral.ai/news/mixtral-of-experts/)|
|
|mixtral-8x7b|Mistral AI|2+ Providers|[mistral.ai](https://mistral.ai/news/mixtral-of-experts/)|
|
||||||
|mistral-nemo|Mistral AI|3+ Providers|[huggingface.co](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)|
|
|mistral-nemo|Mistral AI|3+ Providers|[huggingface.co](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)|
|
||||||
|mistral-large|Mistral AI|1+ Providers|[mistral.ai](https://mistral.ai/news/mistral-large-2407/)|
|
|
||||||
|hermes-2-dpo|NousResearch|2+ Providers|[huggingface.co](https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO)|
|
|hermes-2-dpo|NousResearch|2+ Providers|[huggingface.co](https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO)|
|
||||||
|hermes-3|NousResearch|2+ Providers|[nousresearch.com](https://nousresearch.com/hermes3/)|
|
|hermes-3|NousResearch|2+ Providers|[nousresearch.com](https://nousresearch.com/hermes3/)|
|
||||||
|phi-3.5-mini|Microsoft|2+ Providers|[huggingface.co](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)|
|
|phi-3.5-mini|Microsoft|1+ Providers|[huggingface.co](https://huggingface.co/microsoft/Phi-3.5-mini-instruct)|
|
||||||
|wizardlm-2-7b|Microsoft|1+ Providers|[wizardlm.github.io](https://wizardlm.github.io/WizardLM2/)|
|
|wizardlm-2-7b|Microsoft|1+ Providers|[wizardlm.github.io](https://wizardlm.github.io/WizardLM2/)|
|
||||||
|wizardlm-2-8x22b|Microsoft|2+ Providers|[wizardlm.github.io](https://wizardlm.github.io/WizardLM2/)|
|
|wizardlm-2-8x22b|Microsoft|2+ Providers|[wizardlm.github.io](https://wizardlm.github.io/WizardLM2/)|
|
||||||
|gemini|Google DeepMind|2+ Providers|[deepmind.google](http://deepmind.google/technologies/gemini/)|
|
|gemini|Google DeepMind|1+ Providers|[deepmind.google](http://deepmind.google/technologies/gemini/)|
|
||||||
|
|gemini-exp|Google DeepMind|1+ Providers|[blog.google](https://blog.google/feed/gemini-exp-1206/)|
|
||||||
|gemini-1.5-flash|Google DeepMind|5+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/flash/)|
|
|gemini-1.5-flash|Google DeepMind|5+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/flash/)|
|
||||||
|gemini-1.5-pro|Google DeepMind|7+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/pro/)|
|
|gemini-1.5-pro|Google DeepMind|7+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/pro/)|
|
||||||
|gemini-2.0-flash|Google DeepMind|2+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/flash/)|
|
|gemini-2.0-flash|Google DeepMind|2+ Providers|[deepmind.google](https://deepmind.google/technologies/gemini/flash/)|
|
||||||
|
|
@ -158,48 +158,45 @@ This document provides an overview of various AI providers and models, including
|
||||||
|claude-3-haiku|Anthropic|2+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-haiku)|
|
|claude-3-haiku|Anthropic|2+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-haiku)|
|
||||||
|claude-3-sonnet|Anthropic|1+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-family)|
|
|claude-3-sonnet|Anthropic|1+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-family)|
|
||||||
|claude-3-opus|Anthropic|2+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-family)|
|
|claude-3-opus|Anthropic|2+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-family)|
|
||||||
|claude-3.5-haiku|Anthropic|1+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-5-sonnet)|
|
|claude-3.5-sonnet|Anthropic|3+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-5-sonnet)|
|
||||||
|claude-3.5-sonnet|Anthropic|4+ Providers|[anthropic.com](https://www.anthropic.com/news/claude-3-5-sonnet)|
|
|
||||||
|reka-core|Reka AI|1+ Providers|[reka.ai](https://www.reka.ai/ourmodels)|
|
|reka-core|Reka AI|1+ Providers|[reka.ai](https://www.reka.ai/ourmodels)|
|
||||||
|blackboxai|Blackbox AI|1+ Providers|[docs.blackbox.chat](https://docs.blackbox.chat/blackbox-ai-1)|
|
|blackboxai|Blackbox AI|1+ Providers|[docs.blackbox.chat](https://docs.blackbox.chat/blackbox-ai-1)|
|
||||||
|blackboxai-pro|Blackbox AI|1+ Providers|[docs.blackbox.chat](https://docs.blackbox.chat/blackbox-ai-1)|
|
|blackboxai-pro|Blackbox AI|1+ Providers|[docs.blackbox.chat](https://docs.blackbox.chat/blackbox-ai-1)|
|
||||||
|command-r|CohereForAI|2+ Providers|[docs.cohere.com](https://docs.cohere.com/docs/command-r-plus)
|
|command-r|CohereForAI|1+ Providers|[docs.cohere.com](https://docs.cohere.com/docs/command-r-plus)
|
||||||
|command-r-plus|CohereForAI|2+ Providers|[docs.cohere.com](https://docs.cohere.com/docs/command-r-plus)
|
|command-r-plus|CohereForAI|2+ Providers|[docs.cohere.com](https://docs.cohere.com/docs/command-r-plus)
|
||||||
|command-r7b|CohereForAI|1+ Providers|[huggingface.co](https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024)|
|
|command-r7b|CohereForAI|1+ Providers|[huggingface.co](https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024)|
|
||||||
|qwen-1.5-7b|Qwen|1+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen1.5-7B)|
|
|qwen-1.5-7b|Qwen|1+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen1.5-7B)|
|
||||||
|qwen-2-72b|Qwen|2+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2-72B)|
|
|qwen-2-72b|Qwen|1+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2-72B)|
|
||||||
|qwen-2.5-72b|Qwen|2+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct)|
|
|qwen-2.5-72b|Qwen|3+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2.5-72B-Instruct)|
|
||||||
|qwen-2.5-coder-32b|Qwen|4+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2.5-Coder-32B)|
|
|qwen-2.5-coder-32b|Qwen|5+ Providers|[huggingface.co](https://huggingface.co/Qwen/Qwen2.5-Coder-32B)|
|
||||||
|qwq-32b|Qwen|4+ Providers|[huggingface.co](https://huggingface.co/Qwen/QwQ-32B-Preview)|
|
|qwq-32b|Qwen|4+ Providers|[huggingface.co](https://huggingface.co/Qwen/QwQ-32B-Preview)|
|
||||||
|qvq-72b|Qwen|1+ Providers|[huggingface.co](https://huggingface.co/Qwen/QVQ-72B-Preview)|
|
|qvq-72b|Qwen|1+ Providers|[huggingface.co](https://huggingface.co/Qwen/QVQ-72B-Preview)|
|
||||||
|pi|Inflection|1+ Providers|[inflection.ai](https://inflection.ai/blog/inflection-2-5)|
|
|pi|Inflection|1+ Providers|[inflection.ai](https://inflection.ai/blog/inflection-2-5)|
|
||||||
|deepseek-chat|DeepSeek|3+ Providers|[huggingface.co](https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat)|
|
|deepseek-chat|DeepSeek|3+ Providers|[huggingface.co](https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat)|
|
||||||
|deepseek-r1|DeepSeek|1+ Providers|[api-docs.deepseek.com](https://api-docs.deepseek.com/news/news250120)|
|
|deepseek-r1|DeepSeek|5+ Providers|[api-docs.deepseek.com](https://api-docs.deepseek.com/news/news250120)|
|
||||||
|grok-2|x.ai|1+ Providers|[x.ai](https://x.ai/blog/grok-2)|
|
|grok-2|x.ai|1+ Providers|[x.ai](https://x.ai/blog/grok-2)|
|
||||||
|sonar-online|Perplexity AI|1+ Providers|[docs.perplexity.ai](https://docs.perplexity.ai/)|
|
|sonar-online|Perplexity AI|1+ Providers|[docs.perplexity.ai](https://docs.perplexity.ai/)|
|
||||||
|sonar-chat|Perplexity AI|1+ Providers|[docs.perplexity.ai](https://docs.perplexity.ai/)|
|
|sonar-chat|Perplexity AI|1+ Providers|[docs.perplexity.ai](https://docs.perplexity.ai/)|
|
||||||
|nemotron-70b|Nvidia|2+ Providers|[build.nvidia.com](https://build.nvidia.com/nvidia/llama-3_1-nemotron-70b-instruct)|
|
|nemotron-70b|Nvidia|3+ Providers|[build.nvidia.com](https://build.nvidia.com/nvidia/llama-3_1-nemotron-70b-instruct)|
|
||||||
|lfm-40b|Liquid|2+ Providers|[liquid.ai](https://www.liquid.ai/liquid-foundation-models)|
|
|lfm-40b|Liquid|1+ Providers|[liquid.ai](https://www.liquid.ai/liquid-foundation-models)|
|
||||||
|dbrx-instruct|Databricks|1+ Providers|[huggingface.co](https://huggingface.co/databricks/dbrx-instruct)|
|
|dbrx-instruct|Databricks|1+ Providers|[huggingface.co](https://huggingface.co/databricks/dbrx-instruct)|
|
||||||
|p1|PollinationsAI|1+ Providers|[]( )|
|
|
||||||
|cably-80b|CablyAI|1+ Providers|[cablyai.com](https://cablyai.com)|
|
|cably-80b|CablyAI|1+ Providers|[cablyai.com](https://cablyai.com)|
|
||||||
|glm-4|THUDM|1+ Providers|[github.com/THUDM](https://github.com/THUDM/GLM-4)|
|
|glm-4|THUDM|1+ Providers|[github.com/THUDM](https://github.com/THUDM/GLM-4)|
|
||||||
|evil|Evil Mode - Experimental|2+ Providers|[]( )|
|
|evil|Evil Mode - Experimental|1+ Providers|[]( )|
|
||||||
|midijourney||1+ Providers|[]( )|
|
|
||||||
|unity||1+ Providers|[]( )|
|
|
||||||
|rtist||1+ Providers|[]( )|
|
|
||||||
|
|
||||||
---
|
---
|
||||||
### Image Models
|
### Image Models
|
||||||
| Model | Base Provider | Providers | Website |
|
| Model | Base Provider | Providers | Website |
|
||||||
|-------|---------------|-----------|---------|
|
|-------|---------------|-----------|---------|
|
||||||
|sd-turbo||1+ Providers|[huggingface.co](https://huggingface.co/stabilityai/sd-turbo)|
|
|sdxl-turbo|Stability AI|2+ Providers|[huggingface.co](https://huggingface.co/stabilityai/sdxl-turbo)|
|
||||||
|sd-3.5|Stability AI|1+ Providers|[huggingface.co](https://huggingface.co/stabilityai/stable-diffusion-3.5-large)|
|
|sd-3.5|Stability AI|1+ Providers|[huggingface.co](https://huggingface.co/stabilityai/stable-diffusion-3.5-large)|
|
||||||
|flux|Black Forest Labs|4+ Providers|[github.com/black-forest-labs/flux](https://github.com/black-forest-labs/flux)|
|
|flux|Black Forest Labs|2+ Providers|[github.com/black-forest-labs/flux](https://github.com/black-forest-labs/flux)|
|
||||||
|
|flux-pro|Black Forest Labs|1+ Providers|[huggingface.co](https://huggingface.co/enhanceaiteam/FLUX.1-Pro)|
|
||||||
|flux-dev|Black Forest Labs|3+ Providers|[huggingface.co](https://huggingface.co/black-forest-labs/FLUX.1-dev)|
|
|flux-dev|Black Forest Labs|3+ Providers|[huggingface.co](https://huggingface.co/black-forest-labs/FLUX.1-dev)|
|
||||||
|flux-schnell|Black Forest Labs|2+ Providers|[huggingface.co](https://huggingface.co/black-forest-labs/FLUX.1-schnell)|
|
|flux-schnell|Black Forest Labs|3+ Providers|[huggingface.co](https://huggingface.co/black-forest-labs/FLUX.1-schnell)|
|
||||||
|dall-e-3|OpenAI|6+ Providers|[openai.com](https://openai.com/index/dall-e/)|
|
|dall-e-3|OpenAI|5+ Providers|[openai.com](https://openai.com/index/dall-e/)|
|
||||||
|midjourney|Midjourney|2+ Providers|[docs.midjourney.com](https://docs.midjourney.com/docs/model-versions)|
|
|midjourney|Midjourney|1+ Providers|[docs.midjourney.com](https://docs.midjourney.com/docs/model-versions)|
|
||||||
|
|
||||||
|
|
||||||
## Conclusion and Usage Tips
|
## Conclusion and Usage Tips
|
||||||
|
|
|
||||||
|
|
@ -38,7 +38,7 @@ class Blackbox(AsyncGeneratorProvider, ProviderModelMixin):
|
||||||
default_vision_model = default_model
|
default_vision_model = default_model
|
||||||
default_image_model = 'ImageGeneration'
|
default_image_model = 'ImageGeneration'
|
||||||
image_models = [default_image_model, "ImageGeneration2"]
|
image_models = [default_image_model, "ImageGeneration2"]
|
||||||
vision_models = [default_vision_model, 'gpt-4o', 'gemini-pro', 'gemini-1.5-flash', 'llama-3.1-8b', 'llama-3.1-70b', 'llama-3.1-405b']
|
vision_models = [default_vision_model, 'gpt-4o', 'gemini-pro', 'gemini-1.5-flash', 'llama-3.1-8b', 'llama-3.1-70b', 'llama-3.1-405b', 'deepseek-r1']
|
||||||
|
|
||||||
userSelectedModel = ['gpt-4o', 'gemini-pro', 'claude-sonnet-3.5', 'blackboxai-pro']
|
userSelectedModel = ['gpt-4o', 'gemini-pro', 'claude-sonnet-3.5', 'blackboxai-pro']
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,7 @@ from ..typing import AsyncResult, Messages
|
||||||
from .needs_auth import OpenaiAPI
|
from .needs_auth import OpenaiAPI
|
||||||
|
|
||||||
class CablyAI(OpenaiAPI):
|
class CablyAI(OpenaiAPI):
|
||||||
label = __name__
|
label = "CablyAI"
|
||||||
url = "https://cablyai.com"
|
url = "https://cablyai.com"
|
||||||
login_url = None
|
login_url = None
|
||||||
needs_auth = False
|
needs_auth = False
|
||||||
|
|
@ -35,4 +35,4 @@ class CablyAI(OpenaiAPI):
|
||||||
messages=messages,
|
messages=messages,
|
||||||
headers=headers,
|
headers=headers,
|
||||||
**kwargs
|
**kwargs
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,7 @@ from ..typing import AsyncResult, Messages
|
||||||
from .needs_auth.OpenaiTemplate import OpenaiTemplate
|
from .needs_auth.OpenaiTemplate import OpenaiTemplate
|
||||||
|
|
||||||
class DeepInfraChat(OpenaiTemplate):
|
class DeepInfraChat(OpenaiTemplate):
|
||||||
|
label = "DeepInfraChat"
|
||||||
url = "https://deepinfra.com/chat"
|
url = "https://deepinfra.com/chat"
|
||||||
api_base = "https://api.deepinfra.com/v1/openai"
|
api_base = "https://api.deepinfra.com/v1/openai"
|
||||||
working = True
|
working = True
|
||||||
|
|
@ -29,7 +30,7 @@ class DeepInfraChat(OpenaiTemplate):
|
||||||
"qwq-32b": "Qwen/QwQ-32B-Preview",
|
"qwq-32b": "Qwen/QwQ-32B-Preview",
|
||||||
"wizardlm-2-8x22b": "microsoft/WizardLM-2-8x22B",
|
"wizardlm-2-8x22b": "microsoft/WizardLM-2-8x22B",
|
||||||
"wizardlm-2-7b": "microsoft/WizardLM-2-7B",
|
"wizardlm-2-7b": "microsoft/WizardLM-2-7B",
|
||||||
"qwen-2-72b": "Qwen/Qwen2.5-72B-Instruct",
|
"qwen-2.5-72b": "Qwen/Qwen2.5-72B-Instruct",
|
||||||
"qwen-2.5-coder-32b": "Qwen/Qwen2.5-Coder-32B-Instruct",
|
"qwen-2.5-coder-32b": "Qwen/Qwen2.5-Coder-32B-Instruct",
|
||||||
"nemotron-70b": "nvidia/Llama-3.1-Nemotron-70B-Instruct",
|
"nemotron-70b": "nvidia/Llama-3.1-Nemotron-70B-Instruct",
|
||||||
}
|
}
|
||||||
|
|
@ -50,4 +51,4 @@ class DeepInfraChat(OpenaiTemplate):
|
||||||
**headers
|
**headers
|
||||||
}
|
}
|
||||||
async for chunk in super().create_async_generator(model, messages, headers=headers, **kwargs):
|
async for chunk in super().create_async_generator(model, messages, headers=headers, **kwargs):
|
||||||
yield chunk
|
yield chunk
|
||||||
|
|
|
||||||
|
|
@ -22,6 +22,7 @@ class ImageLabs(AsyncGeneratorProvider, ProviderModelMixin):
|
||||||
default_image_model = default_model
|
default_image_model = default_model
|
||||||
image_models = [default_image_model]
|
image_models = [default_image_model]
|
||||||
models = image_models
|
models = image_models
|
||||||
|
model_aliases = {"sdxl-turbo": default_model}
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
async def create_async_generator(
|
async def create_async_generator(
|
||||||
|
|
|
||||||
|
|
@ -38,24 +38,35 @@ class PollinationsAI(AsyncGeneratorProvider, ProviderModelMixin):
|
||||||
default_model = "openai"
|
default_model = "openai"
|
||||||
default_image_model = "flux"
|
default_image_model = "flux"
|
||||||
default_vision_model = "gpt-4o"
|
default_vision_model = "gpt-4o"
|
||||||
extra_image_models = ["midjourney", "dall-e-3", "flux-pro", "flux-realism", "flux-cablyai", "flux-anime", "flux-3d"]
|
extra_image_models = ["midjourney", "dall-e-3", "flux-pro"]
|
||||||
vision_models = [default_vision_model, "gpt-4o-mini"]
|
vision_models = [default_vision_model, "gpt-4o-mini"]
|
||||||
extra_text_models = [*vision_models, "claude", "claude-email", "karma", "command-r", "llamalight", "mistral-large", "sur", "sur-mistral", "any-dark"]
|
extra_text_models = ["claude", "claude-email", "deepseek-reasoner"] + vision_models
|
||||||
model_aliases = {
|
model_aliases = {
|
||||||
"qwen-2-72b": "qwen",
|
### Text Models ###
|
||||||
|
"gpt-4o-mini": "openai",
|
||||||
|
"gpt-4": "openai-large",
|
||||||
|
"gpt-4o": "openai-large",
|
||||||
|
"qwen-2.5-72b": "qwen",
|
||||||
"qwen-2.5-coder-32b": "qwen-coder",
|
"qwen-2.5-coder-32b": "qwen-coder",
|
||||||
"llama-3.3-70b": "llama",
|
"llama-3.3-70b": "llama",
|
||||||
"mistral-nemo": "mistral",
|
"mistral-nemo": "mistral",
|
||||||
#"": "karma",
|
#"mistral-nemo": "unity", # bug with image url response
|
||||||
#"": "sur-mistral",
|
#"gpt-4o-mini": "midijourney", # bug with the answer
|
||||||
"gpt-4": "searchgpt",
|
"gpt-4o-mini": "rtist",
|
||||||
"claude-3.5-haiku": "claude-hybridspace",
|
"gpt-4o": "searchgpt",
|
||||||
"claude-3.5-sonnet": "claude-email",
|
#"mistral-nemo": "evil",
|
||||||
"gpt-4": "claude",
|
"gpt-4o-mini": "p1",
|
||||||
"deepseek-chat": "deepseek",
|
"deepseek-chat": "deepseek",
|
||||||
"llama-3.1-8b": "llamalight",
|
"deepseek-chat": "claude-hybridspace",
|
||||||
|
"llama-3.1-8b": "llamalight",
|
||||||
|
"gpt-4o-vision": "gpt-4o",
|
||||||
|
"gpt-4o-mini-vision": "gpt-4o-mini",
|
||||||
|
"gpt-4o-mini": "claude",
|
||||||
|
"deepseek-chat": "claude-email",
|
||||||
|
"deepseek-r1": "deepseek-reasoner",
|
||||||
|
|
||||||
### Image Models ###
|
### Image Models ###
|
||||||
"sd-turbo": "turbo",
|
"sdxl-turbo": "turbo",
|
||||||
}
|
}
|
||||||
text_models = []
|
text_models = []
|
||||||
|
|
||||||
|
|
@ -244,4 +255,4 @@ class PollinationsAI(AsyncGeneratorProvider, ProviderModelMixin):
|
||||||
break
|
break
|
||||||
except json.JSONDecodeError:
|
except json.JSONDecodeError:
|
||||||
yield decoded_chunk.strip()
|
yield decoded_chunk.strip()
|
||||||
continue
|
continue
|
||||||
|
|
|
||||||
|
|
@ -21,6 +21,7 @@ from .Provider import (
|
||||||
DeepInfraChat,
|
DeepInfraChat,
|
||||||
HuggingSpace,
|
HuggingSpace,
|
||||||
GPROChat,
|
GPROChat,
|
||||||
|
ImageLabs,
|
||||||
Jmuz,
|
Jmuz,
|
||||||
Liaobots,
|
Liaobots,
|
||||||
Mhystical,
|
Mhystical,
|
||||||
|
|
@ -261,12 +262,6 @@ mistral_nemo = Model(
|
||||||
best_provider = IterListProvider([PollinationsAI, HuggingChat, HuggingFace])
|
best_provider = IterListProvider([PollinationsAI, HuggingChat, HuggingFace])
|
||||||
)
|
)
|
||||||
|
|
||||||
mistral_large = Model(
|
|
||||||
name = "mistral-large",
|
|
||||||
base_provider = "Mistral",
|
|
||||||
best_provider = PollinationsAI
|
|
||||||
)
|
|
||||||
|
|
||||||
### NousResearch ###
|
### NousResearch ###
|
||||||
hermes_2_dpo = Model(
|
hermes_2_dpo = Model(
|
||||||
name = "hermes-2-dpo",
|
name = "hermes-2-dpo",
|
||||||
|
|
@ -318,18 +313,18 @@ gemini_exp = Model(
|
||||||
)
|
)
|
||||||
|
|
||||||
# gemini-1.5
|
# gemini-1.5
|
||||||
gemini_1_5_pro = Model(
|
|
||||||
name = 'gemini-1.5-pro',
|
|
||||||
base_provider = 'Google DeepMind',
|
|
||||||
best_provider = IterListProvider([Blackbox, Jmuz, GPROChat, AIChatFree, Gemini, GeminiPro, Liaobots])
|
|
||||||
)
|
|
||||||
|
|
||||||
gemini_1_5_flash = Model(
|
gemini_1_5_flash = Model(
|
||||||
name = 'gemini-1.5-flash',
|
name = 'gemini-1.5-flash',
|
||||||
base_provider = 'Google DeepMind',
|
base_provider = 'Google DeepMind',
|
||||||
best_provider = IterListProvider([Blackbox, Jmuz, Gemini, GeminiPro, Liaobots])
|
best_provider = IterListProvider([Blackbox, Jmuz, Gemini, GeminiPro, Liaobots])
|
||||||
)
|
)
|
||||||
|
|
||||||
|
gemini_1_5_pro = Model(
|
||||||
|
name = 'gemini-1.5-pro',
|
||||||
|
base_provider = 'Google DeepMind',
|
||||||
|
best_provider = IterListProvider([Blackbox, Jmuz, GPROChat, AIChatFree, Gemini, GeminiPro, Liaobots])
|
||||||
|
)
|
||||||
|
|
||||||
# gemini-2.0
|
# gemini-2.0
|
||||||
gemini_2_0_flash = Model(
|
gemini_2_0_flash = Model(
|
||||||
name = 'gemini-2.0-flash',
|
name = 'gemini-2.0-flash',
|
||||||
|
|
@ -366,16 +361,10 @@ claude_3_opus = Model(
|
||||||
|
|
||||||
|
|
||||||
# claude 3.5
|
# claude 3.5
|
||||||
claude_3_5_haiku = Model(
|
|
||||||
name = 'claude-3.5-haiku',
|
|
||||||
base_provider = 'Anthropic',
|
|
||||||
best_provider = PollinationsAI
|
|
||||||
)
|
|
||||||
|
|
||||||
claude_3_5_sonnet = Model(
|
claude_3_5_sonnet = Model(
|
||||||
name = 'claude-3.5-sonnet',
|
name = 'claude-3.5-sonnet',
|
||||||
base_provider = 'Anthropic',
|
base_provider = 'Anthropic',
|
||||||
best_provider = IterListProvider([Blackbox, PollinationsAI, Jmuz, Liaobots])
|
best_provider = IterListProvider([Blackbox, Jmuz, Liaobots])
|
||||||
)
|
)
|
||||||
|
|
||||||
### Reka AI ###
|
### Reka AI ###
|
||||||
|
|
@ -402,7 +391,7 @@ blackboxai_pro = Model(
|
||||||
command_r = Model(
|
command_r = Model(
|
||||||
name = 'command-r',
|
name = 'command-r',
|
||||||
base_provider = 'CohereForAI',
|
base_provider = 'CohereForAI',
|
||||||
best_provider = IterListProvider([HuggingSpace, PollinationsAI])
|
best_provider = HuggingSpace
|
||||||
)
|
)
|
||||||
|
|
||||||
command_r_plus = Model(
|
command_r_plus = Model(
|
||||||
|
|
@ -429,14 +418,14 @@ qwen_1_5_7b = Model(
|
||||||
qwen_2_72b = Model(
|
qwen_2_72b = Model(
|
||||||
name = 'qwen-2-72b',
|
name = 'qwen-2-72b',
|
||||||
base_provider = 'Qwen',
|
base_provider = 'Qwen',
|
||||||
best_provider = IterListProvider([DeepInfraChat, PollinationsAI, HuggingSpace])
|
best_provider = HuggingSpace
|
||||||
)
|
)
|
||||||
|
|
||||||
# qwen 2.5
|
# qwen 2.5
|
||||||
qwen_2_5_72b = Model(
|
qwen_2_5_72b = Model(
|
||||||
name = 'qwen-2.5-72b',
|
name = 'qwen-2.5-72b',
|
||||||
base_provider = 'Qwen',
|
base_provider = 'Qwen',
|
||||||
best_provider = Jmuz
|
best_provider = IterListProvider([DeepInfraChat, PollinationsAI, Jmuz])
|
||||||
)
|
)
|
||||||
|
|
||||||
qwen_2_5_coder_32b = Model(
|
qwen_2_5_coder_32b = Model(
|
||||||
|
|
@ -475,7 +464,7 @@ deepseek_chat = Model(
|
||||||
deepseek_r1 = Model(
|
deepseek_r1 = Model(
|
||||||
name = 'deepseek-r1',
|
name = 'deepseek-r1',
|
||||||
base_provider = 'DeepSeek',
|
base_provider = 'DeepSeek',
|
||||||
best_provider = IterListProvider([Blackbox, Jmuz, HuggingChat, HuggingFace])
|
best_provider = IterListProvider([Blackbox, Jmuz, PollinationsAI, HuggingChat, HuggingFace])
|
||||||
)
|
)
|
||||||
|
|
||||||
### x.ai ###
|
### x.ai ###
|
||||||
|
|
@ -554,34 +543,16 @@ evil = Model(
|
||||||
best_provider = PollinationsAI
|
best_provider = PollinationsAI
|
||||||
)
|
)
|
||||||
|
|
||||||
### Other ###
|
|
||||||
midijourney = Model(
|
|
||||||
name = 'midijourney',
|
|
||||||
base_provider = 'Other',
|
|
||||||
best_provider = PollinationsAI
|
|
||||||
)
|
|
||||||
|
|
||||||
unity = Model(
|
|
||||||
name = 'unity',
|
|
||||||
base_provider = 'Other',
|
|
||||||
best_provider = PollinationsAI
|
|
||||||
)
|
|
||||||
|
|
||||||
rtist = Model(
|
|
||||||
name = 'rtist',
|
|
||||||
base_provider = 'Other',
|
|
||||||
best_provider = PollinationsAI
|
|
||||||
)
|
|
||||||
|
|
||||||
#############
|
#############
|
||||||
### Image ###
|
### Image ###
|
||||||
#############
|
#############
|
||||||
|
|
||||||
### Stability AI ###
|
### Stability AI ###
|
||||||
sd_turbo = ImageModel(
|
sdxl_turbo = ImageModel(
|
||||||
name = 'sd-turbo',
|
name = 'sdxl-turbo',
|
||||||
base_provider = 'Stability AI',
|
base_provider = 'Stability AI',
|
||||||
best_provider = PollinationsAI
|
best_provider = IterListProvider([PollinationsAI, ImageLabs])
|
||||||
)
|
)
|
||||||
|
|
||||||
sd_3_5 = ImageModel(
|
sd_3_5 = ImageModel(
|
||||||
|
|
@ -590,25 +561,32 @@ sd_3_5 = ImageModel(
|
||||||
best_provider = HuggingSpace
|
best_provider = HuggingSpace
|
||||||
)
|
)
|
||||||
|
|
||||||
### Flux AI ###
|
### Black Forest Labs ###
|
||||||
flux = ImageModel(
|
flux = ImageModel(
|
||||||
name = 'flux',
|
name = 'flux',
|
||||||
base_provider = 'Flux AI',
|
base_provider = 'Black Forest Labs',
|
||||||
best_provider = IterListProvider([Blackbox, PollinationsAI, HuggingSpace])
|
best_provider = IterListProvider([Blackbox, PollinationsAI, HuggingSpace])
|
||||||
)
|
)
|
||||||
|
|
||||||
|
flux_pro = ImageModel(
|
||||||
|
name = 'flux-pro',
|
||||||
|
base_provider = 'Black Forest Labs',
|
||||||
|
best_provider = PollinationsAI
|
||||||
|
)
|
||||||
|
|
||||||
flux_dev = ImageModel(
|
flux_dev = ImageModel(
|
||||||
name = 'flux-dev',
|
name = 'flux-dev',
|
||||||
base_provider = 'Flux AI',
|
base_provider = 'Black Forest Labs',
|
||||||
best_provider = IterListProvider([HuggingSpace, HuggingChat, HuggingFace])
|
best_provider = IterListProvider([HuggingSpace, HuggingChat, HuggingFace])
|
||||||
)
|
)
|
||||||
|
|
||||||
flux_schnell = ImageModel(
|
flux_schnell = ImageModel(
|
||||||
name = 'flux-schnell',
|
name = 'flux-schnell',
|
||||||
base_provider = 'Flux AI',
|
base_provider = 'Black Forest Labs',
|
||||||
best_provider = IterListProvider([HuggingSpace, HuggingChat, HuggingFace])
|
best_provider = IterListProvider([HuggingSpace, HuggingChat, HuggingFace])
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
### OpenAI ###
|
### OpenAI ###
|
||||||
dall_e_3 = ImageModel(
|
dall_e_3 = ImageModel(
|
||||||
name = 'dall-e-3',
|
name = 'dall-e-3',
|
||||||
|
|
@ -682,7 +660,6 @@ class ModelUtils:
|
||||||
mixtral_7b.name: mixtral_7b,
|
mixtral_7b.name: mixtral_7b,
|
||||||
mixtral_8x7b.name: mixtral_8x7b,
|
mixtral_8x7b.name: mixtral_8x7b,
|
||||||
mistral_nemo.name: mistral_nemo,
|
mistral_nemo.name: mistral_nemo,
|
||||||
mistral_large.name: mistral_large,
|
|
||||||
|
|
||||||
### NousResearch ###
|
### NousResearch ###
|
||||||
hermes_2_dpo.name: hermes_2_dpo,
|
hermes_2_dpo.name: hermes_2_dpo,
|
||||||
|
|
@ -719,7 +696,6 @@ class ModelUtils:
|
||||||
claude_3_haiku.name: claude_3_haiku,
|
claude_3_haiku.name: claude_3_haiku,
|
||||||
|
|
||||||
# claude 3.5
|
# claude 3.5
|
||||||
claude_3_5_haiku.name: claude_3_5_haiku,
|
|
||||||
claude_3_5_sonnet.name: claude_3_5_sonnet,
|
claude_3_5_sonnet.name: claude_3_5_sonnet,
|
||||||
|
|
||||||
### Reka AI ###
|
### Reka AI ###
|
||||||
|
|
@ -775,21 +751,17 @@ class ModelUtils:
|
||||||
mini_max.name: mini_max, ## MiniMax
|
mini_max.name: mini_max, ## MiniMax
|
||||||
evil.name: evil, ### Uncensored AI ###
|
evil.name: evil, ### Uncensored AI ###
|
||||||
|
|
||||||
### Other ###
|
|
||||||
midijourney.name: midijourney,
|
|
||||||
unity.name: unity,
|
|
||||||
rtist.name: rtist,
|
|
||||||
|
|
||||||
#############
|
#############
|
||||||
### Image ###
|
### Image ###
|
||||||
#############
|
#############
|
||||||
|
|
||||||
### Stability AI ###
|
### Stability AI ###
|
||||||
sd_turbo.name: sd_turbo,
|
sdxl_turbo.name: sdxl_turbo,
|
||||||
sd_3_5.name: sd_3_5,
|
sd_3_5.name: sd_3_5,
|
||||||
|
|
||||||
### Flux AI ###
|
### Flux AI ###
|
||||||
flux.name: flux,
|
flux.name: flux,
|
||||||
|
flux_pro.name: flux_pro,
|
||||||
flux_dev.name: flux_dev,
|
flux_dev.name: flux_dev,
|
||||||
flux_schnell.name: flux_schnell,
|
flux_schnell.name: flux_schnell,
|
||||||
|
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue