You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add MiniMax (M2.7, M2.5, M2.5-highspeed) as a first-class LLM provider
using the OpenAI-compatible API at api.minimax.io/v1.
- New MiniMaxEngine extending OpenAiEngine with think-tag stripping
- Full setup wizard integration (provider selection, model list, API key URL)
- Dynamic model fetching via /v1/models endpoint with 7-day cache
- Error handling with billing URL and model suggestions
- README updated with MiniMax provider config example
- 32 unit tests + 3 integration tests
`${value} is not supported yet, use 'ollama', 'mlx', 'anthropic', 'azure', 'gemini', 'flowise', 'mistral', 'deepseek', 'aimlapi' or 'openai' (default)`
794
+
`${value} is not supported yet, use 'ollama', 'mlx', 'anthropic', 'azure', 'gemini', 'flowise', 'mistral', 'deepseek', 'aimlapi', 'minimax' or 'openai' (default)`
'You are an expert at writing concise, meaningful git commit messages. Generate a conventional commit message for the provided code diff. Output only the commit message, nothing else.'
0 commit comments