Open Issues Need Help
View All on GitHubAI Summary: This issue proposes implementing a GitHub bot in the `any-llm` repository to automatically mark inactive pull requests and issues as "stale." The request is based on a similar bot already successfully deployed in the `any-agent` repository, suggesting a known configuration and implementation path.
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
AI Summary: The GitHub issue is a feature request to add support for calling a `list_models` function within the xAI integration. This likely involves implementing a new method to retrieve a list of available models from the xAI platform.
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
AI Summary: This GitHub issue is a feature request for the Anthropic API client (or a related library) to add a function or method that allows users to programmatically list the available models provided by Anthropic. The issue body is empty, indicating the title fully conveys the request.
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface
Communicate with an LLM provider using a single interface