Communicate with an LLM provider using a single interface

950 stars 70 forks 950 watchers Python Apache License 2.0
inference llm text-completion
35 Open Issues Need Help Last updated: Sep 2, 2025

Open Issues Need Help

View All on GitHub

AI Summary: This issue proposes implementing a GitHub bot in the `any-llm` repository to automatically mark inactive pull requests and issues as "stale." The request is based on a similar bot already successfully deployed in the `any-agent` repository, suggesting a known configuration and implementation path.

Complexity: 2/5
good first issue help wanted performance

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
documentation good first issue help wanted params

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

AI Summary: The GitHub issue is a feature request to add support for calling a `list_models` function within the xAI integration. This likely involves implementing a new method to retrieve a list of available models from the xAI platform.

Complexity: 2/5
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

AI Summary: This GitHub issue is a feature request for the Anthropic API client (or a related library) to add a function or method that allows users to programmatically list the available models provided by Anthropic. The issue body is empty, indicating the title fully conveys the request.

Complexity: 2/5
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion