Communicate with an LLM provider using a single interface

inference llm text-completion
39 Open Issues Need Help Last updated: Oct 17, 2025

Open Issues Need Help

View All on GitHub

AI Summary: This issue requests the implementation or enhancement of 'reasoning support' within the Sambanova Completion API. With no further description, the specific requirements or scope of this feature are undefined, but it generally aims to improve the API's ability to generate more logical, structured, or explainable outputs.

Complexity: 5/5
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

AI Summary: This GitHub issue proposes adding 'Reasoning Support' to the Databricks Completion API. While no further description is provided, this likely entails enhancing the API's intelligence to understand context, provide more logical or explanatory completions, or offer insights into its suggestions, moving beyond basic auto-completion.

Complexity: 4/5
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted params

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted params

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted completion reasoning

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
Add "stale" github bot about 2 months ago

AI Summary: This issue proposes implementing a GitHub bot in the `any-llm` repository to automatically mark inactive pull requests and issues as "stale." The request is based on a similar bot already successfully deployed in the `any-agent` repository, suggesting a known configuration and implementation path.

Complexity: 2/5
good first issue help wanted performance

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
documentation good first issue help wanted params

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

AI Summary: The GitHub issue is a feature request to add support for calling a `list_models` function within the xAI integration. This likely involves implementing a new method to retrieve a list of available models from the xAI platform.

Complexity: 2/5
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion

AI Summary: This GitHub issue is a feature request for the Anthropic API client (or a related library) to add a function or method that allows users to programmatically list the available models provided by Anthropic. The issue body is empty, indicating the title fully conveys the request.

Complexity: 2/5
good first issue help wanted

Communicate with an LLM provider using a single interface

Python
#inference#llm#text-completion