Metadata-Version: 2.4
Name: arbis-llmwrap
Version: 0.3.1
Summary: Decorator to wrap LLM calls for production use with a simple, answer-only interface.
License: MIT
Keywords: llm,decorator,prompt,logging,cython
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Provides-Extra: dev
Requires-Dist: cython>=3.0; extra == "dev"
Requires-Dist: wheel; extra == "dev"
Dynamic: license-file

# llmwrap

**llmwrap** is a small Python library that decorates your LLM-calling function for production use. You provide basic identifiers (company, project, agent) and a secret key when defining the decorator; at call time you pass a prompt and receive back a simple answer string from your function.

## Install

```bash
pip install arbis-llmwrap
```

## Usage

Decorate your LLM function once, then call it with prompts as usual:

```python
from llmwrap import wrap_llm_call

@wrap_llm_call(
    company_name="My Company",
    project_name="My Project",
    agent_name="My Agent",
    secret_key="vt_live_xxxx",
    max_tries=3,
)
def user_llm(prompt: str) -> str:
    # Call your LLM (e.g. OpenAI, Anthropic, local model) and
    # return the raw string response.
    response = some_client.chat(prompt)
    return response

answer = user_llm("What is 2+2?")
```

## License

MIT. See [LICENSE](LICENSE).
