Metadata-Version: 2.4
Name: ortpy
Version: 1.23b1
Summary: ONNX runtime python binding
Author: Feng Wang
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: C++
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: Microsoft :: Windows
Classifier: Operating System :: POSIX :: Linux
Classifier: Operating System :: MacOS :: MacOS X
Project-URL: Homepage, https://github.com/chemwolf6922/onnxruntime.py
Requires-Python: >=3.10
Requires-Dist: numpy
Provides-Extra: with-lib
Requires-Dist: ortpy_lib~=1.23.0; extra == "with-lib"
Description-Content-Type: text/markdown

# ortpy

A thin Python language binding for the [ONNX Runtime](https://onnxruntime.ai/) library on it's C APIs. Unlike the official `onnxruntime` package, which bundles the full runtime implementation in a heavy Python wheel. This gives you the flexibility to manage the runtime libraries yourself.

## Installation

```bash
pip install ortpy[with-lib]
```

The `with-lib` extra bundles the ONNX Runtime shared libraries. If you manage the runtime libraries yourself, install without the extra:

```bash
pip install ortpy
```

## Quick Start

```python
import ortpy as ort
import numpy as np

# Create a session
session_options = ort.SessionOptions()
session = ort.Session("model.onnx", session_options)

# Inspect inputs and outputs
for name, info in session.get_input_info().items():
    print(f"Input '{name}': shape={info.shape}, dtype={info.dtype}")

for name, info in session.get_output_info().items():
    print(f"Output '{name}': shape={info.shape}, dtype={info.dtype}")

# Run inference
inputs = {"input": np.random.randn(1, 3, 224, 224).astype(np.float32)}
outputs = session.run(inputs)
```
