Metadata-Version: 2.2
Name: omnigbdt
Version: 0.4.1
Summary: Gradient boosted decision trees for multiple outputs
Author-Email: =?utf-8?q?Francis_Lacl=C3=A9?= <4205424+flacle@users.noreply.github.com>
License: Apache License
         Version 2.0, January 2004
         http://www.apache.org/licenses/
         
         TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
         
         1. Definitions.
         
            "License" shall mean the terms and conditions for use, reproduction,
            and distribution as defined by Sections 1 through 9 of this document.
         
            "Licensor" shall mean the copyright owner or entity authorized by
            the copyright owner that is granting the License.
         
            "Legal Entity" shall mean the union of the acting entity and all
            other entities that control, are controlled by, or are under common
            control with that entity. For the purposes of this definition,
            "control" means (i) the power, direct or indirect, to cause the
            direction or management of such entity, whether by contract or
            otherwise, or (ii) ownership of fifty percent (50%) or more of the
            outstanding shares, or (iii) beneficial ownership of such entity.
         
            "You" (or "Your") shall mean an individual or Legal Entity
            exercising permissions granted by this License.
         
            "Source" form shall mean the preferred form for making modifications,
            including but not limited to software source code, documentation
            source, and configuration files.
         
            "Object" form shall mean any form resulting from mechanical
            transformation or translation of a Source form, including but
            not limited to compiled object code, generated documentation,
            and conversions to other media types.
         
            "Work" shall mean the work of authorship, whether in Source or
            Object form, made available under the License, as indicated by a
            copyright notice that is included in or attached to the work
            (an example is provided in the Appendix below).
         
            "Derivative Works" shall mean any work, whether in Source or Object
            form, that is based on (or derived from) the Work and for which the
            editorial revisions, annotations, elaborations, or other modifications
            represent, as a whole, an original work of authorship. For the purposes
            of this License, Derivative Works shall not include works that remain
            separable from, or merely link (or bind by name) to the interfaces of,
            the Work and Derivative Works thereof.
         
            "Contribution" shall mean any work of authorship, including
            the original version of the Work and any modifications or additions
            to that Work or Derivative Works thereof, that is intentionally
            submitted to Licensor for inclusion in the Work by the copyright owner
            or by an individual or Legal Entity authorized to submit on behalf of
            the copyright owner. For the purposes of this definition, "submitted"
            means any form of electronic, verbal, or written communication sent
            to the Licensor or its representatives, including but not limited to
            communication on electronic mailing lists, source code control systems,
            and issue tracking systems that are managed by, or on behalf of, the
            Licensor for the purpose of discussing and improving the Work, but
            excluding communication that is conspicuously marked or otherwise
            designated in writing by the copyright owner as "Not a Contribution."
         
            "Contributor" shall mean Licensor and any individual or Legal Entity
            on behalf of whom a Contribution has been received by Licensor and
            subsequently incorporated within the Work.
         
         2. Grant of Copyright License. Subject to the terms and conditions of
            this License, each Contributor hereby grants to You a perpetual,
            worldwide, non-exclusive, no-charge, royalty-free, irrevocable
            copyright license to reproduce, prepare Derivative Works of,
            publicly display, publicly perform, sublicense, and distribute the
            Work and such Derivative Works in Source or Object form.
         
         3. Grant of Patent License. Subject to the terms and conditions of
            this License, each Contributor hereby grants to You a perpetual,
            worldwide, non-exclusive, no-charge, royalty-free, irrevocable
            (except as stated in this section) patent license to make, have made,
            use, offer to sell, sell, import, and otherwise transfer the Work,
            where such license applies only to those patent claims licensable
            by such Contributor that are necessarily infringed by their
            Contribution(s) alone or by combination of their Contribution(s)
            with the Work to which such Contribution(s) was submitted. If You
            institute patent litigation against any entity (including a
            cross-claim or counterclaim in a lawsuit) alleging that the Work
            or a Contribution incorporated within the Work constitutes direct
            or contributory patent infringement, then any patent licenses
            granted to You under this License for that Work shall terminate
            as of the date such litigation is filed.
         
         4. Redistribution. You may reproduce and distribute copies of the
            Work or Derivative Works thereof in any medium, with or without
            modifications, and in Source or Object form, provided that You
            meet the following conditions:
         
            (a) You must give any other recipients of the Work or
                Derivative Works a copy of this License; and
         
            (b) You must cause any modified files to carry prominent notices
                stating that You changed the files; and
         
            (c) You must retain, in the Source form of any Derivative Works
                that You distribute, all copyright, patent, trademark, and
                attribution notices from the Source form of the Work,
                excluding those notices that do not pertain to any part of
                the Derivative Works; and
         
            (d) If the Work includes a "NOTICE" text file as part of its
                distribution, then any Derivative Works that You distribute must
                include a readable copy of the attribution notices contained
                within such NOTICE file, excluding those notices that do not
                pertain to any part of the Derivative Works, in at least one
                of the following places: within a NOTICE text file distributed
                as part of the Derivative Works; within the Source form or
                documentation, if provided along with the Derivative Works; or,
                within a display generated by the Derivative Works, if and
                wherever such third-party notices normally appear. The contents
                of the NOTICE file are for informational purposes only and
                do not modify the License. You may add Your own attribution
                notices within Derivative Works that You distribute, alongside
                or as an addendum to the NOTICE text from the Work, provided
                that such additional attribution notices cannot be construed
                as modifying the License.
         
            You may add Your own copyright statement to Your modifications and
            may provide additional or different license terms and conditions
            for use, reproduction, or distribution of Your modifications, or
            for any such Derivative Works as a whole, provided Your use,
            reproduction, and distribution of the Work otherwise complies with
            the conditions stated in this License.
         
         5. Submission of Contributions. Unless You explicitly state otherwise,
            any Contribution intentionally submitted for inclusion in the Work
            by You to the Licensor shall be under the terms and conditions of
            this License, without any additional terms or conditions.
            Notwithstanding the above, nothing herein shall supersede or modify
            the terms of any separate license agreement you may have executed
            with Licensor regarding such Contributions.
         
         6. Trademarks. This License does not grant permission to use the trade
            names, trademarks, service marks, or product names of the Licensor,
            except as required for reasonable and customary use in describing the
            origin of the Work and reproducing the content of the NOTICE file.
         
         7. Disclaimer of Warranty. Unless required by applicable law or
            agreed to in writing, Licensor provides the Work (and each
            Contributor provides its Contributions) on an "AS IS" BASIS,
            WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
            implied, including, without limitation, any warranties or conditions
            of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
            PARTICULAR PURPOSE. You are solely responsible for determining the
            appropriateness of using or redistributing the Work and assume any
            risks associated with Your exercise of permissions under this License.
         
         8. Limitation of Liability. In no event and under no legal theory,
            whether in tort (including negligence), contract, or otherwise,
            unless required by applicable law (such as deliberate and grossly
            negligent acts) or agreed to in writing, shall any Contributor be
            liable to You for damages, including any direct, indirect, special,
            incidental, or consequential damages of any character arising as a
            result of this License or out of the use or inability to use the
            Work (including but not limited to damages for loss of goodwill,
            work stoppage, computer failure or malfunction, or any and all
            other commercial damages or losses), even if such Contributor
            has been advised of the possibility of such damages.
         
         9. Accepting Warranty or Additional Liability. While redistributing
            the Work or Derivative Works thereof, You may choose to offer,
            and charge a fee for, acceptance of support, warranty, indemnity,
            or other liability obligations and/or rights consistent with this
            License. However, in accepting such obligations, You may act only
            on Your own behalf and on Your sole responsibility, not on behalf
            of any other Contributor, and only if You agree to indemnify,
            defend, and hold each Contributor harmless for any liability
            incurred by, or claims asserted against, such Contributor by reason
            of your accepting any such warranty or additional liability.
         
         END OF TERMS AND CONDITIONS
         
         Copyright 2026 University of Aruba
         
         Licensed under the Apache License, Version 2.0 (the "License");
         you may not use this file except in compliance with the License.
         You may obtain a copy of the License at
         
             http://www.apache.org/licenses/LICENSE-2.0
         
         Unless required by applicable law or agreed to in writing, software
         distributed under the License is distributed on an "AS IS" BASIS,
         WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
         See the License for the specific language governing permissions and
         limitations under the License.
         
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: C++
Classifier: Operating System :: Microsoft :: Windows
Classifier: Operating System :: MacOS
Classifier: Operating System :: POSIX :: Linux
Project-URL: Documentation, https://omnigbdt.readthedocs.io
Project-URL: Source, https://github.com/University-of-Aruba/OmniGBDT
Project-URL: Issues, https://github.com/University-of-Aruba/OmniGBDT/issues
Project-URL: Upstream, https://github.com/zzd1992/GBDTMO
Requires-Python: <3.14,>=3.10
Requires-Dist: numpy
Requires-Dist: numba>=0.61
Provides-Extra: plot
Requires-Dist: graphviz; extra == "plot"
Provides-Extra: sklearn
Requires-Dist: scikit-learn>=1.4; extra == "sklearn"
Provides-Extra: test
Requires-Dist: pytest>=8; extra == "test"
Description-Content-Type: text/markdown

# OmniGBDT

OmniGBDT packages the original [GBDT-MO](https://github.com/zzd1992/GBDTMO) algorithm as a regular Python library. The native C++ training core remains in place, while the Python layer adds wheel-based installation, public custom-objective hooks, optional sklearn-compatible wrappers, and accuracy-oriented regression defaults.

The main public classes are `MultiOutputGBDT` and `SingleOutputGBDT`.

## Why OmniGBDT

- Joint multi-output gradient boosting from the original GBDT-MO research codebase
- Standard `pip` and `uv` installation with the native library bundled inside the package
- Public Python callbacks for custom gradients, Hessians, metrics, and early stopping
- Fixed-thread deterministic CPU training through the public `deterministic` parameter
- Optional sklearn-compatible wrappers for tools such as permutation importance
- Accuracy-oriented regression defaults in the current fork: `num_rounds=200`, `lr=0.05`, `max_bins=128`, `early_stop=15`, and automatic mean initialization when `base_score` is unset

For the original project, benchmark figures, experiment scripts, and upstream research context, please see:

- Original repository: <https://github.com/zzd1992/GBDTMO>
- Experiment and evaluation repository: <https://github.com/zzd1992/GBDTMO-EX>

## Installation

Install the released package:

```bash
pip install omnigbdt
```

or with `uv`:

```bash
uv add omnigbdt
```

Optional extras:

```bash
pip install "omnigbdt[plot]"
pip install "omnigbdt[sklearn]"
```

The current wheel targets are:

- Linux x86_64
- Windows x86_64
- macOS arm64 (Apple Silicon, 14+)

## First Model

The example below trains one `MultiOutputGBDT` model on two correlated targets using only NumPy:

```python
import numpy as np

from omnigbdt import MultiOutputGBDT, Verbosity

rng = np.random.default_rng(0)
X = rng.normal(size=(400, 6))
shared_signal = 1.2 * X[:, 0] - 0.8 * X[:, 1] + 0.5 * X[:, 2] * X[:, 3]
Y = np.column_stack(
    [
        shared_signal + 0.3 * X[:, 4],
        shared_signal - 0.2 * X[:, 5],
    ]
)

X_train, Y_train = X[:240], Y[:240]
X_valid, Y_valid = X[240:320], Y[240:320]
X_test = X[320:]

model = MultiOutputGBDT(
    out_dim=Y.shape[1],
    params={
        "loss": b"mse",
        "max_depth": 4,
        "max_bins": 128,
        "lr": 0.05,
        "early_stop": 15,
        "num_threads": 1,
        "verbosity": Verbosity.SILENT,
    },
)
model.set_data((X_train, Y_train), (X_valid, Y_valid))
model.train(200)

preds = model.predict(X_test)
print(preds.shape)
```

`SingleOutputGBDT` can be used to train one model per target column as a simple baseline. A real-world financial benchmark based on the UCI Stock Portfolio Performance dataset, together with custom-objective and sklearn examples, is available in the [hosted examples page](https://omnigbdt.readthedocs.io/en/latest/example.html).

## Differences From The Original Package

Compared with the upstream GBDT-MO repository, OmniGBDT currently adds:

- standard Python packaging and bundled native-library loading
- wheel automation for Linux, macOS, and Windows
- public Python callback hooks for custom gradients, Hessians, metrics, and early stopping
- public `deterministic` parameter for fixed-thread CPU repeatability on the same platform
- optional sklearn-compatible wrappers
- automatic regression mean initialization when `base_score` is omitted
- scalar or per-output `base_score` values for `MultiOutputGBDT`
- accuracy-oriented wrapper defaults for regression workflows

Several targeted native-code fixes are also part of the fork, so same-seed runs are not guaranteed to match older buggy runs exactly. A fuller summary is available in the [Differences From Upstream page](https://omnigbdt.readthedocs.io/en/latest/differences.html).

## Documentation Guide

- Hosted documentation: <https://omnigbdt.readthedocs.io>
- Installation and source-build notes: [install page](https://omnigbdt.readthedocs.io/en/latest/install.html)
- Worked examples, including the financial benchmark: [examples page](https://omnigbdt.readthedocs.io/en/latest/example.html)
- Python API reference: [API page](https://omnigbdt.readthedocs.io/en/latest/api.html)
- Parameter reference: [parameters page](https://omnigbdt.readthedocs.io/en/latest/parameters.html)
- Fork-specific differences: [differences page](https://omnigbdt.readthedocs.io/en/latest/differences.html)
- Development workflow: [development page](https://omnigbdt.readthedocs.io/en/latest/development.html)

## Project Provenance

This fork builds directly on the original GBDT-MO implementation by Zhendong Zhang and Cheolkon Jung.

OmniGBDT is intended to make the package easier to build, install, and distribute. It is not the canonical source for the paper, benchmark tables, figures, or research documentation.

## Versioning

This fork follows Semantic Versioning independently from the upstream GBDT-MO repository.

## License

This fork is distributed under the Apache License 2.0. The main license text for this fork is in [LICENSE](LICENSE).

Because this repository incorporates and modifies the original GBDT-MO codebase, the original upstream MIT license notice from Zhendong Zhang is preserved in [LICENSE.upstream](LICENSE.upstream). Additional attribution and fork-specific notice text is provided in [NOTICE](NOTICE).

## Citation

If this project is used in research, please credit the original paper by Zhang and Jung:

```bibtex
@article{zhang2020gbdt,
  title={GBDT-MO: Gradient-boosted decision trees for multiple outputs},
  author={Zhang, Zhendong and Jung, Cheolkon},
  journal={IEEE transactions on neural networks and learning systems},
  volume={32},
  number={7},
  pages={3156--3167},
  year={2020},
  publisher={Ieee}
}
```
