Metadata-Version: 2.4
Name: nao-core
Version: 0.1.4
Summary: nao Core is your analytics context builder with the best chat interface.
Project-URL: Homepage, https://getnao.io
Project-URL: Repository, https://github.com/naolabs/chat
Author: nao Labs
License-Expression: Apache-2.0
License-File: LICENSE
Keywords: ai,analytics,chat
Classifier: Development Status :: 3 - Alpha
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: MacOS
Classifier: Operating System :: Microsoft :: Windows
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Requires-Dist: apscheduler>=3.10.0
Requires-Dist: cryptography>=46.0.3
Requires-Dist: cyclopts>=4.4.4
Requires-Dist: dotenv>=0.9.9
Requires-Dist: fastapi>=0.128.0
Requires-Dist: ibis-framework>=9.0.0
Requires-Dist: jinja2>=3.1.0
Requires-Dist: numpy>=1.26.0
Requires-Dist: pandas>=2.1.0
Requires-Dist: posthog>=7.8.0
Requires-Dist: pydantic>=2.10.0
Requires-Dist: pyngrok>=7.0.0
Requires-Dist: pytest>=9.0.2
Requires-Dist: python-dateutil>=2.8.0
Requires-Dist: python-dotenv>=1.2.1
Requires-Dist: pyyaml>=6.0.0
Requires-Dist: questionary>=2.1.0
Requires-Dist: rich>=14.0.0
Requires-Dist: sqlglot>=26.0.0
Requires-Dist: uvicorn>=0.40.0
Provides-Extra: all
Requires-Dist: anthropic>=0.76.0; extra == 'all'
Requires-Dist: azure-identity>=1.19.0; extra == 'all'
Requires-Dist: certifi>=2024.0.0; extra == 'all'
Requires-Dist: google-cloud>=0.34.0; extra == 'all'
Requires-Dist: google-genai>=1.61.0; extra == 'all'
Requires-Dist: ibis-framework[athena]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[bigquery]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[clickhouse]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[databricks]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[duckdb]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[mssql]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[mysql]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[postgres]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[snowflake]>=9.0.0; extra == 'all'
Requires-Dist: ibis-framework[trino]>=9.0.0; extra == 'all'
Requires-Dist: mistralai<2.0.0,>=1.11.1; extra == 'all'
Requires-Dist: notion-client>=2.7.0; extra == 'all'
Requires-Dist: notion2md>=2.9.0; extra == 'all'
Requires-Dist: ollama>=0.4.0; extra == 'all'
Requires-Dist: openai>=1.0.0; extra == 'all'
Requires-Dist: snowflake-connector-python[secure-local-storage]>=4.2.0; extra == 'all'
Requires-Dist: sshtunnel>=0.4.0; extra == 'all'
Provides-Extra: all-databases
Requires-Dist: azure-identity>=1.19.0; extra == 'all-databases'
Requires-Dist: certifi>=2024.0.0; extra == 'all-databases'
Requires-Dist: google-cloud>=0.34.0; extra == 'all-databases'
Requires-Dist: ibis-framework[athena]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[bigquery]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[clickhouse]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[databricks]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[duckdb]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[mssql]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[mysql]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[postgres]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[snowflake]>=9.0.0; extra == 'all-databases'
Requires-Dist: ibis-framework[trino]>=9.0.0; extra == 'all-databases'
Requires-Dist: snowflake-connector-python[secure-local-storage]>=4.2.0; extra == 'all-databases'
Requires-Dist: sshtunnel>=0.4.0; extra == 'all-databases'
Provides-Extra: all-llms
Requires-Dist: anthropic>=0.76.0; extra == 'all-llms'
Requires-Dist: google-genai>=1.61.0; extra == 'all-llms'
Requires-Dist: mistralai<2.0.0,>=1.11.1; extra == 'all-llms'
Requires-Dist: ollama>=0.4.0; extra == 'all-llms'
Requires-Dist: openai>=1.0.0; extra == 'all-llms'
Provides-Extra: anthropic
Requires-Dist: anthropic>=0.76.0; extra == 'anthropic'
Provides-Extra: athena
Requires-Dist: ibis-framework[athena]>=9.0.0; extra == 'athena'
Provides-Extra: bigquery
Requires-Dist: google-cloud>=0.34.0; extra == 'bigquery'
Requires-Dist: ibis-framework[bigquery]>=9.0.0; extra == 'bigquery'
Provides-Extra: clickhouse
Requires-Dist: ibis-framework[clickhouse]>=9.0.0; extra == 'clickhouse'
Provides-Extra: databricks
Requires-Dist: certifi>=2024.0.0; extra == 'databricks'
Requires-Dist: ibis-framework[databricks]>=9.0.0; extra == 'databricks'
Provides-Extra: dev
Requires-Dist: pytest-cov; extra == 'dev'
Requires-Dist: pytest-timeout>=2.3.0; extra == 'dev'
Provides-Extra: duckdb
Requires-Dist: ibis-framework[duckdb]>=9.0.0; extra == 'duckdb'
Provides-Extra: fabric
Requires-Dist: azure-identity>=1.19.0; extra == 'fabric'
Requires-Dist: ibis-framework[mssql]>=9.0.0; extra == 'fabric'
Provides-Extra: gemini
Requires-Dist: google-genai>=1.61.0; extra == 'gemini'
Provides-Extra: mistral
Requires-Dist: mistralai<2.0.0,>=1.11.1; extra == 'mistral'
Provides-Extra: mssql
Requires-Dist: ibis-framework[mssql]>=9.0.0; extra == 'mssql'
Provides-Extra: mysql
Requires-Dist: ibis-framework[mysql]>=9.0.0; extra == 'mysql'
Provides-Extra: notion
Requires-Dist: notion-client>=2.7.0; extra == 'notion'
Requires-Dist: notion2md>=2.9.0; extra == 'notion'
Provides-Extra: ollama
Requires-Dist: ollama>=0.4.0; extra == 'ollama'
Provides-Extra: openai
Requires-Dist: openai>=1.0.0; extra == 'openai'
Provides-Extra: postgres
Requires-Dist: ibis-framework[postgres]>=9.0.0; extra == 'postgres'
Provides-Extra: redshift
Requires-Dist: ibis-framework[postgres]>=9.0.0; extra == 'redshift'
Requires-Dist: sshtunnel>=0.4.0; extra == 'redshift'
Provides-Extra: snowflake
Requires-Dist: ibis-framework[snowflake]>=9.0.0; extra == 'snowflake'
Requires-Dist: snowflake-connector-python[secure-local-storage]>=4.2.0; extra == 'snowflake'
Provides-Extra: trino
Requires-Dist: ibis-framework[trino]>=9.0.0; extra == 'trino'
Description-Content-Type: text/markdown

# nao CLI

Command-line interface for nao chat.

## Installation

Install the core package (lightweight, no database or LLM dependencies):

```bash
pip install nao-core
```

Then add only the providers you need:

```bash
# Database backends
pip install 'nao-core[postgres]'
pip install 'nao-core[bigquery]'
pip install 'nao-core[snowflake]'
pip install 'nao-core[duckdb]'
pip install 'nao-core[clickhouse]'
pip install 'nao-core[databricks]'
pip install 'nao-core[mysql]'
pip install 'nao-core[mssql]'
pip install 'nao-core[athena]'
pip install 'nao-core[trino]'
pip install 'nao-core[redshift]'
pip install 'nao-core[fabric]'

# LLM providers
pip install 'nao-core[openai]'
pip install 'nao-core[anthropic]'
pip install 'nao-core[mistral]'
pip install 'nao-core[gemini]'
pip install 'nao-core[ollama]'

# Integrations
pip install 'nao-core[notion]'
```

Combine multiple extras in a single install:

```bash
pip install 'nao-core[postgres,openai]'
pip install 'nao-core[snowflake,bigquery,anthropic]'
```

Or install everything at once (equivalent to the previous default):

```bash
pip install 'nao-core[all]'
```

Convenience groups are also available:

```bash
pip install 'nao-core[all-databases]'  # all database backends
pip install 'nao-core[all-llms]'       # all LLM providers
```

## Usage

```bash
nao --help
Usage: nao COMMAND

╭─ Commands ────────────────────────────────────────────────────────────────╮
│ chat         Start the nao chat UI.                                       │
│ debug        Test connectivity to configured resources.                   │
│ init         Initialize a new nao project.                                │
│ sync         Sync resources to local files.                               │
│ test         Run and explore nao tests.                                   │
│ --help (-h)  Display this message and exit.                               │
│ --version    Display application version.                                 │
╰───────────────────────────────────────────────────────────────────────────╯
```

### Initialize a new nao project

```bash
nao init
```

This will create a new nao project in the current directory. It will prompt you for a project name and ask you to configure:

- **Database connections** (BigQuery, DuckDB, Databricks, Snowflake, PostgreSQL, Redshift, MSSQL, Trino)
- **Git repositories** to sync
- **LLM provider** (OpenAI, Anthropic, Mistral, Gemini, OpenRouter, Ollama)
- **`ai_summary` template + model** (prompted only when you enable `ai_summary` for databases)
- **Slack integration**
- **Notion integration**

The resulting project structure looks like:

```
<project>/
├── nao_config.yaml
├── .naoignore
├── RULES.md
├── databases/
├── queries/
├── docs/
├── semantics/
├── repos/
├── agent/
│   ├── tools/
│   └── mcps/
└── tests/
```

Options:

- `--force` / `-f`: Force re-initialization even if the project already exists

### Start the nao chat UI

```bash
nao chat
```

This will start the nao chat UI. It will open the chat interface in your browser at `http://localhost:5005`.

### Test connectivity

```bash
nao debug
```

Tests connectivity to all configured databases and LLM providers. Displays a summary table showing connection status and details for each resource.

### Sync resources

```bash
nao sync
```

Syncs configured resources to local files:

- **Databases** - generates markdown docs (`columns.md`, `preview.md`, `description.md`) for each table into `databases/`
- **Git repositories** — clones or pulls repos into `repos/`
- **Notion pages** — exports pages as markdown into `docs/notion/`

After syncing, any Jinja templates (`*.j2` files) in the project directory are rendered with the nao context.

Optional `ai_summary` generation:

- Add `ai_summary` to a database connection `templates` list to render `ai_summary.md`.
- Use `prompt("...")` inside Jinja templates to generate `ai_summary` content.
- `prompt(...)` requires `llm.provider`, `llm.annotation_model`, and `llm.api_key` (except for ollama).

### Run tests

```bash
nao test
```

Runs test cases defined as YAML files in `tests/`. Each test has a `name`, `prompt`, and expected `sql`. Results are saved to `tests/outputs/`.

Options:

- `--model` / `-m`: Models to test against (default: `openai:gpt-4.1`). Can be specified multiple times.
- `--threads` / `-t`: Number of parallel threads (default: `1`)

Examples:

```bash
nao test -m openai:gpt-4.1
nao test -m openai:gpt-4.1 -m anthropic:claude-sonnet-4-20250514
nao test --threads 4
```

### Explore test results

```bash
nao test server
```

Starts a local web server to explore test results in a browser UI showing pass/fail status, token usage, cost, and detailed data comparisons.

Options:

- `--port` / `-p`: Port to run the server on (default: `8765`)
- `--no-open`: Don't automatically open the browser

### BigQuery service account permissions

When you connect BigQuery during `nao init`, the service account used by `credentials_path`/ADC must be able to list datasets and run read-only queries to generate docs. Grant the account:

- Project: `roles/bigquery.jobUser` (or `roles/bigquery.user`) so the CLI can submit queries
- Each dataset you sync: `roles/bigquery.dataViewer` (or higher) to read tables

The combination above mirrors the typical "BigQuery User" setup and is sufficient for nao's metadata and preview pulls.

### Snowflake authentication

Snowflake supports three authentication methods during `nao init`:

- **SSO**: Browser-based authentication (recommended for organizations with SSO policies)
- **Password**: Traditional username/password
- **Key-pair**: Private key file with optional passphrase

## Development

### Building the package

```bash
cd cli
python build.py --help
Usage: build.py [OPTIONS]

Build and package nao-core CLI.

╭─ Parameters ──────────────────────────────────────────────────────────────────╮
│ --force -f --no-force              Force rebuild the server binary             │
│ --skip-server -s --no-skip-server  Skip server build, only build Python pkg   │
│ --bump                             Bump version (patch, minor, major)          │
╰───────────────────────────────────────────────────────────────────────────────╯
```

This will:
1. Build the frontend with Vite
2. Compile the backend with Bun into a standalone binary
3. Bundle everything into a Python wheel in `dist/`

### Installing for development

```bash
cd cli
pip install -e '.[all]'
```

### Publishing to PyPI

```bash
# Build first
python build.py

# Publish
uv publish dist/*
```

## Architecture

```
nao chat (CLI command)
    ↓ spawns
nao-chat-server (Bun-compiled binary, port 5005)
  + FastAPI server (port 8005)
    ↓ serves
Backend API + Frontend Static Files
    ↓
Browser at http://localhost:5005
```
