Metadata-Version: 2.4
Name: vllm-rs
Version: 0.9.8
Summary: A minimal, high-performance large language model (LLM) inference engine implementing vLLM in Rust.
Keywords: llm,inference,candle,qwen,moe,rust,vllm
Home-Page: https://github.com/guoqingbao/vllm.rs
License: MIT
Project-URL: Homepage, https://github.com/guoqingbao/vllm.rs
Project-URL: Source Code, https://github.com/guoqingbao/vllm.rs
