Documentation Index
Fetch the complete documentation index at: https://ahvn.top/llms.txt
Use this file to discover all available pages before exploring further.
hb.LLM is intentionally stateless. Conversation state lives in the messages you pass to chat or stream, which keeps the LLM client simple and easy to serialize.
chat when you need another turn.
Inspect resolved state
Usespec when you need to see how a client resolved:
spec.to_dict() omits runtime secrets. spec.to_dict(secrets=True) includes the materialized runtime dictionary and should only be used in trusted debugging contexts.
Deterministic spec keys
Resolved specs can produce stable hash keys for deduplication and future response caching:client_key() includes only the gateway client construction fields, so duplicate LLM instances can reuse the same in-memory OpenAI-compatible SDK client.

