Skip to main content
This guide reorganizes the SimpleLLMFunc docs by task, subsystem, and learning stage so you can quickly find the next page to read.
If you are new to SimpleLLMFunc, start with Quick Start and then use this page to decide what to read next.

Quick Navigation

Infrastructure

Configuration and Environment

Learn how .env, provider.json, logging, and runtime configuration work.

LLM Interface Layer

Understand OpenAICompatible, OpenAIResponsesCompatible, API key pools, and token bucket rate limiting.

Developer Experience

llm_function Decorator

Build stateless LLM-powered functions with structured output, templates, and tool usage.

llm_chat Decorator

Build multi-turn chat applications, Agents, and streaming interactions.

Agent Execution

Event Stream System

Observe the ReAct loop and consume model, tool, and runtime events.

Abort and Cancellation

Use AbortSignal to interrupt a running turn and shut it down cleanly.

Tools and Runtime

Runtime Primitives

Learn how runtime.*, PrimitivePack, and backend lifecycle management work.

Tool System

Define tools, return structured values, and inject tool usage guidance.

PyRepl Runtime

Execute Python code in a persistent session and expose runtime primitives.

UI and Interaction

Terminal TUI

Wrap @llm_chat with a Textual-based terminal interface.

Integrations and Examples

Langfuse Integration

Add observability for model calls, tool calls, and event streaming.

Examples

Browse runnable examples by scenario and recommended learning order.

Contributing

Learn the repository conventions for issues, pull requests, and local development.

Browse by Task

Start with Quick Start, then come back here to choose the next topic.
Read Configuration and Environment for .env, provider.json, and logging.
Read llm_function Decorator to learn about signatures, output types, templates, and tools.
Read llm_chat Decorator to learn about history handling, streaming output, and runtime context.
Read Event Stream System to consume EventYield and ResponseYield in custom UI or telemetry code.
Read Tool System. If you also need persistent Python state, pair it with PyRepl Runtime.
Read Terminal TUI to learn about @tui, interrupts, hotkeys, and custom event hooks.
Go to Examples and browse by scenario.
1

Beginner path

  1. Read Quick Start
  2. Read llm_function Decorator
  3. Run a structured-output example from Examples
2

Intermediate path

  1. Read llm_chat Decorator
  2. Read Tool System
  3. Revisit Configuration and Environment to tune providers and limits

Browse by Capability

CapabilityDocumentationWhat it covers
Basic configurationConfiguration and EnvironmentAPI keys, environment variables, and provider.json
Stateless LLM tasksllm_function DecoratorText processing, typed outputs, structured extraction
Chat applicationsllm_chat DecoratorMulti-turn conversation, history handling, streaming
Event streamingEvent Stream SystemRealtime observation, tool call telemetry, performance insight
Abort controlAbort and CancellationInterrupting model output and tool execution
Tool integrationTool SystemTool definitions, invocation, multimodal returns
Runtime primitivesRuntime PrimitivesCodeAct runtime capabilities and primitive design
Interface designLLM Interface LayerAPI abstraction, key pools, and rate limiting
Runnable examplesExamplesEnd-to-end examples for common scenarios

FAQ Quick Reference

Read Configuration and Environment. Pay special attention to the provider-to-model-list structure in provider.json.
No. @llm_function, @llm_chat, and @tool all require async def functions.
Start with llm_chat Decorator and learn how history, stream=True, and return modes work.
Read Abort and Cancellation and use AbortSignal.
Read Tool System to learn @tool, return types, and tool guidance injection.
Read LLM Interface Layer. The framework supports both OpenAI-compatible chat/completions adapters and OpenAI Responses API adapters, and can connect to many compatible services.
Start with the troubleshooting section in LLM Interface Layer, then use logs and the event stream for deeper inspection.

Other Resources

Project Introduction

Learn the design philosophy, core features, and project layout.

Examples

Run example code directly and compare different patterns.

GitHub Repository

Browse source code, issues, and release history.
Most pages keep full code examples. When something goes wrong, first check the troubleshooting or FAQ section on the relevant page, then compare against the examples.