Skip to main content
This page organizes the SimpleLLMFunc examples by use case, from basic typed LLM functions to event streams, TUI applications, runtime primitives, and Agent composition.
All decorators in SimpleLLMFunc such as @llm_function, @llm_chat, and @tool require functions defined with async def.

Quick Access

Basic Function Example

Start with structured output and basic prompt design.

Event Stream Chat Example

Inspect the full ReAct loop, tool calls, and token statistics.

Terminal TUI Agent

Explore the combination of @tui and @llm_chat.

Provider Config Template

Reuse a provider configuration template with multiple vendors and keys.

Highlighted Examples

File: examples/llm_function_pydantic_example.pyUseful for learning:
  • nested Pydantic models
  • structured parsing
  • typed return value handling
File: examples/dynamic_template_demo.pyUseful for learning:
  • one function serving many scenarios
  • switching prompts by role or style
  • reducing repetitive function definitions
File: examples/response_api_example.pyUseful for learning:
  • OpenAIResponsesCompatible with reasoning={...}
  • system prompt to Responses instructions adaptation
  • runtime.selfref.fork.gather_all(...) result parsing via status, response, and result
  • workspace-scoped TUI agent workflows

Provider Configuration Examples

provider.json Example

See the full provider-to-model configuration structure.

provider_template.json

Reuse a template with multiple providers, keys, and rate limits.

Browse by Capability

Run Examples Quickly

1

Prepare the environment

  1. Install SimpleLLMFunc: pip install SimpleLLMFunc
  2. Configure your API keys as described in Quick Start
  3. Create or update provider.json
2

Enter the examples directory

cd examples
3

Run a few representative examples

python llm_function_pydantic_example.py
python event_stream_chatbot.py
python parallel_toolcall_example.py
python multi_modality_toolcall.py
python tui_chat_example.py
event_stream_chatbot.py depends on rich, so install it first if needed.

Suggested Learning Path

1

Beginner

  1. Read Quick Start
  2. Run llm_function_pydantic_example.py
  3. Modify the prompt and observe how the structured output changes
2

Intermediate

  1. Read llm_chat Decorator
  2. Run event_stream_chatbot.py
  3. Try parallel_toolcall_example.py
3

Advanced

  1. Read LLM Interface Layer
  2. Explore multi_modality_toolcall.py
  3. Study event_stream_chatbot.py and tui_general_agent_example.py

FAQ

All example code lives in the repository’s examples/ directory.
Clone the repository, edit the files under examples/, and run them locally.
Most examples rely on provider.json and therefore work with any OpenAI-compatible provider you configure.
Re-check your environment, API keys, and provider configuration, then compare against Quick Start.