Wiki
Download
Manual
Eggs
API
Tests
Bugs
show
edit
history
You can edit this page using
wiki syntax
for markup.
Article contents:
[[tags: egg]] == llm A provider-agnostic LLM chat API client for CHICKEN Scheme with tool calling support. [[toc:]] === Description This egg provides a high-level interface for interacting with Large Language Model APIs. It supports: * Multi-turn conversations with history management * Tool/function calling with automatic execution * File attachments (images, PDFs, text files) * Token usage and cost tracking * Provider abstraction for multiple LLM backends OpenAI is included as the default provider. The modular architecture allows adding other providers (Anthropic, Mistral, etc.) without changing application code. === Author Rolando Abarca === Repository [[https://forgejo.rolando.cl/cpm/llm-egg]] === Requirements * [[medea]] * [[base64]] * [[uri-common]] * [[http-client]] * [[intarweb]] * [[openssl]] * [[srfi-1]] * [[srfi-13]] * [[srfi-4]] * [[logger]] === Documentation ==== Quick Start <enscript highlight="scheme"> (import llm) ;; Create a conversation (define conv (llm/chat system: "You are a helpful assistant.")) ;; Send a message (let-values ([(conv ok?) (llm/send conv "What is 2 + 2?")]) (when ok? (print (llm/get-last-response conv)))) </enscript> ==== Environment Variables ; {{OPENAI_API_KEY}} : Required for the OpenAI provider. ==== Module: llm The main module providing the public API. <procedure>(llm/chat #!key system tools history model temperature max-tokens on-response-received on-tool-executed provider)</procedure> Create a new conversation. Returns a conversation state alist. ; {{system}} : Optional system prompt string. ; {{tools}} : Optional list of tool name symbols (kebab-case) to enable. ; {{history}} : Optional existing message history to continue. ; {{model}} : Optional model name string (default: {{gpt-5-nano-2025-08-07}}). ; {{temperature}} : Optional temperature float (default: 1). ; {{max-tokens}} : Optional maximum completion tokens (default: 4000). ; {{on-response-received}} : Optional callback {{(lambda (message) ...)}}. ; {{on-tool-executed}} : Optional callback {{(lambda (name args result) ...)}}. ; {{provider}} : Optional provider record (default: {{openai-provider}}). <enscript highlight="scheme"> ;; Basic conversation (define conv (llm/chat system: "You are a helpful assistant.")) ;; With all options (define conv (llm/chat system: "You can check the weather." tools: '(get-weather) model: "gpt-4o" temperature: 0.7 max-tokens: 1000 on-response-received: (lambda (msg) (print "Response: " msg)) on-tool-executed: (lambda (name args result) (print "Tool called: " name)))) </enscript> <procedure>(llm/send conversation message #!key file)</procedure> Send a message and get a response. Returns two values: the updated conversation and a success boolean. ; {{conversation}} : Conversation state from {{llm/chat}}. ; {{message}} : String message to send. ; {{file}} : Optional local file path to attach (image, PDF, or text file). <enscript highlight="scheme"> ;; Text only (let-values ([(conv ok?) (llm/send conv "What is 2+2?")]) (if ok? (print (llm/get-last-response conv)) (print "Request failed"))) ;; With image attachment (let-values ([(conv ok?) (llm/send conv "What's in this image?" file: "photo.jpg")]) (when ok? (print (llm/get-last-response conv)))) ;; With PDF attachment (let-values ([(conv ok?) (llm/send conv "Summarize this document" file: "report.pdf")]) (when ok? (print (llm/get-last-response conv)))) </enscript> <procedure>(llm/get-last-response conversation)</procedure> Get the text content of the last assistant message from the conversation. Returns {{#f}} if no assistant message exists. <procedure>(llm/register-tool! name schema implementation)</procedure> Register a tool in the global registry. ; {{name}} : Symbol identifier in kebab-case (e.g., {{'get-weather}}). ; {{schema}} : Alist defining the function schema (use snake_case in {{function.name}}). ; {{implementation}} : Procedure {{(lambda (params-alist) ...)}} returning a result alist. <enscript highlight="scheme"> (llm/register-tool! 'get-weather '((type . "function") (function . ((name . "get_weather") (description . "Get current weather for a location") (parameters . ((type . "object") (properties . ((location . ((type . "string") (description . "City name"))))) (required . #("location"))))))) (lambda (params) (let ((location (alist-ref 'location params))) `((success . #t) (temperature . 72) (conditions . "sunny") (location . ,location))))) </enscript> <procedure>(llm/get-registered-tools #!optional tool-names)</procedure> Get registered tool schemas as a vector. If {{tool-names}} (a list of kebab-case symbols) is provided, only those tools are returned. <procedure>(llm/get-cost conversation)</procedure> Get the total cost of the conversation in USD. <procedure>(llm/get-tokens conversation)</procedure> Get total token counts as a pair {{(input-tokens . output-tokens)}}. <enscript highlight="scheme"> (let-values ([(conv ok?) (llm/send conv "Hello!")]) (when ok? (let ((tokens (llm/get-tokens conv)) (cost (llm/get-cost conv))) (print "Input tokens: " (car tokens)) (print "Output tokens: " (cdr tokens)) (print "Total cost: $" cost)))) </enscript> <parameter>llm/use-provider</parameter> Parameter to get or set the current default provider. Re-export of {{current-provider}} from {{llm-provider}}. <enscript highlight="scheme"> ;; Get current provider (llm/use-provider) ; => openai-provider ;; Set a different default provider (llm/use-provider some-other-provider) </enscript> ==== Module: llm-provider Defines the provider abstraction layer. <record><llm-provider></record> Record type for LLM providers with the following fields: ; {{name}} : Symbol identifying the provider (e.g., {{'openai}}). ; {{prepare-message}} : Procedure {{(message include-file?) -> provider-format-message}}. ; {{build-payload}} : Procedure {{(messages tools model temp max-tokens) -> payload-alist}}. ; {{call-api}} : Procedure {{(endpoint payload) -> response-alist}}. ; {{parse-response}} : Procedure {{(response-data) -> normalized-response}}. ; {{format-tool-result}} : Procedure {{(tool-call-id result) -> tool-message}}. ; {{get-model-pricing}} : Procedure {{(model-name) -> pricing-alist}}. ; {{extract-tool-calls}} : Procedure {{(response-message) -> list-of-tool-calls}}. <procedure>(make-llm-provider name prepare-message build-payload call-api parse-response format-tool-result get-model-pricing extract-tool-calls)</procedure> Constructor for provider records. <procedure>(llm-provider? obj)</procedure> Predicate to check if {{obj}} is an {{<llm-provider>}} record. <procedure>(llm-provider-name provider)</procedure> <procedure>(llm-provider-prepare-message provider)</procedure> <procedure>(llm-provider-build-payload provider)</procedure> <procedure>(llm-provider-call-api provider)</procedure> <procedure>(llm-provider-parse-response provider)</procedure> <procedure>(llm-provider-format-tool-result provider)</procedure> <procedure>(llm-provider-get-model-pricing provider)</procedure> <procedure>(llm-provider-extract-tool-calls provider)</procedure> Accessor procedures for provider record fields. <parameter>current-provider</parameter> Parameter holding the current default provider. Initially {{#f}} until set by importing {{llm}} (which sets it to {{openai-provider}}). ==== Module: llm-openai OpenAI provider implementation. <constant>openai-provider</constant> The OpenAI provider instance. This is the default provider when using the {{llm}} module. <parameter>openai-http-client</parameter> Parameter holding the HTTP client procedure. Can be overridden for testing or custom HTTP handling. The procedure signature is {{(endpoint payload) -> response-alist}}. <enscript highlight="scheme"> ;; Mock the HTTP client for testing (openai-http-client (lambda (endpoint payload) '((choices . #(((message . ((content . "Test response") (role . "assistant"))) (finish_reason . "stop")))) (usage . ((prompt_tokens . 10) (completion_tokens . 5)))))) </enscript> <procedure>(openai-call-api endpoint payload)</procedure> Make an API request using the current {{openai-http-client}}. <constant>*openai-default-model*</constant> Default model string: {{"gpt-5-nano-2025-08-07"}}. <constant>*openai-default-temperature*</constant> Default temperature: {{1}}. <constant>*openai-default-max-tokens*</constant> Default max completion tokens: {{4000}}. ==== Module: llm-common Shared utilities used by providers. <procedure>(detect-mime-type path)</procedure> Detect MIME type from file extension. Supports JPEG, PNG, GIF, WebP, PDF, and common text files. <procedure>(image-mime-type? mime)</procedure> Returns {{#t}} if {{mime}} starts with {{"image/"}}. <procedure>(pdf-mime-type? mime)</procedure> Returns {{#t}} if {{mime}} is {{"application/pdf"}}. <procedure>(vision-mime-type? mime)</procedure> Returns {{#t}} if {{mime}} is an image or PDF (types supported by vision models). <procedure>(read-file-base64 path)</procedure> Read a file and return its contents as a base64-encoded string. <procedure>(read-text-file path)</procedure> Read a text file and return its contents as a string. <procedure>(kebab->snake symbol)</procedure> Convert a kebab-case symbol to snake_case. Example: {{'get-current-time}} becomes {{'get_current_time}}. ==== Creating a New Provider To add support for a new LLM backend, implement the provider interface: <enscript highlight="scheme"> (import llm-provider llm-common) (define (my-prepare-message msg include-file?) ;; Convert internal message format to API-specific format msg) (define (my-build-payload messages tools model temp max-tokens) ;; Build the API request body `((model . ,model) (messages . ,(list->vector messages)))) (define (my-call-api endpoint payload) ;; Make HTTP request, return response alist ...) (define (my-parse-response response-data) ;; Return alist with: success, message, content, tool-calls, ;; finish-reason, input-tokens, output-tokens `((success . #t) (message . ...) (content . "response text") (tool-calls . #f) (finish-reason . "stop") (input-tokens . 100) (output-tokens . 50))) (define (my-extract-tool-calls message) ;; Return list of ((id . "...") (name . "...") (arguments . "json-string")) '()) (define (my-format-tool-result tool-call-id result) ;; Return tool result message in API format `((role . "tool") (tool_call_id . ,tool-call-id) (content . ,(json->string result)))) (define (my-get-model-pricing model-name) ;; Return ((input-price-per-1m . N) (output-price-per-1m . M)) '((input-price-per-1m . 1.00) (output-price-per-1m . 3.00))) (define my-provider (make-llm-provider 'my-provider my-prepare-message my-build-payload my-call-api my-parse-response my-format-tool-result my-get-model-pricing my-extract-tool-calls)) ;; Use it (llm/use-provider my-provider) ;; Or per-conversation (llm/chat system: "Hello" provider: my-provider) </enscript> === Examples ==== Simple Chat <enscript highlight="scheme"> (import llm) (define conv (llm/chat system: "You are a helpful assistant.")) (let-values ([(conv ok?) (llm/send conv "Tell me a joke.")]) (when ok? (print (llm/get-last-response conv)) (print "Cost: $" (llm/get-cost conv)))) </enscript> ==== Multi-turn Conversation <enscript highlight="scheme"> (import llm) (define conv (llm/chat system: "You are a math tutor.")) (let*-values ([(conv ok?) (llm/send conv "What is calculus?")] [(_ _) (when ok? (print (llm/get-last-response conv)))] [(conv ok?) (llm/send conv "Can you give me a simple example?")] [(_ _) (when ok? (print (llm/get-last-response conv)))]) (print "Total tokens: " (llm/get-tokens conv))) </enscript> ==== Tool Calling <enscript highlight="scheme"> (import llm srfi-19) (llm/register-tool! 'get-current-time '((type . "function") (function . ((name . "get_current_time") (description . "Get the current date and time") (parameters . ((type . "object") (properties . ()) (required . #())))))) (lambda (params) `((success . #t) (time . ,(date->string (current-date) "~Y-~m-~d ~H:~M:~S"))))) (define conv (llm/chat system: "You can tell the user the current time." tools: '(get-current-time))) (let-values ([(conv ok?) (llm/send conv "What time is it?")]) (when ok? (print (llm/get-last-response conv)))) </enscript> === License BSD-3-Clause === Version History ; 0.0.1 : Initial release with OpenAI support.
Description of your changes:
I would like to authenticate
Authentication
Username:
Password:
Spam control
What do you get when you add 15 to 8?