llm
A provider-agnostic LLM chat API client for CHICKEN Scheme with tool calling support.
Description
This egg provides a high-level interface for interacting with Large Language Model APIs. It supports:
- Multi-turn conversations with history management
- Tool/function calling with automatic execution
- File attachments (images, PDFs, text files)
- Token usage and cost tracking
- Provider abstraction for multiple LLM backends
OpenAI is included as the default provider. The modular architecture allows adding other providers (Anthropic, Mistral, etc.) without changing application code.
Author
Rolando Abarca
Repository
https://forgejo.rolando.cl/cpm/llm-egg
Requirements
Documentation
Quick Start
(import llm) ;; Create a conversation (define conv (llm/chat system: "You are a helpful assistant.")) ;; Send a message (let-values ([(conv ok?) (llm/send conv "What is 2 + 2?")]) (when ok? (print (llm/get-last-response conv))))
Environment Variables
- OPENAI_API_KEY
- Required for the OpenAI provider.
Module: llm
The main module providing the public API.
[procedure] (llm/chat #!key system tools history model temperature max-tokens on-response-received on-tool-executed provider)Create a new conversation. Returns a conversation state alist.
- system
- Optional system prompt string.
- tools
- Optional list of tool name symbols (kebab-case) to enable.
- history
- Optional existing message history to continue.
- model
- Optional model name string (default: gpt-5-nano-2025-08-07).
- temperature
- Optional temperature float (default: 1).
- max-tokens
- Optional maximum completion tokens (default: 4000).
- on-response-received
- Optional callback (lambda (message) ...).
- on-tool-executed
- Optional callback (lambda (name args result) ...).
- provider
- Optional provider record (default: openai-provider).
;; Basic conversation (define conv (llm/chat system: "You are a helpful assistant.")) ;; With all options (define conv (llm/chat system: "You can check the weather." tools: '(get-weather) model: "gpt-4o" temperature: 0.7 max-tokens: 1000 on-response-received: (lambda (msg) (print "Response: " msg)) on-tool-executed: (lambda (name args result) (print "Tool called: " name))))[procedure] (llm/send conversation message #!key file)
Send a message and get a response. Returns two values: the updated conversation and a success boolean.
- conversation
- Conversation state from llm/chat.
- message
- String message to send.
- file
- Optional local file path to attach (image, PDF, or text file).
;; Text only (let-values ([(conv ok?) (llm/send conv "What is 2+2?")]) (if ok? (print (llm/get-last-response conv)) (print "Request failed"))) ;; With image attachment (let-values ([(conv ok?) (llm/send conv "What's in this image?" file: "photo.jpg")]) (when ok? (print (llm/get-last-response conv)))) ;; With PDF attachment (let-values ([(conv ok?) (llm/send conv "Summarize this document" file: "report.pdf")]) (when ok? (print (llm/get-last-response conv))))[procedure] (llm/get-last-response conversation)
Get the text content of the last assistant message from the conversation. Returns #f if no assistant message exists.
[procedure] (llm/register-tool! name schema implementation)Register a tool in the global registry.
- name
- Symbol identifier in kebab-case (e.g., 'get-weather).
- schema
- Alist defining the function schema (use snake_case in function.name).
- implementation
- Procedure (lambda (params-alist) ...) returning a result alist.
(llm/register-tool!
'get-weather
'((type . "function")
(function . ((name . "get_weather")
(description . "Get current weather for a location")
(parameters . ((type . "object")
(properties . ((location . ((type . "string")
(description . "City name")))))
(required . #("location")))))))
(lambda (params)
(let ((location (alist-ref 'location params)))
`((success . #t)
(temperature . 72)
(conditions . "sunny")
(location . ,location)))))[procedure] (llm/get-registered-tools #!optional tool-names)
Get registered tool schemas as a vector. If tool-names (a list of kebab-case symbols) is provided, only those tools are returned.
[procedure] (llm/get-cost conversation)Get the total cost of the conversation in USD.
[procedure] (llm/get-tokens conversation)Get total token counts as a pair (input-tokens . output-tokens).
(let-values ([(conv ok?) (llm/send conv "Hello!")])
(when ok?
(let ((tokens (llm/get-tokens conv))
(cost (llm/get-cost conv)))
(print "Input tokens: " (car tokens))
(print "Output tokens: " (cdr tokens))
(print "Total cost: $" cost))))[parameter] llm/use-provider
Parameter to get or set the current default provider. Re-export of current-provider from llm-provider.
;; Get current provider (llm/use-provider) ; => openai-provider ;; Set a different default provider (llm/use-provider some-other-provider)
Module: llm-provider
Defines the provider abstraction layer.
[record] <llm-provider>Record type for LLM providers with the following fields:
- name
- Symbol identifying the provider (e.g., 'openai).
- prepare-message
- Procedure (message include-file?) -> provider-format-message.
- build-payload
- Procedure (messages tools model temp max-tokens) -> payload-alist.
- call-api
- Procedure (endpoint payload) -> response-alist.
- parse-response
- Procedure (response-data) -> normalized-response.
- format-tool-result
- Procedure (tool-call-id result) -> tool-message.
- get-model-pricing
- Procedure (model-name) -> pricing-alist.
- extract-tool-calls
- Procedure (response-message) -> list-of-tool-calls.
Constructor for provider records.
[procedure] (llm-provider? obj)Predicate to check if obj is an <llm-provider> record.
[procedure] (llm-provider-name provider)[procedure] (llm-provider-prepare-message provider)
[procedure] (llm-provider-build-payload provider)
[procedure] (llm-provider-call-api provider)
[procedure] (llm-provider-parse-response provider)
[procedure] (llm-provider-format-tool-result provider)
[procedure] (llm-provider-get-model-pricing provider)
[procedure] (llm-provider-extract-tool-calls provider)
Accessor procedures for provider record fields.
[parameter] current-providerParameter holding the current default provider. Initially #f until set by importing llm (which sets it to openai-provider).
Module: llm-openai
OpenAI provider implementation.
[constant] openai-providerThe OpenAI provider instance. This is the default provider when using the llm module.
[parameter] openai-http-clientParameter holding the HTTP client procedure. Can be overridden for testing or custom HTTP handling. The procedure signature is (endpoint payload) -> response-alist.
;; Mock the HTTP client for testing (openai-http-client (lambda (endpoint payload) '((choices . #(((message . ((content . "Test response") (role . "assistant"))) (finish_reason . "stop")))) (usage . ((prompt_tokens . 10) (completion_tokens . 5))))))[procedure] (openai-call-api endpoint payload)
Make an API request using the current openai-http-client.
[constant] *openai-default-model*Default model string: "gpt-5-nano-2025-08-07".
[constant] *openai-default-temperature*Default temperature: 1.
[constant] *openai-default-max-tokens*Default max completion tokens: 4000.
Module: llm-common
Shared utilities used by providers.
[procedure] (detect-mime-type path)Detect MIME type from file extension. Supports JPEG, PNG, GIF, WebP, PDF, and common text files.
[procedure] (image-mime-type? mime)Returns #t if mime starts with "image/".
[procedure] (pdf-mime-type? mime)Returns #t if mime is "application/pdf".
[procedure] (vision-mime-type? mime)Returns #t if mime is an image or PDF (types supported by vision models).
[procedure] (read-file-base64 path)Read a file and return its contents as a base64-encoded string.
[procedure] (read-text-file path)Read a text file and return its contents as a string.
[procedure] (kebab->snake symbol)Convert a kebab-case symbol to snake_case. Example: 'get-current-time becomes 'get_current_time.
Creating a New Provider
To add support for a new LLM backend, implement the provider interface:
(import llm-provider llm-common) (define (my-prepare-message msg include-file?) ;; Convert internal message format to API-specific format msg) (define (my-build-payload messages tools model temp max-tokens) ;; Build the API request body `((model . ,model) (messages . ,(list->vector messages)))) (define (my-call-api endpoint payload) ;; Make HTTP request, return response alist ...) (define (my-parse-response response-data) ;; Return alist with: success, message, content, tool-calls, ;; finish-reason, input-tokens, output-tokens `((success . #t) (message . ...) (content . "response text") (tool-calls . #f) (finish-reason . "stop") (input-tokens . 100) (output-tokens . 50))) (define (my-extract-tool-calls message) ;; Return list of ((id . "...") (name . "...") (arguments . "json-string")) '()) (define (my-format-tool-result tool-call-id result) ;; Return tool result message in API format `((role . "tool") (tool_call_id . ,tool-call-id) (content . ,(json->string result)))) (define (my-get-model-pricing model-name) ;; Return ((input-price-per-1m . N) (output-price-per-1m . M)) '((input-price-per-1m . 1.00) (output-price-per-1m . 3.00))) (define my-provider (make-llm-provider 'my-provider my-prepare-message my-build-payload my-call-api my-parse-response my-format-tool-result my-get-model-pricing my-extract-tool-calls)) ;; Use it (llm/use-provider my-provider) ;; Or per-conversation (llm/chat system: "Hello" provider: my-provider)
Examples
Simple Chat
(import llm) (define conv (llm/chat system: "You are a helpful assistant.")) (let-values ([(conv ok?) (llm/send conv "Tell me a joke.")]) (when ok? (print (llm/get-last-response conv)) (print "Cost: $" (llm/get-cost conv))))
Multi-turn Conversation
(import llm) (define conv (llm/chat system: "You are a math tutor.")) (let*-values ([(conv ok?) (llm/send conv "What is calculus?")] [(_ _) (when ok? (print (llm/get-last-response conv)))] [(conv ok?) (llm/send conv "Can you give me a simple example?")] [(_ _) (when ok? (print (llm/get-last-response conv)))]) (print "Total tokens: " (llm/get-tokens conv)))
Tool Calling
(import llm srfi-19) (llm/register-tool! 'get-current-time '((type . "function") (function . ((name . "get_current_time") (description . "Get the current date and time") (parameters . ((type . "object") (properties . ()) (required . #())))))) (lambda (params) `((success . #t) (time . ,(date->string (current-date) "~Y-~m-~d ~H:~M:~S"))))) (define conv (llm/chat system: "You can tell the user the current time." tools: '(get-current-time))) (let-values ([(conv ok?) (llm/send conv "What time is it?")]) (when ok? (print (llm/get-last-response conv))))
License
BSD-3-Clause
Version History
- 0.0.1
- Initial release with OpenAI support.