contenox/runtime: An open-source state machine for GenAI workflows in Go
77 views
Skip to first unread message
Alexander Ertli
unread,
Aug 30, 2025, 1:00:07 PM (7 days ago) Aug 30
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to golang-nuts
Hi everyone,
I'm excited to share a project I've been heads-down on for the better part of this year: an LLM backend management and orchestration API written in Go.
After a few failed attempts and several months of work, it's reached a point where I'd genuinely value feedback from developers who understand this space.
In a nutshell, it's a system to manage multiple LLM backends (Ollama, vLLM, OpenAI, etc.), execute complex, conditional workflows ("Task Chains") that can branch based on model output, call external hooks, and handle a variety of data types.
Key Components:
- Unified API: Manage models, backends, and provider configs through a single OpenAPI 3.1 spec.
- Affinity Groups: Control exactly which models are available to which backends for routing and access control.
- Powerful Task Engine: Define workflows with multiple steps that can conditionally branch, parse responses (as numbers, scores, ranges, etc.), and integrate with external systems via hooks.
- OpenAI-Compatible: Includes endpoints that mimic the OpenAI API, making it easier to integrate with existing tools.
It's Apache 2.0 licensed and available on GitHub.
__ I'd be incredibly grateful if you could take a look, star it if it seems interesting, and open an issue with any thoughts, feedback, or questions—no matter how small.