|
|
||
|---|---|---|
| .vscode | ||
| docker | ||
| notebooks | ||
| scratches/izapata | ||
| scripts | ||
| .gitignore | ||
| README.md | ||
| changelog | ||
| pyproject.toml | ||
| uv.lock | ||
README.md
Brunix Assistance Engine
The Brunix Assistance Engine is a high-performance, gRPC-powered AI orchestration service. It serves as the core intelligence layer for the Brunix ecosystem, integrating advanced RAG (Retrieval-Augmented Generation) capabilities with real-time observability.
This project is a strategic joint development:
- 101OBEX Corp: Infrastructure, System Architecture, and the proprietary AVAP Technology stack.
- MrHouston: Advanced LLM Fine-tuning, Model Training, and Prompt Engineering.
System Architecture (Hybrid Dev Mode)
The engine runs locally for development but connects to the production-grade infrastructure in the Vultr Cloud (Devaron Cluster) via secure kubectl tunnels.
graph TD
subgraph Local_Workstation [Developer]
BE[Brunix Assistance Engine - Docker]
KT[Kubectl Port-Forward Tunnels]
end
subgraph Vultr_K8s_Cluster [Production - Devaron Cluster]
OL[Ollama Light Service - LLM]
EDB[(Elasticsearch Vector DB)]
PG[(Postgres - Langfuse Data)]
LF[Langfuse UI - Web]
end
BE -- localhost:11434 --> KT
BE -- localhost:9200 --> KT
BE -- localhost:5432 --> KT
KT -- Secure Link --> OL
KT -- Secure Link --> EDB
KT -- Secure Link --> PG
Developer -- Browser --> LF
Project Structure
.
├── Dockerfile # Container definition for the Engine
├── README.md # System documentation & Dev guide
├── changelog # Version tracking and release history
├── docker-compose.yaml # Local orchestration for dev environment
├── protos/
│ └── brunix.proto # Protocol Buffers: The source of truth for the API
└── src/
└── server.py # Core Logic: gRPC Server & RAG Orchestration
Data Flow & RAG Orchestration
The following diagram illustrates the sequence of a single AskAgent request, detailing the retrieval and generation phases through the secure tunnel.
sequenceDiagram
participant U as External Client (gRPCurl/App)
participant E as Brunix Engine (Local Docker)
participant T as Kubectl Tunnel
participant V as Vector DB (Vultr)
participant O as Ollama Light (Vultr)
U->>E: AskAgent(query, session_id)
Note over E: Start Langfuse Trace
E->>T: Search Context (Embeddings)
T->>V: Query Index [avap_manuals]
V-->>T: Return Relevant Chunks
T-->>E: Contextual Data
E->>T: Generate Completion (Prompt + Context)
T->>O: Stream Tokens (qwen2.5:1.5b)
loop Token Streaming
O-->>T: Token
T-->>E: Token
E-->>U: gRPC Stream Response {text, avap_code}
end
Note over E: Close Langfuse Trace
Development Setup
1. Prerequisites
- Docker & Docker Compose
- gRPCurl (
brew install grpcurl) - Access Credentials: Ensure the file
./ivar.yaml(Kubeconfig) is present in the root directory.
2. Observability Setup (Langfuse)
The engine utilizes Langfuse for end-to-end tracing and performance monitoring.
- Access the Dashboard: http://45.77.119.180
- Create a project and generate API Keys in Settings.
- Configure your local
.envfile:
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_HOST=http://45.77.119.180
3. Infrastructure Tunnels
Open a terminal and establish the connection to the Devaron Cluster:
# 1. AI Model Tunnel (Ollama)
kubectl port-forward --address 0.0.0.0 svc/ollama-light-service 11434:11434 -n brunix --kubeconfig ./kubernetes/ivar.yaml &
# 2. Knowledge Base Tunnel (Elasticsearch)
kubectl port-forward --address 0.0.0.0 svc/brunix-vector-db 9200:9200 -n brunix --kubeconfig ./kubernetes/ivar.yaml &
# 3. Observability DB Tunnel (PostgreSQL)
kubectl port-forward --address 0.0.0.0 svc/brunix-postgres 5432:5432 -n brunix --kubeconfig ./kubernetes/ivar.yaml &
4. Launch the Engine
docker-compose up -d --build
Testing & Debugging
The service is exposed on port 50052 with gRPC Reflection enabled.
Streaming Query Example
grpcurl -plaintext \
-d '{"query": "Hola Brunix, ¿qué es AVAP?", "session_id": "dev-test-123"}' \
localhost:50052 \
brunix.AssistanceEngine/AskAgent
API Contract (Protobuf)
To update the communication interface, modify protos/brunix.proto and re-generate the stubs:
python -m grpc_tools.protoc -I./protos --python_out=./src --grpc_python_out=./src ./protos/brunix.proto
Security & Intellectual Property
- Data Privacy: All LLM processing and vector searches are conducted within a private Kubernetes environment.
- Proprietary Technology: This repository contains the AVAP Technology stack (101OBEX) and specialized training logic (MrHouston). Unauthorized distribution is prohibited.