Skip to content

OpenAI And Codex SDK

LeafEnterprise uses OpenAI-facing SDKs behind backend-owned interfaces.

Azure OpenAI Runtime

LeafEnterprise/ai/azure_openai_client.py is the SDK-aware Azure OpenAI client path. Existing deployments keep LEAFENTERPRISE_AI_TRANSPORT=chat_completions as compatibility default. responses is opt-in when the Azure/OpenAI runtime supports it.

OpenAI Agents SDK

LeafEnterprise/agents/runner.py is the first OpenAI Agents SDK execution hook for the internal workbench ledger.

Execution rules:

  • run only after task policy allows the lane, provider, source systems, and mutation level;
  • expose backend tools for task context, validation posture, ITR evidence bundles, search documents, and SQL validation;
  • record explicit blocked status if the SDK is missing;
  • never claim a write occurred unless a backend tool confirms it.

Agent Workers

Agent work that becomes long-running, source-touching, or mutation-capable should move through the backend worker model:

Layer Role
Agent task ledger Records request, lane, policy posture, source systems, status, traces, and artifacts.
SQL guard Validates generated SQL before execution tools are allowed.
Service Bus dispatch Moves approved work to worker queues such as agent-tasks.
AKS lane worker Runs durable work with lane-scoped identity, source credentials, and artifact output.
Trace/artifact ledger Proves what ran, what was blocked, and what output can be trusted.

Agent SDKs are support tools. They do not replace deterministic formulas, evidence gates, or source-system ownership.

Codex SDK

The root package.json includes @openai/codex-sdk and exposes npm run codex:job for engineering automations. Codex SDK work should remain backend/operator scoped and should not bypass LeafEnterprise policy, trace, or artifact rules.

Fallback Rule

AI provider failures must not break deterministic business logic. Module copilots and optimization flows must keep deterministic fallback and surface provider limitations clearly.