Skip to content

Configure the Deployment

With the cluster ready, use values.yaml to configure Cosine. This file becomes the source of truth for what’s running in your environment.

Set the tag for each service to the Cosine version you want to run so every component stays in lockstep:

image:
repository: registry.cosine.enterprises/azure/api
pullPolicy: IfNotPresent
tag: REPLACE_WITH_IMAGE_TAG

Repeat for dashboard, api, ccode, and parser.

Provide the inference endpoint and the model you want to use. These settings control the default model Cosine will reach for:

- name: LITELLM_BASE_URL
value: "http://RESOURCE_NAME.openai.azure.com/openai/v1"
- name: LITELLM_GENIE_MODEL_OVERRIDE
value: "gpt-5.2"
- name: LITELLM_API_KEY
valueFrom:
secretKeyRef:
name: llm-auth
key: token

Set the dashboard and API URLs, then update the ingress hostnames so the UI and backend agree on their public address:

dashboard:
env:
- name: DASHBOARD_URL
value: "https://REPLACE_WITH_YOUR_DOMAIN"
- name: API_URL
value: "https://REPLACE_WITH_YOUR_DOMAIN/api"
- name: NEXT_PUBLIC_API_URL
value: "https://REPLACE_WITH_YOUR_DOMAIN/api"
ingress:
dashboardHostname: DOMAIN
tls:
- secretName: domain-tls
hosts:
- DOMAIN
env:
- name: API_URL
value: "https://REPLACE_WITH_YOUR_DOMAIN/api"
- name: DASHBOARD_URL
value: "https://REPLACE_WITH_YOUR_DOMAIN"
- name: COORS_ORIGIN
value: "https://REPLACE_WITH_YOUR_DOMAIN"
- name: DOMAIN
value: "REPLACE_WITH_YOUR_DOMAIN"

Set JWT_SECRET and COOKIE_SECRET to 32‑character hex strings. You can generate them with:

Terminal window
openssl rand -hex 32

Then update values.yaml:

- name: JWT_SECRET
value: "REPLACE_WITH_VALUE"
- name: COOKIE_SECRET
value: "REPLACE_WITH_VALUE"