Commit d581e7ba authored by Jan Reimes's avatar Jan Reimes
Browse files

feat(ai): implement TDC_AI_LLM_API_KEY precedence over provider-specific keys

- Add os import to summarize.py
- Read TDC_AI_LLM_API_KEY from environment in LiteLLMClient.complete()
- Pass api_key to litellm.completion() - takes precedence over provider-specific env vars
- Update documentation to clarify API key precedence and usage
- Add troubleshooting section explaining the alternative API key approach
parent 74d4a1bd
Loading
Loading
Loading
Loading
+9 −1
Original line number Diff line number Diff line
@@ -52,7 +52,7 @@ Configure AI processing via environment variables (see `.env.example`):
```bash
# LLM Configuration
TDC_AI_LLM_MODEL=openrouter/openrouter/free    # Default: free tier via OpenRouter
TDC_AI_LLM_API_KEY=your-api-key                # Required for cloud providers
TDC_AI_LLM_API_KEY=your-api-key                # Optional: takes precedence over provider-specific keys
TDC_AI_LLM_API_BASE=                           # Optional: custom endpoint

# Embedding Model
@@ -447,6 +447,14 @@ echo $OPENAI_API_KEY

**Solution:** LiteLLM expects the API key in a standard environment variable named `<PROVIDER>_API_KEY`. See the [Model Providers](#model-providers) table for the correct variable name for each provider.

**Alternative:** You can use `TDC_AI_LLM_API_KEY` as a universal API key that takes precedence over provider-specific keys. This is useful when you want to use a single API key across different providers (e.g., via OpenRouter or a proxy service):

```bash
export TDC_AI_LLM_API_KEY=your-key-here
```

If `TDC_AI_LLM_API_KEY` is set, it will be used instead of the provider-specific key (`OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.).

### Embedding Model Issues

**Problem:** `OSError: No sentence-transformers model found`
+5 −0
Original line number Diff line number Diff line
@@ -6,6 +6,7 @@ import hashlib
import importlib
import json
import logging
import os
import re

from tdoc_crawler.ai.config import AiConfig
@@ -117,6 +118,9 @@ class LiteLLMClient:
            raise RuntimeError(msg)

        try:
            # Check for TDC_AI_LLM_API_KEY - takes precedence over provider-specific keys
            api_key = os.environ.get("TDC_AI_LLM_API_KEY")

            response = self._client.completion(
                model=model or AiConfig().llm_model,
                messages=[
@@ -124,6 +128,7 @@ class LiteLLMClient:
                    {"role": "user", "content": prompt},
                ],
                max_tokens=max_tokens,
                api_key=api_key,  # Pass TDC_AI_LLM_API_KEY if set, otherwise None (LiteLLM uses provider-specific env vars)
            )
            return response.choices[0].message.content
        except Exception as e: