-
Notifications
You must be signed in to change notification settings - Fork 496
Ai config #358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Ai config #358
Conversation
| logger.error("openai_api_key not provided") | ||
| return jsonify({"message": "openai_api_key not provided"}), 400 | ||
| openai_api_key: str = data["openai_api_key"] | ||
| logger.debug("OpenAI API Key %s", openai_api_key[:5]) |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information High
sensitive data (password)
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
In general, the fix is to avoid logging any portion of sensitive credentials, even with masking or truncation, and instead log non‑sensitive metadata (e.g., that initialization occurred, which provider, maybe a hashed or opaque identifier if truly needed). Here, the simplest and safest change is to remove the API key value from the debug message entirely and replace it with a generic message that does not include the key contents.
Concretely, in services/chatbot/src/chatbot/chat_api.py, within the init() function, change line 80 from logger.debug("OpenAI API Key %s", openai_api_key[:5]) to a message like logger.debug("Received OpenAI API key for session %s", session_id) or just logger.debug("Received OpenAI API key"). This preserves the intent of having a debug trace that the key was provided, without exposing any part of the secret. No new imports or helper methods are required; we only change the log message and its parameters. The rest of the logic (storing and using the key) remains unchanged.
-
Copy modified line R80
| @@ -77,7 +77,7 @@ | ||
| logger.error("openai_api_key not provided") | ||
| return jsonify({"message": "openai_api_key not provided"}), 400 | ||
| openai_api_key: str = data["openai_api_key"] | ||
| logger.debug("OpenAI API Key %s", openai_api_key[:5]) | ||
| logger.debug("Received OpenAI API key for session %s", session_id) | ||
| await store_api_key(session_id, openai_api_key, provider) | ||
| return jsonify({"message": "Initialized"}), 200 | ||
| if provider == "anthropic": |
| logger.error("anthropic_api_key not provided") | ||
| return jsonify({"message": "anthropic_api_key not provided"}), 400 | ||
| anthropic_api_key: str = data["anthropic_api_key"] | ||
| logger.debug("Anthropic API Key %s", anthropic_api_key[:5]) |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information High
sensitive data (password)
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
In general, to fix clear-text logging of sensitive information, remove the sensitive value from log messages, or replace it with non-sensitive metadata (e.g., a constant message like “API key provided” or a derived, non-reversible identifier). Do not log any part of API keys, passwords, tokens, or similar secrets, even at debug level.
For this specific code, the minimal fix without changing functionality is to adjust the logger.debug call on line 95 so that it doesn’t include anthropic_api_key[:5] at all. The log can still indicate that an Anthropic API key was received or that initialization is proceeding, but must not print any portion of the key. We do not need new imports, helper methods, or structural changes; just update the logging statement.
Concretely, in services/chatbot/src/chatbot/chat_api.py, change:
95: logger.debug("Anthropic API Key %s", anthropic_api_key[:5])to something like:
95: logger.debug("Anthropic API Key received for session %s", session_id)This preserves useful debugging information (which session is being initialized) while eliminating the sensitive data from the sink. No other lines require modification for this specific issue.
-
Copy modified line R95
| @@ -92,7 +92,7 @@ | ||
| logger.error("anthropic_api_key not provided") | ||
| return jsonify({"message": "anthropic_api_key not provided"}), 400 | ||
| anthropic_api_key: str = data["anthropic_api_key"] | ||
| logger.debug("Anthropic API Key %s", anthropic_api_key[:5]) | ||
| logger.debug("Anthropic API Key received for session %s", session_id) | ||
| await store_api_key(session_id, anthropic_api_key, provider) | ||
| return jsonify({"message": "Initialized"}), 200 | ||
| error = _validate_provider_env(provider) |
| if provider in {"openai", "anthropic"} and provider_api_key: | ||
| logger.debug( | ||
| "OpenAI API Key for session %s: %s", session_id, openai_api_key[:5] | ||
| "Provider API key for session %s: %s", session_id, provider_api_key[:5] |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information High
sensitive data (password)
This expression logs
sensitive data (password)
This expression logs
sensitive data (password)
This expression logs
sensitive data (password)
This expression logs
sensitive data (password)
This expression logs
sensitive data (password)
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 23 hours ago
General approach: Do not log API keys (or any part of them) at all. Instead, log nonsensitive metadata, such as whether a key is configured/initialized and for which session or provider. Avoid emitting environment-derived secrets into logs anywhere.
Concrete fix for this codebase:
- In
chat_api.py’sstate()endpoint, replace thelogger.debugcall that includesprovider_api_key[:5]with a message that indicates only the presence of a key (e.g.,True/False) and the session ID. This removes the direct flow of secret data into the logger while preserving observability about initialization. - No changes are needed in
config.pyorsession_service.pyfor logging, because they currently only read and return secrets, not log them. CodeQL’s variants 2–5 trace the same sensitive sources but the only actual sink is the logging call inchat_api.py. Once that sink no longer contains the key, all variants will be addressed.
Details:
- File:
services/chatbot/src/chatbot/chat_api.py- Region around lines 145–155.
- Change:
- From:
logger.debug("Provider API key for session %s: %s", session_id, provider_api_key[:5]) - To:
logger.debug("Provider API key configured for session %s: %s", session_id, bool(provider_api_key))
- From:
- This keeps the debug log informative (you can still see whether a key is configured) without ever including any part of the secret value.
No new imports, methods, or definitions are required.
-
Copy modified lines R153-R155
| @@ -150,7 +150,9 @@ | ||
| provider_api_key = await get_api_key(session_id) | ||
| if provider in {"openai", "anthropic"} and provider_api_key: | ||
| logger.debug( | ||
| "Provider API key for session %s: %s", session_id, provider_api_key[:5] | ||
| "Provider API key configured for session %s: %s", | ||
| session_id, | ||
| bool(provider_api_key), | ||
| ) | ||
| chat_history = await get_chat_history(session_id) | ||
| # Limit chat history to last 20 messages |
Test Results93 tests 93 ✅ 2s ⏱️ Results for commit eab5398. |
☂️ Python Coverage
Overall Coverage
New FilesNo new covered files... Modified FilesNo covered modified files...
|
Description
Add other AI usage
Testing
Local
Documentation
Make sure that you have documented corresponding changes in this repository.