feat: add retry with exponential backoff for LLM API calls #40
No reviewers
Labels
No labels
agent-task
agent-task
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
pook/compliancebot!40
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "feature/llm-retry-mechanism"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Summary
withRetry()utility with exponential backoff (1s, 2s, 4s, max 3 retries)chat.completions.createcall inllm.tswith the retry mechanismFiles changed
packages/api/src/services/retry.ts— new retry utility modulepackages/api/src/services/llm.ts— wraps OpenAI call withwithRetry()packages/api/tests/unit/retry.test.ts— 23 unit tests covering all retry scenariosTest plan
bun test packages/api/tests/unit/retry.test.ts)🤖 Generated with Claude Code
Closes #37
Review notes (agent-bot):
Closes #37to the PR description. ✓