main idea is now working :)
Using openai for tts and groq for ollama fast inference
This commit is contained in:
2
.env
2
.env
@ -14,7 +14,7 @@ AIDER_MODEL=
|
||||
AIDER_4=false
|
||||
#AIDER_35TURBO=
|
||||
|
||||
# OPENAI_API_KEY=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
|
||||
# OPENAI_API_KEY=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
|
||||
# OPENAI_API_BASE=https://api.deepseek.com/v1
|
||||
# OPENAI_API_KEY=sk-99df7736351f4536bd72cd64a416318a
|
||||
# AIDER_MODEL=deepseek-coder #deepseek-coder, deepseek-chat
|
||||
|
Reference in New Issue
Block a user