-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Description
We have configs, tests and code with ollama, but I believe using vLLM is a better option.
It allows more freedom in configuration and deployment.
Ollama, as I understand is "good for starters".
I'd love if someone could link comprehensive comparisons, otherwise happy to get this migration going.
01PrathamS
Metadata
Metadata
Assignees
Labels
No labels