-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add LM Studio Support and Reasoning Model Panel #576
base: master
Are you sure you want to change the base?
Conversation
how to use this |
i am getting this error 2025-01-29 12:18:06 [ERROR] |
Set the end point in the config.yml [API_ENDPOINTS] |
@haddadrm How can I use your version? I've tried installing from the "feature-improvement" branch, but I still don't have the LMStudio option, even though I've configured LMSTUDIO="http://host.docker.internal:1234/v1" |
LGTM, will merge after testing |
Hey, could you please merge the master branch into the feature-improvement branch and resolve any conflicts so that I can merge it? |
303992c
to
802627a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix time out error importing google font
6b72114
to
143d931
Compare
LM Studio Integration: - Added LM Studio provider with OpenAI-compatible API support - Dynamic model discovery via /v1/models endpoint - Support for both chat and embeddings models - Docker-compatible networking configuration Thinking Model Panel: - Added collapsible UI panel for model's chain of thought - Parses responses with <think> tags to separate reasoning - Maintains backward compatibility with regular responses - Styled consistently with app theme for light/dark modes - Preserves all existing message functionality (sources, markdown, etc.) These improvements enhance the app's compatibility with local LLMs and provide better visibility into model reasoning processes while maintaining existing functionality.
- Added LM Studio provider with OpenAI-compatible API support - Dynamic model discovery via /v1/models endpoint - Support for both chat and embeddings models - Docker-compatible networking configuration - Thinking Model Panel: Added collapsible UI panel for model's chain of thought -Parses responses with tags to separate reasoning -Maintains backward compatibility with regular responses -Styled consistently with app theme for light/dark modes -Preserves all existing message functionality (sources, markdown, etc.) These improvements enhance the app's compatibility with local LLMs and provide better visibility into model reasoning processes while maintaining existing functionality.
143d931
to
5220aba
Compare
- Replace health check endpoint with proper model listing endpoint - Remove workaround that returned 200 for unexpected GET /v1/health
@haddadrm 2025-02-17 15:53:07 error: Cannot read properties of undefined (reading 'OPENAI') |
Check latest branch as of yesterday |
LM Studio Integration:
Reasoning Model Panel:
These improvements enhance the app's compatibility with local LLMs and provide better visibility into model reasoning processes while maintaining existing functionality.