Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add LM Studio Support and Reasoning Model Panel #576

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

haddadrm
Copy link

@haddadrm haddadrm commented Jan 26, 2025

LM Studio Integration:

  - Added LM Studio provider with OpenAI-compatible API support
  - Dynamic model discovery via /v1/models endpoint
  - Support for both chat and embeddings models
  - Docker-compatible networking configuration
  - Requires adding LM Studio API URL to config.toml:
          (Docker)
                  [MODELS.LMSTUDIO]
                  API_URL = "http://host.docker.internal:1234"
          (Non docker)
                  [MODELS.LMSTUDIO]
                  API_URL = "http://localhost:1234"

Reasoning Model Panel:

  - Added collapsible UI panel for model's chain of thought
  - Parses responses with <think> tags to separate reasoning
  - Maintains backward compatibility with regular responses
  - Styled consistently with app theme for light/dark modes
  - Preserves all existing message functionality (sources, markdown, etc.)

These improvements enhance the app's compatibility with local LLMs and provide better visibility into model reasoning processes while maintaining existing functionality.

image
image
Screenshot 2025-02-16 125848
Screenshot 2025-02-16 130231

@tanm-sys
Copy link

how to use this

@tanm-sys
Copy link

i am getting this error 2025-01-29 12:18:06 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:18:06 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:18:06 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:18:06 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:37 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:38 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:42 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:19:42 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:20:12 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:20:12 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:20:12 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway
2025-01-29 12:20:12 [ERROR]
Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway

@haddadrm
Copy link
Author

haddadrm commented Feb 1, 2025

how to use this

Set the end point in the config.yml

[API_ENDPOINTS]
LMSTUDIO = "http://host.docker.internal:1234/v1"

@User-Clb
Copy link

@haddadrm How can I use your version? I've tried installing from the "feature-improvement" branch, but I still don't have the LMStudio option, even though I've configured LMSTUDIO="http://host.docker.internal:1234/v1"

@ItzCrazyKns
Copy link
Owner

LGTM, will merge after testing

@ItzCrazyKns ItzCrazyKns self-requested a review February 13, 2025 18:49
@ItzCrazyKns
Copy link
Owner

Hey, could you please merge the master branch into the feature-improvement branch and resolve any conflicts so that I can merge it?

Copy link
Author

@haddadrm haddadrm Feb 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix time out error importing google font

@haddadrm haddadrm force-pushed the feature-improvement branch from 6b72114 to 143d931 Compare February 15, 2025 22:16
LM Studio Integration:
- Added LM Studio provider with OpenAI-compatible API support
- Dynamic model discovery via /v1/models endpoint
- Support for both chat and embeddings models
- Docker-compatible networking configuration

Thinking Model Panel:
- Added collapsible UI panel for model's chain of thought
- Parses responses with <think> tags to separate reasoning
- Maintains backward compatibility with regular responses
- Styled consistently with app theme for light/dark modes
- Preserves all existing message functionality (sources, markdown, etc.)

These improvements enhance the app's compatibility with local LLMs and
provide better visibility into model reasoning processes while maintaining
existing functionality.
- Added LM Studio provider with OpenAI-compatible API support
- Dynamic model discovery via /v1/models endpoint
- Support for both chat and embeddings models
- Docker-compatible networking configuration
- Thinking Model Panel:

Added collapsible UI panel for model's chain of thought
-Parses responses with tags to separate reasoning
-Maintains backward compatibility with regular responses
-Styled consistently with app theme for light/dark modes
-Preserves all existing message functionality (sources, markdown, etc.)

These improvements enhance the app's compatibility with local LLMs and provide better visibility into model reasoning processes while maintaining existing functionality.
@haddadrm haddadrm force-pushed the feature-improvement branch from 143d931 to 5220aba Compare February 15, 2025 23:01
@haddadrm haddadrm changed the title feat: Add LM Studio Support and Thinking Model Panel feat: Add LM Studio Support and Reasoning Model Panel Feb 16, 2025
- Replace health check endpoint with proper model listing endpoint
- Remove workaround that returned 200 for unexpected GET /v1/health
@User-Clb
Copy link

@haddadrm 2025-02-17 15:53:07 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:09 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:13 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:19 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:21 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:41 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:48 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:50 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:53:54 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:54:02 error: Cannot read properties of undefined (reading 'OPENAI')
2025-02-17 15:54:10 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI')
image
image
image

@haddadrm
Copy link
Author

@haddadrm 2025-02-17 15:53:07 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:09 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:13 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:19 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:21 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:41 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:48 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:50 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:53:54 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:54:02 error: Cannot read properties of undefined (reading 'OPENAI') 2025-02-17 15:54:10 error: Error getting config: Cannot read properties of undefined (reading 'OPENAI') image image image

Check latest branch as of yesterday

@haddadrm haddadrm closed this Feb 17, 2025
@haddadrm haddadrm reopened this Feb 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants