Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix internvl2.5 error after eviction #3122

Merged
merged 2 commits into from
Feb 11, 2025

Conversation

grimoire
Copy link
Collaborator

@grimoire grimoire commented Feb 8, 2025

Model generate padded image token could lead to error after eviction.
Note that a better solution is to set a pad token that would never generated.

@grimoire grimoire marked this pull request as ready for review February 10, 2025 05:59
@lvhan028
Copy link
Collaborator

May change image_token_id here

dict(pixel_values=pixel_values, image_tokens=image_tokens, image_token_id=0, image_size=image.size))

@lvhan028 lvhan028 self-requested a review February 10, 2025 11:16
@grimoire
Copy link
Collaborator Author

@lvhan028 does this have side effect on turbomind?

@lvhan028
Copy link
Collaborator

No

@lvhan028 does this have side effect on turbomind?

@lvhan028 lvhan028 merged commit e91ccf0 into InternLM:main Feb 11, 2025
5 checks passed
tastelikefeet added a commit to tastelikefeet/lmdeploy that referenced this pull request Feb 17, 2025
* main: (90 commits)
  Fix cogvlm and phi3vision (InternLM#3137)
  support release pipeline (InternLM#3069)
  [ci] fix some fail in daily testcase (InternLM#3134)
  Fix internvl2.5 error after eviction (InternLM#3122)
  fix UT of deepseek chat template (InternLM#3125)
  Update benchmark script and user guide (InternLM#3110)
  bump version to v0.7.0.post3 (InternLM#3115)
  fix postional argument (InternLM#3086)
  remove logitswarper (InternLM#3109)
  [Fix] fix the URL judgment problem in Windows (InternLM#3103)
  fix user guide about cogvlm deployment (InternLM#3088)
  add option max-concurrent-requests for api_server(InternLM#2961)
  bump version to v0.7.0.post2 (InternLM#3094)
  Fix xcomposer2d5 (InternLM#3087)
  Add system role to deepseek chat template (InternLM#3031)
  Update tokenizer (InternLM#3061)
  Add deepseek-r1 chat template (InternLM#3072)
  bump version to v0.7.0.post1 (InternLM#3076)
  More arguments in api_client, update docstrings (InternLM#3077)
  fix sliding window mgr (InternLM#3068)
  ...

# Conflicts:
#	lmdeploy/turbomind/turbomind.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants