Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GLM-4-9B-Chat #1724

Merged
merged 19 commits into from
Jun 21, 2024
Merged

Add GLM-4-9B-Chat #1724

merged 19 commits into from
Jun 21, 2024

Conversation

lzhangzz
Copy link
Collaborator

@lzhangzz lzhangzz commented Jun 6, 2024

I'm not certain about the correctness of the chat template though.

@lvhan028 lvhan028 self-requested a review June 6, 2024 12:08
@lvhan028 lvhan028 added the enhancement New feature or request label Jun 18, 2024
@lvhan028
Copy link
Collaborator

evaluation pass

  • gsm8k,1d7fe4,accuracy,gen,77.56

@lvhan028 lvhan028 requested a review from AllentDan June 21, 2024 07:30
@lvhan028 lvhan028 mentioned this pull request Jun 21, 2024
}
except Exception as e:
# qwen-vl
assert str(e) == 'Unclosed image token'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@irexyc Does this influence VLMs?

Copy link
Collaborator

@AllentDan AllentDan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested OK with lmdeploy serve api_server

@lvhan028 lvhan028 merged commit e95213d into InternLM:main Jun 21, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants