Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AssertionError when using model "google/gemma-2b" with multi-gpus #500

Open
2 of 4 tasks
tritct opened this issue Jun 6, 2024 · 0 comments
Open
2 of 4 tasks

AssertionError when using model "google/gemma-2b" with multi-gpus #500

tritct opened this issue Jun 6, 2024 · 0 comments

Comments

@tritct
Copy link

tritct commented Jun 6, 2024

System Info

Screenshot 2024-06-06 114916

Information

  • Docker
  • The CLI directly

Tasks

  • An officially supported command
  • My own modifications

Reproduction

I'm trying to run Docker on 2 A16 GPUS using model_id "google/gemma-2b". But after the model downloading step I run into AssertionError like the following.

image

image

Expected behavior

When I run with only 1 GPU it can initialize just fine. This issue only happen when I try to use multigpu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant