Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coca_ViT-B-32: self._encode_text bug #897

Open
haideraltahan opened this issue Jun 14, 2024 · 2 comments
Open

coca_ViT-B-32: self._encode_text bug #897

haideraltahan opened this issue Jun 14, 2024 · 2 comments

Comments

@haideraltahan
Copy link

when doing a forward method on a coca model:
model_out = self.model(images, input_texts)

I get an ValueError: too many values to unpack (expected 2) on line 145:
text_latent, token_emb = self.text(text)

Here is full error:

[rank1]: Traceback (most recent call last):
[rank1]:   File "/opt/hpcaas/.mounts/fs-0565f60d669b6a2d3/home/haideraltahan/RelationCLIP/main.py", line 82, in <module>
[rank1]:     fire.Fire(main)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/fire/core.py", line 143, in Fire
[rank1]:     component_trace = _Fire(component, args, parsed_flag_args, context, name)
[rank1]:                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/fire/core.py", line 477, in _Fire
[rank1]:     component, remaining_args = _CallAndUpdateTrace(
[rank1]:                                 ^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/fire/core.py", line 693, in _CallAndUpdateTrace
[rank1]:     component = fn(*varargs, **kwargs)
[rank1]:                 ^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/opt/hpcaas/.mounts/fs-0565f60d669b6a2d3/home/haideraltahan/RelationCLIP/main.py", line 78, in main
[rank1]:     trainer.fit(model=net, datamodule=data, ckpt_path=checkpoint)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/trainer.py", line 544, in fit
[rank1]:     call._call_and_handle_interrupt(
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/call.py", line 43, in _call_and_handle_interrupt
[rank1]:     return trainer.strategy.launcher.launch(trainer_fn, *args, trainer=trainer, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/launchers/subprocess_script.py", line 105, in launch
[rank1]:     return function(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/trainer.py", line 580, in _fit_impl
[rank1]:     self._run(model, ckpt_path=ckpt_path)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/trainer.py", line 987, in _run
[rank1]:     results = self._run_stage()
[rank1]:               ^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/trainer.py", line 1033, in _run_stage
[rank1]:     self.fit_loop.run()
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/fit_loop.py", line 205, in run
[rank1]:     self.advance()
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/fit_loop.py", line 363, in advance
[rank1]:     self.epoch_loop.run(self._data_fetcher)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/training_epoch_loop.py", line 140, in run
[rank1]:     self.advance(data_fetcher)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/training_epoch_loop.py", line 250, in advance
[rank1]:     batch_output = self.automatic_optimization.run(trainer.optimizers[0], batch_idx, kwargs)
[rank1]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/optimization/automatic.py", line 190, in run
[rank1]:     self._optimizer_step(batch_idx, closure)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/optimization/automatic.py", line 268, in _optimizer_step
[rank1]:     call._call_lightning_module_hook(
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/call.py", line 157, in _call_lightning_module_hook
[rank1]:     output = fn(*args, **kwargs)
[rank1]:              ^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/core/module.py", line 1303, in optimizer_step
[rank1]:     optimizer.step(closure=optimizer_closure)
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/core/optimizer.py", line 152, in step
[rank1]:     step_output = self._strategy.optimizer_step(self._optimizer, closure, **kwargs)
[rank1]:                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/ddp.py", line 270, in optimizer_step
[rank1]:     optimizer_output = super().optimizer_step(optimizer, closure, model, **kwargs)
[rank1]:                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/strategy.py", line 239, in optimizer_step
[rank1]:     return self.precision_plugin.optimizer_step(optimizer, model=model, closure=closure, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/plugins/precision/amp.py", line 77, in optimizer_step
[rank1]:     return super().optimizer_step(optimizer, model=model, closure=closure, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/plugins/precision/precision.py", line 122, in optimizer_step
[rank1]:     return optimizer.step(closure=closure, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/optim/lr_scheduler.py", line 75, in wrapper
[rank1]:     return wrapped(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/optim/optimizer.py", line 391, in wrapper
[rank1]:     out = func(*args, **kwargs)
[rank1]:           ^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/optim/optimizer.py", line 76, in _use_grad
[rank1]:     ret = func(self, *args, **kwargs)
[rank1]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/optim/adamw.py", line 165, in step
[rank1]:     loss = closure()
[rank1]:            ^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/plugins/precision/precision.py", line 108, in _wrap_closure
[rank1]:     closure_result = closure()
[rank1]:                      ^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/optimization/automatic.py", line 144, in __call__
[rank1]:     self._result = self.closure(*args, **kwargs)
[rank1]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
[rank1]:     return func(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/optimization/automatic.py", line 129, in closure
[rank1]:     step_output = self._step_fn()
[rank1]:                   ^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/loops/optimization/automatic.py", line 318, in _training_step
[rank1]:     training_step_output = call._call_strategy_hook(trainer, "training_step", *kwargs.values())
[rank1]:                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/trainer/call.py", line 309, in _call_strategy_hook
[rank1]:     output = fn(*args, **kwargs)
[rank1]:              ^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/strategy.py", line 390, in training_step
[rank1]:     return self._forward_redirection(self.model, self.lightning_module, "training_step", *args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/strategy.py", line 642, in __call__
[rank1]:     wrapper_output = wrapper_module(*args, **kwargs)
[rank1]:                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
[rank1]:     return self._call_impl(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
[rank1]:     return forward_call(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1593, in forward
[rank1]:     else self._run_ddp_forward(*inputs, **kwargs)
[rank1]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1411, in _run_ddp_forward
[rank1]:     return self.module(*inputs, **kwargs)  # type: ignore[index]
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
[rank1]:     return self._call_impl(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
[rank1]:     return forward_call(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/pytorch_lightning/strategies/strategy.py", line 635, in wrapped_forward
[rank1]:     out = method(*_args, **_kwargs)
[rank1]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/opt/hpcaas/.mounts/fs-0565f60d669b6a2d3/home/haideraltahan/RelationCLIP/modules/clip_module.py", line 49, in training_step
[rank1]:     model_out = self.model(images, input_texts)
[rank1]:                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
[rank1]:     return self._call_impl(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
[rank1]:     return forward_call(*args, **kwargs)
[rank1]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/open_clip/coca_model.py", line 170, in forward
[rank1]:     text_latent, token_embs = self._encode_text(text)
[rank1]:                               ^^^^^^^^^^^^^^^^^^^^^^^
[rank1]:   File "/data/home/haideraltahan/anaconda3/envs/RClip/lib/python3.12/site-packages/open_clip/coca_model.py", line 145, in _encode_text
[rank1]:     text_latent, token_emb = self.text(text)
[rank1]:     ^^^^^^^^^^^^^^^^^^^^^^
[rank1]: ValueError: too many values to unpack (expected 2)
@haideraltahan haideraltahan changed the title coca_ViT-B-32: 'self._encode_text' bug coca_ViT-B-32: self._encode_text bug Jun 14, 2024
@rwightman
Copy link
Collaborator

@haideraltahan I don't think there is a problem here, looks more likely to be an issue with how model is being used/wrapped in whatever this use case is.

Can do model(torch.randn(2,3,224,224), torch.randint(0, 49408, (2, 76))) without issue, call text directly ,etc.

@Srividhya-Sainath
Copy link

Adding context_length=76 should help. Something like this:
text_tokens = tokenizer.tokenize(text_descriptions, context_length=76)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants