Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: could not broadcast input array from shape (32000,) into shape (0,) #95

Open
ultimatech-cn opened this issue Jun 13, 2024 · 1 comment

Comments

@ultimatech-cn
Copy link

When excution example workflow
image

!!! Exception during processing!!! could not broadcast input array from shape (32000,) into shape (0,)
Traceback (most recent call last):
File "E:\training\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\nodes\llavaloader.py", line 99, in generate_text
response = llm.create_chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1734, in create_chat_completion
return handler(
^^^^^^^^
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama_chat_format.py", line 2712, in call
completion_or_chunks = llama.create_completion(
^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1570, in create_completion
completion: Completion = next(completion_or_chunks) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 1095, in _create_completion
for token in self.generate(
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 723, in generate
self.eval(tokens)
File "E:\training\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 569, in eval
self.scores[n_past : n_past + n_tokens, :].reshape(-1)[: :] = logits
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^
ValueError: could not broadcast input array from shape (32000,) into shape (0,)

Any workround for this?

@gokayfem
Copy link
Owner

first time seeing this error. are you sure you donwloaded the right mmproj for the model? you cannot use another models mmproj on different model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants