Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: TypeError: 'str' object is not callable #16083

Open
1 of 6 tasks
MODOMison opened this issue Jun 24, 2024 · 1 comment
Open
1 of 6 tasks

[Bug]: TypeError: 'str' object is not callable #16083

MODOMison opened this issue Jun 24, 2024 · 1 comment
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@MODOMison
Copy link

Checklist

  • The issue exists after disabling all extensions
  • The issue exists on a clean installation of webui
  • The issue is caused by an extension, but I believe it is caused by a bug in the webui
  • The issue exists in the current version of the webui
  • The issue has not been reported before recently
  • The issue has been reported before but has not been fixed yet

What happened?

wont run

Steps to reproduce the problem

load prompt, hit go

What should have happened?

run normally

What browsers do you use to access the UI ?

No response

Sysinfo

[], 1, 1, 7, 512, 512, True, 0.7, 2, 'Latent', 100, 0, 0, 'Use same checkpoint', 'Use same sampler', 'Use same scheduler', '', '', [], 0, 100, 'DPM++ 2M', 'Automatic', False, '', 0.8, -1, False, -1, 0, 0, 0, False, False, 'positive', 'comma', 0, False, False, 'start', '', 1, '', [], 0, '', [], 0, '', [], True, False, False, False, False, False, False, 0, False) {}
Traceback (most recent call last):
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 57, in f
res = list(func(*args, **kwargs))
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 36, in f
res = func(*args, **kwargs)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\txt2img.py", line 109, in txt2img
processed = processing.process_images(p)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 845, in process_images
res = process_images_inner(p)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 959, in process_images_inner
p.setup_conds()
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 1495, in setup_conds
super().setup_conds()
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 506, in setup_conds
self.uc = self.get_conds_with_caching(prompt_parser.get_learned_conditioning, negative_prompts, total_steps, [self.cached_uc], self.extra_network_data)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 492, in get_conds_with_caching
cache[1] = function(shared.sd_model, required_prompts, steps, hires_steps, shared.opts.use_old_scheduling)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 177, in get_learned_conditioning
prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps, hires_steps, use_old_scheduling)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in get_learned_conditioning_prompt_schedules
promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in
promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 124, in get_schedule
tree = schedule_parser.parse(prompt)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lark.py", line 625, in parse
return self.parser.parse(text, start=start, on_error=on_error)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parser_frontends.py", line 96, in parse
return self.parser.parse(stream, chosen_start, **kw)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 266, in parse
to_scan = self._parse(lexer, columns, to_scan, start_symbol)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\xearley.py", line 144, in _parse
self.predict_and_complete(i, to_scan, columns, transitives)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in predict_and_complete
originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in
originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\grammar.py", line 20, in eq
return self.is_term == other.is_term and self.name == other.name
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lexer.py", line 192, in eq
if isinstance(other, Token) and self.type != other.type:
TypeError: 'str' object is not callable


Console logs

[], 1, 1, 7, 512, 512, True, 0.7, 2, 'Latent', 100, 0, 0, 'Use same checkpoint', 'Use same sampler', 'Use same scheduler', '', '', [], 0, 100, 'DPM++ 2M', 'Automatic', False, '', 0.8, -1, False, -1, 0, 0, 0, False, False, 'positive', 'comma', 0, False, False, 'start', '', 1, '', [], 0, '', [], 0, '', [], True, False, False, False, False, False, False, 0, False) {}
    Traceback (most recent call last):
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 57, in f
        res = list(func(*args, **kwargs))
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 36, in f
        res = func(*args, **kwargs)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\txt2img.py", line 109, in txt2img
        processed = processing.process_images(p)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 845, in process_images
        res = process_images_inner(p)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 959, in process_images_inner
        p.setup_conds()
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 1495, in setup_conds
        super().setup_conds()
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 506, in setup_conds
        self.uc = self.get_conds_with_caching(prompt_parser.get_learned_conditioning, negative_prompts, total_steps, [self.cached_uc], self.extra_network_data)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 492, in get_conds_with_caching
        cache[1] = function(shared.sd_model, required_prompts, steps, hires_steps, shared.opts.use_old_scheduling)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 177, in get_learned_conditioning
        prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps, hires_steps, use_old_scheduling)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in get_learned_conditioning_prompt_schedules
        promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in <dictcomp>
        promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 124, in get_schedule
        tree = schedule_parser.parse(prompt)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lark.py", line 625, in parse
        return self.parser.parse(text, start=start, on_error=on_error)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parser_frontends.py", line 96, in parse
        return self.parser.parse(stream, chosen_start, **kw)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 266, in parse
        to_scan = self._parse(lexer, columns, to_scan, start_symbol)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\xearley.py", line 144, in _parse
        self.predict_and_complete(i, to_scan, columns, transitives)
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in predict_and_complete
        originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in <listcomp>
        originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\grammar.py", line 20, in __eq__
        return self.is_term == other.is_term and self.name == other.name
      File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lexer.py", line 192, in __eq__
        if isinstance(other, Token) and self.type != other.type:
    TypeError: 'str' object is not callable

---

Additional information

[], 1, 1, 7, 512, 512, True, 0.7, 2, 'Latent', 100, 0, 0, 'Use same checkpoint', 'Use same sampler', 'Use same scheduler', '', '', [], 0, 100, 'DPM++ 2M', 'Automatic', False, '', 0.8, -1, False, -1, 0, 0, 0, False, False, 'positive', 'comma', 0, False, False, 'start', '', 1, '', [], 0, '', [], 0, '', [], True, False, False, False, False, False, False, 0, False) {}
Traceback (most recent call last):
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 57, in f
res = list(func(*args, **kwargs))
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\call_queue.py", line 36, in f
res = func(*args, **kwargs)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\txt2img.py", line 109, in txt2img
processed = processing.process_images(p)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 845, in process_images
res = process_images_inner(p)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 959, in process_images_inner
p.setup_conds()
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 1495, in setup_conds
super().setup_conds()
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 506, in setup_conds
self.uc = self.get_conds_with_caching(prompt_parser.get_learned_conditioning, negative_prompts, total_steps, [self.cached_uc], self.extra_network_data)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\processing.py", line 492, in get_conds_with_caching
cache[1] = function(shared.sd_model, required_prompts, steps, hires_steps, shared.opts.use_old_scheduling)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 177, in get_learned_conditioning
prompt_schedules = get_learned_conditioning_prompt_schedules(prompts, steps, hires_steps, use_old_scheduling)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in get_learned_conditioning_prompt_schedules
promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 132, in
promptdict = {prompt: get_schedule(prompt) for prompt in set(prompts)}
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\modules\prompt_parser.py", line 124, in get_schedule
tree = schedule_parser.parse(prompt)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lark.py", line 625, in parse
return self.parser.parse(text, start=start, on_error=on_error)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parser_frontends.py", line 96, in parse
return self.parser.parse(stream, chosen_start, **kw)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 266, in parse
to_scan = self._parse(lexer, columns, to_scan, start_symbol)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\xearley.py", line 144, in _parse
self.predict_and_complete(i, to_scan, columns, transitives)
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in predict_and_complete
originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\parsers\earley.py", line 122, in
originators = [originator for originator in columns[item.start] if originator.expect is not None and originator.expect == item.s]
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\grammar.py", line 20, in eq
return self.is_term == other.is_term and self.name == other.name
File "D:\Users\matto\Recovered decent stuff\Desktop\stable-diffusion-webui\venv\lib\site-packages\lark\lexer.py", line 192, in eq
if isinstance(other, Token) and self.type != other.type:
TypeError: 'str' object is not callable


@MODOMison MODOMison added the bug-report Report of a bug, yet to be confirmed label Jun 24, 2024
@w-e-w
Copy link
Collaborator

w-e-w commented Jun 25, 2024

  • The issue is caused by an extension, but I believe it is caused by a bug in the webui

if you believe that is the case then you should tell us what extension did we break by accident

copy and pasting the same thing three times is not going to give us more information
also provide the full logs from the very beginning to the end not just the error

provide Sysinfo as specified in the instructions
that information gives us a "chance" of replicating your system on our machines and thus replicating the issue
without that we don't even know what version of webui you are using

provide detail "steps" on what exactly did to trigger the issue
steps as in 1.do xxx 2. type yyy, 3. click ccc
include screenshot or video if necessary

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

2 participants