-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] slow UI / loading with a conversation that has large context #566
Comments
@dogmatic69 are you running in prod or dev mode? If in dev, there are many time penalties (e.g. react strict mode = 2x the time). If on Big-AGI.com then it's a prod build. I have conversations around 20k tokens which can take 1s to load, and I'd expect 4s for 80k (and mostly it's text layout by the browser). Could you try to download the message, and upload it to big -AGI.com (even in an incognito window) to see whether it's faster on there? Finally, I profile with Chrome but not Firefox, not sure if anyone has done profiling there. |
Found the function at this point, something to do with markdown formatting?
|
@dogmatic69 it's likely the markdown editor (which is react-markdown + remark_gfm, the standard combo). There must be something exponential with their layouting algos. Are you able to edit the 'localStorage' of the application, as-is (broken) and set localStorage > app-ui > renderMarkdown: false This should disable the markdown renderer, and the app may come back to life, |
I did get it working by removing something in local storage. Lost my folders though :/ The lack of deep URL links made things worse as I had no way to navigate away from that broken chat. |
I just turned off markdown and revisited that chat and same problem. |
Good idea about the deep links. I'll move to storing the current chat ID in the URL. Would be a good workaround in extreme cases. LMK if you can share a problematic file (ctrl+s in the app to save it, and then you can upload it here), so I can profile it. However I use the chrome profiler for dev. |
Description
I load up a bunch of data into context (only 80k tokens) and reloading the page with that conversation takes 10-15 seconds. After that it's pretty normal
The context is basically spread over 2 large messages and then another 3-4 messages with a couple sentences.
Device and browser
linux / firefox 127
Screenshots and more
No response
Willingness to Contribute
The text was updated successfully, but these errors were encountered: