Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option to disable storing pages in temp folder #979

Open
lambdareader opened this issue Jun 4, 2024 · 3 comments
Open

Option to disable storing pages in temp folder #979

lambdareader opened this issue Jun 4, 2024 · 3 comments

Comments

@lambdareader
Copy link

lambdareader commented Jun 4, 2024

I couldn't find any information in the docs to disable it.
Setting temp folder size to 0 doesn't work.
I don't have it on SSD so it just creates a copy on my hard drive for no reason at all.

Most manga/comic readers (komga, kavita, yacreader) just read/unzip single pages of a cbz in memory through random access. It is much faster than unpacking it, writing it and then reading.

Could this be implemented please? It just creates a lot of extra hardware strain and the gains are negligible for a single user. Even with SSD it wouldn't be worth it. Hard drives take a fraction of a second to read a few MB for a page if you have them on a NAS spinning the entire time.
tbh I'm confused why caching is standard. It would only make sense if there are multiple users or you are on desktop and the drive platter are spinning down when not used. But aren't most using this on a NAS?
Sure it would use a bit of memory to unzip pages in memory but isn't this a case where it should be used?

@Difegue
Copy link
Owner

Difegue commented Jun 4, 2024

👋 The full extract/caching was designed to avoid performing the unarchiving job which (used) to take a bunch of time.. This is a pretty old part of the design, the server used to rely on unar instead of libarchive and that couldn't do random access.

The server does support random access when you hit individual page endpoints nowadays, but it's true that it always copies to the filesystem first. I'm not sure how much work it'd take to make it all happen in-memory.
As a workaround, I think you could just point the temp folder to a RAM disk?

@lambdareader
Copy link
Author

Thanks for the fast reply.
Wow, stupid of me to not think of simply using a ramdisk. I'll be doing that, though always having to load all pages at once is a bit wasteful.
It would be pretty neat to eventually rewrite it.

As another thought. I'm running lanraragi on a very low powered device. Is there a option to just generate page thumbnails if i open the archive overview and not when i open the archive (or disable them entirely)? It causes the server to lag because all cores are at 100% if there are triple digit pages. I don't really use the overview very much, so every time i want to read a big archive it takes 30 seconds before i can turn to the next page.

@Difegue
Copy link
Owner

Difegue commented Jun 16, 2024

No specific option, but I believe the changes from #885 will help as they restrain page thumbnail jobs to run sequentially on one worker only.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants