[B4X] huge files qty: File.ListFilesAsync, possible DeleteAsync ?

peacemaker

Expert
Licensed User
Longtime User
Hi, All

Say, if we have very-very long file list in a folder - the list is got also very long time by File.ListFilesAsync.
For the task such huge files qty is required, storing files created during several weeks.
But how to delete lots of old files without pausing the main thread too long ?
 

peacemaker

Expert
Licensed User
Longtime User
200...300 thousand files now. FTP listing consumes 23 MB and 12 seconds.
Deleting manually by "sudo rm -fr $folder$" lasts 7 seconds...
If to delete one by one, selecting the file name from db and looking it in the whole file list - deleting speed is around 1 file per second... in Release mode.
Maybe here the deleting by file mask (without looking into the file list) would be good.
 

hatzisn

Expert
Licensed User
Longtime User
I suppose we are talking about a B4J app. Why don't you write the present filenames in a specific file and read this file? Then once a day recreate it... It could be random access file and keep 5 bytes at the end of each filename to write the next position of the next filename in case you "delete it" (the next) mid-day... You can also append the new files...

PS. Also save the length of the filename in a byte... I would do___________ | length of filename - 1| filename - X | position of next filename - 5|
 
Last edited:

peacemaker

Expert
Licensed User
Longtime User
Thanks, Erel, seems, no need more discussion indeed for this my topic.
But i have posted here due to it's common cross-platform B4X question. Where better to post such questions to ?
 

peacemaker

Expert
Licensed User
Longtime User
I had this my app before under Android and now under Linux. Both times it was the same situation: huge qty of the small files that must be stored for long time, as photo log bank, using as much drive capacity as possible. But without overload killing the OS.
Seems, this task is crossplatform.
 
Top