I'm creating a UI b4j program and find that while iterating the filesystem and stuffing a database (9 fields) all works fine. However, when I use the MD5 feature of the encryption v1.1 library (I am getting a checksum of the files themselves) memory quickly rises out of the heap space (>8GB within the first few hundred iterations) and crashes the program.
With the MD5 code commented out and a string "na" in the result variable the program completes iterating all files in the selected directory (17,187 files), loads em all into a single list and updates an sqli table in a single batch without exceeding 1.2GB memory.
The crash also occurs if I specify 'SHA-1" except it crashes faster (maybe 150 iterations).
Files greater than about 1.5 GB will crash the program instantly.
I am replacing a media system I wrote years ago where I rely on PHP to perform this task (it still runs great).
That media system uses an http server, database server and php to deliver all of my home media to my devices and computer web browsers. This new system is intended to be a single application that will handle everything and perform the same job.
Is there a better way to get the MD5 or similar using B4J?
Thanks for any comments.
update:
found this
www.b4x.com
and applied inline java to perform the task. It is much faster and does not generate the out of memory error. It actually uses a little less memory.
However, larger files (a GB or more) still crash with 'out of memory' error.
Update:
MEMORY LEAK?
NO.
I was loading an entire file into a variable and causing the 'out of memory error'.
The answer is to stream the file in and process it in chunks then return the result.
(for anyone else with this issue)
With the MD5 code commented out and a string "na" in the result variable the program completes iterating all files in the selected directory (17,187 files), loads em all into a single list and updates an sqli table in a single batch without exceeding 1.2GB memory.
The crash also occurs if I specify 'SHA-1" except it crashes faster (maybe 150 iterations).
Files greater than about 1.5 GB will crash the program instantly.
I am replacing a media system I wrote years ago where I rely on PHP to perform this task (it still runs great).
That media system uses an http server, database server and php to deliver all of my home media to my devices and computer web browsers. This new system is intended to be a single application that will handle everything and perform the same job.
Is there a better way to get the MD5 or similar using B4J?
Thanks for any comments.
update:
found this
Hash on nonce
Hello, I have a Key : u7crun6Qk4g4z0qxJRAWGA== I need to calculate a Hash of a Key like the one above to sign a Http Service authorization (SetHeader) the Manual says: X-ServiceAuthorization Hash(base 64) of the nonce that was retrieved from the previous response (hmac_sha1(,)) Help on how...

and applied inline java to perform the task. It is much faster and does not generate the out of memory error. It actually uses a little less memory.
However, larger files (a GB or more) still crash with 'out of memory' error.
Update:
MEMORY LEAK?
NO.
I was loading an entire file into a variable and causing the 'out of memory error'.
The answer is to stream the file in and process it in chunks then return the result.
(for anyone else with this issue)
Last edited: