Android Question Out of Memory error - Limitation?

GeoffT660

Active Member
Licensed User
Longtime User
I received this error when I was trying to download and import 150K records. I have no problem with fewer records 10k to 20K (tested so far). Just wondering if there is solution for this or is just a size limitation.


java.lang.OutOfMemoryError: OutOfMemoryError thrown while trying to throw OutOfMemoryError; no stack trace available
java.lang.OutOfMemoryError: Failed to allocate a 104 byte allocation with 88 free bytes and 88B until OOM, max allowed footprint 536870912, growth limit 536870912
at com.android.org.conscrypt.NativeCrypto.SSL_read(Native Method)
at com.android.org.conscrypt.OpenSSLSocketImpl$SSLInputStream.read(OpenSSLSocketImpl.java:766)
at okio_Okio$2.read(Okio.java:138)
at okio.AsyncTimeout$2.read(AsyncTimeout.java:236)
at okio.RealBufferedSource.request(RealBufferedSource.java:66)
at okio.RealBufferedSource.require(RealBufferedSource.java:59)
at okhttp3.internal.http2.Http2Reader.nextFrame(Http2Reader.java:88)
at okhttp3.internal.http2.Http2Connection$ReaderRunnable.execute(Http2Connection.java:568)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.lang.Thread.run(Thread.java:764)
 

Pendrush

Well-Known Member
Licensed User
Longtime User
Probably there is no justified reason to load 150k item at once and show it to user.
What you actually want to do, where you import records (in view or sqlite or something different) and how?
 
Upvote 0

GeoffT660

Active Member
Licensed User
Longtime User
To be honest, I don't need to download 150K but was testing as I do need 20K to 30K (which works fine and fast) because users don't always have internet access but need the data. I create a text file and download the text file from the server (works fine) but it's when I'm creating the map to import through dbUtils that it bombs out with that error before I am able to dbUtils.insertmaps. I'm just wondering if there is another way or solution in case I did want to import 150K records. The Text file is 49MB.
 
Upvote 0

GeoffT660

Active Member
Licensed User
Longtime User
Thanks for the response. I do have that line as I needed to do the 30K. What would cause and how would I test for a memory leak. I'm just using the generic code to read a csv (work fine) and create a map to import but it's only when I'm creating the map that it fails. As mentioned, the error occurs while creating the map after I load the csv and before I call dbUtils to import the map.
 
Upvote 0

Semen Matusovskiy

Well-Known Member
Licensed User
Without largeheap app is limited by 64MB, with largeheap - 512 MB.
Android told absolutelly clear - "growth limit 536870912" (512 * 1024 * 1024). Actually "largeheap" is not a normal setting.
Of course, if you rotate photos etc., you need a lot of memory. But to keep hundreds megabytes of database information in memory instead of files is obviously bad idea.
 
Upvote 0

GeoffT660

Active Member
Licensed User
Longtime User
Is there a better way to move that many records from a text file to a sqlLite db without creating a map? I will look into the b4J solution and keep you posted. Thanks again for your ideas.
 
Upvote 0

mc73

Well-Known Member
Licensed User
Longtime User
One solution would be to split the original txt file into a 'safe' number of files for which import runs smoothly.
 
Upvote 0

udg

Expert
Licensed User
Longtime User
Is there a better way to move that many records from a text file to a sqlLite db without creating a map?
Since you know beforehand the DB record structure, you could write an INSERT SQL statement and loop on your data. To have a clue on how to do it, just look at how InsertMaps works (at some point it should "translate" from the way you pass it data and an INSERT statement).

If you find easier to keep the InsertMaps approach, why don't you serve the function just a smaller block of data, repeating the action as many times are needed to exhaust your original file list?
I mean, you read 150K records, pass 10K records to InsertMaps, then another batch of 10K and so on.
If I recall correctly, InsertMaps expects as a parameter a ListOfMaps where each map is a record (key=record field, value = field value); so, limiting the LOM to 10K will limit the needed RAM for each step in your loop.

Personally, I would go for the first option since you have a large number of records and that will speed thing a lot (InsertMaps necessarily does a few of "unnecessary" preparation steps). My 2 cents.
 
Upvote 0
Top