Android Question Bulk insert/update

EddyW

Member
Licensed User
Longtime User
Hi have to make an app that gets a big XML with 10000+ records.

I have to place them in a sqlite database with the following checks:
- if record not exist insert record
- if record exist but upload =0 (in sqlite db on tablet) then update record
- if record exist but upload =1 (in sqlite db on tablet) then ignore record

What is the best/fastest way to insert/update this many records?
use the xmlparser and insert/update record for record or is there a better way to do it?

regards, Eddy
 

DonManfred

Expert
Licensed User
Longtime User
If it is possible i would get smaller XMLs. Maybe 100-500 Entries in each part and get them one after another till i got all Data. ALL Entries in on could cause a lot of memory-usage..

After each part downloaded i would use a thread to import them into db. If it is no problem with Memory then you can try to get hole list at once and then import them in a thread.
If thred finishes i would download the next part and start a new thread to import... AND SO ON AND SO ON

See here
 
Upvote 0

EddyW

Member
Licensed User
Longtime User
Thanks DonManfred we will make the xml in smaller block.

We did some tests and inserting records with dbutils.insertMaps goes fast (is there a faster way then insertMaps?)
but still have to test what the best way is to get the XML (from a webservice) to a list of maps for the InsertMaps.

We are thinking to use the xmlParser and use stringbuilder to add lines(maps) with stringbuilder to a global list and when all lines/maps are added
then insert the list of maps to sqlite db. we have to test if this is fast enough.
 
Upvote 0

mc73

Well-Known Member
Licensed User
Longtime User
Since we're talking about large data file, I would consider the option to send only the rows needed to be updated or inserted. To do this, I would store in the original db (server side) the rows already inserted or updated at the client's side. This would require a unique ID for each device of course and be even better, store the timestamp of insertions/updates.
 
Upvote 0

sorex

Expert
Licensed User
Longtime User
you could append your queries to one big one in the check loop.

execuring multiples in one go (read connection) is way faster than one by one.

I even think that there is some sort of query queue made by erel which probably uses this principle
to improve bulk inserts/updates.
 
Upvote 0
Top