Hi have to make an app that gets a big XML with 10000+ records.
I have to place them in a sqlite database with the following checks:
- if record not exist insert record
- if record exist but upload =0 (in sqlite db on tablet) then update record
- if record exist but upload =1 (in sqlite db on tablet) then ignore record
What is the best/fastest way to insert/update this many records?
use the xmlparser and insert/update record for record or is there a better way to do it?
If it is possible i would get smaller XMLs. Maybe 100-500 Entries in each part and get them one after another till i got all Data. ALL Entries in on could cause a lot of memory-usage..
After each part downloaded i would use a thread to import them into db. If it is no problem with Memory then you can try to get hole list at once and then import them in a thread.
If thred finishes i would download the next part and start a new thread to import... AND SO ON AND SO ON
Thanks DonManfred we will make the xml in smaller block.
We did some tests and inserting records with dbutils.insertMaps goes fast (is there a faster way then insertMaps?)
but still have to test what the best way is to get the XML (from a webservice) to a list of maps for the InsertMaps.
We are thinking to use the xmlParser and use stringbuilder to add lines(maps) with stringbuilder to a global list and when all lines/maps are added
then insert the list of maps to sqlite db. we have to test if this is fast enough.
Since we're talking about large data file, I would consider the option to send only the rows needed to be updated or inserted. To do this, I would store in the original db (server side) the rows already inserted or updated at the client's side. This would require a unique ID for each device of course and be even better, store the timestamp of insertions/updates.