Hi,
I'm doing some data transfers from a remote sql server and thing are going great. BUT... I have a question.
The way most examples work is by having b4a make an http request which in turn (on the host server) interrogates the database and returns a simple array in JSON format wich looks like this:
(you can have a better sense of the data by using some kind of online parse like these:
Json Parser Online
Online JSON Viewer)
After that, b4a convert that chuck of data to a list like that:
This is VERY powerful because you can just populate a local database by doing this with the list:
NOW... my question...
Having an array transferred directly in JSON format is great but it does use a little bit more bandwidth as each column name is repeated for each record. Consider this JSON format of an SQL query (as opposed to an array):
This is 35% smaller and this is with 3 letters column name... I could be alot more when using more significant clolumn name.
I haven't been able by using the json parser library to transform that into an array. So far, I'm not even able to discern it and traverse the structures with b4a.
Am I missing something? Have some of you done some workarounds?
Thanks for any info.
JF.
I'm doing some data transfers from a remote sql server and thing are going great. BUT... I have a question.
The way most examples work is by having b4a make an http request which in turn (on the host server) interrogates the database and returns a simple array in JSON format wich looks like this:
(you can have a better sense of the data by using some kind of online parse like these:
Json Parser Online
Online JSON Viewer)
B4X:
[{"DES":"--ss dp(12)710ML","PRO":"4999"},{"DES":"--ssdp(12-24) 450-500-547-591-695ML","PRO":"4099"},{"DES":"--dp (24)341MLcan","PRO":"3499"},{"DES":"--ss dp bib","PRO":"6299"},{"DES":"--dp(12)341MLcan","PRO":"3998"},{"DES":"COKE(12)1L","PRO":"1001"},{"DES":"SPRITE 1 LITRE","PRO":"1002"},{"DES":"n\/a","PRO":"1050"},{"DES":"n\/a","PRO":"1041"},{"DES":"n\/a","PRO":"1040"},{"DES":"n\/a","PRO":"1042"},{"DES":"SPRITE 1 L.","PRO":"V1002"},{"DES":"SPRITE DIETE 1 L.","PRO":"V1050"},{"DES":"GINGER ALE 1 LITRE","PRO":"V1041"},{"DES":"TONIC WATER 1 LITRE","PRO":"V1040"},{"DES":"CLUB SODA 1 LITRE","PRO":"V1042"},{"DES":"COKE CLASSIQUE 1 L.","PRO":"V1001"},{"DES":"COKE DIET(12)1L","PRO":"1053"},{"DES":"COKE DIETE 1 L","PRO":"V1053"},{"DES":"COKE 24X355 ML. NR.","PRO":"V0501"}]
After that, b4a convert that chuck of data to a list like that:
B4X:
Dim parser_stk As JSONParser
parser_stk.Initialize(Job.GetString)
Dim stock As List
stock = parser_stk.NextArray
This is VERY powerful because you can just populate a local database by doing this with the list:
B4X:
DBUtils.InsertMaps(SQL, "stk", stock)
NOW... my question...
Having an array transferred directly in JSON format is great but it does use a little bit more bandwidth as each column name is repeated for each record. Consider this JSON format of an SQL query (as opposed to an array):
B4X:
{"COLUMNS":["PRO","DES"],"DATA":[["4999","--ss dp(12)710ML"],["4099","--ssdp(12-24) 450-500-547-591-695ML"],["3499","--dp (24)341MLcan"],["6299","--ss dp bib"],["3998","--dp(12)341MLcan"],["1001","COKE(12)1L"],["1002","SPRITE 1 LITRE"],["1050","n\/a"],["1041","n\/a"],["1040","n\/a"],["1042","n\/a"],["V1002","SPRITE 1 L."],["V1050","SPRITE DIETE 1 L."],["V1041","GINGER ALE 1 LITRE"],["V1040","TONIC WATER 1 LITRE"],["V1042","CLUB SODA 1 LITRE"],["V1001","COKE CLASSIQUE 1 L."],["1053","COKE DIET(12)1L"],["V1053","COKE DIETE 1 L"],["V0501","COKE 24X355 ML. NR."]]}
This is 35% smaller and this is with 3 letters column name... I could be alot more when using more significant clolumn name.
I haven't been able by using the json parser library to transform that into an array. So far, I'm not even able to discern it and traverse the structures with b4a.
Am I missing something? Have some of you done some workarounds?
Thanks for any info.
JF.