I divide my DateTime by 1000 which I believe converts it to seconds
Correct. Ticks are in milliseconds.
Unix time was originally a (signed) 32-bit integer of seconds since 01/01/1970. Maximum value of 31 bits is 2^31-1 ie tad over 2 billion. Divide by 3600 seconds/hour and 24 hours/day and 365 days/year, and you get maximum value of 68 years. 1970 + 68 = 2038 ie the Unix equivalent of Y2K.
B4X time is 64-bit integer of milliseconds since 01/01/1970. With similar math, we get maximum value of 292 million years, so we should be ok for a while. Plus it covers the entire period of human history backwards, whereas (signed 32-bit) Unix time only goes back to ~1902 (ie 1970 = 68) and thus my grandmother born in 1901 would be reborn in 2037.
Why does my array start with the first four elements as zeros?
Because dividing by 1000 shifts the number to the right (by 3 decimal digits = approximately 10 binary digits = approximately 1.25 bytes) and so the bits (value) of byte array elements 2 and 3 have moved down to elements 4 and 5. and the zero values of elements 0 and 1 have moved down to elements 2 and 3.
If you're thinking that because the first four elements are zero, that you can ignore them and just send the lower four bytes as a 32-bit Int rather than all eight bytes as a 64-bit Long, then you're back at the Unix 2038 problem. Although, yes, you could treat the four bytes as an UNSIGNED 32-bit value, which would double your 68 year maximum and push the rollover problem to your great-great-grandchildren.