Android Question Line Chart

Discussion in 'Android Questions' started by MarkusR, Mar 26, 2019.

Tags:
  1. MarkusR

    MarkusR Well-Known Member Licensed User

    hello,

    is there a simple (bullet proof) line chart where i can just Add Data with a Time Stamp and a Name to Monitoing the last 24 hours?
    i need it to visualize 4 different sensor values.

    attached my first try and ugly result.

    at extreme case i have 86400 x 4 collected values each day for every second.

    temperature 10-50 °C
    humidity 0-100%
    movement yes/no
    brightness 0-1023

    i try to reproduce a sleep phase analysis project with a arduino microprocessor board. :D
    IMG_20190326_194836.jpg
     

    Attached Files:

    Last edited: Mar 26, 2019
  2. thetahsk

    thetahsk Active Member Licensed User

  3. Computersmith64

    Computersmith64 Well-Known Member Licensed User

    MarkusR likes this.
  4. MarkusR

    MarkusR Well-Known Member Licensed User

    ok, i tested some before, i think i need to collect the data into average quarter-hour columns then i have only 96 columns which are easy & fast to draw.
     
  5. emexes

    emexes Active Member Licensed User

    Man, that brings back some memories. Back in the Dark Days of DOS, when programming data acquisition systems that would run week-long tests on engines, I had the same issue of displaying long-term graphs without long-term calculation or redrawings. What I ended up doing was storing the graph data at 6 different "zoom levels", I think it was "nice" round numbers like 1000 points at 10 Hz, 1000 points at 2 Hz, 1000 points at 0.5 Hz, 1000 points at 0.1 Hz, 1000 points at 0.02 Hz and 1000 points at 0.005 Hz. Nope, must have been 7 zoom levels, add on 1000 points at 0.001 Hz. So that last zoom level represents 1 million seconds = just over a week, and I could redraw that graph (or any of the other zoom levels) from scratch in about 10 ms.

    You'd think keeping a 600-pixel-wide graph (back in the days when 640x480 VGA was king) and 7 sets of data up to date 10 times per second would be hard, but it is actually not. The first shortcut is that 599 of the 600 graph x-columns don't change when a new reading comes in, thus there is no need to redraw them - all you need to update is the last (most recent) point. The second was to scroll by "unplotting" each previous point (pixels) of the graph just before drawing the corresponding new one - in fact, this had a double benefit for the horizontal portions of the graph (ie, a lot) because you could just reuse the pixel already there from the previous point, no need to restore it and then write the new pixel in its place.

    The other shortcut is that, yes, you're keeping 7 sets of data up to date, but it's either: (i) you only have to keep the last "bucket" of each set up to date, which is a few integer operations and a few (add and min/max comparison) floating point operations (which even ye olde '386 could handle with ease), or (ii) you're only updating the 10 Hz and 2 Hz sets at full rate; the slower sets only need updating at much slower rates (down to once-every-200 seconds).

    I vaguely remember using indexes that pointed to the most recent point in circular arrays, rather than shuffling data along by one position every time new data came in.

    I also remember that 10 Hz was a good sampling rate because it was a whole-number-of-cycles of mains electricity, regardless of whether it was 50 Hz or 60 Hz. And that we had an aluminium fencing manufacturer next door whose induction casting machinery produced phenomenal amounts of electrical noise. One moment the CRO was a beautiful clean sine wave or sensor reading, next moment it would explode into random lines. It did have the benefit that we had to make our product resilient to noise way greater than it ever encountered out in the field (except for the odd case of one brake tester, that turned out to be at the focal point of local TV tower transmissions reflected off a 10 acre car wrecking yard 4 km away...)
     
    Last edited: Mar 27, 2019
    Shelby and MarkusR like this.
  6. emexes

    emexes Active Member Licensed User

    Sorry, the short takeaway from that spiel is: saving them as condensed points (where each point/bucket represents multiple readings) works great.

    Consider saving multiple zoom levels.

    Like, if you want to be able to show from 0.25 second resolution to a full week's worth (because you're looking at the light cycle timing or something) and say your graph is 800 pixels/points wide, then:

    one week scale = 7 x 86400 x 4 = 2419200 readings
    0.25 resolution scale = 800 readings

    1209600 / 800 = 3024 factor between the two (smallest-to-largest) scales

    3025 ^ (1/(numzoomlevels-1)) = 4.96 = round up to 5x multiplier step between scales

    So you'd have:

    Zoom level 1 = 1 reading per point = 0.25 seconds = 200 second-wide graph
    Zoom level 2 = 5 readings per point = 1.25 seconds = 16 minute-wide graph
    Zoom level 3 = 25 readings per point = 6.25 seconds = 83 minute-wide graph
    Zoom level 4 = 125 readings per point = 31.25 seconds = 6.9 hour-wide graph
    Zoom level 5 = 625 readings per point = 156.25 seconds = 34.7 hour-wide graph
    Zoom level 6 = 3125 readings per point = 781.25 seconds = 7.23 day-wide graph

    and instead of keeping 2419200 readings at 0.25 second resolution x say 4 bytes/float = 10 MB (per channel),
    you'd have 6 zoom levels x 800 samples x 3 summaries (min, max & average) x 4 bytes/float = 0.058 MB (per channel) = 99% memory saving (and instantaneous graph display, since those readings are already condensed = no need to go through those 2.4M readings)

    On the other hand, if your software specification/customer says one day will do, and is ok with quarter-hour resolution, then... unless there's some benefit to you in exceeding requirements, or you're a bit of a perfectionist, or you just like doing a super-good job, or you enjoy this stuff, then: 96 columns is the go (but I'd track the minimum and maximum too, so that short events don't get lost in the consolidation).

    What could go wrong?!?!

    :)

    (hmm - so much for the "short takeaway"...)
     
    Shelby and MarkusR like this.
  7. MarkusR

    MarkusR Well-Known Member Licensed User

    @emexes
    yes, a zoom level is something i will have too for a week or month overview.
    thank you for your long explanation. i hope you made a screenshot in past :)
     
    Shelby likes this.
Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice