Android Question WiFi data acquisition

johnf2013

Member
Licensed User
Longtime User
I am developing an app that receives real time engineering data over wifi and displays it in various ways. Android is mostly for display and UI, and the heavy number crunching happens at the source. The Android display code is finished and the emulator has been very convenient for this using simulated data. Now I need to write the wifi data acquisition code. Although it is stated that wifi is not emulated, there are some comments on the internet that it works with certain emulated devices though I cannot get it to work the way the authors suggest.
Am I wasting my time trying and should I just tolerate using a real device for development?
Is there a solution for development that does not use wifi?
Any suggestions would be appreciated.
 

Peter Simpson

Expert
Licensed User
Longtime User
Why are you using an emulator it you have a real device?
  1. B4A Bridge - Through WiFi
  2. USB Debugging - Through a cable
Tolerate???
If possible you should really be using a real device with one of the two solution I listed above. I for one only use real devices not unless there's a specific reason to emulate a device that I do not own.

Enjoy...
 
Upvote 0

johnf2013

Member
Licensed User
Longtime User
Mmmm. I'm new to this Android stuff, not programming. I guess I got carried away with the performance of the emulator. It runs as fast as the real device and my development machine has multiple large monitors so it's very convenient to have the emulator on one of the monitors - big and in your face - and better than peering into a tiny six inch screen that seems to have legs and a mind of its own.
I was unaware how others were doing their development. Thanks for your help.
 
Upvote 0

Peter Simpson

Expert
Licensed User
Longtime User
Multiple monitors, now that makes more sense. In that case for anything that needs WiFi/the internet just use a real device, otherwise use the new B4A AVD manager emulator.

If you are using the Google AVD then it will be almost impossible for the emulator to be faster than a real device. If you are using B4A 7.8+ with its built in AVD manager and you have an Intel based machine, it will be a lot quicker but still no match for a real device side by side.

For anything WiFi based just use a real device, it will make your life a lot easier.

Using the B4A Bridge is an excellent solution, you should try it on all your devices...
 
Upvote 0

canalrun

Well-Known Member
Licensed User
Longtime User
I have developed several projects that do exactly the same thing – use the phone as a display for real-time data.

Divide and Conquer

The display portion sounds all set. Now it's time to concentrate on the network data acquisition via Wi-Fi.

Look for some ASyncStreams demos that Erel wrote about five or six years ago. I believe his Walkie Talkie demo is one.

These demos have gotten me 99% of the network communications I needed.

I agree with the others. Using a real device is the best approach.

I wrote small test apps that merely acquired the data and displayed it on a device as text in a multiline label. After I got that working it was relatively easy to merge the network half and display half to create the final app.

Barry.
 
Upvote 0

johnf2013

Member
Licensed User
Longtime User
I am at a crossroads at present. One promising approach is UDP broadcast and I have this working. I started with Erel's Weather Station app and built it from there. It works fine. It is simple and for real time data is ideal - it does not really matter if the packet has errors or even if it reached its destination - there is another packet "hot on its heels". On the display the old data is replaced by the new in 20ms, so unless there are a lot of errors no one is likely to notice. The only problem I see is if there are multiple servers - I wish - I could sell more product!! It is however unlikely given the way in which this product is used. Or I could go the whole tcp/ip route which feels like overkill.

Or, I could give up and use Bluetooth which is dead simple but slow, though maybe fast enough for this app. Ideally I need 1440 bytes every 20ms or so plus some status bytes. This is my "Gold Standard" but not really necessary and I feel this is unlikely to be achieved anyway. I can get by with some trickery at the embedded controller end. Using a wired connection to a C++ app on the PC I am getting good results at 115200 baud. The PC of course is doing most of the maths which it crunches through at 4.7GHz so it's hardly a fair comparison to the "smart" phone.

If BT works I could offer both. Current development is using an ESP8266 which has enough user code space for my data gathering app in addition to the two A/D converters I need for the accelerometers and enough I/O for the motor control and rotation sensor. This is a really cheap solution. Going up-market a bit I could use the ESP32 and get BT as well as WiFi. I don't suppose the cost of the electronics is such a big deal when the hardware, just the manufacturing cost, is several hundred dollars. We don't plan to sell too many of these things, but those who have a need really want these. With the app free from an app store IP protection is to be built into the embedded controller.

So, a "typical well planned" engineering development project. Software half written and still no hardware definition.....
Projects are living, evolving creatures. Just be on guard for "scope bloat".

I can post some shots later if anyone is interested, but I must get back to my thesis for now or my supervisor is going to have my %^@$# for breakfast :)
 
Upvote 0

johnf2013

Member
Licensed User
Longtime User
Multiple monitors, now that makes more sense. In that case for anything that needs WiFi/the internet just use a real device, otherwise use the new B4A AVD manager emulator.

If you are using the Google AVD then it will be almost impossible for the emulator to be faster than a real device. If you are using B4A 7.8+ with its built in AVD manager and you have an Intel based machine, it will be a lot quicker but still no match for a real device side by side.

For anything WiFi based just use a real device, it will make your life a lot easier.

Using the B4A Bridge is an excellent solution, you should try it on all your devices...

Yes Peter, I do use B4A bridge - "Wirephobia" is that a real condition? I now have a tablet so maybe I won't need to peer down a microscope to develop on a real device. Everyone's suggestions have been much appreciated.
 
Upvote 0

canalrun

Well-Known Member
Licensed User
Longtime User
From what you have written it sounds like it might be very similar to projects I have done in the past.

It sounds like you have multiple sensors (you may have called them servers) based on something like the ESP8266. You can have one sensor or multiple sensors that will send data in real time to an Android phone for display.

I didn't see this mentioned, but I'm guessing the display software might have a way to choose one out of the many sensors connected for display.

I handled this using network sockets (TCP/IP) over Wi-Fi.

The phone had a specific IP address. It would Listen at a specific port for connections and could possibly accept many connections. Each connection as well as information about the connected sensor (such as the sensor IP and identification code) would be stored in a list.

When the sensor started up it would try to connect to the phone IP at the specific port. Once connected the sensor would just continually send data at the specified rate.

The display phone accepted each record that came in and displayed the data that corresponded to the sensor chosen for display.

One key here is that each sensor needed to send an identifying code (several bytes or short string), along with its data, to allow the display phone to identify the sensor.

This worked very well for me. My sensors were actually other phones continuously sending camera snapshots – the data rate was fairly high.

One nice thing about the network sockets is that it's a bidirectional communications link. I defined a packet payload from each sensor that included a header identifying the packet source and whether the data was a response to a command from the phone or whether the packet represented real-time data.

This let me send configuration commands (three character strings) to each sensor and receive a response, if needed.

Hope this gives some ideas.

Barry.
 
Upvote 0

johnf2013

Member
Licensed User
Longtime User
Yes, thank you canalrun. Great ideas to consider. I have several sensors but the phone only sees them as one. The embedded controller manages all of the sensors, does a bunch of maths on the data, and creates data packets to send. It also has to respond to some simple commands.
The data consists of one 10 or 12 bit sample per degree of rotation from each of two sensors. 720 FIR Filters are instantiated and each sample is fed to a filter. Some timing stuff is also computed based on an angle sensor and a motor/solenoid is also driven. All of this is packaged into a payload with some security bits added ready for sending.
This is all working fine using a USB connection and an 8 bit controller sans the FIR filters. The filters are handled by the i7 on the PC running an app written in BC++. I expect the planned 32 bitter will eat the job, filters and all.
Because there are only 2 sensors and data from both is in the one packet I planned to identify the sensor by the order of the data, say two 1-dim arrays or a single 2-dim array.
The phone app is essentially complete. Data structures are in place to hold the packet data and a simulation mode fills these with simulated data. The app works and shows the data the way it should. All that is needed now is to get the data and stuff it into the data structures, and add a little logic to send commands to the controller.
The hardest part of the project was to get the real time response I desired. It seems that the graphics is quite fast but the maths can bog down. I forced integer arithmetic wherever I could and now achieve my desired 50Hz update rate. There were a couple of tasks I would have liked to give to a different core but it seemed like too big a "can of worms" for a modest performance gain - if any.
I have been reading this forum about the asynch streams and they look very promising for this app. I will have a go this afternoon to get this working.
Your other ideas are good to try. I want this to be transparent, if possible, to the users. I do not want the user to have to be a network engineer to get the app to work. Start the app, find the controller, and do its stuff without the user being aware of what's under the hood.
For interest, the app is the smarts for a small gas turbine balancer, as used in radio controlled aircraft. My friend and I have developed the hardware which is all CNC machined aluminium contruction and is unique in that it accepts the entire engine. There is no need to disassemble the engine to remove the rotor for balancing. The rotor is spun up using the starter motor (electric) or compressed air via a solenoid valve. We plan to supply it as a kit - a box full of machined parts and the user screws it together.
The problem we faced was that an integrated solution required tooling for a case, an expensive display, user input hardware, complex circuit, among other things, and still would be unlikely to offer all of the features we could offer by using a phone app. The target market is reasonably technically astute people so we assume they have a phone modern enough to run the app.
Techniques learned from this app will be directly applied to my Uni research project. I need to get engineering data off a drone being used for geophysical survey. There are other telemetry links on the drone, but this one is for ad-hoc development data to be sent to a phone - like, what do I need to know now - let's implement it in 5 minutes.
Thanks for your help. This discussion is good because I am working almost completely in a vacuum - my research is new and unique and no one I know has the slightest idea what I am talking about.
 
Upvote 0

canalrun

Well-Known Member
Licensed User
Longtime User
Sounds interesting.

You mentioned telemetry from a drone - I used asyncstreams to do what might be called telemetry from a drone.

A couple years ago, when quadcopters first came out, I bought a very inexpensive toy quadcopter to play with. It did not have a camera, so I bought something called an AI Ball camera. This is about half the diameter of a ping-pong ball and very light. It has a camera and a Wi-Fi hotspot built-in. I taped this to the bottom of the quadcopter. I could then connect to the hotspot via asycstreams and the AI Ball would stream camera frames. I displayed these on an Android tablet.

I guess this could count as telemetry :D.

Barry.
 
Upvote 0

johnf2013

Member
Licensed User
Longtime User
I like that. I must have a play with it. It sounds interesting, and your use of asyncstreams gives me hope it will be the best solution. It is telemetry.

For the serious telemetry we have a 915MHz link and a Yagi mounted on a tracker to follow the drone. This is good for up to a hundred kBaud or so and carries mainly operational data and commands. There is a 5.2GHz video link too with a quad helical array also mounted on the tracker. This is for the user to see a low resolution preview of the geophysical data, allowing him to alter the flight plan if he sees something of interest.

The WiFi link to the phone is planned for development only. There's a lot of code cutting going on and not all of it works. I see the phone app as a quick way to get feedback - like a debugging tool - over a short range only of course. I would hope to get 100m, which would be enough.

I was going to play with asyncstreams this afternoon but got bogged down with the preferences screen - grrrr this stuff makes one's head hurt :p
 
Upvote 0
Top