I'm writing an app to receive data via the serial port and display the byte values. I've stripped out the project for the attachment. In the 'On Com' event, you'll see that at first I was using Port.InputString - however if the byte was greater than 0x7F it was changed to 0x3F ('?'), presumably by Serial2.dll.
I changed to using Port.InputArray and now receive the full range of bytes from 0x00 to 0xFF, but now, if opto compiled, I get the error attached if 0x1A is sent. If legacy or IDE compiled, sending a single 0x1A byte fires 'On Com' twice and opto gives the error. I've included an app that may be compiled for a device and connected with an RS232 cable but I appreciate not many will have that facility.
What does Port.TimeOut actually do?
Finally, if I use CTSHandshaking=True, do I still have to assert RTSEnable?
I changed to using Port.InputArray and now receive the full range of bytes from 0x00 to 0xFF, but now, if opto compiled, I get the error attached if 0x1A is sent. If legacy or IDE compiled, sending a single 0x1A byte fires 'On Com' twice and opto gives the error. I've included an app that may be compiled for a device and connected with an RS232 cable but I appreciate not many will have that facility.
What does Port.TimeOut actually do?
Finally, if I use CTSHandshaking=True, do I still have to assert RTSEnable?