Hi,
We are retrieving the RSSI value from Dialog every 100ms while connected to a central and logging it in our application. We are getting some unexpected RSSI values when we setup the link to use more than 2 packets per connection event (our app can control the number of packets per connection event). We are using an MTU of 512 bytes (although the MTU size doesn't seem to matter) and we are performing write requests and indications over BLE of various sizes up to the MTU size- many of them are more than 20 bytes.
I have attached a spreadsheet which shows the sequence of individual RSSI readings we are receiving from Dialog. The test runs the same set of data between Dialog and the central. You can see when we use 3 or 4 packets per connection interval that we get several readings < -100dB. When we reduce the packets per connection interval to 2 or 1 then all the RSSI readings are as expected.
I have also noticed that if we only send write requests and indications with <20 bytes of data (i.e. the data fits in one packet), and run the same set of commands from the test above the issue also goes away- all the RSSI values are as expected.
Any thoughts on why we are seeing the unexpected RSSI values?
Thanks
Hi rparkinson,
When you try to send more packets than the central can accept on a connection event, you will go out in the air and not communicate with the master. The master will not appear on this last event. Regarding your excel sheet your master will go out on the 3th packet, so that’s the reason why you get unexpected RSSI from the 3rd packet. Every time we have a synch error the RSSI value which is the last reported will be -112, since we do not “filter-out” cases that we have not seen the master when updating the RSSI value.
Thanks, PM_Dialog
Thanks for the information and the sync error resulting in -112 makes sense. My data also has readings between -104 & -109, what could be the cause of these? Again if I keep to 2 or 1 packets per connection event these readings don't occur.
Thanks
Hi rparkinson,
最可能的是ason why you get these values is that a noise is added into your signal.
Thanks, PM_Dialog
Thank you , that makes sense. Do you have any guidance as to the best dB cutoff for differentiating a reading from a sync error that is just returning the ambient noise level vs. an RSSI value that was an actual received frame?
Hi rparkinson,
I would recommend you to have a look into the gapc_con_rssi_ind_handler() from proximity Monitor application from SDK Host Apps. Also, I suggest you to check the measure_errors_receiver() function of the SDK in to find which flags are triggered when you get an invalid measurement. In order to use this function, please define the CFG_BLE_METRICS from the da1458x_config_advanced.h header file
Thanks, PM_Dialog