Beacons with Accelerometers

When choosing a beacon with an accelerometer, care needs to be taken that it supports the anticipated use. In some cases the accelerometer can control the functionality within the beacon while in other’s it provides raw data that can be used by other Bluetooth devices such as smartphones, gateways and single board computers such as the Raspberry Pi.

The most common use of an accelerometer is to provide for motion triggered broadcast. This is when the beacon only advertises when the beacon is moving so as to improve battery life and lessen the redundant processing needed by observing devices. Beacons supporting motion triggering include the M52-SA Plus, F1, K15, and the H1 Wristband.

M52-SA Plus provides motion triggered advertising

A few beacons such as the iBS01G and iBS03G interpret the movement as starting, stopping and falling with a consequent change in Bluetooth advertising.

Raw acceleration data is provided by beacons such as the iBS01RG , iBS03RG, e8, K15 and B10.

View Sensor Beacons

Understanding Sensor Beacon Accelerometer Data

In this post we will take a look at data from the INGICS iGS01RG beacon.

The x axis is time. You can see the x, y and z values, every 100ms, over time. The y axis is normalised between -1 and -1 for use in our SensorCognition Edge device. The chart is for when the beacon has been moving, followed by a stationary period. Notice how the orange line continues to show acceleration even though the beacon isn’t moving. This is caused by gravity.

In this chart the beacon has been flipped over and the orange line now shows a constant negative acceleration.

A good thing about the presence of a constant offset in one of the x y z inputs is that it can be used to help determine the orientation of the beacon. The less desirable aspect is that the offset significantly complicates using the x y z to determine types of movement such as human gestures.

Such complex data problems are more easily solved using AI machine learning than trying to write a traditional algorithm to make sense of the data.

Here’s an example of output from a SensorCognition Edge device trained with up and down movement and left and right movement. In this case, the output 227 is showing the beacon is moving left and right.

Read about SensorCognition