Tutorial 3: Orientation 3D (Python)
In this example, the sensor data is read from the Pozyx shield (either the shield mounted on the Arduino, or a remote shield). To visualize all the data, we use Processing. Below a screenshot of the Processing program showing all the sensor data graphically.
This example requires one or two Pozyx shields.
In order to run the sketch in Processing, a specific version of the program is required. Processing version 3.1.1 can be found on this GitHub page https://github.com/processing/processing/releases/tag/processing-0250-3.1.1. Simply download the correct folder that corresponds to your system and extract the files. Within the folder there should be an executable called processing.exe that you can double click to run the program.
The sketch also requires two libraries that can be downloaded here. Download the two folders and place them into the libraries folder linked to Processing. This folder can normally be found under Users > Username > Documents > Processing > libraries. It is important that you copy paste these two exact libraries instead of installing them using the tool in the Processing application.
As a final step the examples can be downloaded here. The pozyx_orientation3D folder contains the necessary sketch. Place the folder in the Processing examples folder that can normally be found under Username > Documents > Processing > examples
If all tools are installed, open up orientation_3D.py in the Pozyx library’s tutorial folder. Probably, the path to this file will be "Downloads/Pozyx-Python-library/tutorials/orientation_3D.py". You can run the script from either command line or a text editor that allows running the scripts as well. If you're on Windows and have installed Python, you might be able to simply double-click it. If you you get an error, you likely forgot to install pythonosc, you can do this easily using pip install python-osc.
Open up the pozyx_orientation3D.pde sketch in Processing. In Processing, make sure serial is set to false. Unless you need port 8888 for something else, you can leave both the Python script and Processing sketch intact. The data will automatically be sent using OSC on port 8888. You should be able to physically rotate your Pozyx now and directly see it rotate in Processing as well. There are plots on the side and at the top, but what do they all mean?
Understanding the sensor data
Acceleration (g): the acceleration is measured along 3 axes (i.e., in 3 directions) which is shown by three different colors on the plot. The acceleration is expressed in g's (from gravity). 1g is exactly the acceleration from earth's gravitation pull (1g = 9.81m/s2). Because the acceleration from gravity is always present and equal to 1g it can be used to determine how the device is tilted or pitched. Try rotating the Pozyx device by 90 degrees (slowly) and you will see the the acceleration change from one axis to the other.
Magnetic field strength (µT): the magnetic field strength is also measured along 3 axes and is expressed in µT. It can be used to measure the earth's magnetic field which varies between 30 and 60µT over the earth's surface. The measured magnetic field strength can be used to estimate the magnetic north (similar to a compass). However, magnetic fields exist wherever electric current flows or magnetic materials are present. Because of this, they will influence the sensor data and it will become impossible to determine the magnetic north exactly. Try holding a magnet or something iron close to the pozyx device and you will see the magnetic field strength fluctuate.
Angular velocity (deg/s): The angular velocity is measured by the gyroscope and measures how fast the device rotates around itself. By integrating the angular velocity it is possible to obtain the angles of the device. However, due to drift this method does not give accurate results for a prolonged time. What is drift you wonder? Well if the device is standing still, the angular velocity should be exactly equal to zero. This is not the case however, it will be slightly different and this error will accumulate over time when integrating to angles.
3D Orientation: The 3D orientation is shown by the 3D model in the middle. The orientation is computed by combining the sensor data from all three sensors together. By combining all sensors it is possible to overcome the limitations of each sensor separately. The 3D orientation can be expressed in Euler angles: yaw, pitch, roll or in quaternions. Quaternions are a mathematical representation using 4 numbers. In many situations, quaternions are preferred because they do not have singularity problems like the Euler angles.
Gravity vector (g): If the pozyx device is standing still, the acceleration is exactly equal to the gravity. The gravity vector is shown by the black line and is always pointing down. Notice that even when moving the device (which introduces an additional acceleration) the gravity vector still points down. This is due to the fusion algorithm that can separate gravity from an arbitrary acceleration;
Linear acceleration in body coordinates (g): The linear acceleration is the acceleration that remains after the gravity has been removed. When you hold the device horizontal, pointed forward, and shake it from left to right the circle will also move from left to right in the plot. However, if you rotate the device by 90 degrees and shake it again from left to right, the circle will now move in a different direction. This is because the linear acceleration is expressed in body coordinates, i.e., relative to the device. Note that all the above sensor data is expressed in body coordinates.
Linear acceleration in world coordinates (g): Once the orientation of the device is known, it is possible to express the acceleration in world coordinates. By doing this, the rotation of the device no longer affects the linear acceleration in the plot.
The code explained
We'll now go over the code that's needed to retrieve all sensor data. Looking at the code's parameters, we can see that we can once again use a remote Pozyx for this. This means that you could for example attach a Pozyx to a ball, and watch the ball's spin directly on your screen as well with the Pozyx.
Imports and setup
We import the pypozyx library and the IMU interrupt flag from its bitmasks, and pythonosc so that we can send the sensor data to Processing over OSC. The addition of the default time library import allows us to measure the time between different measurements.
We initialize the PozyxSerial and UDP socket over which we'll send the data, and the setup then sets the last measured time right.
from time import time from pypozyx import * from pypozyx.definitions.bitmasks import POZYX_INT_MASK_IMU from pythonosc.osc_message_builder import OscMessageBuilder from pythonosc.udp_client import SimpleUDPClient
def setup(self): """There is no specific setup functionality""" self.current_time = time()
The main loop deserves to be elaborated on. The Pozyx's IMU sensors trigger an interrupt flag when there is new data available, and in the code we wait for this flag to trigger explicitly.
When checkForFlag returns POZYX_SUCCESS, meaning that POZYX_INT_MASK_IMU was raised and new IMU data is available, or when we're retrieving sensor data remotely, all sensor data and the calibration status will be read from the (remote) Pozyx and packed in a OSC message, which is then interpreted by the Processing sketch. You can see that there is a SensorData object available for this, which automatically converts the sensor data to its respective standard units.
def loop(self): """Gets new IMU sensor data""" sensor_data = SensorData() calibration_status = SingleRegister() if self.remote_id is not None or self.pozyx.checkForFlag(POZYX_INT_MASK_IMU, 0.01) == POZYX_SUCCESS: status = self.pozyx.getAllSensorData(sensor_data, self.remote_id) status &= self.pozyx.getCalibrationStatus(calibration_status, self.remote_id) if status == POZYX_SUCCESS: self.publishSensorData(sensor_data, calibration_status)
The calibration status gives information about the quality of the sensor orientation data. When the system is not fully calibrated, the orientation estimation can be bad. The calibration status is a 8-bit variable where every 2 bits represent a different piece of calibration info.
You're at the end of the standard tutorials that introduce you to all of Pozyx's basic functionality: ranging, positioning, and its IMU. If you haven't already done so and have the number of devices necessary, you can read the multitag tutorial. When you start work on your own prototype, don't be afraid to delve into our documentation which should suffice for all your needs.