Deprecated: Function jetpack_form_register_pattern is deprecated since version jetpack-13.4! Use Automattic\Jetpack\Forms\ContactForm\Util::register_pattern instead. in /home/theshamb/public_html/wp-includes/functions.php on line 6031

Warning: Cannot modify header information - headers already sent by (output started at /home/theshamb/public_html/wp-includes/functions.php:6031) in /home/theshamb/public_html/wp-includes/feed-rss2.php on line 8
data visualization – The Shamblog https://theshamblog.com A place for Scott to write on the internet Thu, 18 Jan 2024 17:16:57 +0000 en-US hourly 1 75771282 Introducing the ‘mpl_stereo’ Library to Make Stereograms and Anaglyphs https://theshamblog.com/introducing-the-mpl_stereo-library-to-make-stereograms-and-anaglyphs/ https://theshamblog.com/introducing-the-mpl_stereo-library-to-make-stereograms-and-anaglyphs/#respond Wed, 17 Jan 2024 04:49:18 +0000 https://theshamblog.com/?p=79688 When I was a kid growing up in the 90’s, I was fascinated with the “Magic Eye” books. These were picture books filled with images like the one below, which when you did a special technique of “unfocusing” your eyes and staring through the page, transformed into 3D images popping off the page. Learning the trick felt like initiation into a secret club, and to this day the results are more magic than any optical illusion I’ve ever seen. Check out this excellent video for a how-to. It takes some practice to be able to view the stereoscopic effect for the first time, but the effort is well worth it!

Magic Eye – If you know the trick to diverging your eyes, you’ll be able to see a shark popping out at you! (from wikipedia)

I’ve seen a lot of these pictures made for fun (there’s an active community on /r/MagicEye), but none made for practical purposes. This is a huge mistake! Stereographic images can significantly enhance the interpretability of 3D data by leveraging human binocular vision. Instead of looking at a flat projection on a page, stereograms give us “3D glasses” for 2D data with just our eyes.

I’ve been on a data visualization kick recently, working to overhaul Matplotlib’s 3D plotting capabilities with bug fixes and new features. But the end result of those is still a flattened image – I wanted to actually see my plots in 3D space. So, I made a Matplotlib extension called mpl_stereo to do just that!

Check it out on GitHub, or install it right now!

pip install mpl_stereo

Stereograms

Some examples of 2D and 3D plots that pop off the page when you use “divergent” viewing to merge the two plots into one.

Anaglyphs

If you can’t figure out how to do the stereoscopic eye trick, then you can still see the effect with regular old red-blue 3D glasses. These sorts of plot which you can also make with the library are called “anaglyphs”.

Stereoscopic Images

Or if you already have a pair of stereoscopic images, the library makes it easy to plot as a stereogram or as an anaglyph.

Wiggle Stereograms

Any pair of side-by-side plots can also be turned into a “wiggle stereogram”, or “wigglegram”. Not as precise for examining data, but allows for seeing the 3D effect without training your eyes or using 3D glasses.

]]>
https://theshamblog.com/introducing-the-mpl_stereo-library-to-make-stereograms-and-anaglyphs/feed/ 0 79688
Flying around 3d plots with an IMU and quaternions https://theshamblog.com/flying-around-3d-plots-with-an-imu-and-quaternions/ https://theshamblog.com/flying-around-3d-plots-with-an-imu-and-quaternions/#respond Mon, 22 Aug 2022 20:33:47 +0000 https://theshamblog.com/?p=66230 Cool Video First!

Project Idea

I generate a good number of 3D plots for work and other projects, and for a while I’ve wanted to make a physical device that acts as a “virtual camera” that would let me fly around the 3D data virtually. In a fully featured end state, this would basically be a simple virtual reality setup – there would be a volume of space at my desk that represents the volume of the 3D plot, and I could move the device around in it to zoom in and look at data from any angle.

Hacking matplotlib

Unfortunately, there was immediately a roadblock. My favorite (and probably yours too!) Python plotting library matplotlib only gives you two angles to define your viewpoint in 3D plots: elevation and azimuth. This lets you move the camera location to any position around the plot, but the Z axis is always going to be “up”. In 3D space we need 3 angles to define an arbitrary orientation, and here we are missing a “roll” angle that rotates the camera about the direction it’s pointing. There were basically two options to get around this. One, ditch matplotlib and use another visualization library like PyVista or plotly that handles 3D plots better. Or two, add the functionality I needed to matplotlib. This decision was easy – why learn another library when you could spend ten times as long hacking the one you know? And never let an opportunity to give back to open source go to waste!

Diagram for the matplotlib docs

Once I got the roll angle working, I submitted it as a pull request to the matplotlib official repository. This was my first time contributing to a major open source project, and while it was a decent bit of effort to get code quality and test coverage up to the standards of a project that sees 26 million downloads every month, I found the chance to share useful features with that many users incredibly rewarding. Matplotlib is the de-facto default of plotting in python, and python is the de-facto default programming language of education, science, and frankly programming overall. And beyond giving it a third view angle, I found that once I had a development version set up the ball had started rolling and I kept finding things to fix and add in.

So I’m happy to share these new features in matplotlib 3.6, which is out now!

Plus these, hopefully coming in matplotlib 3.7:

Hardware Setup

But back to the original project. To get the orientation for my virtual camera, I needed an inertial measurement unit (IMU). The IMU contains an accelerometer, magnetometer, and gyroscope, which fuse their data to determine the orientation of the device in 3D space. The accelerometer could be used to calculate the change in position as I move it around, but unfortunately all accelerometers will have a small bias in one direction, and this will cause the estimated position to drift over time. A source of absolute position is needed to counteract this drift, which is why an IMU is often paired with GPS, cameras, radar, lidar, or sonar in real-world vehicles and robots. Right now I think that this would add a little too much scope to the project, so I shelved the idea of flying the virtual camera around in 3D space and decided to just have the IMU control the view angle of the plot.

The IMU I chose was a ICM-20948, which is the current upgrade to the popular MPU-9250, and it does all the sensor fusion onboard so I don’t have to worry about implementing a Kalman filter. This gets wired up on a breadboard to a Teensy 4.0 microcontroller (essentially a more powerful Arduino), which reads the output of the IMU and sends the orientation quaternions to my laptop.

The Teensy 4.0 (left) wired to the ICM-20948 IMU (right) via an SPI connection, all connected to my laptop over USB

Quaternion Math

Quaternions are the right way to represent orientations and rotations in 3D space, and check out this interactive video series by 3blue1brown for a great introduction. However, quaternions are not supported by default in numpy (see the numpy-quaternion package for adding this functionality) and so in the interest of maintaining backwards compatibility and not adding dependencies, I did not update matplotlib to use them internally. However, we can convert the quaternions from the IMU to matplotlib’s Euler angles in a straightforward manner.

The equations to do this can be found in the paper Diebel, James “Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors” (2006) as Sequence (3, 2, 1), with azimuth as φ, elevation as -θ, and roll as ψ. To convert from the camera frame to the axes frame, the rotation matrices are applied in the (right-multiplied) order R = Rz(φ)Ry(θ)Rx(ψ), and to convert from the axes to the camera frame this rotation matrix is inverted.

For anyone who wants to do a similar project, the core of the calculation is below. Feel free to reuse!

import numpy as np
import matplotlib.pyplot as plt  # matplotlib >= 3.6.0
import quaternion  # package numpy-quaternion

# See Diebel, James "Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors" (2006)
# https://www.astro.rug.nl/software/kapteyn-beta/_downloads/attitude.pdf
# Sequence (3, 2, 1)
def quat_to_elev_azim_roll(q, angle_offsets=(0, 0, 0)):
    q0, q1, q2, q3 = q.w, q.x, q.y, q.z
    phi = np.arctan2(-2*q1*q2 + 2*q0*q3, q1**2 + q0**2 - q3**2 - q2**2)
    theta = np.arcsin(2*q1*q3 + 2*q0*q2)
    psi = np.arctan2(-2*q2*q3 + 2*q0*q1, q3**2 - q2**2 - q1**2 + q0**2)
    azim = np.rad2deg(phi) + angle_offsets[0]
    elev = np.rad2deg(-theta) + angle_offsets[1]
    roll = np.rad2deg(psi) + angle_offsets[2]
    return elev, azim, roll

def elev_azim_roll_to_quat(elev, azim, roll, angle_offsets=(0, 0, 0)):
    phi = np.deg2rad(azim) - angle_offsets[0]
    theta = np.deg2rad(-elev) - angle_offsets[1]
    psi = np.deg2rad(roll) - angle_offsets[2]
    q0 = np.cos(phi/2)*np.cos(theta/2)*np.cos(psi/2) - np.sin(phi/2)*np.sin(theta/2)*np.sin(psi/2)
    q1 = np.cos(phi/2)*np.cos(theta/2)*np.sin(psi/2) + np.sin(phi/2)*np.sin(theta/2)*np.cos(psi/2)
    q2 = np.cos(phi/2)*np.sin(theta/2)*np.cos(psi/2) - np.sin(phi/2)*np.cos(theta/2)*np.sin(psi/2)
    q3 = np.cos(phi/2)*np.sin(theta/2)*np.sin(psi/2) + np.sin(phi/2)*np.cos(theta/2)*np.cos(psi/2)
    q = np.quaternion(q0, q1, q2, q3)
    return q

q = np.quaternion(1, 0, 0, 0)
angles_init = (0, 0, 0)
elev, azim, roll = quat_to_elev_azim_roll(q, angles_init)

ax = plt.figure().add_subplot(projection='3d')
ax.view_init(elev, azim, roll)

Putting It All Together

All that remained at this point was to get my windows laptop talking to its WSL linux installation, write a little python code to read in the quaternions from the IMU/Teensy setup in a loop, and use that to set the view angle of a 3D matplotlib plot. And voila! You get the video up at the top of the post.

I’m pretty happy with how it all works, with the caveat that there’s a good bit of lag and pretty low framerate resulting in jittery movement. The bottleneck is in matplotlib’s speed redrawing the plot, so perhaps that will be the next improvement to work on…

Check out this github repo for the full code used to generate that video!

]]>
https://theshamblog.com/flying-around-3d-plots-with-an-imu-and-quaternions/feed/ 0 66230