Revolutionizing Atomic Force Microscopy with Tomographic AFM - MATLAB & Simulink

Technical Articles

Accelerating Postprocessing of Tomographic Atomic Force Microscopy Data with Lidar Toolbox

By Bryan Huey, University of Connecticut


“In addition to simplifying tasks such as performing 3D segmentation, calculating surface normals and curvature, and determining dependencies along various vectors, MATLAB and Lidar Toolbox have reduced the time required to postprocess TAFM data sets from many hours to just minutes.”

Atomic force microscopy (AFM) is a cornerstone technique in nanotechnology, enabling researchers to obtain detailed insights into surface topography with subnanometer resolution. The technique involves scanning a sharp probe across a sample to map surface features with exceptionally high precision—a capability that has made AFM a vital tool in materials science, physics, mechanical engineering, and biology, among other fields.

In traditional AFM, one of the aims is to minimize the force with which the probe makes contact with the material, often reducing this force to mere piconewtons. My research group at the University of Connecticut (UConn) turns this idea on its head by forcing the probe to scrape or dig into the sample so that we can measure piezoresponse, photocurrent, and other material properties at or even below the surface. This novel approach—known as tomographic AFM (TAFM)—enables the reconstruction of a 3D image of the sample, revealing internal structures and subsurface features undetectable via traditional AFM (Figure 1).

Figure 1. A 3D representation of a piezoresponse for a nanocomposite as measured via TAFM with a 500 x 500 x 25 nm scalebar.

Processing TAFM data comes with its own set of challenges. One major issue is the sparseness of data in the z-direction (depth). Unlike conventional imaging methods that uniformly acquire data, TAFM often generates a limited number of nonlinearly distributed data points, especially along the z-axis. This sparse data distribution complicates the reconstruction process, requiring sophisticated computational methods to accurately interpolate and visualize the missing information from thousands of consecutive images.

My group, the HueyAFM Labs in the Materials Science and Engineering Department at UConn, has recently implemented a new approach for postprocessing TAFM data. Based on MATLAB®, this new approach incorporates an innovative use of Lidar Toolbox™—a product typically used by engineers in automotive and other industries for the design, analysis, and testing of lidar processing systems—to accelerate the visualization and analysis of TAFM data. Lidar Toolbox point cloud capabilities are particularly useful for advanced visualization of raw TAFM data, enabling us to grid the sparse results for exporting to 3D image stacks. In addition to simplifying tasks such as performing 3D segmentation, calculating surface normals and curvature, and determining dependencies along various vectors, MATLAB and Lidar Toolbox have reduced the time required to postprocess TAFM data sets from many hours to just minutes, which has significantly increased the pace and impact of our research.

Challenges with Traditional Postprocessing

The atomic force microscope we use in our lab (Figure 2) produces roughly 100 million data points for a single tomographic experiment. This includes measurements at more than 10 million different coordinates along the x, y, and z directions, with multiple properties measured at each coordinate, including, for example, force, piezoelectricity, conductivity, photoconductivity, surface potential, stiffness, and/or magnetic fields.

A computer station at HueyAFM Labs that is connected to an Oxford Instruments Asylum Research AFM.

Figure 2. One of four Oxford Instruments Asylum Research AFMs at UConn’s HueyAFM Labs.

In the end, we need our visualizations to show what can be thought of as a structure of colored building blocks, where the color of each block—or voxel—indicates the value of a particular measured material property in that tiny sample volume. The x- and y-dimensions of this building block structure are well-defined, but the z-dimension requires postprocessing to account for the sparseness and nonuniform distribution of the data that results from how we finely dig into the sample during our experiments. Initially, we used 3D interpolation algorithms in MATLAB to postprocess the experimental data for visualization at uniform depths. While this approach worked, it required several hours of processing time due to the size and complexity of the data sets.

A Novel Use Case for Lidar Toolbox

While looking for a way to shorten postprocessing time, I came upon the idea of using the point cloud capabilities of Lidar Toolbox—specifically the X and Y functions—to analyze TAFM data. I used rmmissing to deal with missing data or data points sometimes masked away for cropping, filtering, or segmentation purposes; pcdownsample to account for the volume of data; and pcShow for visualization.

The principal benefit of using Lidar Toolbox for TAFM postprocessing is speed: It is easily a thousand times faster than our previous approach. Another benefit is improved visualization, which is crucial when we explore the data. Now we can zoom in and out while rotating the sample to better visualize the material and its properties (Figure 3).

Video Player is loading.
Current Time 0:00
Duration 0:04
Loaded: 0%
Stream Type LIVE
Remaining Time 0:04
 
1x
  • Chapters
  • descriptions off, selected
  • captions off, selected
  • en (Main), selected
    Video length is 0:04

    Figure 3. A zoomed-in view of a 3D TAFM image comprising more than 520 million data points, where the color reveals variations in a piezoresponse up to a maximum depth of 320 nm.

    Just as important, our new postprocessing approach with Lidar Toolbox has increased our confidence in our analysis. For example, with Lidar Toolbox we can easily determine how many data points are present in any given voxel. Moreover, we can choose the depth of each voxel, optimizing this depth so that most voxels contain at least one data point. As researchers, we have a much better feel for the accuracy of our measurements. Of course, when we publish our findings, it is crucial to report that we are not simply interpolating between sparse data points, but rather that we have real data for nearly all the possible voxels in the analyzed volume (typically at least 99%).

    Next Steps

    Incorporating Lidar Toolbox into our postprocessing workflow has opened new opportunities to learn more about the materials we study. For example, we can use the x-, y-, and z-positions to define a surface, and then use Lidar Toolbox functions to analyze and quantify the curvature of that surface (Figure 4).

    Two graphs visualizing the minimum and maximum principal curvatures of features such as peaks and valleys on a TAFM data-derived surface.

    Figure 4. Visualizing the two principal curvatures (κ1 on the left and κ2 on the right) to better understand features such as peaks, valleys, and saddle points on a surface derived from TAFM data.

    We know that Lidar Toolbox has additional capabilities that we have yet to apply. Going forward, we plan to explore more of these. We also see automated analysis of large TAFM data sets as an inevitable future step. The ability to integrate our postprocessing and point cloud activities with other MATLAB capabilities—including machine learning and AI—will be increasingly valuable.

    These advances, together with the increased pace of research made possible by MATLAB and Lidar Toolbox, are continuing to deepen our understanding of the properties of materials essential to the performance of a host of vital technologies such as sonar and ultrasonic imaging, computer memory devices, MEMS sensors, and solar panels. This deeper understanding will help in all phases of engineering. On the front end, equipped with such comprehensive knowledge of material properties down to the nanoscale, engineers can design more efficient and reliable technologies. On the back end, we can selectively assess regions of high or poor performance following in-service use or accelerated degradation. Ultimately, we can thus optimize the functionality, and reliability, of the next-generation materials-enabled solutions to engineering challenges.

    Published 2024

    Products Used

    Learn More

    View Articles for Related Capabilities

    View Articles for Related Industries