May 12

Lidar Point Cloud Data Processing

0  comments

Have you heard about capturing reality in 3D? Well, that’s what Lidars are for!

Lidar is an abbreviation for Light Detection and Ranging. Did you know that although the Lidar-like system was first introduced in 1961 by the Hughes Aircraft Company, the common man was exposed to its accuracy and efficiency only in 1971 via the Apollo 15 mission?

Lidar Data Processing

Lidar Data Processing

When the Lidar was introduced to the world, it was termed “CoLidar” – for “coherent light detecting and ranging”. It was greatly applied in satellite tracking. With its help, laser-based imaging and calculation of distances were made easy by measuring the time for a signal to return using appropriate sensors and data acquisition equipment. It has since been used in meteorology to measure clouds and pollution and later in 1971, the moon’s surface was mapped using a laser altimeter.

Lidar technology has ever since opened up a plethora of possibilities in this ever-growing world.

Applications of Lidar Technology

From satellites to mining and construction to autonomous vehicles (AV), Lidar technology is on the rise. Following are a few if not the only areas where Lidar is gaining extensive popularity:

Autonomous Vehicles

Sensors play a compelling role in all autonomous systems. How, do you ask? Oh, you know: when we want to know the state of a system, the direct method is to measure the variables using sensors. This data is processed and then used to control the system or modify any specific parameter.

Lidar in Autonomous Vehicles

Lidar in Autonomous Vehicles

The primary available sensors for AV are IMU, RADAR, Camera Vision, and Lidar. The Vision-based Cameras work well with high visibility conditions, recognize text and color (traffic signs and indicators from other vehicles), provide parking assistance, and identify road markings. But they fall short as they are sensitive to light and weather conditions and they cannot measure the distance between objects (it is possible with the complex integration of multiple cameras with a high computational cost!). The RADAR uses radio waves to measure distance and is more efficient than cameras because they perform well with low visibility conditions, and measure the relative speed between vehicles and objects. The Lidar operates on infrared waves to provide accurate mapping of its surrounding, and are unaffected by light or weather conditions. But they definitely cannot tell if a cyclist will take a turn by looking over his shoulder. Therefore, Vision and RADAR-based systems, including Lidar in AV, have increased the efficiency, accuracy, and precision of object detection and recognition in ADAS (Advanced Driver Assistance Systems) Systems.

Survey

Surveying is the most crucial area of civil engineering. Mapping and surveying of landscapes for mining, agriculture, construction, and accounting for resources are all undertaken by drones or UAVs. The tedious task of surveying by a team of workers manually accounting for the site is made easy, simple, and effective by the use of Lidar technology embedded in aerial vehicles. This has also made it possible to explore unreachable places by a human.

Many Lidar industries are now forerunners in providing high-quality point cloud data and better post-processing techniques.    

3D Mapping

If you can imagine the 3-dimensional structure of any object, with current and fast-evolving technology, it is possible to obtain a 3D model and map of the surrounding environment. This is possible with the Lidar cloud dataset available either in .las or .laz or .dem files. 3D modeling has provided efficiency in the planning and planting of power transmission lines in remote areas. It also helps to explore the unknown terrain in autonomous navigation and military grounds applications.

Working of Lidar Sensors

The remote sensing Lidar technique has found its use to measure elevation and 360deg view of ground, buildings, and forest. Lidar system uses laser pulses to measure the distance between the source and surrounding objects. The time elapsed between the initial laser pulse and the received sensor signal is the distance light has travelled. Lidar is adept at emitting and computing data from 100,000s of laser pulses per second with a range accuracy of 60 -200 Mts. These data points form a point cloud to create 3D models of objects in their environment.

Lidar data can be preprocessed using Automated Driving, Lidar, or Computer Vision Toolbox depending on the application. We are extensively using the later two toolboxes in this blog to explore various methods involved in importing Lidar data and a basic preprocessing step, i.e., extracting ground and non-ground planes.

What is Point Cloud Data?

Wait, are we talking about Rain clouds or Cloud data? Yes, It’s later. Lidar sensors produce data of scanned objects in form of 2D and 3D point clouds. Point cloud data is the set of data points in space representing the x, y, and z spatial coordinates of the objects. These 3 channels are of size 64 x 1024.

Cloud Data

Cloud Data

Point clouds are categorized into:

  • Unorganized point cloud: this is a row point cloud with format MxC, where, M is the number of points in the point cloud and C is the number of channels. Lidar sensor is an example of unorganized point cloud data. It also depends on the source or type of Lidar that is being used in the application
  • Organized point cloud: it has MxNxC size, where, M is the length of the point cloud, N is the width of the point cloud and C is the number of channels. Stereo cameras are an example of organized point cloud data

In this blog, we will look at the three primary possibilities for extracting point cloud data and their preprocessing techniques:

  • Ego vehicle
  • Velodyne Sensor
  • Open Source Lidar dataset

Ego Vehicle

For this part, we will consider the data received from a Lidar sensor mounted on top of an ego vehicle, recorded from a highway-driving scenario.

Just a quick note: If you want to know more about Driving Scenario Toolbox, check out our blog Driving Scenario Designer App

The sensor used is the Ouster OS1 Lidar sensor, which is mounted horizontally to the ground plane. This is organized data with 64 horizontal scan lines. This data can be downloaded using the helper function helperDownloadData. This sensor also provides intensity and range measurements of each point.

% Download Lidar data

[ptClouds,pretrainedModel] = helperDownloadData;

% Load point cloud

ptCloudA = ptCloud{1};

The data obtained is pre-processed to extract objects such as cars, cyclists, pedestrians, and buildings. Following are the major steps: ground plane segmentation, semantic segmentation, oriented bounding box fitting, and tracking oriented bounding boxes. We are going to look at the most primary pre-processing which is ground plane segmentation.

We can either use the data defined given or define the x-,y- and z- limits of the region of interest. We have used the later approach in our article. To visualize the point cloud, use the pcshow function.

% Preprocessing of Organized Point Cloud Data

% Region Of Interest (ROI) Selection Method 1: General Method

% Define ROI for cropping point cloud

xLimit = [-30,30];

yLimit = [-12,12];

zLimit = [-3,15];

player = pcplayer(xLimit,yLimit,zLimit);

roi = [xLimit,yLimit,zLimit];

% Display figure

figure

pcshow(ptCloudA.Location);

Each point data from the organized point cloud is segmented into the ground and non-ground elements using helperExtractGround function, which is part of the Computer Vision toolbox.

% Extract ground plane

[nonGround,ground] = helperExtractGround(ptCloudA,roi);

figure;

pcshowpair(nonGround,ground);

segmentGroundFromLidarData function sought ground and non-ground point cloud data by extracting their indices separately. A piece-wise plane fitting is employed using the function pcfitplane. This function divides the estimated ground plane into strips along the direction of the vehicle. Both these approaches are combined into the helperExtractGround function. This provides robustness in handling variations in point cloud data and the effective piece-wise fitting of the ground plane. The points representing the ground plane are given in green and the obstacles are represented in purple color.

Ground and Non-Ground Points

Ground and Non-Ground Points

Velodyne Sensor

Among the various sensors present in the market, the one that is of late in the news is the Velodyne sensor. The stand-alone Velodyne is now part of the Ouster Company. The high end product Alpha Prime sensor has the horizontal range of 360deg and 40deg vertical range, with high resolution of (0.2 x 0.1) deg.

MATLAB has a Lidar Toolbox Support Package for Velodyne Lidar Sensors toolkit for data acquisition and processing developed for this range of sensors, we have to choose the sensor appropriate for our application. Make sure you have MATLAB R2020b or later installed, with Lidar Toolbox as a must and of course, you need a supported Velodyne sensor. We can process data from .las or .laz or PCAP file formats using the given support package.

Importing data from this sensor is a simple step achieved by using the velodyneFileReader function.

% Data Input Method 2: Read Data from Velodyne Sensor

% veloReader = velodyneFileReader('LidarData_ConstructionRoad.pcap','HDL32E');

If you want to experiment on your own then follow the same pre-processing steps of defining ROI and extracting ground and non-ground points using the helperExtractGround function we used in the data extraction and processing of the Ego vehicle. And later you can refer to the Velodyne Lidar Sensors Data Acquisition page from MathWorks for an even more detailed explanation.   

Open Source Lidar dataset    

Commercially available sensors cost a bomb! If you are a student or a researcher, then investing in them is not profitable, therefore we go with the only other option: use an open-source dataset. The Lidar dataset is made available for academic and restricted use by many companies. Here are a few websites where you can download the Lidar dataset for free: USGS Earth Explorer, NOAA Digital Coast, Open Topography, and National Ecological Observatory Network (NEON).

These web services not only provide the world’s dataset at the click of your finger but also come in handy when we have to design an earthquake-proof model, where the relevant post-earthquake data of the required region can be collected for analysis, and many other such scenarios. To make use of these services all you have to do is a simple sign-in that costs you absolutely nothing.

Open Topography

Open Topography

I found Open Topography straightforward for both beginners and advanced users. Under Find Data Map in the drop-down menu, you can access data from any region directly from maps. Or you can filter data points based on the type of dataset, collector, and funder. Data starting from 2006 till the recent 2022 is available for various topographies, measured over a few days to a few months.

Data Map from Open Topography

Data Map from Open Topography

After downloading the required Lidar dataset which usually takes less than a minute, let us move to MATLAB and study how the data is imported and processed.

We are exclusively looking at the processing of .laz point cloud data. When it comes to las or .laz files then MATLAB has lasReader object which can be used for read, write and obtain classification information of objects detected. And that is exactly what we are using!

lasFileReader will read the .laz point cloud data file as an object, and a point cloud is created with the help of the readPointCloud function. 

% Import Lidar data from laz point cloud

LidarData = fullfile("points.laz");

lasReader = lasFileReader(LidarData);

% Read point cloud

ptCloudB = readPointCloud(lasReader);

Las Point Cloud Data

Las Point Cloud Data

As this is an unorganized point cloud, we cannot use the same processing techniques we used earlier. You can either use the region of interest property to define the area or another approach is to use the ClassificationInfo property of the lasFileReader object which will provide information on different classes of the input point cloud.  

disp(lasReader.ClassificationInfo);

Ground plane extraction from unorganized point cloud data is achieved using the segmentGroundSMRF function. SMRF stands for Simple Morphological Filter. This algorithm has the following stages: the creation of a minimum elevation map of the surface, segmenting the surface map into the ground and non-ground elements using morphological opening operation, and segmenting of original point cloud data.   

[groundPtsIdx,nonGroundPtCloud,groundPtCloud] = segmentGroundSMRF(ptCloudB);

The final visualization of the 3D map has ground points given in purple color and non-ground points by green color. 

Segmentation of Las point cloud into ground and non-ground plane

Segmentation of Las point cloud into ground and non-ground plane

Conclusion

In conclusion, Lidar point cloud data processing has emerged as a fundamental component in various applications such as autonomous vehicles, surveying, and 3D mapping. This blog provided an overview of Lidar technology, its applications, and how to process Lidar data using different sources like ego vehicles, Velodyne sensors, and open-source datasets. We explored the basic preprocessing steps involved, such as ground plane segmentation and point cloud data extraction, along with the tools available in MATLAB for processing Lidar data.

As Lidar technology continues to advance, it will undoubtedly unlock new possibilities and applications that are currently unimaginable. By understanding the fundamentals of Lidar data processing, you can harness the power of this technology to create innovative solutions for various fields and industries. Whether you are a student, researcher, or professional, mastering the techniques and tools discussed in this blog will be invaluable in your journey to explore and contribute to the rapidly evolving world of Lidar technology.


Get instant access to the code, model, or application of the video or article you found helpful! Simply purchase the specific title, if available, and receive the download link right away! #MATLABHelper #CodeMadeEasy

Ready to take your MATLAB skills to the next level? Look no further! At MATLAB Helper, we've got you covered. From free community support to expert help and training, we've got all the resources you need to become a pro in no time. If you have any questions or queries, don't hesitate to reach out to us. Simply post a comment below or send us an email at [email protected].

And don't forget to connect with us on LinkedIn, Facebook, and Subscribe to our YouTube Channel! We're always sharing helpful tips and updates, so you can stay up-to-date on everything related to MATLAB. Plus, if you spot any bugs or errors on our website, just let us know and we'll make sure to fix it ASAP.

Ready to get started? Book your expert help with Research Assistance plan today and get personalized assistance tailored to your needs. Or, if you're looking for more comprehensive training, join one of our training modules and get hands-on experience with the latest techniques and technologies. The choice is yours – start learning and growing with MATLAB Helper today!

Education is our future. MATLAB is our feature. Happy MATLABing!


Tags

3D mapping, autonomous vehicles, data processing, ground plane segmentation, Lidar, point cloud, remote sensing, surveying, Velodyne sensors


About the author 

SANJANA RAMESH TIWARI

MATLAB Developer at MATLAB Helper,
MTech in Sensors and Control Systems,
Cheers to an attempt at combining my love for writing and technology!

You may also like

Runge-Kutta method in MATLAB

Runge-Kutta method in MATLAB
{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
Subscribe to our newsletter!

MATLAB Helper ®

Follow: YouTube Channel, LinkedIn Company, Facebook Page, Instagram Page

Join Community of MATLAB Enthusiasts: Facebook Group, Telegram, LinkedIn Group

Use Website Chat or WhatsApp at +91-8104622179

>