In January of this year, I was convinced to submit an abstract for FOSS4G North America 2016. To my delight, my talk on Filtering point clouds with PDAL and PCL was accepted. This is a topic that is near and dear to me as an early adopter and contributor to both PDAL and PCL.
To this day, I find it quite convenient to leverage PCL's extensive collection of modules when developing new approaches to processing point cloud data. Though the pace of PCL development has slowed (latest release, workshop and code sprint were all in 2014), there is still a wealth of algorithms that can aid in point cloud processing and analysis tasks. And it is easily extensible.
While point clouds can be derived from a number of sources, my focus continues to be on point clouds collected by airborne lidar systems, where LAS, LAZ, and the lesser known BPF formats are the norm. When it comes to reading/writing these (and other) formats, I'd rather not worry about the details, which is were PDAL shines. The PDAL CLI allows me to effortlessly transcode between formats using the translate subcommand.
$ pdal translate -i input.bpf -o output.laz
Here we have converted a point cloud input written in the BPF format to a point cloud compressed as LAZ. But the fun doesn't end there! We can also construct processing pipelines by inserting filters.
$ pdal translate -i input.bpf -o output.laz -f range \
--filters.range.limits="Z(0:]"
This example performs the same format conversion, but uses a range filter to only pass points with Z values that are greater than 0.
Pipelines can also be specified as JSON, invoked using the pipeline subcommand.
$ pdal pipeline -i pipeline.json
Here is the earlier transcoding example, specified using PDAL's JSON specification.
{
"pipeline":[
"input.bpf",
"output.laz"
]
}
Similarly, the range filtering example is given by:
{
"pipeline":[
"input.bpf",
{
"type":"filters.range",
"limits":"Z(0:]"
},
"output.laz"
]
}
This barely scratches the surface of what PDAL can do, but I think you get the idea.
While it was at first tempting to either 1) write format drivers for PCL or 2) write the processing algorithms for PDAL, both of these overlook a vital aspect of open source software: community. If I were to go with option 1, I'd be on an island -- at least initially. The established PCL community already had a format. The same was true for option 2. PDAL's goal in life is really to focus on formats. Sure it would be nice to have many of the PCL algorithms living natively within PDAL, but I don't want to spend the bulk of my time recoding a bunch of algorithms, and it didn't seem there was a body of developers who wanted to jump in on the task with me.
No. What I really want to do is to come up with novel ways of processing the data. Sometimes this will mean writing (or implementing) a new or existing algorithm. Other times it's simply a matter of wiring up existing algorithms in a new way. To that end, we've developed methods for incorporating PCL within PDAL, along with new and intuitive ways to interact with PDAL. Over the coming weeks, my hope is to be able to share with you a number of methods we have developed that bridge the PDAL-PCL divide.