LAScatalog formal class

A LAScatalog class is a representation in R of a las file or a collection of las files not loaded in memory. Indeed, a regular computer cannot load the entire point cloud in R if it covers a broad area. For very high density datasets it can even fail loading a single file (see also the “LAS formal class” vignette). In lidR, we use a LAScatalog to process datasets that cannot fit in memory.

Build a LAScatalog object reading a folder of las files

ctg <- readLAScatalog("path/to/las/files/")
# or
ctg <- readLAScatalog("path/to/las/files/big_file.las")
ctg
#> class       : LAScatalog (v1.2 format 1)
#> extent      : 883166.1, 895250.2, 625793.6, 639938.4 (xmin, xmax, ymin, ymax)
#> coord. ref. : NAD83 / UTM zone 17N 
#> area        : 111.1 km²
#> points      : 0  points
#> density     : 0 points/m²
#> num. files  : 62

Basic structure of a LAScatalog object

A LAScatalog contains a sf object with POLYGON geometries plus some extra slots that store information relative to how the LAScatalog will be processed.

The slot data of a LAScatalog object contains the sf object with the most important information read from the header of .las or .laz files. Reading only the header of the file provides an overview of the content of the files very quickly without actually loading the point cloud. The columns of the table are named after the LAS specification version 1.4

The other slots are well documented in the documentation help("LAScatalog-class") so they are not described in this vignette.

Allowed and non-allowed manipulation of a LAScatalog object

A LAScatalog purpose is not to manipulate spatial data in R. The purpose of a LAScatalog is to represent a set of existing las/laz files. Thus a LAScatalog cannot be modified because it must be related to the actual content of the files. The following throws an error:

ctg$Min.Z <- 0
#> Error: LAScatalog data read from standard files cannot be modified

Obviously it is always possible to modify an R object by bypassing such simple restrictions. In this case the user will break something internally and a correct output is not guaranteed.

However it is possible to add and modify the attributes using a name that is not reserved. The following is allowed:

ctg$newattribute <- 0

Validation of LAScatalog object

Users commonly report bugs arising from the fact that the point cloud is invalid. This is why we introduced the function las_check() to perform an inspection of the LAScatalog objects. This function checks if a LAScatalog object is consistent (files are all of the same type for example). For example, it may happen that a collection mixes files of type 1 with files of type 3 or files with different scale factors.

las_check(ctg)
#> 
#>  Checking headers consistency
#>   - Checking file version consistency... ✓
#>   - Checking scale consistency... ✓
#>   - Checking offset consistency... ✓
#>   - Checking point type consistency... ✓
#>   - Checking VLR consistency... ✓
#>   - Checking CRS consistency... ✓
#>  Checking the headers
#>   - Checking scale factor validity... ✓
#>   - Checking Point Data Format ID validity... ✓
#>  Checking preprocessing already done 
#>   - Checking negative outliers...
#>     ⚠ 2 file(s) with points below 0
#>   - Checking normalization... no
#>  Checking the geometry
#>   - Checking overlapping tiles...
#>     ⚠ Some tiles seem to overlap each other
#>   - Checking point indexation... no

The function las_check() when applied to a LAScatalog does not perform a deep inspection of the point cloud unlike when applied to a LAS object. Indeed the point cloud is not actually read.

Display a LAScatalog object

lidR provides a simple plot() function to plot a LAScatalog object:

plot(ctg)

The option mapview = TRUE displays the LAScatalog on an interactive map with pan and zoom and allows the addition of a satellite map in the background. It uses the package mapview internally. It is often useful to check if the CRS of the file are properly registered. The epsg codes recorded in the las files appear to be sometime incorrect, according to our own experience.

plot(ctg, mapview = TRUE, map.type = "Esri.WorldImagery")

Using a sf object to store the attributes of the las file it is easy to display metadata of the files. In the following we can immediately see that the catalog is not normalized and is likely to contain outliers:

plot(ctg["Min.Z"])

Apply lidR functions on a LAScatalog

Most of lidR functions are compatible with a LAScatalog and work almost like with a single point cloud loaded in memory. In the following example we use the function pixel_metrics() to compute the mean elevation of the points. The output is a continuous wall-to-wall raster. It works exactly as if the input was a LAS object.

hmean <- pixel_metrics(ctg, mean(Z), 20)

However, processing a LAScatalog usually requires some tuning of the processing options to get better control of the computation. Indeed, if the catalog is huge the output is likely to be huge as well, and maybe the output cannot fit in the R memory. For example, normalize_height() throws an error if used ‘as is’ without tuning the processing options. Using normalize_height() like in the following example the expected output would be a huge point cloud loaded in memory. The lidR package forbids such a call:

output <- normalize_height(ctg, tin())
#> Error: This function requires that the LAScatalog provides an output file template.

Instead, one can use the processing option opt_output_files(). Processing options drive how the big files are split in small chunks and how the outputs are either returned into R or written on disk into files.

opt_output_files(ctg) <- "folder/where/to/store/outputs/{ORIGINALFILENAME}_normalized"
output <- normalize_height(ctg, tin())

Here the output is not a point cloud but a LAScatalog pointing to the newly created files. The user can check how the collection will be processed by calling summary

summary(ctg)
#> class       : LAScatalog (v1.2 format 1)
#> extent      : 883166.1, 895250.2, 625793.6, 639938.4 (xmin, xmax, ymin, ymax)
#> coord. ref. : NAD83 / UTM zone 17N 
#> area        : 111.1 km²
#> points      : 0  points
#> density     : 0 points/m²
#> num. files  : 62 
#> proc. opt.  : buffer: 30 | chunk: 0
#> input opt.  : select: * | filter: 
#> output opt. : in memory | w2w guaranteed | merging enabled
#> drivers     :
#>  - Raster : no parameter
#>  - stars : NA_value = -999999  
#>  - Spatial : no parameter
#>  - SpatRaster : overwrite = FALSE  NAflag = -999999  
#>  - SpatVector : overwrite = FALSE  
#>  - LAS : no parameter
#>  - sf : quiet = TRUE  
#>  - data.frame : no parameter

Also the plot function can displays the chunks pattern i.e. how the dataset is split into small chunks that will be sequentially processed

opt_chunk_size(ctg) <- 0
plot(ctg, chunk = TRUE)

opt_chunk_size(ctg) <- 900
plot(ctg, chunk = TRUE)

Partial processing

It possible to flag some file that will not be processed but that will be used to load a buffer if required. In the following example only the central files will be processed but the others one were not removed and they will be used to buffer the processed files.

ctg$processed <- FALSE
ctg$processed[c(19:20, 41:44, 49:50)] <- TRUE
plot(ctg)

Some practical examples

Example 1 - Raster

Load a collection Process each file sequentially. Returns a raster into R.

ctg <- readLAScatalog("path/to/las/files/")
hmean <- pixel_metrics(ctg, ~mean(Z), 20)

Example 2 - Raster

Load a collection Process each file sequentially. For each file write a raster on disk named after the name of the processed files. Returns a lightweight virtual raster mosaic.

ctg <- catalog("path/to/las/files/")
opt_output_files(ctg) <- "folder/where/to/store/outputs/dtm_{ORIGINALFILENAME}"
dtm <- rasterize_terrain(ctg, tin())

Example 3 - Raster

Load a single big file too big to be actually loaded in memory. Process small chunks of 100 x 100 meters at a time. Returns a raster into R.

ctg <- readLAScatalog("path/to/las/files/bigfile.las")
opt_chunk_size(ctg) <- 100
chm <- rasterize_canopy(ctg, p2r())

Example 4 - Tree detection

Load a collection. Process small chunks of 200 x 200 meters at a time. Each chunk is loaded with an extra 20 m buffer. Returns spatial points into R.

ctg <- readLAScatalog("path/to/las/files/")
opt_chunk_size(ctg) <- 200
opt_chunk_buffer(ctg) <- 20
ttops <- locate_trees(ctg, lmf(5))

Example 5 - Decimate

This is forbidden. The output would be too big.

ctg <- readLAScatalog("path/to/las/files/")
decimated <- decimate_points(ctg, homogenize(4))

Example 6 - Decimate

Load a collection. Process small chunks of 500 x 500 meter sequentially. For each chunk write a laz file on disk named after the coordinates of the chunk. Returns a lightweight LAScatalog. Note that the original collection has been retiled.

ctg <- readLAScatalog("path/to/las/files/")
opt_chunk_size(ctg) <- 500
opt_output_files(ctg) <- "folder/where/to/store/outputs/project_{XLEFT}_{YBOTTOM}_decimated"
opt_laz_compression(ctg) <- TRUE
decimated <- decimate_points(ctg, homogenize(4))

Example 7 - Clip

Load a collection. Load a shapefile of plot centers. Extract the plots. Returns a list of extracted point clouds in R.

ctg <- readLAScatalog("path/to/las/files/")
shp <- sf::st_read("plot_center.shp")
plots <- clip_roi(ctg, shp, radius = 11.2)

Example 8 - Clip

Load a collection. Load a shapefile of plot centers. Extract the plots and immediately write them into a file named after the coordinates of the plot and an attributes of the shapefile (here PLOTID if such an attribute exists in the shapefile). Returns a lightweight LAScatalog.

ctg <- readLAScatalog("path/to/las/files/")
shp <- sf::st_read("plot_center.shp")
opt_output_files(ctg) <- "folder/where/to/store/outputs/plot_{XCENTER}_{YCENTER}_{PLOTID}"
plots <- clip_roi(ctg, plot_centers, radius = 11.2)