TransWikia.com

How can I make LAS data output from an RPAS survey processed in Agisoft Metashape, to work in lidR with catalog_apply?

Geographic Information Systems Asked on July 3, 2021

I am creating a Normalized Canopy Height Model using a ground DTM from lidar and a set of surface elevation LAS files from an RPAS (drone) processed and output with Agisoft Metashape (these will be the vegetation). I want to filter/decimate the RPAS LAS files (stored in a LAS Catalogue) and then subtract the lidar DTM in a function called with catalog_apply in lidR.

It works for some chunks (all with warnings: WARNING: data type 22 of attribute 1 (‘normals’) is deprecated), but most are empty. Finally it bails with an error cannot allocate vector of size xxxMB.

I do not know if its a code error or a data error. I’ll provide the code in case someone can see an error. Otherwise is there a way to "fix" the las data? (I already tried las2las to change from 1.2 to 1.4 for the LAS format).

library(lidR)

RPAS_LASCatalogue <- readLAScatalog("D:/Projects2020/p2020_1021_RPAS_EFI_test/RGB/RPAS_LAS", select = "xyzc")
LIDAR_LASCatalogue <- readLAScatalog("D:/Projects2020/p2020_1021_RPAS_EFI_test/lidar_LAS/original", select = "xyzc", filter = "-keep_first -drop_z_below 0 -drop_overlap", progress = TRUE)

plot(RPAS_LASCatalogue)
plot(LIDAR_LASCatalogue)

LIDAR_DTMGrid <- grid_terrain(LIDAR_LASCatalogue, 1, knnidw(), use_class = c(2L, 9L))

RPAS_LASFilterFunction = function(chunk)
{
  RPAS_LASChunk = readLAS(chunk)
  if (is.empty(RPAS_LASChunk))
    return(NULL)

  RPAS_LASChunk <- lasfilter(RPAS_LASChunk, Classification == 3L | Classification == 4L | Classification == 5L)
  RPAS_LASChunk <- filter_duplicates(RPAS_LASChunk)
  RPAS_LASChunk <- decimate_points(RPAS_LASChunk, homogenize(density = 2, res = 1, use_pulse = FALSE))
  if (is.empty(RPAS_LASChunk))
    return(NULL)

  RPAS_LASFile_Normalized <- RPAS_LASChunk - LIDAR_DTMGrid
  RPAS_LASFile_Normalized <- filter_poi(RPAS_LASFile_Normalized, Z > 1)
  return(RPAS_LASFile_Normalized)
}

opt_chunk_size(RPAS_LASCatalogue) <- 50
opt_output_files(RPAS_LASCatalogue) <- "D:/Projects2020/p2020_1021_RPAS_EFI_test/RGB/RPAS_LAS/RPAS_LASFile_Normalized_{ID}"

catalog_apply(RPAS_LASCatalogue, RPAS_LASFilterFunction)

chunk pattern

2 Answers

I wasn't able to decimate the RPAS LAS files (represented in a catalogue here) in chunks through the function in the code above, but I was able to decimate each RPAS LAS file on their own. But I found a more practical way when I filtered the Dense POint Cloud in Agisoft Metashape before exporting and bringing into lidR. Though it was no longer necessary to decimate the points, it did work with the code above. I don't know why the code did not work on the first set of super dense LAS files but I think it is better to filter in Agisoft Metashape first... so (along with enhancement/correction to the code from JRR) I will mark this as the answer.

Correct answer by Ray J on July 3, 2021

About the warning

WARNING: data type 22 of attribute 1 ('normals') is deprecated

It means that your LAS files have more than the cores attributes defined by the LAS specification. Your files have some extra attributes for each point. Extra attributes are associated to a code that tells how to read them. For example the code 3 means that the extra attribute is an integer on 2 bytes (see also the specification table 24). In your case you have a code 22 which is deprecated by the LAS specifications. Your normals attribute does not respect the LAS specification.

About the error

cannot allocate vector of size xxx MB

Without a reproducible example it is hard to say anything. The error is explicit but we know nothing about the point cloud (size, density) and your computer (memory). Moreover you obfuscated the xxx value. I don't understand why you make 50 m chunks. Is the point cloud so dense that you cannot load more data? Yet you were able to compute a DTM by loading each file. Below a modified code that is expected to use more carefully the memory.

RPAS_LASFilterFunction = function(chunk, LIDAR_DTMGrid)
{
  RPAS_LASChunk = readLAS(chunk)
  if (is.empty(RPAS_LASChunk))
    return(NULL)

  RPAS_LASChunk <- filter_duplicates(RPAS_LASChunk)
  RPAS_LASChunk <- decimate_points(RPAS_LASChunk, homogenize(density = 2, res = 1, use_pulse = FALSE))

  if (is.empty(RPAS_LASChunk))
    return(NULL)

  RPAS_LASChunk <- RPAS_LASChunk - LIDAR_DTMGrid
  RPAS_LASChunk <- filter_poi(RPAS_LASChunk, Z > 1, buffer == 0) # Do not forget to remove the buffer
  return(RPAS_LASChunk)
}

opt_filter(RPAS_LASCatalogue) <-  "-keep_class 3 4 5"
opt_chunk_size(RPAS_LASCatalogue) <- 50
opt_chunk_buffer(RPAS_LASCatalogue) <- 0 # I don't think you need a buffer here. Or a very small one.
opt_output_files(RPAS_LASCatalogue) <- "D:/Projects2020/p2020_1021_RPAS_EFI_test/RGB/RPAS_LAS/RPAS_LASFile_Normalized_{ID}"

catalog_apply(RPAS_LASCatalogue, RPAS_LASFilterFunction, LIDAR_DTMGrid = LIDAR_DTMGrid)

Answered by JRR on July 3, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP