Note that when running the "heavy" commands on a remote machine, prefix them by
nohup, so the process continues when disconnected from the machine, and suffix them by&, so the process doesn't block the terminal.When running the "PDAL" commands, the PDAL environment will have to be activated.
Note that we have machine specific convenience scripts for running these processes in the
machine_scripts/directory.
Copy the contents of download_ahn/. into [laz/dir]
Edit get_ahn_via_md5.py to refer to the correct AHN dataset URLs.
$ cd [laz/dir]
$ make all
Note, PDAL must have been installed:
conda create -n pdal; conda activate pdal; conda install python-pdal.
$ conda activate pdal
$ ./ahn_to_copc.sh -p2 -m3 [laz/dir] [copc/dir]
The
pargument determines the number of concurrent processes. Themargument determines the hierarchy of AHN LAZ files to merge into one COPC file. For example, form3, all LAZ files for which the name starts with the sameC_xxxwill be merged into one COPC.
$ ./count_copc.sh [copc/dir]
$ conda activate pdal
$ ./test_copc.sh [copc/dir]
This process can optionally be instructed to only check files created recently, within a fixed number of days:
$ conda activate pdal
$ ./test_copc.sh -w[days] [copc/dir]
Note that any invalid COPC file can be deleted, after which the conversion process can be repeated. The conversion process will not process any data for which the output COPC files already exist.
$ ./list_copc_failures.sh [test_file] [copc/dir]
Note that the main purpose of creating a checksum of the COPC directory is to facilitate safe transfer of the data.
$ ./checksum_copc.sh [copc/dir]