Forest

Forests play a crucial role in our ecosystem and are, therefore, being researched on a large scale. Modern approaches are based on big data (e.g., analyzing satellite imagery or single-tree-based simulations). These approaches require significant computing resources, which we can provide with our HPC systems.

Our general motivation and topics of interest can be found on our community page. This page describes tools, data, and computing resources that we believe can be useful based on our experience. If you work in forest science and have ideas or requests, please do not hesitate to contact us via our ticket system or the responsible person listed on the community page.

Forest workloads often fit the workflow of a typical HPC job. Therefore, it is recommended that you start with our general documentation on how to use the cluster.

Following are some of our services and tools highlighted, which we think are useful to consider when doing forest research on our HPC system.

JupyterHub

JupyterHub offers an interactive way of using our HPC resource, which most researchers use. Besides the typical usage of Jupyter notebooks, starting an RStudio server or a virtual desktop is possible. Further, setting up your own container with the specific software that fulfills your requirements is possible. This way, sharing your setup with project partners is also easy. This way, it is possible to run your workloads with CPU and GPU support.

Metashape

Orthomosaic Generation with Metashape is a workflow we developed for the project ForestCare.

SynForest

SynForest is a tool that generates realistic large-scale point clouds of forests by simulating the lidar scanning process from a stationary or moving platform and capturing the shape and structure of realistic tree models.

Data Pools

Data pools are a simple way to share your data on our HPC system with others. Further, data pools offer access to large data sets that are useful for different user groups over a longer period of time.

Workshops

The GWDG Academy offers workshops on different topics relevant to forest science on HPC. Most of them handle technical topics without examples from the field of forest science. The following are some workshops that include forest science workflows.

Deep learning with GPU cores

As different AI methods are applied in forest science, the workshop Deep learning with GPU cores provides a starting point for your AI workload on our HPC system. The workshop covers tree species classification of segmented point clouds with PointNet. In the section Deep Learning Containers in HPC a container is set up, that provides all required software for the workflow and can be used interactively on JupyterHub or via SLURM on our HPC cluster.

Performance of AI Workloads

Based on the workshop “Deep learning with GPU cores,” we introduce different tools and approaches to optimizing job performance. This can help reduce your waiting time and the energy consumption of your analysis.

Forests in HPC

Forests in HPC introduces different options for using the HPC system for forest applications. Here, we focus on lidar data analysis with R (lidR) and the same deep learning workflow that is used in the workshop Deep learning with GPU cores.