Skip to content

Commit

Permalink
Merge pull request #18 from quantifyearth/mwd-adjust-combined-result-…
Browse files Browse the repository at this point in the history
…tiffs

Adjust combined result tiffs
  • Loading branch information
mdales authored Dec 18, 2024
2 parents 94c1db3 + 3c408b9 commit 32db0c7
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 19 deletions.
27 changes: 14 additions & 13 deletions deltap/delta_p_scaled_area.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,32 +33,33 @@ def delta_p_scaled_area(

area_restore_filter = area_restore.numpy_apply(lambda c: np.where(c < SCALE, float('nan'), c)) / SCALE

per_taxa_path = os.path.join(dirname, f"per_taxa_{basename}")
per_taxa_path = os.path.join(dirname, f"{basename}")
with RasterLayer.empty_raster_layer_like(
area_restore,
filename=per_taxa_path,
nodata=float('nan'),
bands=len(per_taxa)
bands=len(per_taxa) + 1
) as result:
for idx, inlayer in enumerate(per_taxa):
_, name = os.path.split(inlayer.name)
result._dataset.GetRasterBand(idx+1).SetDescription(name[:-4]) # pylint: disable=W0212
scaled_filtered_layer = inlayer.numpy_apply(
lambda il, af: np.where(af != 0, (il / af) * -1.0, float('nan')),
area_restore_filter
)
scaled_filtered_layer.parallel_save(result, band=idx + 1)

summed_output_path = os.path.join(dirname, f"summed_{basename}")
with RasterLayer.empty_raster_layer_like(area_restore, filename=summed_output_path, nodata=float('nan')) as result:
result._dataset.GetRasterBand(1).SetDescription("all") # pylint: disable=W0212
summed_layer = per_taxa[0]
for layer in per_taxa[1:]:
summed_layer = summed_layer + layer
scaled_filtered_layer = summed_layer.numpy_apply(
lambda il, af: np.where(af != 0, (il / af) * -1.0, float('nan')),
area_restore_filter
)
scaled_filtered_layer.parallel_save(result)
scaled_filtered_layer.parallel_save(result, band=1)

for idx, inlayer in enumerate(per_taxa):
_, name = os.path.split(inlayer.name)
result._dataset.GetRasterBand(idx + 2).SetDescription(name[:-4]) # pylint: disable=W0212
scaled_filtered_layer = inlayer.numpy_apply(
lambda il, af: np.where(af != 0, (il / af) * -1.0, float('nan')),
area_restore_filter
)
scaled_filtered_layer.parallel_save(result, band=idx + 2)


def main() -> None:
parser = argparse.ArgumentParser(description="Scale final .")
Expand Down
15 changes: 9 additions & 6 deletions method.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ pip install -r requirements.txt

### The PostGIS container

For querying the IUCN data held in the PostGIS database we use a seperate container, based on the standard PostGIS image.
For querying the IUCN data held in the PostGIS database we use a seperate container, based on the standard PostGIS image. This does not run the database, rather is a place to run python scripts that will talk to the database.

```shark-build:postgis
((from python:3.12-slim)
Expand Down Expand Up @@ -93,7 +93,7 @@ To calculate the AoH we need various basemaps:
- Habitat maps for four scenarios:
- Current day, in both L1 and L2 IUCN habitat classification
- Potential Natural Vegetation (PNV) showing the habitats predicted without human intevention
- Restore scenario - a map derived from the PNV and current maps showing certain lands restored to their pre-human
- Restore scenario - a map derived from the PNV and current maps showing certain lands restored to their pre-human type
- Conserve scenario - a map derived form current indicating the impact of placement of arable lands
- The Digital Elevation Map (DEM) which has the height per pixel in meters

Expand Down Expand Up @@ -135,7 +135,7 @@ python3 ./prepare-layers/make_current_map.py --jung /data/habitat/jung_l2_raw.ti
-j 16
```

The habitat map by Jung et al is at 100m resolution in World Berhman projection, and for IUCN compatible AoH maps we use Molleide at 1KM resolution, so we use GDAL to do the resampling for this:
The habitat maps by Jung et al is at 100m per pixel at the equator resolution in WGS84 projection, which we covert to a series of downsampled proportional-coverage layers, one layer per habitat class, at our target result scale. This is done with `habtiat_process.py`:

```shark-run:layer-prep
python3 ./aoh-calculator/habitat_process.py --habitat /data/habitat/pnv_raw.tif \
Expand All @@ -149,6 +149,8 @@ python3 ./aoh-calculator/habitat_process.py --habitat /data/habitat/current_raw.
--output /data/habitat_maps/current/
```

This process to generate a proprotional layer per habitat class will be repereated for the additional layers generated below.

### Generating additional habitat maps

LIFE calculates the impact on extinction rates under two future scenarios: restoration of habitats to their pre-human state, and the converstion of non-urban terrestrial habitat to arable.
Expand Down Expand Up @@ -184,13 +186,14 @@ python3 ./aoh-calculator/habitat_process.py --habitat /data/habitat/arable.tif \

### Generate area map

For LIFE we need to know the actual area, not just pixel count. For this we generate a map that contains the area per pixel for a given latitude which is one pixel wide, and then we sweep that across for a given longitude.
For LIFE we need to know the actual area, not just pixel count. For this we generate a raster map that contains the area per pixel in meters for a given latitude. As a performance optimisation, this map is generated as a one pixel wide raster, as the values at all latitudes are the same.

```shark-run:layer-prep
python3 ./prepare-layers/make_area_map.py --scale 0.016666666666667 --output /data/area-per-pixel.tif
```

### Differences maps

In the algorithm we use need to account for map projection distortions, so all values in the AoHs are based on the area per pixel. To get the final extinction risk values we must remove that scaling. To do that we generate a map of area difference from current for the given scenario.

```shark-run:layer-prep
Expand All @@ -211,13 +214,13 @@ python3 ./prepare-layers/make_diff_map.py --current /data/habitat/current_raw.ti

### Fetching the elevation map

To assist with provenance, we download the data from the Zenodo ID.
To calculate AoH we need a digital elevation map. We use [this map](https://zenodo.org/records/5719984) map compiled by Jeffrey Hanson.

```shark-run:reclaimer
reclaimer zenodo --zenodo_id 5719984 --filename dem-100m-esri54017.tif --output /data/elevation.tif
```

Similarly to the habitat map we need to resample to 1km, however rather than picking the mean elevation, we select both the min and max elevation for each pixel, and then check whether the species is in that range when we calculate AoH.
Similarly to the habitat map we need to downsample to the target projection, however rather than picking the mean elevation, we select both the min and max elevation for each pixel, and then check whether the species is in that range when we calculate AoH.

```shark-run:gdalonly
gdalwarp -t_srs EPSG:4326 -tr 0.016666666666667 -0.016666666666667 -r min -co COMPRESS=LZW -wo NUM_THREADS=40 /data/elevation.tif /data/elevation-min-1k.tif
Expand Down

0 comments on commit 32db0c7

Please sign in to comment.