-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clip/Mask data below threshold value #638
Comments
So, this is a simple threshold operation - all values below The As we discussed, there is still no dedicated API functionality for conditional masking like this, but I expect nonetheless that it can be done with the current API, without modifying the data. We should explore the use of an Alternatively, we could register a custom |
@siddharth0248, can you please confirm
|
Ok,
|
We should make sure we instruct users who e.g. load the data into Python or QGIS from the same source also apply this masking. Otherwise, they will get different values e.g. when calculating zonal statistics. That is the disadvantage of applying the masking on the fly and not fixing the files - slightly worse interoperability. |
To configure this, put this expression into the veda-config-ghg/datasets/tm54dvar-ch4flux-monthgrid-v1.data.mdx Lines 62 to 67 in 56574cc
This block would become sourceParams:
assets: total
colormap_name: purd
rescale:
- 0.48
- 24
expression: "where(b1<10, -9999, b1)" |
Some data providers request clipping data below a specific threshold value so that the colormap starts at this threshold and extends to an upper limit. For example, in TM5 Var, we clipped the values below 0.48 and then set the min-max range in the metadata to 0.48 and 12. If we hadn’t clipped the data, we would have seen many more pixels, and we also want to avoid designating them as "no-data."
Currently, my approach involves masking out these values during the data transformation using the provided script. Could this functionality be incorporated into rio-tiler to enhance sustainability?
The text was updated successfully, but these errors were encountered: