diff --git a/book/chapters/datasource.ipynb b/book/chapters/datasource.ipynb index 6746598..f06d30c 100644 --- a/book/chapters/datasource.ipynb +++ b/book/chapters/datasource.ipynb @@ -467,39 +467,38 @@ "id": "36f1f6ac", "metadata": {}, "source": [ - "# MODIS Dataset\n", + "## MODIS Dataset\n", "\n", - "## Introduction\n", "### What is MODIS?\n", "MODIS, which stands for MODerate Resolution Imaging Spectroradiometer, is an advanced instrument operating aboard both the Terra and Aqua spacecraft. It captures a comprehensive view of Earth's surface, oceans, and atmosphere. The MODIS dataset is a comprehensive collection of Earth observation data captured by the MODIS instruments. Scientists use MODIS data to track changes in things like land cover, weather patterns, ice and snow, and the color of the oceans. MODIS boasts a remarkable viewing swath width of 2,330 km and covers the entire Earth surface every one to two days. With 36 spectral bands ranging from 0.405 to 14.385 µm, it provides detailed data at three spatial resolutions: 250m, 500m, and 1,000m.\n", "\n", - "## Characteristics\n", - "- **Spatial Resolution**: MODIS provides data at moderate spatial resolutions ranging from 250 meters to 1 kilometer, depending on the specific product and spectral band.\n", + "### Characteristics\n", "\n", - "- **Temporal Resolution**: MODIS offers daily global coverage, providing data at a high temporal resolution suitable for monitoring dynamic environmental processes.\n", + "
\n", "\n", - "- **Variables**: MODIS measures various Earth surface parameters including land cover, land surface temperature, vegetation indices, fire occurrence, ocean color, and atmospheric properties.\n", + "**Earth Science Data Type (ESDT)**: MOD10A1\n", "\n", - "- **Coverage**: MODIS provides global coverage, capturing data over land, ocean, and atmosphere, facilitating multi-disciplinary Earth observation studies.\n", + "**Product Level**: L3\n", "\n", - "- **Quality**: MODIS data undergoes extensive calibration and validation processes to ensure accuracy and reliability, with quality flags provided to identify potential data anomalies.\n", + "**Nominal Data Array Dimensions**: 1200km by 1200km\n", "\n", - "

\n", - " \"Sample\n", - "

\n", - "

MODIS from YALE YCEO UHI dataset

\n", + "**Spatial Resolution**: 500m\n", "\n", + "**Temporal Resolution**: day\n", "\n", - "## Data Format\n", + "**Map Projection**: Sinusoidal \n", + "\n", + "
\n", + "\n", + "### Data Format\n", "MODIS data are typically available in Hierarchical Data Format (HDF) or NetCDF formats, which are widely used for storing and distributing Earth observation data. These formats facilitate efficient data access, manipulation, and analysis using various software tools and programming languages commonly employed in the Earth sciences community.\n", "\n", - "## MODIS Direct Broadcast\n", + "### MODIS Direct Broadcast\n", "Users with x-band receiving systems can capture regional data directly from the spacecraft using the MODIS Direct Broadcast signal, enhancing real-time monitoring capabilities.\n", "\n", - "# fSCA\n", + "### fSCA\n", "\n", - "Fractional Snow Covered Area (fSCA) is a metric used in the field of snow science and environmental studies to quantify the proportion of a given area that is covered by snow. It is derived from remote sensing data, particularly from sensors like Landsat, which capture images of the Earth's surface in various spectral bands. By analyzing these images, researchers can differentiate between snow-covered and snow-free areas, allowing them to calculate the percentage of the landscape covered by snow at a particular point in time. fSCA is valuable for understanding snow distribution patterns, monitoring changes in snow cover over time, and aiding in snowmelt and water resource management. It plays a crucial role in snowpack modeling, avalanche forecasting, and climate change research, providing essential data for informing decision-making processes related to snow-dependent ecosystems and human activities.\n", - "\n" + "Fractional Snow Covered Area (fSCA) is a metric used in the field of snow science and environmental studies to quantify the proportion of a given area that is covered by snow. It is derived from remote sensing data, particularly from sensors like Landsat, which capture images of the Earth's surface in various spectral bands. By analyzing these images, researchers can differentiate between snow-covered and snow-free areas, allowing them to calculate the percentage of the landscape covered by snow at a particular point in time. fSCA is valuable for understanding snow distribution patterns, monitoring changes in snow cover over time, and aiding in snowmelt and water resource management. It plays a crucial role in snowpack modeling, avalanche forecasting, and climate change research, providing essential data for informing decision-making processes related to snow-dependent ecosystems and human activities." ] }, { @@ -507,23 +506,22 @@ "id": "a0ddb8f3", "metadata": {}, "source": [ - "# SNOTEL dataset\n", + "## SNOTEL dataset\n", "\n", - "## Introduction\n", "### What is SNOTEL?\n", "The SNOwpackTELemetryNetwork (SNOTEL) is an automated system of snowpack and climate sensors managed by the Natural Resources Conservation Service (NRCS) in the Western United States, offering critical data for water supply forecasting, flood prediction, and climate research. SNOTEL provides real-time data on snow water equivalent, snow depth, precipitation, and temperature from remote mountainous regions, aiding in understanding hydroclimatic conditions. SNOTEL offers comprehensive snowpack and climate data from over 900 sites, helping monitor snowpack, precipitation, temperature, and other climatic conditions in the western U.S.The SNOTEL dataset serves as a valuable resource for a wide range of stakeholders, contributing to informed decision-making in various sectors impacted by snowpack and climate conditions.\n", "\n", - "## SNOTEL Network Overview\n", - "### Composition of SNOTEL\n", + "### SNOTEL Network Overview\n", + "#### Composition of SNOTEL\n", "- Comprising over 900 automated sites in remote, high-elevation mountain watersheds.\n", "- Monitors snowpack, precipitation, temperature, and other climatic parameters.\n", "\n", - "### Operations and Data Collection\n", + "#### Operations and Data Collection\n", "- Sites operate unattended and without maintenance for extended periods.\n", "- Standard sensor configuration includes snow pillow, precipitation gauge, and temperature sensors.\n", "\n", - "## Telemetry and Data Transmission\n", - "### Data Collection and Storage\n", + "### Telemetry and Data Transmission\n", + "#### Data Collection and Storage\n", "- Dataloggers installed in equipment shelters collect and store data.\n", "- Various telemetry systems transmit data back to the Water and Climate Information System.\n", "\n", @@ -531,18 +529,25 @@ "- Enhanced sites equipped with soil moisture, soil temperature, solar radiation, wind speed, and relative humidity sensors.\n", "- Tailored configurations based on physical conditions and climate requirements.\n", "\n", - "## Characteristics\n", - "**Spatial Resolution**: SNOTEL provides data at a network of monitoring sites distributed across mountainous regions, typically covering areas with varying spatial resolutions depending on the density of monitoring stations.\n", + "### Characteristics\n", + "
\n", "\n", - "**Temporal Resolution**: SNOTEL data is typically collected at hourly intervals, providing high temporal resolution data for monitoring snowpack conditions and related hydrological variables.\n", + "**Product/Data Type**: SNOTEL Station Daily Data\n", "\n", - "**Variables**: SNOTEL measures snow water equivalent, snow depth, temperature, precipitation, and soil moisture at monitoring sites in mountainous regions.\n", + "**Spatial Resolution**: Point data specific to each SNOTEL station location present in the western USA within bounding box of\n", + "southwest_lon = -125.0\n", + "southwest_lat = 25.0\n", + "northeast_lon = -100.0\n", + "northeast_lat = 49.0\n", "\n", - "**Coverage**: SNOTEL stations are primarily located in the western United States, covering areas with significant snowpack and water resource management importance.\n", + "**Temporal Resolution**: Daily\n", "\n", "**Quality**: SNOTEL data undergoes quality control procedures to ensure accuracy and reliability, including calibration checks and validation against manual measurements.\n", "\n", - "## Data Format\n", + "
\n", + "\n", + "\n", + "### Data Format\n", "The Snow Telemetry (SNOTEL) data format encompasses structured datasets collected from remote automated stations situated in mountainous regions, monitoring snowpack, weather, and hydrological parameters. Key aspects include recorded parameters such as snow water equivalent (SWE), snow depth, air temperature, and precipitation, timestamped to denote observation times and often stored at varying resolutions like hourly or daily intervals. Quality control flags accompany data points to denote reliability, while metadata provides station details and sensor calibration information. SNOTEL data is commonly stored in formats like CSV, TSV, HDF5, or netCDF, accessible through agency websites, data portals, or APIs. This format facilitates applications spanning water resource management, climate research, agriculture, recreation, hydrological modeling, and ecological studies.\n", "\n", "

\n", @@ -556,489 +561,41 @@ "

Snow Water Equivalent Percent NRCS 1991-2020 Median April 6 2024

\n", "\n", "\n", - "## Applications\n", - "\n", - "**Water Resource Management**:\n", - " - *Snowpack Monitoring*: Assessing snowpack depth and SWE helps in forecasting water availability for irrigation, hydropower generation, and municipal water supply.\n", - " - *Runoff Forecasting*: Data from SNOTEL stations aids in predicting spring runoff, facilitating reservoir management and flood control.\n", - "\n", - "**Climate Research**:\n", - " - *Long-term Climate Trends*: Historical data enables researchers to study long-term climate patterns, including changes in snowfall, temperature, and precipitation.\n", - " - *Climate Change Studies*: SNOTEL data is utilized to understand the impacts of climate change on snowpack dynamics, water resources, and ecosystems.\n", - "\n", - "**Agriculture and Forestry**:\n", - " - *Crop Planning*: Farmers use snowpack data to anticipate water availability during the growing season, aiding in crop planning and irrigation scheduling.\n", - " - *Forest Management*: Forestry agencies utilize SNOTEL data for assessing wildfire risk, planning timber harvests, and monitoring forest health.\n", - "\n", - "**Recreation and Tourism**:\n", - " - *Winter Sports Planning*: Ski resorts and recreational outfitters rely on snowpack data for planning activities such as skiing, snowboarding, and snowmobiling.\n", - " - *Summer Recreation*: Understanding snowmelt timing and water availability helps in planning summer recreational activities like hiking, fishing, and camping.\n", - "\n", - "# SNOTEL Dataset Download Instructions\n", - "\n", - "Because snow has a higher albedo than most other land cover types, it can cause the seasonal changes in the albedo of a landscape to be quite dramatic. The Soil Climate Analysis Network (SCAN) and the SNOwpack TELemetry (SNOWTEL) network provide snow depth and snow water equivalent (the amount of water contained in a snowpack) data for many sites across the United States.\n", - "\n", - "### Step 1: Navigate to the SCAN/SNOWTEL website\n", - "- Visit the [SCAN/SNOWTEL Website](http://www.wcc.nrcs.usda.gov/nwcc/inventory).\n", - "\n", - "### Step 2: Choose Data Product and Location\n", - "- Select the data product you are interested in (e.g., Snow Depth or Snow Water Equivalent) from the drop-down menu under *Element*.\n", - "- Choose a State/County or Basin using the drop-down menus provided.\n", - "\n", - "### Step 3: View Inventory\n", - "- Click on 'View Inventory' to see available stations in your selected area.\n", - " - If no results are returned, consider widening your search.\n", - "\n", - "### Step 4: Select Station and Data\n", - "- Click 'View' next to the station of interest to access its page.\n", - "- Use the table to select the data you need:\n", - " - Choose the data product (Snow Depth or Snow Water Equivalent).\n", - " - Select ‘Daily’ in the Time Series column.\n", - " - Choose the format ('chart' for visualization or 'csv' for download).\n", - " - View current data by selecting the desired time frame in the yellow column and clicking 'View Current', or view historic data by selecting the year and time in the green column and clicking 'View Historic'.\n", - "\n", - "### Step 5: Download Data\n", - "- Save the downloaded CSV file to your computer for further analysis.\n", - "\n", "**For map visualization of SNOWTEL stations, click on ‘SNOWTEL data’ under ‘Climate Monitoring’ in the right panel. The maps are clickable for station selection.**\n", "\n", "For more information, visit the [NRCS SNOTEL page](https://www.nrcs.usda.gov/wps/portal/wcc/home/aboutUs/monitoringPrograms/automatedSnowMonitoring).\n" ] }, { - "cell_type": "code", - "execution_count": 9, - "id": "e005299a", + "cell_type": "markdown", + "id": "2afc5f94", "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['/Users/meghana/Documents/swe-workflow-book/book/chapters', '/Users/meghana', '/opt/anaconda3/lib/python311.zip', '/opt/anaconda3/lib/python3.11', '/opt/anaconda3/lib/python3.11/lib-dynload', '', '/Users/meghana/.local/lib/python3.11/site-packages', '/opt/anaconda3/lib/python3.11/site-packages', '/opt/anaconda3/lib/python3.11/site-packages/aeosa']\n", - "https://www.nohrsc.noaa.gov/nearest/index.html?city=40.05352381745094%2C-106.04027196859343&county=&l=5&u=e&y=2022&m=5&d=4\n", - "\n", - "\n", - "\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "Nearest Observations - NOHRSC - The ultimate source for snow information\n", - "\n", - "\n", - "\n", - "\n", - "\t\n", - "\t\n", - "\t\n", - "\t\n", - "\n", - "\n", - "\t\n", - "\t\t\n", - "\t\t\t\n", - "\t\t\n", - "\t
\n", - "\t\t\t\t\"(content\n", - "\t\t\t\tweather.gov\n", - "\t\t\t\t   \n", - "\t\t\t
\n", - "\t\n", - "\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\n", - "\t\t\n", - "\t\t\t\n", - "\t\t\n", - "\t
\"NOAA
National Weather Service
 \"NWS
National Operational Hydrologic
Remote Sensing Center
\n", - "\t\n", - "\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\n", - "\t
\"\"HomeNewsOrganization\n", - "\t\t\t\t
\n", - "\t\t\t\t  \n", - "\t\t\t\t\n", - "\t\t\t\t \n", - "\t\t\t\t\n", - "\t\t\t\t
\n", - "\t\t\t
 \"\"
\n", - "\t\n", - "\t\t\n", - "\t\t\t\n", - "\t\t\t\n", - "\t\t\n", - "\t
\n", - "
\n", - "
Home
\n", - "
\n", - "
\n", - "
Snow Information
\n", - "
National Analyses
\n", - "
Interactive Maps
\n", - "
3D Visualization
\n", - "
Airborne Surveys
\n", - "
Snowfall Analysis
\n", - "
Satellite Products
\n", - "
Forecasts
\n", - "
Data Archive
\n", - "
SHEF Products
\n", - "
\n", - "
\n", - "
\n", - "
Observations near
\n", - "
\n", - "
\n", - "
\n", - "
\n", - "
\n", - "
Science/Technology
\n", - "
NOHRSC
\n", - "
GIS Data Sets
\n", - "
Special Purpose Imagery
\n", - "
\n", - "
\n", - "
About The NOHRSC
\n", - "
Staff
\n", - "
\n", - "
\n", - "
NOAA Links
\n", - "
Snow Climatology
\n", - "
Related Links
\n", - "
\n", - "
\n", - "
Help
\n", - "
Help and FAQ
\n", - "
Site Map
\n", - "
\n", - "
\n", - "
Contact Us
\n", - "
Please Send Us Comments!
\n", - "
\n", - "
\n", - "\"USA.gov\n", - "
\n", - "
\n", - "\t\t\t
\n", - "\t\t\t\t\n", - "
Nearest observations to
\n", - "

40.05°N, -106.04°W

\n", - "Note: these data are unofficial and provisional.
\n", - "
\n", - "
\n", - "Location and Date\n", - "\n", - "\n", - "

\n", - "\n", - "\n", - "  \n", - "\n", - "  \n", - "\n", - "  \n", - "\n", - "  \n", - "  \n", - "  \n", - "
\n", - "
\n", - "\n", - "
Closest 5 observations near 40.05°N, -106.04°W
40.05°N, -106.04°W (Elevation: N/A)
Latest between 2022-05-03 06:00 UTC
and 2022-05-04 06:00 UTC

\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - " \n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
Raw Snowfall Observations
Station IDNameElev.
(ft)
Raw Snowfall
(in)
Duration
(hours)
Date (UTC)Distance
CO-GR-68TABERNASH 2.7 NW, CO88060.00242022-05-03 138.7 mi ESE
WIFC2WILLIAMS FORK DAM77330.00242022-05-03 148.7 mi W
CO-GR-81GRANBY 2.9 NE, CO80410.00242022-05-04 018.8 mi ENE
CO-GR-52PARSHALL 3.0 NNW, CO79040.00242022-05-03 138.9 mi WNW
CO-GR-53TABERNASH 1.9 NW, CO85790.00242022-05-03 069.5 mi ESE
\n", - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - " \n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
Snow Depth Observations
Station IDNameElev.
(ft)
Snow Depth
(in)
Date (UTC)Distance
CO-GR-68TABERNASH 2.7 NW, CO88060.002022-05-03 138.7 mi ESE
WIFC2WILLIAMS FORK DAM77330.002022-05-03 148.7 mi W
CO-GR-52PARSHALL 3.0 NNW, CO79040.002022-05-03 138.9 mi WNW
SCSC2STILLWATER CREEK87930.002022-05-04 0513.7 mi NE
CO-GR-78GRAND LAKE 3.7 SW, CO85370.002022-05-03 1314.4 mi NE
\n", - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - " \n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
Snow Water Equivalent Observations
Station IDNameElev.
(ft)
Snow Water Equivalent
(in)
Date (UTC)Distance
SCSC2STILLWATER CREEK87930.002022-05-04 0513.7 mi NE
FCVC2FOOL CREEK1116820.102022-05-04 0315.5 mi SE
MFKC2MIDDLE FORK CAMP89831.302022-05-04 0417.6 mi S
WLLC2WILLOW CREEK PASS960014.202022-05-04 0520.7 mi N
JNPC2JONES PASS1048210.902022-05-04 0520.9 mi SSE
\n", - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - " \n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
Raw Precipitation Observations
Station IDNameElev.
(ft)
Raw Precipitation
(in)
Duration
(hours)
Date (UTC)Distance
CAWC2COLORADO RVR BLW WINDY GAP78220.03242022-05-04 064.5 mi NNE
GRUC2GROUSE MOUNTAIN100130.0012022-05-04 059.3 mi WNW
CO-GR-86PARSHALL 8.8 SSE, CO84910.20242022-05-03 1510 mi SW
KSEC2KEYSER RIDGE101900.0012022-05-04 0511 mi S
KSEC2KEYSER RIDGE101900.00242022-05-04 0511 mi S
\n", - "
\n", - "Page generated in 4.04889 seconds.
\n", - "\t\t\t\t

\n", - "\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t
\n", - "\t\t\t\t\t\t\tNOHRSC
\n", - "\t\t\t\t\t\t\tMission Statement\n", - "\t\t\t\t\t\t\t | \n", - "\t\t\t\t\t\t\tContact\n", - "\t\t\t\t\t\t
\n", - "\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t\t \n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\t\n", - "\t\t\t\t\t\n", - "\t\t\t\t

\n", - "\t\t\t\t\t\t\t
\n", - "\t\t\t\t\t\t\tNational Weather Service
\n", - "\t\t\t\t\t\t\tNational Operational Hydrologic Remote Sensing Center
\n", - " Office of Water Prediction
\n", - "\t\t\t\t\t\t\t1735 Lake Drive W.
\n", - "\t\t\t\t\t\t\tChanhassen, MN 55317
\n", - "\t\t\t\t\t\t\t
\n", - "\t\t\t\t\t\t
\n", - "\t\t\t\t\t\t\t\"NOHRSC\n", - "\t\t\t\t\t\t
\n", - "\t\t\t\t\t\t\tContact NOHRSC
\n", - "\t\t\t\t\t\t\tGlossary
\n", - "\t\t\t\t\t\t\tCredits
\n", - "\t\t\t\t\t\t\tInformation Quality
\n", - "Page last modified: Nov 14, 2022 - cloud
\n", - "\t\t\t\t\t\t
\n", - "\t\t\t\t\t\t\tAbout Us
\n", - "\t\t\t\t\t\t\tDisclaimer
\n", - "\t\t\t\t\t\t\tPrivacy Policy
\n", - "\t\t\t\t\t\t\tFOIA
\n", - "\t\t\t\t\t\t\tCareer Opportunities
\n", - "\t\t\t\t\t\t
\n", - "\t\t\t
\n", - "\n", - "\n", - "\n", - "Container div not found\n", - "None\n" - ] - } - ], "source": [ + "## DEM Dataset\n", "\n", - "# First Python script in Geoweaver\n", - "import os\n", - "import urllib.request, urllib.error, urllib.parse\n", - "import sys\n", + "A Digital Elevation Model (DEM) is a digital representation of the topography of a surface, such as the Earth's terrain or the surface of another celestial body. It consists of a grid of elevation values, where each cell in the grid represents the elevation at a specific location. DEMs are widely used in various fields, including geography, geology, hydrology, environmental modeling, urban planning, and 3D visualization.\n", "\n", - "print(sys.path)\n", + "### Characteristics\n", "\n", - "try:\n", - " from BeautifulSoup import BeautifulSoup\n", - "except ImportError:\n", - " from bs4 import BeautifulSoup\n", + "
\n", "\n", - "nohrsc_url_format_string = \"https://www.nohrsc.noaa.gov/nearest/index.html?city={lat}%2C{lon}&county=&l=5&u=e&y={year}&m={month}&d={day}\"\n", + "**Product/Data Type**: SRTM 90m Digital Elevation Model (DEM)\n", "\n", - "test_noaa_query_url = nohrsc_url_format_string.format(lat=40.05352381745094, lon=-106.04027196859343, year=2022, month=5, day=4)\n", + "**Nominal Data Array Dimensions**: 5° x 5° tiles\n", "\n", - "print(test_noaa_query_url)\n", + "**Spatial Resolution**: 90 meters (at the equator)\n", "\n", - "response = urllib.request.urlopen(test_noaa_query_url)\n", - "webContent = response.read().decode('UTF-8')\n", + "**Temporal Resolution**: Single-time snapshot (data captured during the SRTM mission in 2000)\n", "\n", - "print(webContent)\n", + "**Vertical Accuracy**: Less than 16 meters error\n", "\n", - "parsed_html = BeautifulSoup(webContent)\n", - "container_div = parsed_html.body.find('div', attrs={'class':'container'})\n", + "**Data Format**: ArcInfo ASCII and GeoTiff\n", "\n", - "if container_div is not None:\n", - " print(container_div.text)\n", - "else:\n", - " print(\"Container div not found\")\n", + "**Coverage**: Western USA\n", "\n", - "print(container_div)\n" - ] - }, - { - "cell_type": "markdown", - "id": "8ee2035e", - "metadata": {}, - "source": [ - "We automate the process of accessing and parsing NOAA's snow data using BeautifulSoup. By constructing a query URL based on user-defined location and date parameters, we automate the process of fetching web content and parsing HTML to extract pertinent information. The above script enhances efficiency in obtaining snow data, offering a user-friendly approach for various analyses or applications without the need for technical expertise." - ] - }, - { - "cell_type": "markdown", - "id": "2afc5f94", - "metadata": {}, - "source": [ - "# DEM Dataset\n", - "\n", - "## Introduction to DEM (Digital Elevation Model):\n", - "A Digital Elevation Model (DEM) is a digital representation of the topography of a surface, such as the Earth's terrain or the surface of another celestial body. It consists of a grid of elevation values, where each cell in the grid represents the elevation at a specific location. DEMs are widely used in various fields, including geography, geology, hydrology, environmental modeling, urban planning, and 3D visualization.\n", + "**Projection**: WGS84 datum, geographic coordinate system\n", "\n", - "## Characteristics\n", - "\n", - "- **Spatial Resolution**: DEMs can vary in spatial resolution, ranging from coarse resolution global datasets to high-resolution local datasets. Higher spatial resolution DEMs provide more detailed information about the terrain.\n", - " \n", - "- **Accuracy**: The accuracy of DEMs depends on the source data and the methods used for their generation. High-quality DEMs are crucial for accurate analysis and decision-making in applications such as flood modeling, terrain navigation, and infrastructure planning.\n", - " \n", - "- **Coverage**: DEMs can cover different geographic extents, from local areas to entire continents or even the entire globe. The coverage of a DEM determines its utility for specific applications.\n", - " \n", - "- **Data Format**: DEM data is typically stored in raster formats such as GeoTIFF, ASCII grid, or Esri GRID. DEM data is typically stored in raster formats such as GeoTIFF, ASCII grid, or Esri GRID.\n", + "
\n", "\n", "DEM data is typically stored in raster formats such as GeoTIFF, ASCII grid, or Esri GRID. DEM data is typically stored in raster formats such as GeoTIFF, ASCII grid, or Esri GRID. Additional information such as coordinate system, spatial resolution, and metadata may also be included in the data file.\n", "\n", @@ -1047,415 +604,7 @@ "\"Aspect\n", "\n", "\"Nothness\n", - "\"Eastness\n", - "\n", - "\n", - "## Data Sources and Acquisition:\n", - "\n", - "#### Satellite Imagery:\n", - "DEMs can be derived from satellite imagery using techniques such as stereo photogrammetry or interferometry.\n", - "\n", - "#### Aerial LiDAR (Light Detection and Ranging):\n", - "LiDAR data collected from aircraft can produce high-resolution DEMs with accurate elevation information.\n", - "\n", - "#### Topographic Surveys:\n", - "Ground-based surveys using total stations or GPS equipment can also be used to generate DEMs for smaller areas with high precision.\n", - "\n", - "## Applications:\n", - "\n", - "- Terrain Analysis for Infrastructure Development\n", - "- Environmental Impact Assessment\n", - "- Geological Mapping and Exploration\n", - "- Disaster Risk Reduction\n", - "- Climate Change Modeling\n", - "- Ecological and Habitat Modeling\n", - "\n", - "# DEM Data Download:" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "id": "e0e8e426", - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "western_df.head() = Latitude Longitude x y Elevation Slope Aspect Curvature \\\n", - "0 49.0 -125.000 0 0 15.124211 0.470627 272.35254 3624.44560 \n", - "1 49.0 -124.964 1 0 136.762280 0.454659 284.84660 1261.81840 \n", - "2 49.0 -124.928 2 0 258.745100 0.446243 282.81650 -1763.61730 \n", - "3 49.0 -124.892 3 0 387.150480 0.281288 6.37222 -2461.73140 \n", - "4 49.0 -124.856 4 0 213.531710 0.405632 30.24500 -511.48447 \n", - "\n", - " Northness Eastness \n", - "0 0.041025 0.784977 \n", - "1 0.250835 0.768424 \n", - "2 0.218295 0.772784 \n", - "3 0.782300 -0.110535 \n", - "4 0.712497 -0.466602 \n", - "stations_file_df.head() = stationTriplet stationId stateCode networkCode name \\\n", - "0 ABY:CA:SNOW ABY CA SNOW Abbey \n", - "1 0010:ID:COOP 0010 ID COOP Aberdeen Experimnt Stn \n", - "2 0041:NM:COOP 0041 NM COOP Abiquiu Dam \n", - "3 08108010:NM:BOR 08108010 NM BOR Abiquiu Reservoir \n", - "4 13E19:ID:SNOW 13E19 ID SNOW Above Gilmore \n", - "\n", - " dcoCode countyName huc elevation latitude longitude \\\n", - "0 UN Plumas 1.802012e+11 5650.0 39.95500 -120.53800 \n", - "1 ID Bingham 1.704021e+11 4410.0 42.95000 -112.83333 \n", - "2 UN Rio Arriba 1.302010e+11 6380.0 36.23333 -106.43333 \n", - "3 CO Rio Arriba 1.302010e+11 6180.0 36.23700 -106.42912 \n", - "4 ID Lemhi 1.706020e+11 8289.0 44.45615 -113.30097 \n", - "\n", - " dataTimeZone pedonCode shefId beginDate endDate \n", - "0 NaN NaN NaN 1963-02-01 00:00:00.0 2100-01-01 \n", - "1 NaN NaN ABDI1 1914-01-01 00:00:00.0 2100-01-01 \n", - "2 NaN NaN ABIN5 1957-01-01 00:00:00.0 2100-01-01 \n", - "3 NaN NaN NaN 1964-09-01 00:00:00.0 2100-01-01 \n", - "4 NaN NaN ABGI1 1961-01-01 00:00:00.0 2100-01-01 \n" - ] - } - ], - "source": [ - "# Load dependencies\n", - "import geopandas as gpd\n", - "import json\n", - "import geojson\n", - "from pystac_client import Client\n", - "import planetary_computer\n", - "import xarray\n", - "import rioxarray\n", - "import xrspatial\n", - "import numpy as np\n", - "import matplotlib.pyplot as plt\n", - "import pandas as pd\n", - "from pyproj import Proj, transform\n", - "import os\n", - "import sys, traceback\n", - "import requests\n", - "\n", - "home_dir = os.path.expanduser('~')\n", - "work_dir = f\"{home_dir}/gridmet_test_run\"\n", - "snowcast_github_dir = f\"{home_dir}/Documents/GitHub/SnowCast/\"\n", - "\n", - "#exit() # this process no longer need to execute, we need to make Geoweaver to specify which process doesn't need to run\n", - "\n", - "# user-defined paths for data-access\n", - "data_dir = f'{snowcast_github_dir}data/'\n", - "gridcells_file = data_dir+'snowcast_provided/grid_cells_eval.geojson'\n", - "stations_file = f\"{work_dir}/all_snotel_cdec_stations_active_in_westus.csv\"\n", - "gridcells_outfile = data_dir+'terrain/gridcells_terrainData_eval.csv'\n", - "stations_outfile = f\"{work_dir}/training_all_active_snotel_station_list_elevation.csv_terrain_4km_grid_shift.csv\"\n", - "\n", - "\n", - "def get_planetary_client():\n", - " #requests.get('https://planetarycomputer.microsoft.com/api/stac/v1')\n", - "\n", - " # setup client for handshaking and data-access\n", - " print(\"setup planetary computer client\")\n", - " client = Client.open(\"https://planetarycomputer.microsoft.com/api/stac/v1\",ignore_conformance=True)\n", - " \n", - " return client\n", - "\n", - "def prepareGridCellTerrain():\n", - " client = get_planetary_client()\n", - " # Load metadata\n", - " gridcellsGPD = gpd.read_file(gridcells_file)\n", - " gridcells = geojson.load(open(gridcells_file))\n", - " stations = pd.read_csv(stations_file)\n", - "\n", - " # instantiate output panda dataframes\n", - " df_gridcells = df = pd.DataFrame(columns=(\n", - " \"Longitude [deg]\",\"Latitude [deg]\",\n", - " \"Elevation [m]\",\"Aspect [deg]\",\n", - " \"Curvature [ratio]\",\"Slope [deg]\",\n", - " \"Eastness [unitCirc.]\",\"Northness [unitCirc.]\"))\n", - " # instantiate output panda dataframes\n", - " # Calculate gridcell characteristics using Copernicus DEM data\n", - " print(\"Prepare GridCell Terrain data\")\n", - " for idx,cell in enumerate(gridcells['features']):\n", - " print(\"Processing grid \", idx)\n", - " search = client.search(\n", - " collections=[\"cop-dem-glo-30\"],\n", - " intersects={\"type\":\"Polygon\", \"coordinates\":cell['geometry']['coordinates']},\n", - " )\n", - " items = list(search.get_items())\n", - " print(\"==> Searched items: \", len(items))\n", - "\n", - " cropped_data = None\n", - " try:\n", - " signed_asset = planetary_computer.sign(items[0].assets[\"data\"])\n", - " data = (\n", - " #xarray.open_rasterio(signed_asset.href)\n", - " xarray.open_rasterio(signed_asset.href)\n", - " .squeeze()\n", - " .drop(\"band\")\n", - " .coarsen({\"y\": 1, \"x\": 1})\n", - " .mean()\n", - " )\n", - " cropped_data = data.rio.clip(gridcellsGPD['geometry'][idx:idx+1])\n", - " except:\n", - " signed_asset = planetary_computer.sign(items[1].assets[\"data\"])\n", - " data = (\n", - " xarray.open_rasterio(signed_asset.href)\n", - " .squeeze()\n", - " .drop(\"band\")\n", - " .coarsen({\"y\": 1, \"x\": 1})\n", - " .mean()\n", - " )\n", - " cropped_data = data.rio.clip(gridcellsGPD['geometry'][idx:idx+1])\n", - "\n", - " # calculate lat/long of center of gridcell\n", - " longitude = np.unique(np.ravel(cell['geometry']['coordinates'])[0::2]).mean()\n", - " latitude = np.unique(np.ravel(cell['geometry']['coordinates'])[1::2]).mean()\n", - "\n", - " print(\"reproject data to EPSG:32612\")\n", - " # reproject the cropped dem data\n", - " cropped_data = cropped_data.rio.reproject(\"EPSG:32612\")\n", - "\n", - " # Mean elevation of gridcell\n", - " mean_elev = cropped_data.mean().values\n", - " print(\"Elevation: \", mean_elev)\n", - "\n", - " # Calculate directional components\n", - " aspect = xrspatial.aspect(cropped_data)\n", - " aspect_xcomp = np.nansum(np.cos(aspect.values*(np.pi/180)))\n", - " aspect_ycomp = np.nansum(np.sin(aspect.values*(np.pi/180)))\n", - " mean_aspect = np.arctan2(aspect_ycomp,aspect_xcomp)*(180/np.pi)\n", - " if mean_aspect < 0:\n", - " mean_aspect = 360 + mean_aspect\n", - " print(\"Aspect: \", mean_aspect)\n", - " mean_eastness = np.cos(mean_aspect*(np.pi/180))\n", - " mean_northness = np.sin(mean_aspect*(np.pi/180))\n", - " print(\"Eastness: \", mean_eastness)\n", - " print(\"Northness: \", mean_northness)\n", - "\n", - " # Positive curvature = upward convex\n", - " curvature = xrspatial.curvature(cropped_data)\n", - " mean_curvature = curvature.mean().values\n", - " print(\"Curvature: \", mean_curvature)\n", - "\n", - " # Calculate mean slope\n", - " slope = xrspatial.slope(cropped_data)\n", - " mean_slope = slope.mean().values\n", - " print(\"Slope: \", mean_slope)\n", - "\n", - " # Fill pandas dataframe\n", - " df_gridcells.loc[idx] = [longitude,latitude,\n", - " mean_elev,mean_aspect,\n", - " mean_curvature,mean_slope,\n", - " mean_eastness,mean_northness]\n", - "\n", - " # Save output data into csv format\n", - " df_gridcells.set_index(gridcellsGPD['cell_id'][0:idx+1],inplace=True)\n", - " df_gridcells.to_csv(gridcells_outfile)\n", - "\n", - "def prepareStationTerrain():\n", - " client = get_planetary_client()\n", - " \n", - " df_station = pd.DataFrame(columns=(\"Longitude [deg]\",\"Latitude [deg]\",\n", - " \"Elevation [m]\",\"Elevation_30 [m]\",\"Elevation_1000 [m]\",\n", - " \"Aspect_30 [deg]\",\"Aspect_1000 [deg]\",\n", - " \"Curvature_30 [ratio]\",\"Curvature_1000 [ratio]\",\n", - " \"Slope_30 [deg]\",\"Slope_1000 [deg]\",\n", - " \"Eastness_30 [unitCirc.]\",\"Northness_30 [unitCirc.]\",\n", - " \"Eastness_1000 [unitCirc.]\",\"Northness_1000 [unitCirc.]\"))\n", - " \n", - " stations_df = pd.read_csv(stations_file)\n", - " print(stations_df.head())\n", - " # Calculate terrain characteristics of stations, and surrounding regions using COP 30\n", - " for idx,station in stations_df.iterrows():\n", - " search = client.search(\n", - " collections=[\"cop-dem-glo-30\"],\n", - " intersects={\n", - " \"type\": \"Point\", \n", - " \"coordinates\": [\n", - " stations_df['lon'],\n", - " stations_df['lat']\n", - " ]\n", - " },\n", - " )\n", - " items = list(search.get_items())\n", - " print(f\"Returned {len(items)} items\")\n", - "\n", - " try:\n", - " signed_asset = planetary_computer.sign(items[0].assets[\"data\"])\n", - " data = (\n", - " xarray.open_rasterio(signed_asset.href)\n", - " .squeeze()\n", - " .drop(\"band\")\n", - " .coarsen({\"y\": 1, \"x\": 1})\n", - " .mean()\n", - " )\n", - " xdiff = np.abs(data.x-stations_df['lon'])\n", - " ydiff = np.abs(data.y-stations_df['lat'])\n", - " xdiff = np.where(xdiff == xdiff.min())[0][0]\n", - " ydiff = np.where(ydiff == ydiff.min())[0][0]\n", - " data = data[ydiff-33:ydiff+33,xdiff-33:xdiff+33].rio.reproject(\"EPSG:32612\")\n", - " except:\n", - " traceback.print_exc(file=sys.stdout)\n", - " signed_asset = planetary_computer.sign(items[1].assets[\"data\"])\n", - " data = (\n", - " xarray.open_rasterio(signed_asset.href)\n", - " .squeeze()\n", - " .drop(\"band\")\n", - " .coarsen({\"y\": 1, \"x\": 1})\n", - " .mean()\n", - " )\n", - " xdiff = np.abs(data.x-stations_df['lon'])\n", - " ydiff = np.abs(data.y-stations_df['lat'])\n", - " xdiff = np.where(xdiff == xdiff.min())[0][0]\n", - " ydiff = np.where(ydiff == ydiff.min())[0][0]\n", - " data = data[ydiff-33:ydiff+33,xdiff-33:xdiff+33].rio.reproject(\"EPSG:32612\")\n", - "\n", - " # Reproject the station data to better include only 1000m surrounding area\n", - " inProj = Proj(init='epsg:4326')\n", - " outProj = Proj(init='epsg:32612')\n", - " new_x,new_y = transform(inProj,outProj,\n", - " stations_df['lon'],\n", - " stations_df['lat'])\n", - "\n", - " # Calculate elevation of station and surroundings\n", - " mean_elevation = data.mean().values\n", - " elevation = data.sel(x=new_x,y=new_y,method='nearest')\n", - " print(elevation.values)\n", - "\n", - " # Calcuate directional components\n", - " aspect = xrspatial.aspect(data)\n", - " aspect_xcomp = np.nansum(np.cos(aspect.values*(np.pi/180)))\n", - " aspect_ycomp = np.nansum(np.sin(aspect.values*(np.pi/180)))\n", - " mean_aspect = np.arctan2(aspect_ycomp,aspect_xcomp)*(180/np.pi)\n", - " if mean_aspect < 0:\n", - " mean_aspect = 360 + mean_aspect\n", - " aspect = aspect.sel(x=new_x,y=new_y,method='nearest')\n", - " eastness = np.cos(aspect*(np.pi/180))\n", - " northness = np.sin(aspect*(np.pi/180))\n", - " mean_eastness = np.cos(mean_aspect*(np.pi/180))\n", - " mean_northness = np.sin(mean_aspect*(np.pi/180))\n", - "\n", - " # Positive curvature = upward convex\n", - " curvature = xrspatial.curvature(data)\n", - " mean_curvature = curvature.mean().values\n", - " curvature = curvature.sel(x=new_x,y=new_y,method='nearest')\n", - " print(curvature.values)\n", - "\n", - " # Calculate slope\n", - " slope = xrspatial.slope(data)\n", - " mean_slope = slope.mean().values\n", - " slope = slope.sel(x=new_x,y=new_y,method='nearest')\n", - " print(slope.values)\n", - "\n", - " # Fill pandas dataframe\n", - " df_station.loc[idx] = [stations_df['lon'],\n", - " stations_df['lat'],\n", - " station['elevation_m'],\n", - " elevation.values,mean_elevation,\n", - " aspect.values,mean_aspect,\n", - " curvature.values,mean_curvature,\n", - " slope.values,mean_slope,\n", - " eastness.values,northness.values,\n", - " mean_eastness,mean_northness]\n", - "\n", - " # Save output data into CSV format\n", - " df_station.set_index(stations_df['station_name'][0:idx+1],inplace=True)\n", - " df_station.to_csv(stations_outfile)\n", - "\n", - "\n", - "def add_more_points_to_the_gridcells():\n", - " # check how many points are in the current grid_cell json\n", - " station_cell_mapping = f\"{work_dir}/station_cell_mapping.csv\"\n", - " current_grid_df = pd.read_csv(station_cell_mapping)\n", - " \n", - " print(current_grid_df.columns)\n", - " print(current_grid_df.shape)\n", - " \n", - " western_us_coords = f'{work_dir}/dem_file.tif.csv'\n", - " dem_df = pd.read_csv(western_us_coords)\n", - " print(dem_df.head())\n", - " print(dem_df.shape)\n", - " filtered_df = dem_df[dem_df['Elevation'] > 20] # choose samples from points higher than 20 meters\n", - "\n", - " # Randomly choose 700 rows from the filtered DataFrame\n", - " random_rows = filtered_df.sample(n=700)\n", - " random_rows = random_rows[[\"Latitude\", \"Longitude\"]]\n", - " random_rows.rename(columns={\n", - " 'Latitude': 'lat', \n", - " 'Longitude': 'lon'\n", - " }, inplace=True)\n", - " previous_cells = current_grid_df[[\"lat\", \"lon\"]]\n", - " result_df = previous_cells.append(random_rows, ignore_index=True)\n", - " print(result_df.shape)\n", - " result_df.to_csv(f\"{work_dir}/new_training_points_with_random_dem_locations.csv\")\n", - " print(f\"New training points are saved to {work_dir}/new_training_points_with_random_dem_locations.csv\")\n", - " \n", - " \n", - " \n", - " # find the random points that are on land from the dem.json\n", - " \n", - " # merge the grid_cell.json with the new dem points into a new grid_cell.json\n", - " \n", - "def find_closest_index(target_latitude, target_longitude, lat_grid, lon_grid):\n", - " \"\"\"\n", - " Find the closest grid point indices for a target latitude and longitude.\n", - "\n", - " Parameters:\n", - " target_latitude (float): Target latitude.\n", - " target_longitude (float): Target longitude.\n", - " lat_grid (numpy.ndarray): Array of latitude values.\n", - " lon_grid (numpy.ndarray): Array of longitude values.\n", - "\n", - " Returns:\n", - " int: Latitude index.\n", - " int: Longitude index.\n", - " float: Closest latitude value.\n", - " float: Closest longitude value.\n", - " \"\"\"\n", - " lat_diff = np.float64(np.abs(lat_grid - target_latitude))\n", - " lon_diff = np.float64(np.abs(lon_grid - target_longitude))\n", - " row_idx = np.argmin(lat_diff + lon_diff)\n", - " return row_idx\n", - " \n", - " \n", - "def read_terrain_from_dem_csv():\n", - " western_us_coords = f'{work_dir}/dem_all.csv'\n", - " western_df = pd.read_csv(western_us_coords)\n", - " print(\"western_df.head() = \", western_df.head())\n", - " \n", - " stations_file_df = pd.read_csv(stations_file)\n", - " print(\"stations_file_df.head() = \", stations_file_df.head())\n", - " \n", - " def find_closest_dem_row(row, western_df):\n", - " #print(row)\n", - " row_idx = find_closest_index(\n", - " row[\"latitude\"],\n", - " row[\"longitude\"],\n", - " western_df[\"Latitude\"], \n", - " western_df[\"Longitude\"]\n", - " )\n", - " dem_row = western_df.iloc[row_idx]\n", - " new_row = pd.concat([row, dem_row], axis=0)\n", - " return new_row\n", - " \n", - " stations_file_df = stations_file_df.apply(find_closest_dem_row, args=(western_df,), axis=1)\n", - " stations_file_df.to_csv(stations_outfile, index=False)\n", - " \n", - "\n", - "if __name__ == \"__main__\":\n", - " try:\n", - " read_terrain_from_dem_csv()\n", - " except:\n", - " traceback.print_exc(file=sys.stdout)\n" - ] - }, - { - "cell_type": "markdown", - "id": "1bb5bf61", - "metadata": {}, - "source": [ - "In our geospatial data processing workflow, we utilize various libraries to analyze terrain characteristics for the SnowCast project. We calculate attributes like elevation, aspect, curvature, slope, eastness, and northness for grid cells and station locations. Our process involves accessing Copernicus DEM data and leveraging the Planetary Computer service. Through this analysis, we contribute to a broader understanding of the geographic region under study." + "\"Eastness\n" ] } ], diff --git a/book/chapters/dem.ipynb b/book/chapters/dem.ipynb index 74777cc..0ff64a8 100644 --- a/book/chapters/dem.ipynb +++ b/book/chapters/dem.ipynb @@ -5,7 +5,7 @@ "id": "29dcdd816afc6ba4", "metadata": {}, "source": [ - "# 3.5 Digital Elevation Model" + "# 3.6 Digital Elevation Model" ] }, { @@ -17,7 +17,29 @@ "\n", "Used to analyze topography, such as `slope`, `aspect`, and `curvature`, which are essential for understanding landforms and landscape features.\n", "\n", - "This chapter covers how to create, process, and analyze DEMs using Python and shell scripts. We will walk through the process of creating a GeoTIFF file for a specific region, reprojection and resampling of DEMs, and extracting various features from DEMs" + "This chapter covers how to create, process, and analyze DEMs using Python and shell scripts. We will walk through the process of creating a GeoTIFF file for a specific region, reprojection and resampling of DEMs, and extracting various features from DEMs\n", + "\n", + "## 3.6.1 Characteristics\n", + "\n", + "
\n", + "\n", + "**Product/Data Type**: SRTM 90m Digital Elevation Model (DEM)\n", + "\n", + "**Nominal Data Array Dimensions**: 5° x 5° tiles\n", + "\n", + "**Spatial Resolution**: 90 meters (at the equator)\n", + "\n", + "**Temporal Resolution**: Single-time snapshot (data captured during the SRTM mission in 2000)\n", + "\n", + "**Vertical Accuracy**: Less than 16 meters error\n", + "\n", + "**Data Format**: ArcInfo ASCII and GeoTiff\n", + "\n", + "**Coverage**: Western USA\n", + "\n", + "**Projection**: WGS84 datum, geographic coordinate system\n", + "\n", + "
" ] }, { @@ -25,7 +47,7 @@ "id": "acd0a803", "metadata": {}, "source": [ - "## 3.5.1 Creating a GeoTIFF Template for the Western U.S." + "## 3.6.2 Creating a GeoTIFF Template for the Western U.S." ] }, { @@ -36,18 +58,6 @@ "Our goal is to create a GeoTIFF file that serves as a template for the western U.S. This GeoTIFF will have a specified spatial extent and resolution, and will initially contain an empty 2D array. This template can be used as a starting point for adding real elevation data later." ] }, - { - "cell_type": "markdown", - "id": "edb3cec0", - "metadata": {}, - "source": [ - "**Spatial Extent**: Defines the geographic area covered by the DEM. For this example, we will focus on the western U.S. with specific latitude and longitude boundaries.\n", - "\n", - "**Resolution**: Determines the level of detail in the DEM. A resolution of 0.036 degrees is chosen here, which corresponds to a spatial resolution of approximately 4 kilometers.\n", - "\n", - "**GeoTIFF**: A file format for storing raster graphics, including DEMs. It includes metadata about the spatial reference and other attributes." - ] - }, { "cell_type": "code", "execution_count": 1, @@ -127,7 +137,7 @@ "id": "d15cf514", "metadata": {}, "source": [ - "## 3.5.2 Reprojecting and Resampling DEMs" + "## 3.6.3 Reprojecting and Resampling DEMs" ] }, { @@ -233,7 +243,7 @@ "id": "c9479759", "metadata": {}, "source": [ - "## 3.5.3 Calculating DEM Features" + "## 3.6.4 Calculating DEM Features" ] }, { @@ -286,7 +296,7 @@ "id": "5b782375", "metadata": {}, "source": [ - "### 3.5.3.1 How to calculate slope and aspect from a given dem file\n", + "### 3.6.4.1 How to calculate slope and aspect from a given dem file\n", "\n", "**Slope**: This tells us how steep the terrain is.\n", "\n", @@ -354,7 +364,7 @@ "id": "224863a4", "metadata": {}, "source": [ - "### 3.5.3.2 How to calculate Curvature from a dem file\n", + "### 3.6.4.2 How to calculate Curvature from a dem file\n", "\n", "What is Curvature?\n", "\n", @@ -416,7 +426,7 @@ "id": "461d2214", "metadata": {}, "source": [ - "### 3.5.3.3 How to calculate gradients\n", + "### 3.6.4.3 How to calculate gradients\n", "\n", "Why Northness and Eastness?\n", "\n", @@ -476,7 +486,7 @@ "id": "672522c4", "metadata": {}, "source": [ - "## 3.5.4 GeoTIFF to CSV Conversion " + "## 3.6.5 GeoTIFF to CSV Conversion " ] }, { @@ -514,7 +524,7 @@ "id": "9c01cfab", "metadata": {}, "source": [ - "## 3.5.5 How to save GeoTIFF with Meta Data" + "## 3.6.6 How to save GeoTIFF with Meta Data" ] }, { @@ -583,7 +593,7 @@ "id": "9848d5de", "metadata": {}, "source": [ - "## 3.5.6 Unleashing Terrain Insights: From DEM to CSV\n", + "## 3.6.7 Unleashing Terrain Insights: From DEM to CSV\n", "Now lets utilise all the functions we have created to convert the dem files to csv files and merge them into a single csv file consisting slope, aspect, curvature, northness, and eastness" ] }, @@ -758,12 +768,6 @@ "merged_df.to_csv(result_dem_feature_csv_path, index=False)\n", "print(f\"New dem features are updated in {result_dem_feature_csv_path}\")" ] - }, - { - "cell_type": "markdown", - "id": "db1056fb", - "metadata": {}, - "source": [] } ], "metadata": { diff --git a/book/chapters/fsCA.ipynb b/book/chapters/fsCA.ipynb index 14a2911..a8e2d54 100644 --- a/book/chapters/fsCA.ipynb +++ b/book/chapters/fsCA.ipynb @@ -7,7 +7,7 @@ "collapsed": false }, "source": [ - "# 3.4 MODIS for fsCA" + "# 3.5 MODIS for fsCA" ] }, { @@ -15,6 +15,31 @@ "id": "d4476bab", "metadata": {}, "source": [ + "## 3.5.1 Characteristics of MODIS\n", + "\n", + "
\n", + "\n", + "**Earth Science Data Type (ESDT)**: MOD10A1\n", + "\n", + "**Product Level**: L3\n", + "\n", + "**Nominal Data Array Dimensions**: 1200km by 1200km\n", + "\n", + "**Spatial Resolution**: 500m\n", + "\n", + "**Temporal Resolution**: day\n", + "\n", + "**Map Projection**: Sinusoidal \n", + "\n", + "
\n" + ] + }, + { + "cell_type": "markdown", + "id": "33259edd", + "metadata": {}, + "source": [ + "## 3.5.2 Procedure\n", "This script is designed to download MODIS snow cover data from NASA servers, convert the downloaded HDF files to GeoTIFF format, and then merge these GeoTIFF tiles into a single file for each day within a specified date range." ] }, @@ -36,7 +61,7 @@ }, { "cell_type": "markdown", - "id": "6683a825", + "id": "774749e3", "metadata": {}, "source": [ "**os, subprocess, threading**: Libraries for file operations, running shell commands, and multithreading.\n", @@ -52,75 +77,171 @@ "create an account in urs.earthdata.nasa.gov for earth access" ] }, + { + "cell_type": "code", + "execution_count": null, + "id": "5add18ec", + "metadata": {}, + "outputs": [], + "source": [ + "start_date = datetime(2023, 1, 1)\n", + "end_date = datetime(2023, 1, 31)\n", + "tile_list = [\"h08v04\", \"h08v05\", \"h09v04\", \"h09v05\", \"h10v04\", \"h10v05\", \"h11v04\", \"h11v05\", \"h12v04\", \"h12v05\", \"h13v04\", \"h13v05\", \"h15v04\", \"h16v03\", \"h16v04\"]\n", + "input_folder = f\"{work_dir}/temp/\"\n", + "output_folder = f\"{work_dir}/output_folder/\"\n", + "modis_day_wise = f\"{work_dir}/final_output/\"\n", + "os.makedirs(output_folder, exist_ok=True)\n", + "os.makedirs(modis_day_wise, exist_ok=True)" + ] + }, { "cell_type": "markdown", - "id": "846abe3a", + "id": "b8c4f93a", "metadata": {}, "source": [ - "## 3.4.1 Converting date to Julian format " + "Defines the date range and a list of MODIS tiles.\n", + "\n", + "Creates input, output, and final output directories if they don't exist.\n" ] }, { "cell_type": "code", - "execution_count": 2, - "id": "377eb1f1", + "execution_count": null, + "id": "a70e2450", "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The file ../data/fsca/final_output//2023-01-01__snow_cover.tif does not exist.\n", + "start to download files from NASA server to local\n" + ] + }, + { + "data": { + "application/vnd.jupyter.widget-view+json": { + "model_id": "f783fe916b404146807897d33b326446", + "version_major": 2, + "version_minor": 0 + }, + "text/plain": [ + "QUEUEING TASKS | : 0%| | 0/17 [00:00\n", + "\n", + "**Product/Data Type**: SNOTEL Station Daily Data\n", + "\n", + "**Spatial Resolution**: Point data specific to each SNOTEL station location present in the western USA within bounding box of\n", + "southwest_lon = -125.0\n", + "\n", + "southwest_lat = 25.0\n", + "\n", + "northeast_lon = -100.0\n", + "\n", + "northeast_lat = 49.0\n", + "\n", + "**Temporal Resolution**: Daily\n", + "\n", + "**Data Format**: CSV\n", + "\n", + "\n", + "\n", "\n", "This script automates the collection of Snow Water Equivalent (SWE) data from SNOTEL stations, filters it based on geographic criteria, and saves it into CSV files. By the end of this process, we'll have a valuable dataset, ready to provide insights into SWE, snow depth, and temperature trends in the Western United States." ]