An explorer of Unemployment Insurance data, which can be downlaoded from here: https://oui.doleta.gov/unemploy
This data concerns the administration of the UI program in each state.
Specifically, this focuses on data regarding payment and decision timelapses, to ensure that states are holding up their end of the bargain in promptly paying claimants or at least adjudicating their cases promptly.
This product uses the FRED® API but is not endorsed or certified by the Federal Reserve Bank of St. Louis. See the FRED API's terms of use: https://research.stlouisfed.org/docs/api/terms_of_use.html
This project was originally created by Michael Hollander and Community Legal Services in 2017. It has been updated and refreshed through a collaboration between The Century Foundation, Community Legal Services, and Michael Hollander.
You can explore the unemployment insurance data we've compiled though an interactive website with an array of charts and maps.
You can view the app here: https://tcf-ui-data.shinyapps.io/ui-data-explorer/
You can download the data for your own analysis as well.
Data is released here as a parquet
file for use with R, python, and other programming languages. https://github.com/tcf-ui-data/ui-data-explorer/releases/tag/uiExplorerData
You can also download the data as a collection of csv
tables. You can use these tables in your favorite statistical package as well as Excel, LibreOffice Calc, or other spreadsheet app. https://github.com/tcf-ui-data/ui-data-explorer/releases/tag/uiExplorerCSV
This project is open source, and we value contributions.
The code in this repository produces two different things: a package of processed unemployment compensation data and a Shiny webapp for visualizing and interacting with the data.
You can download and process the data on your own computer. Download this repository with git clone
. Run Rscripts unemploymentDataProcessor.R
to download and process data into a variety of useful tables.
You can also use docker-compose to download and process the data with docker-compose run -rm datadownload
We can publish the data in two ways.
Locally
- Locally clone the repository and run
Rscript unemploymentDataProcessor.R
. - Load a github token into your shell's environment. A
.env
file is helpful here. - Run
. ./updateRelease.sh
to update the released data on Github. The script accepts a few command line arguments. See the script for the details.
Github Actions
The Github workflow described in the file .github/workflows/releasedata
will automatically process and publish the data.
Once you've managed to download the data, you can use RStudio to run the app. The app is a Shiny application, so RStudio can help you install the necessary packages and get the app running on your computer.
We can publish the app in three ways.
Locally in RStudio
Once you have the app running locally, use the rsconnect
library from Shinyapps.io. Set environment variables for SHINYAPPS_ACCOUNT
, SHINYAPPS_TOKEN
, and SHINYAPPS_SECRET
. You can get these values from your shinyapps account. Then run Rscript deployShinyApps.R
.
Locally with Docker-Compose.
Running docker-compose run --rm shinyappdeploy
will start a docker service that downloads and processes the data, and publishes the app to Shinyapps. Make sure your SHINYAPPS_x
environment variables are set up. Docker-compose will automatically load a .env
file if there is one.
Github Actions
The Github workflow .github/workflows/deployshinyio
describes a workflow that processes the data and publishes to Shinyapps. The workflow .github/workflows/deployshinyiofromrelease
describes a workflow that uses the released parquet data to publish to shinyapps.
To use these workflows, you'll need to add your shiny token, account, and secret as Github Secrets.