Data Mining: EPA Superfund Site Records Downloader

Daily Independent Journal is proud to announce the release of our EPA Superfund Toxic Waste Site Records Downloader.  

The Superfund is a US EPA project to clean up the worst of the worst toxic waste sites in The United States. There are currently over 1,300 toxic waste sites on the Superfund's National Priorities List (NPL).

Activists and journalists that would like to research these sites are often left with the arduous task of locating and downloading all of the relevant records.  Our Superfund downloader streamlines this process, allowing a researcher to quickly locate and acquire all electronically available records for any given toxic waste site.  

Each Superfund site on the NPL has an associated identifying number, called a site ID.  Once the site ID is located, it can then be entered into the downloader and the software will scrape the US EPA website and download all relevant records and store them as PDF files. 

It's estimated that there are well over a half billion records associated with the Superfund, but that only 3 to 4% of these records are available on the US EPA's website.  The rest are stored, in hard copy form, in federally required document repositories around the country. 

The software was conceptualized by our publisher, Matthew Berdyck, and was coded by Alex Hwang, who is a Silicon Valley software engineer that has worked for Reuters, Google, and Facebook.  

Hwang's most recent project was working on Facebook's team that determined the identities and informed Facebook users if they had become victims of Russian social media propaganda orchestrated during the 2016 election. 

Along with being the publisher of Daily Independent Journal, Berdyck is also the founder of SuperfundResearch.org and a former software tester for FedEx. 

The application is currently in beta format and is expected to be developed into a web app in January of 2019. 

The software can be downloaded via Dropbox, here: Superfund Documents Downloader