Every year, fracking generates hundreds of billions of gallons of wastewater laced with corrosive salts, radioactive materials and many other chemicals. Because some of that wastewater winds up in rivers after it’s treated to remove dangerous contaminants, regulators across the U.S. have begun to develop testing regimens to gauge how badly fracking wastewater is polluted and how effective treatment plants are at removing contamination.
A newly published scientific study, however, shows that testing methods sometimes used and recommended by state regulators in the Marcellus region can dramatically underestimate the amount of radioactive radium in fracking wastewater.
These test methods can understate radium levels by as much as 99 percent, according to a scientific paper published earlier this month in Environmental Science and Technology Letters. The tests, both recommended by the Environmental Protection Agency for testing radium levels in drinking water, can be thrown off by the mix of other contaminants in salty, chemical-laden fracking brine, researchers found.
Not all the radium tests from the Marcellus region dramatically understate radioactivity. Many researchers, both public and private, have used a method, called gamma spectroscopy, that has proved far more reliable than the EPA drinking water method. But the results of the research serve as a warning to regulators in states across the U.S., as they make decisions about how to monitor radioactivity in fracking waste.
“People have to know that this EPA method is not updated” for use with fracking wastewater or other highly saline solutions, said Avner Vengosh, a geochemist at Duke University.
The team of scientists from the University of Iowa tested “flowback water,” the water that flows out from a shale well after fracking, using several different test methods. The EPA drinking water method detected less than one percent of radium-226, the most common radioactive isotope in Marcellus wastewater.