RCRA Overview

The Resource Conservation and Recovery Act (RCRA) was passed by the United States Congress in 1976 to address problems caused by municipal and industrial waste. RCRA’s focus is on active and future waste facilities and covers generation, transportation, treatment, and disposal of hazardous wastes. Abandoned and historical sites are managed under the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) which is also knows as Superfund (paraphrased from information at the EPA website, here and here).

RCRA waste characterization parameters include:

  • ignitibility (flashpoint);
  • corrosivity (pH);
  • reactivity (reactive cyanide & reactive sulfide);
  • toxicity characteristic leaching procedure (TCLP) pesticides;
  • TCLP herbicides;
  • TCLP volatile organic compounds (VOCs);
  • TCLP semivolatile organic compounds (TCLP SVOCs);
  • polychlorinated biphenyls (PCBs);
  • and the “RCRA 8” metals: silver, arsenic, barium, cadmium, chromium, mercury, lead, and selenium.

Other analyses such as chloride, paint filter, and total petroleum hydrocarbon (TPH) suites may also be needed depending upon site history, the waste generator or contract. TCLP Metals is also occasionally required or useful.

Analytical notes:

  • “TCLP” is not one test, but several, and though some of the sample preparation efforts may be combined, this is not possible for every method, and none of the analyses can be combined; each requires a separate method. It is important to account for this in cost estimates.
  • As shown above, reactivity is two methods, both of which are wet chemistry methods.
  • Additionally, although a large carbon range may be analyzed under a single TPH method, the information needed for specific sites requiring THP analysis usually requires one to three separate TPH methods: TPH-gasoline range organics (GRO), TPH-diesel range organics (DRO), and TPH – oil range organics (TPH-ORO). Site history can help determine which carbon ranges are needed.
  • If analysis for polynuclear aromatic hydrocarbons (PAHs) in required, this usually cannot be grouped with SVOCs, but rather, requires a selected ion monitoring (SIM) analysis. Both analyses may be listed as 8270 methods.
  • “Oil & grease” is generally no longer required.
  • There is no such method as “oil & gas”, and anyone requesting such a method usually needs TPH analyses.

RCRA suites are used on environmental projects to determine if waste is hazardous and what type of disposal is required. Some suites may be dropped if site history is well-known and documented; however, any accepting landfill must approve such measures. If such an approach seems logical, consult with the accepting landfill prior to proceeding.

CERCLA suites vary by site and are based upon site history and investigation data. CERCLA suites are formalized in a Record of Decision (ROD), which is based upon the Remedial Investigation/Feasibility Study (RI/FS).

If you need assistance determining the appropriate analytical suites for your site, please click on the contact page and send us an email for an estimate.

What Are Your Data Telling You?

A Budget and Schedule Streamlining Review & Summary of Anaerobic Reductive Dechlorination of Chlorinated Hydrocarbons

Several previous articles discuss simplifying and streamlining environmental data acquisition and evaluation.

Mindful consideration of the information covered in the aforementioned articles can increase project margins and provide tools for avoiding some common pitfalls, such as:

  • Misinterpretation of nondetect data as an indication that a site is not contaminated when samples have been diluted such that quantitation limits (QLs) are too high to support the conclusion. This can occur because
    • Matrices are not amenable to the analyses.
    • Contamination concentrations require large dilutions.
    • Analysts are reluctant to analyze samples at lesser dilutions due to not understanding project purposes, inexperience, to avoid instrument maintenance, or to protect expensive and delicate instrumentation.
  • Failure to plan for an approach to achieve site closure when it is not possible to achieve detection limits (DLs) less than regulatory levels
    • Because regulatory levels are usually established using toxicological studies, not instrumental analyses, current methodologies may not be able to achieve these values.
    • Identifying the approach for addressing this issue during the project planning phase, and securing client and regulatory approval of the approach when the plan is finalized, is more efficient than securing approval after samples have been analyzed.
    • Additionally, planning for this issue allows for identification of more sensitive methodologies if they exist and performance of a cost/benefit analysis for the use of such methods with cooperation of clients and regulators.
    • Equally as important, it provides a tool to avoid misinterpretation of such data as an indication that the site is contaminated.
  • Attempting to remediate to levels less than native background.
    • Sometimes Federal regulatory levels are less than native levels (ex: arsenic in the West) and it is crucial to perform site assessments against background data (or perform background studies) in such cases.
  • Long-term monitoring of common contaminants that have been misidentified as contaminants of concern (COC). In such cases, assessment of historical data is necessary and several sampling rounds may be required to identify and gain approval for eliminating these parameters.

These pitfalls are easily avoided with proper assessment of site data and data requirements. However, site data can provide far more information. This may require analysis of parameters that are not COCs so careful planning is important, because analysis for parameters that do not provide relevant information wastes time and money and does not support the protection of human health and the environment.

Natural attenuation is one example of a remediation process where analysis for non-COCs can provide valuable information. Natural attenuation occurs when naturally occurring processes reduce contamination in soil and groundwater. These processes occur in situ and include dilution, dispersion, volatilization and other natural processes. Monitored natural attenuation (MNA) is an approved remedy at some sites and involves collection of data to assess and document the efficacy of the attenuation process.

When MNA is in place, non-COC data can be used to confirm that conditions are amenable to attenuation and the presence of breakdown products. Aerobic or anaerobic conditions (or one and then the other), specific pH values, and the presence of specific metals and/or microbes may be needed. Natural attenuation may be enhanced through forcing a site to anaerobic conditions, introducing bacteria and/or feeding native bacteria, and/or temporarily altering the pH. In each case, data must be collected to determine if the desired conditions have been achieved, and subsequent data must be collected to determine if attenuation was enhanced.

Contaminants that may undergo natural attenuation include chlorinated solvents, certain metals, radionuclides, and oil & gas-related aromatic hydrocarbons such as benzene, toluene, ethylbenzene, and xylenes (BTEX). For this article, the MNA process to be further considered is anaerobic reductive dechlorination (ARD) of chlorinated hydrocarbons, specifically ARD of the volatile organic compounds (VOCs) tetrachloroethene (PERC or PCE) and trichloroethene (TCE).

The primary or initial contaminant for this process may be either PCE or TCE. These contaminants are present in the environment due to past use as degreasers, dry-cleaning agents, and through use in manufacturing processes. The ARD process breaks down PCE to TCE, and TCE subsequently breaks down to cis-1,2,-dichloroethene (DCE) and trans-1,2-DCE. 1,1-DCE may also be produced. The DCE isomers break down to vinyl chloride, and vinyl chloride breaks down to ethene. If all PCE and/or TCE breaks down to ethene, remediation is complete, because ethene is not an environmental risk.

To assess if ARD is occurring, break down products are included in the analytical data set. If TCE (and/or PCE) concentrations are decreasing and vinyl chloride concentrations are increasing, the process is successfully progressing. Regulatory criteria for vinyl chloride are stringent and the concentrations and may exceed the action limit; however, additional actions are not needed to address the exceedances unless the process stalls at this stage or  if vinyl chloride is also present because manufacturing at the site included polyvinyl chloride (PVC) production. .

Because MNA can be unacceptably time-consuming, often taking many decades to progress, various enhancement processes, as previously noted, have been implemented. The ARD process may be bio-attenuated with the microorganisms Dehalococcoides (abbreviated as DHC or DHE), which can be naturally occurring or inoculated into a site. Even when naturally occurring, the native population may not be great enough to speed ARD to the desired extent. In such cases, the microorganisms may be fed with vegetable oil or molasses. The table below shows some of the analyses that may be used at an ARD site.

ARD Table

Additional information about ARD and various analytical parameters can be found at the EPA Clu-In website, here. The information at the link provides more detailed and extensive information as well as links to additional resources. However, in the experience of the authors, if results show that ARD is occurring, assessing other parameters at the frequency indicated at the Clu-In link is not necessary. Some of the tabulated analyses listed are high-cost specialty analyses that may not add value for your specific project. If you are unsure of the frequency and type of analyses needed at your site, we invite you to contact Oak Services, LLC for a consultation.

 References & Resources

2007, Agency for Toxic Substances & Disease Registry (ATSDR), https://www.atsdr.cdc.gov/csem/csem.asp?csem=15&po=5, November.

2014, ATSD, https://www.atsdr.cdc.gov/phs/phs.asp?id=263&tid=48, October.

1999, United States Environmental Protection Agency (USEPA) https://www.epa.gov/sites/production/files/2014-02/documents/d9200.4-17.pdf, April.

2012, USEPA, https://clu-in.org/techfocus/default.focus/sec/bioremediation/cat/Anaerobic_Bioremediation_(Direct), -.

2001, Unites States Geological Survey (USGS), Natural Attenuation Strategy for Groundwater Cleanup Focuses on Demonstrating Cause and Effect, January.

Environmental Analyses and Your Project Budget, Part Two

As we discussed in Part One of this two-part series of articles, managing the budget is an important aspect of any project, and with Performance Based Contracts (PBCs), it is critical. In Part One, we discussed streamlining analytical parameters and what to consider when determining when and how to do so. In a previous article, we discussed what your laboratory needs to know to provide you with the most accurate pricing for your project. In this article, we focus on questions about how your analyses will be used and how the answers can further help manage your budget and schedule.

What decisions hinge upon your analytical results? Will you have dig-sites that must be left open as you await the results? Will discharge operations stall while you wait?

In any project where stand-by time is accrued prior to receipt of analytical results, compare stand-by costs to mark-ups for expedited turn-around-times (TATs) for analytical results. Ideally, this comparison is performed while generating your estimate; however, it can also occur during project planning. In almost all cases, the costs for stand-by labor-hours exceed the mark-ups for expedited TATs for results, even at 100% and 200% mark-ups for the expedited TATs. It’s important to confirm that your laboratory can meet your TAT needs and identify any methodological limitations to meeting those TATs (some methods can be performed within 24 hours while some cannot). Sometimes, clients will permit action with preliminary or partial results after an initial wave of full results if results can reasonably be anticipated to be similar for each event. Our article on establishing a relationship with your laboratory  and Part One of this article provide some guidance for communicating with laboratories and clients.

Does your laboratory offer sample pick-up? Can you deliver samples? Does the laboratory offer on-site packing services?

If you subcontract with a laboratory that offers pick-up at your location, compare costs of this service to costs of shipping. Alternately, a local laboratory may make it possible for your field personnel to drop off samples at the end of the day, which can also save shipping costs. Some laboratories also offer sample packing at field sites. It may be worth considering the costs of this service vs. the possibility of burn-out for field staff if you have limited personnel who will otherwise be packing samples after a long day in the field. The potential for mistakes in sample labelling, packing, and Chain-of-Custody procedures increases at the end of a long work day, and such errors can lead to the need to reanalyze or even re-collect samples, thus increasing costs. Personnel turn-over resulting from burn-out can also be costly. While these concerns do not apply to every project, it is worth a forthright, proactive assessment of whether it applies to yours to mitigate the need to solve problems that could have been prevented.

What level of data validation is needed? Is it imperative to wait for final validation or can actions move forward based on preliminary results?

Depending upon the nature and sensitivity of your project, your client may permit action based on preliminary data verification prior to validation, may approved limited validation, or may – if your laboratory and data validation group remain consistent – approve actions based on preliminary data and data verification following one or more rounds of full validation.

It is important to consult with your client and ensure all actions are ethical and support achievement of project objectives prior to taking these actions. In our article about data validation, we discuss how environmental data validation can mitigate risk, including budgetary risk, for projects. We also provide insight on when it is appropriate to perform reduced validation and when validation may not be required at all. Once you have ensured you are performing the proper level of validation, streamlining your analytical parameters as discussed in Part One will result in streamlined validation, further supporting budget and schedule management.

Actions that reduce stand-by time or streamline work by limiting the focus to relevant details will have a positive impact on schedule and budget. Assessing project types for determining when it is ethical and technically sound to approach your client with these questions is beyond the scope of this article. If you are unsure about this for your own work, we invite you to contact us to determine if contracting Oak Services, LLC for a brief consultation regarding these concerns or for other types of Chemistry Program support is right for you.


Environmental Analyses and Your Project Budget, Part One

Managing the budget is an important aspect of any project, and with Performance Based Contracts (PBCs), it is critical. PBCs offer greater opportunities for technical innovation and efficiency but can also pose financial risks. In a previous article, we discussed what your laboratory needs to know to provide you with the most accurate pricing for your project. In this article, we focus briefly on questions you should be asking of your team and actions you can take to help manage your budget, streamline your work, and increase client confidence.

At what stage in the Superfund process is your project? Have Contaminants of Potential Concern (COPCs) or Contaminants of Concern (COCs) been identified? Once a Record of Decision (ROD) has been issued, site characterization has been performed, and your contaminants have been determined.

Ideally, you identified your intention to evaluate only COCs during the proposal stage of your project. Your contract award and approval of your project plans subsequently indicate client and possibly regulatory approval of this approach. Alternately, client approval can be confirmed during scoping sessions.

If analyses for parameters other than COCs are planned on a new or on-going project, why?

It is prudent to discuss changing to COC-only analysis and reporting with clients and regulators, but push-back on ROD-compliant actions is rare. Unless there is a compelling reason to perform or continue performing analyses for non-COCs, there is no ethical or legal reason to do so.

Clients are usually receptive to simplifying approaches. If you are managing an on-going project or taking over an existing project, an historical data review may be prudent. Common laboratory contaminants sometimes show up as COCs. Contaminants that have been remediated may be listed. A thorough review of historical data can identify such parameters and determine if removal from the COC-list is appropriate. This requires formal approval from the client and regulators, but it is not usually an onerous task when taken on by personnel with experience evaluating and interpreting analytical data. It is even possible that parameters that are naturally occurring but exceed Federal action levels are listed (example: arsenic in the Western United States). In such cases, identifying background studies for your client or proposing to your client that one be designed and executed is prudent.

If you are conducting site characterization, your work is key to identifying appropriate COCs. Is site history known? Are you narrowing your focus on contaminants that are reasonably expected to be present or are you taking a “let’s do everything” approach? If the latter, is there a compelling reason to do so? Are there data gaps or suspected historical activities? If not, why do what is not needed? Do background studies exist for the site or region? Should you propose one?

Meetings with clients and regulators may be needed to streamline your analytical parameters, and formal approval may be needed, but these discussions show that you are attending to the details of the project rather than going through the motions. Asking your laboratory to reduce your analyte list only to those parameters that are meaningful to your project is a scientifically sound, ethical way to reduce not only analytical costs, but also reduce internal costs by streamlining data review, interpretation, and reporting.

Environmental Data Validation – What It Is and Why We Do It

Costs of inaccurate or inadequate data can be steep. Problems with data quality can result in tangible and intangible damage ranging from loss of customer/user confidence to loss of life and mission. – Department of Defense (DoD) Guidelines on Data Quality Management

…..more than a decade’s experience has demonstrated that integrity is not a safe assumption. United States Environmental Protection Agency (EPA), 2002 Guidance on Environmental Data Verification and Data Validation

Data validation is absolutely essential at key decision points, such as determining the boundaries of groundwater contamination. – EPA, Region 9

We must ensure project objectives are met through adequate, accurate data. We must also take steps to ensure project decisions are based upon legally defensible data.

In the terminology of the United States Environmental Protection Agency (EPA) and the Department of Defense (DoD), measurement performance criteria (MPC*) or data quality indicators (DQIs) are the criteria for evaluation of project data. Your MPC support your data quality objectives (DQOs), which are objectives your data must satisfy to be “good enough” to support your project decisions. This means MPC support your DQOs, which in turn support the project objectives. However, your MPC do not have to be perfectly met for your DQOs to be achieved.

How do you determine your MPC and DQOs? Your MPC and DQOs are set during project planning. The optimized Uniform Federal Policy – Quality Assurance Project Plan (UFP-QAPP) lays out steps and some guidelines to assist you in determining what they should be. The prompts in the optimized UFP-QAPP format can be useful for project planning, even if this challenging document is not required for your project, and you don’t go on to use the template. Referencing the prompts can help you meet EPA criteria for project plans and determine when to engage your engineers, geologists, and chemists during planning. Though this may seem time-consuming, this approach to planning will save you valuable project execution time once your project is in progress.

Your chemists can help develop and/or identify your MPC. It may be appropriate to default to the current DoD Quality Systems Manual (QSM) criteria, method criteria, or historical laboratory criteria. Because your planning time is not unlimited and because you cannot predict every field and laboratory condition that may affect your data, using established criteria as your MPC is usually a good approach. Your MPC are intended to be a guide for evaluating how well you met your DQOs and not meeting MPC is an indication that a closer review of the data may be needed, but MPC are not a prescriptive measure of your DQOs.**

So, how do you know if your MPC have been met well enough for you to make sound project decisions? How do you know your project decisions are based upon legally defensible data? Data validation is how you do that.

During project execution, your validators evaluate your data to determine if any of your data are not usable for project purposes. Ideally, this evaluation is not simply an “in is in, out is out” approach to evaluating quality control (QC) outliers, but rather takes your overall project goal into account. With use of Automated Data Review (ADR) and similar tools, 60-80% of what a validator does can be automated. However, not all of the required assessments can be automated, and experience and the ability to look at the big picture have value in minimizing risks.

Although laboratories perform several levels of data review, it is illegal for them to “validate” their own data in most cases; this is considered a conflict of interest. Under deadline and holding-time pressure, mistakes can occur.

Scenario 1:

Validation is required by your client for your project which involves a simple dig and haul remediation of lead in soil with no migration to groundwater. Easy, right? You get your preliminary results which are all nondetects, place your clean fill, and move on. Your final data arrives a few weeks later and your validators determine the calibration for lead was improperly performed or calculated by the laboratory and your nondetect values for lead reported at a 2 milligram per kilogram (mg/kg) quantitation limit (QL) should have been reported at 20 mg/kg.

If an “in is in, out is out, get it done as quickly as possible” approach is used, this could lead the validators to reject these nondetect values, and you may have to remobilize, re-excavate, and collect and analyze additional samples from your site. Your project has just cost twice as much as anticipated.

However, if your validators have access to your project goals and are taking a “whole project” approach, they will have access to your project action limits (PALs) or decision criteria, which is likely to be between 100 mg/kg and 400 mg/kg for lead, in which case, nondetect values with QLs raised to 20 mg/kg are clearly usable to determining you have met your criteria.

If any of your project documents, inclusive of your Request for Proposal (RFP), Statement of Objectives (SOO), Statement of Work (SOW), or Performance Work Statement (PWS) require data review per the DoD QSM, the UFP-QAPP, or reference ADR, your client is requiring data validation. Clients expect you to understand their requirements and to know the content of their guidance documents.

Data validation is ideally performed and often required for confirming remedial action is complete, for monitoring and operations assessments, and for determining that materials are suitable to be put into or back into the ground (ex: using clean site soils as backfill). Data validation may also be needed for site characterization, depending upon purpose of characterization. Data validation provides assurance that data are adequate for the intended use. Data that are adequate for the intended use lead to sound project decisions. Data validation may also save you a day in court.

Scenario 2:

The new owner of the property in Scenario 1 plans to sue the previous owner, your client. You are called upon to defend your assertion that you completed remediation to the satisfaction of the regulatory requirements of the time; however, ten years have passed since you completed this project and you don’t recall the details of this small project all that well. You are concerned when your data are called into question. However, because your validators clearly and thoroughly documented the calibration issue, raised the reporting limits, and showed your soil samples were clean to 20 mg/kg, the case is dismissed.

Data validation is, however, not always needed. It is rarely needed for waste characterization, and data indicating additional actions are needed (such as excavating wider or deeper) do not require validation. In these cases, responsibility is being assumed by another party (i.e. the waste acceptance facility or landfill) or your team will be taking additional actions before you make the final project decisions. In some cases, the level of validation (such as those commonly referred to as Levels 2, 3, and 4) may be minimized unless issues arise, depending on project objectives. Data validation performed in a manner that is tailored to your project helps ensure project objectives are met and risk is minimized.

If you have questions or comments, please leave them in the comments section below. If you’d like to find out if Oak Services is the right company to write or review your project plan, perform your environmental data validation or assess your validation needs, please contact us and let us know what you’re looking for.



*The parameters evaluated with MPC were historically referred to as “PARCC”, which is an acronym for precision, accuracy, representativeness, comparability, and completeness. PARCC is generally considered to be an obsolete term now, and MPC includes an evaluation of data precision, accuracy, bias, sensitivity, and completeness.

** Although it is possible to tailor your MPC very specifically to your project, it is not possible to anticipate every possible field and laboratory condition that may be encountered during project execution. It is reasonable to use default MPC with room for professional judgement. This approach can be formalized in your QAPP, along with a requirement that professional judgment calls will be briefly explained in your data validation summaries or reports.

About the author: Dianne McNeill is a Proposal Manager and Senior Scientist with Oak Services. She has 23 years of experience in the environmental sciences, with 15 years environmental data validation experience inclusive of 10 years experience training new validators. She also has 7 years of environmental laboratory experience, with a focus on GC/MS analysis of VOCs and SVOCs and LC/MS and LC/MS/MS analysis of explosives, dyes, and specialty parameters.