Fehling Group

Ecological Risk Assessment

TFG and its partner firm Neptune has a team of ecologists, ecological modelers, ecotoxicologists, and statisticians that support ecological risk assessment with:

  • Calculation of media-specific risk-based values to support site screening and extensive experience in performing screening level and baseline risk assessments;
  • A coordinated approach to developing assessment endpoints and associated measures that incorporates the current state of ecological knowledge as well as public and stakeholder involvement to ensure that the ecological risk assessment supports decision making;
  • Risk assessments for aquatic, marine, and terrestrial ecosystems based on the appropriate regulatory drivers and stakeholder input;
  • Design and interpretation of field and laboratory ecological studies
  • Experience in application of EPA’s ecological risk assessment guidance; 
  • Development and application of methods for evaluating ecological risks in the unique situation of arid environments.
  • Ecological risk assessment across broad spatial scales
  • Development of ecological risk assessment training

We have established ourselves as leaders in the risk assessment community by developing novel approaches to assessing unique environments and large spatial scales, members of our team have developed ERA training under EPA’s Ecological Risk Assessment Guidance for Superfund (ERAGS) that was tailored to arid environments. We have implemented ERAs at an array of large Federal facilities, including the Los Alamos National Laboratory, the Hanford Site, and multiple DoD facilities across the U.S, as well as well as commercial facilities ranging from electrical power plants to former petroleum facilities. We have extensive experience designing and conducting ecological risk assessments in aquatic and marine systems, including a number of Naval facilities and port areas. Our team members have authored the ecological screening methods for the Los Alamos National Laboratory and developed and implemented the baseline ERA methods for investigations of semiarid fluvial canyons. Our technical approach for large DoD facilities such as Quantico Marine Corp Base has been to prepare defensible data based on clear sampling and decision objectives. Our work at the Hanford Site has included waste site and non-waste areas over 50 miles of shoreline in the Hanford Reach of the Columbia River and Hanford’s terrestrial environment that encompasses almost 600 square miles in eastern Washington State.  The experience gained from these high-profile projects has furthered our expertise as innovators in the field of ERAs.

TFG and Neptune have the personnel and experience to assess ecological risks in any type of environment and any size site.  We have developed database and data analysis software to meet the needs of a diverse clientele. Our focus on planning and data quality allows clients to make defensible environmental decisions when it comes to ecological risk assessment.

Insert E

We utilize a number of statistical survey design tools and standard statistical software packages to develop sampling and analysis plans designed to provide data of the right type and quality to support decision making, including:

  • Probabilistic survey designs utilizing random, systematic, stratified and composite sampling approaches to achieve project objectives at the lowest cost.
  • Design and statistical evaluation of preliminary information to generate more efficient sampling plans.

We work with field teams during sampling campaigns to assist with implementation of phased, iterative and sequential sampling approaches.

STATISTICAL DATA ANALYSIS

We routinely analyze complex environmental data sets through:

  • Data preparation, exploratory data analysis, and graphical data presentation.
  • Application of classical hypothesis tests and associated power analysis.
  • Monte Carlo simulations and geostatistical analyses.
  • Data quality assessment including evaluation of statistical assumptions.

BAYESIAN STATISTICAL ANALYSIS

For some environmental problems, a frequentist (classical) statistical approach results in collecting unnecessarily large amounts of data with no benefit to public health or safety.  For example, when looking for a “needle in a haystack” frequentist statistics require large samples sizes. Alternatively, Bayesian statistics consider what the risk is of not finding the needle and allow for incorporation of expert and historical site knowledge to be incorporated into the statistical analyses through initial (prior) distributions. We employs experts in Bayesian statistics and uses these methods in many projects.

TFG and our partner firm Neptune and Company, have written guidance for EPA on the DQO process, and on the subsequent data quality assessment (DQA) guidance after the data are collected.  We has also written guidance for EPA and the Uniform Federal Programs (an attempt to create consistency across EPA, DOE and DoD) on QAPPs, and has written guidance for many specific applications on development of SAPs, QAPPs and other documents.  Neptune currently reviews all sampling plans that are produced by EPA’s Office of Research and Development.  Our team has written or reviewed thousands of SAPs and QAPPs over the 30 or more years that we have all been involved in environmental consulting.


We have been involved in developing guidance for the various Federal and State agencies.  For NDEP, we have developed guidance on a variety of different technical areas that are important for environmental problem solving, including data quality objectives, data validation, data usability, detection limits, field duplicates, background comparisons, secular equilibrium, environmental statistics, asbestos and radionuclide risk assessment, ecological risk assessment, and vapor intrusion. We have also developed guidance for DOE and DOD, and has provided extensive support to the development of EPA’s Quality System.  Neptune’s founders developed the DQO process at EPA in the 1980s and decided to found Neptune with the intention of applying the DQO process to facilitate better planning, design and analysis of environmental data.  Since that time, Neptune has supported various revisions of EPA’s Quality System guidance, including the QAPP, DQOs, data quality assessment (DQA), measurement quality objectives (MQOs), data quality indicators (DQIs), associated EPA software (DEFT, DataQuest), and development of Quality Management Plans.  

We have also provided direct support to EPA’s Quality Staff in OEIP with guidance development, training, and document review. Recent efforts include the creation of revised guidance documents on Quality Management Plans (QMPs) and QA (Quality Assurance) Project Plans (QAPPs). We created a series of electronic QMPs (eQMPs) for a wide variety of EPA organizations, including National Program Offices, Regions, and Office of Research and Development (ORD) Laboratories. Our team has had a long history of supporting EPA’s Quality Staff in writing or updating key QA guidance documents, most recently including: EPA QA/G-5A (Guidance on QA Project Plans) and EPA QA/G-2A (Guidance on Quality Management Plans), and historically including: EPA QA/G-8 (Guidance on Environmental Data Verification and Validation), EPA QA/G-9R and G-9S (Data Quality Assessment: A Reviewer’s Guide and Statistical Tools for Practitioners), EPA QA/G-11 (Guidance on Quality Assurance for Environmental Technology Design, Construction, and Operation), as well as several supplemental guidance documents on planning using performance and assessment criteria, data quality indicators, measurement quality objectives, a case study (EPA QA/CS-1, A Case Study for Hazardous Waste Site Investigations), and other topics relevant to this Performance Work Statement (PWS).

Our statistics and decision analysis support team has played a leadership role within EPA and for numerous federal and state agencies engaged in environmental characterization, monitoring, risk assessment, and remediation. We have developed web-based interactive decision support tools including DASEES (Decision Analysis for a Sustainable Environment, Economy, and Society) and GiSdT© (Guided Interactive Statistical Decision Tools), based on open-source software that uses R as the statistical engine. We can customize a website for a user’s requirements, including enabling ready access to large, complex databases and the ability to perform data query, data analysis, and data presentation. We have successfully applied Bayesian methods to solve environmental problems and has worked with EPA to develop a Bayesian approach to data quality objectives (DQOs).


Our partner firm has been involved in the development of statistical environmental software, ProUCL since 1999. Have developed and maintained ProUCL since 1999. The latest upgrade, ProUCL version 5.2 (2022) is a user-friendly comprehensive package equipped with statistical methods and graphical tools to address environmental sampling and statistical data analytical needs. The graphical methods in ProUCL include histograms, multiple quantile-quantile (Q-Q) plots, and side-by-side box plots. The Sample Sizes module is useful to develop data quality objectives (DQOs) based sampling designs and perform power evaluations. The Dixon and Rosner outlier tests and Goodness-of-Fit (GOF) tests for normal, lognormal, and gamma distributions can be used on data sets with and without nondetects. For uncensored and left-censored data sets with NDs, ProUCL 5.2 computes several upper limits including: UCLs, UPLs and, UTLs by taking data distribution, data set size and skewness into consideration. UCLs are used to estimate EPC terms, and UPLs and UTLs are used to estimate BTVs. UPLs are also used to perform intra-well comparisons. These upper limits are also used in GW detection and compliance monitoring projects. The ANOVA module of ProUCL is useful to perform inter-well comparisons. Single-sample hypothesis tests in ProUCL include t-test, Sign, Wilcoxon Signed Rank, and proportion tests; and two- sample tests include: t-test, Wilcoxon Rank Sum (WRS) test, Gehan, and Tarone-Ware tests. These hypotheses are used to verify the attainment of cleanup standards and/or to perform site versus background comparisons. ProUCL also has regression module and trend module (Mann-Kendall and Theil-Sen tests) which are used on time-series data sets to determine potential trends present in environmental data sets collected over a certain period of time.  

Our team has a strong track record of providing statistical support to EPA and other clients for the application of specific methods to unique site-specific circumstances. Specifically, we have developed and implemented customized solutions for the data analysis and subsequent statistical analysis needed for rapid and accurate waste site assessment. This type of customized work to develop statistical approaches that address site and program specific needs necessarily includes computer programming in languages including R, python, Javascript, Objective-C, C#, JSON, XML, django, and php. 

Our team has worked with EPA scientists to develop statistical reports covering many different site and program specific focus areas. Our work to both develop and review statistical reports for the EPA has included the following topics: sampling and analysis plans / data, calculation of background contaminant concentrations, comparison of site concentration data to background concentration data, calculation of upper confidence limits (UCLs), upper tolerance limits (UTLs) and exposure point concentrations (EPCs), analysis of variance (both univariate and multivariate), geostatistical and geospatial analysis, analysis of temporal data, and statistical analysis of survey/questionnaire results. Analyses performed have also included the statistical techniques of principal component analysis, factor analysis, discriminant analysis, cluster analysis, and other pattern recognition techniques. 

Specific examples of work performed in this capacity include:

  • Under work performed for the ADEQ, we developed a stand-alone open-source software tool for performing background comparisons. This tool was also developed using our GiSdT© technology (based on the statistical programming language, R), and was built as a web-based tool available to their clients in the Voluntary Remediation Program (VRP). This tool is used by VRP clients to support their background characterization efforts to assist efforts to define background versus those concentrations due to anthropogenic sources.
  • Working on a project to develop methods for statistical analysis of low count asbestos data. The methods involve the use of Bayesian statistics that take advantage of prior information about a site. This is similar to DQOs, but the effect is different and results in lower confidence bounds when contamination is not expected. This project also entails statistical analysis of existing asbestos data associated with the RAFs instrument for measuring asbestos in near-ground-surface air samples and involves data validation for the same data. The ultimate objective is to produce a decision analysis framework program to solve asbestos contamination problems.
  • Provided statistical modeling support for a range of experimental and observational studies, including qPCR measurements, biosolids remediation of steroids, and microbial remediation of chlorides. The modeling efforts have ranged from relatively straightforward analysis of variance models to multivariate analysis of variance, linear regression models with complex interaction structures, and linear mixed effects models.
  • Provided experimental design expertise to EPA on a variety of topics. These efforts have included traditional sample size calculations as well as high-level guidance for framing questions in a statistical hypothesis framework. For example, we have provided guidance to the Bio-Response Operational Testing and Evaluation Project, which included sample size review for pre- vs. post-decontamination concentrations, spatial considerations for sampling, and sampling technology comparisons and encouraged designs to support risk-based rather than dose-based reporting of results.
  • Provide expert technical guidance for the application of statistical methods and models. As an example, we developed system trace statistical models that reflect comprehensive information to test the following: Consensus, Hierarchical clustering, Single model, Group contribution, FDA, and Nearest neighbor methods have clearly defined calls within the application. We also developed methods to ensure “Endpoint Descriptors” (i.e., toxicity endpoints) are statistically correct and easily identified.
  • Have extensive experience with the development of site-specific sampling designs. As an example, we have provided key design input into the statistical monitoring design for a number of critical observing platforms for the NSF-funded National Ecological Observing Network (NEON).
  • Under work performed for the NDEP, we developed a stand-alone open-source software tool for performing background comparisons and estimation of upper confidence limits for a mean concentration. This work also included the development and application of Bayesian approaches to establish background concentrations and background threshold values (BTVs). This tool was also developed using our GiSdT© technology (based on the statistical programming language, R), but was built as a stand-alone, instead of web-based tool. This tool is used by the responsible parties at the BMI, Henderson, site to support their risk characterization efforts.

We routinely review quality planning documents (e.g., QMPs), activity-specific quality planning documents (sampling and analysis plans [SAPs], standard operating procedures [SOPs], and QAPPs), and research products for adherence to EPA quality specifications, statistical experimental design, and data analysis issues. The specifications include EPA requirements and/or guidance found at http://www.epa.gov.quality/qa_docs.html; ANSI/ASQC E4-2004, Quality Systems for Environmental Data and Technology Programs: Requirements with Guidance for Use; International Organization for Standardization (ISO) Standard 17025; approved QMP and QAPP requirements; or other guidance listed in the specific technical directive. These reviews typically have a very short turnaround (2 weeks) and require personnel from a variety of scientific disciplines. Reviews have been performed for a wide range of topics in support of laboratories throughout NRMRL and other ORD offices. 

We are also highly effective at implementing different levels of review as specified by the client to target specific issues of need. For example, we are familiar with the QA categories identified by EPA as discussed in ORD organizational QMPs. The level of effort for the reviews and the focus of the content for review will be determined by the client depending on whether the planning document or product reviews that are needed by the client are specified be initial, follow-up, or statistical only. If an initial review is requested, then this would correspond to the first review of the original document. If a follow-up review is requested, then we would evaluate revisions to the original document which were made by the author in response to issues that were identified during the initial review. If a review was requested to focus solely on the statistical components of a document, then our review would restrict comments to only those issues that relate to statistical aspects of the document, e.g. experimental design, decision quality.