As we discussed in our previous posting entitled “Money Talks: Utilizing the Power of Statistics for Remediation Savings” (https://bit.ly/3q7btUM) this article focuses on the use of statical analysis in Sampling Plan preparation to generate even more savings.
Subsurface investigations are an integral part in determining the need for and development of appropriate remedial action plans. Successful investigations should be designed to obtain sufficient information, within a minimal number of mobilizations, and provide a road map to completion. Deficient investigations can lead your project down the wrong path from the beginning and generally wind up costing more money and time down the line. Like falling dominos, poor planning can lead to cost overruns, delays, back-and-forth negotiations with regulators, more sampling, demurrage costs, design changes, and unforeseen contamination, often in unexpected locations. Strictly following regulatory guidance, regarding site investigation, is no guarantee that that these pitfalls can be avoided.
Industry standard practice typically adheres to the regulatory guidance outlined in each state for sampling and investigation techniques, believing that stringently following the guidance will result in a sound site investigation plan. Regulatory guidance typically focuses on sample collection in areas of staining, product, odors, elevated instrument measurements, or other “evidence” to document contamination. Strictly following this sort of guidance generally results in an approvable investigation plan but is likely to result in a flawed approach. These flaws can lead to an incomplete and a systematically biased dataset that overestimates the extent of contamination and thus results in a statistically invalid dataset. These statistically invalid results are unlikely to accurately reflect what was initially intended and produce uncertainty with previous assumptions determined to be untrue. These flaws and uncertainties could lead to additional mobilizations and sampling to resolve the identified voids and biases, thus adding unpredictable costs and time to the project.
Also, to make matters worse, many falsely believe that because their data has been “validated”, it is sound. This is misleading in that while the laboratory may have done its job in the analysis, there is little value in a dataset where the laboratory analysis is valid, but the dataset is statistically biased and invalid. In summary, precisely following regulatory guidance is unlikely to produce a sound and unbiased dataset, translating into poor decisions and preparing remediation plans using a compromised dataset. This is a recipe for trouble, regardless of whether the data are laboratory validated. Statistically valid data are the best protection to ensure your decisions are based on sound evidence. A sound sampling plan has data collected with an element of randomness so that it yields representative results.
At FLS, we design investigation plans that utilize the regulatory guidance approvable by regulatory agencies, but also provide a statistically valid dataset. This is accomplished by designing plans that have an element of randomness. Random sampling, according to statisticians, is the only way to get representative results and a smaller random dataset yields better information than a larger biased dataset. Randomization makes clear what is a real effect by removing overlapping or hidden factors, random errors, and biases. Simply stated, you must collect samples with randomness to obtain reliable data. Sometimes this means initially adding more samples to achieve regulatory approval. But these additional samples are better viewed as an investment rather than a cost. A sound sampling plan often finds that contamination is less widespread than initially believed, leading to a smaller soil volume and less reporting, which translates into lower costs and reduced cleanup time. It also leads to more certainty about the location and extent of contamination. These savings can vastly outweigh the initial sampling costs.
A reduced scope utilizing randomization and statistical analysis can lead to better investigation plans which produces more reliable data sets to make informed decisions. Frequently, the scope reduction is very dramatic along with a greater degree of confidence in the results. Lower initial investigation costs may catch the eye; however, they may lead to poor decisions and increased remediation costs down the line. The most important question that should be asked is whether or not the data obtained is a statistically valid dataset and whether sufficient samples were collected to accurately characterize the identified impacts and develop a remediation strategy that accurately addresses these concerns. Generally, our customers find that the FLS approach reduces future remediation costs and lessens timeframes to achieve regulatory compliance.
Give us a call and see if our outside the box thinking and unique solutions can reduce your costs! Let us use the power of statistics to develop your road map to success. Please see our website at www.flemingleeshue.com or call 212-675-3225.