In addition to daily human-in-the-loop quality control (QC) tests that are conducted as data streams are collected, data products are run through multiple automated QC algorithms. If a user identifies an issue with an OOI data product or has a question about QC procedures, please contact the Data Team through the HelpDesk. We will do our best to sort it and correct any identified deficiencies.

Quality Control Goals

OOI instrument deployment and data quality control procedures were designed with the goal of meeting the U.S. Integrated Ocean Observing System (IOOS) Quality Assurance of Real Time Ocean Data (QARTOD) quality control standards:

  • Every real-time observation must be accompanied by a quality descriptor
  • All observations should be subject to automated real-time quality tests
  • Quality flags and test descriptions must be described in the metadata
  • Observers should verify / calibrate sensors before deployment
  • Observers should describe methods / calibration in real-time metadata
  • Observers should quantify level of calibration accuracy and expected error
  • Manual checks on automated procedures, real-time data collected, and status of observing system must be provided on an appropriate timescale

How We Work

The OOI’s Data Team is led by the Program Management Office at Woods Hole Oceanographic Institution (WHOI), with members from WHOI, University of Washington, Oregon State University, and Rutgers, The State University of New Jersey. The team is responsible for ensuring the collected data and metadata delivered by the OOI meets community data quality standards.

The Data Team meets regularly, conducts focus groups to elicit community feedback on data being served, and remains in constant communication with each other via apps such as Redmine and Slack to troubleshoot and resolve any data issues that arise. Members of the team also work directly with the user community and marine engineers to identify, diagnose, and resolve data availability and data quality issues. To help users better access and use the data, the team offers training to the community regarding data access, availability, processing routines, and quality control.

The Data Team’s primary goals are:

  • To monitor the operational status of data flowing through the OOI Data system end-to-end
  • To ensure the availability of OOI datasets in the system (raw, processed, derived, and cruise)
  • To verify that data delivered by the system meets quality guidelines
  • To identify data availability and quality issues and make sure they are resolved
  • To communicate known data issues with end users
  • To report operational statistics on data availability and quality, and issue resolution

Manual QC Tests

The data manager oversees QC tests, which are conducted by the team of four data evaluators.Tests they conduct include Quick Look tests (a first pass by evaluators using automated tools) and Deep Dives (a closer look at data flagged as suspect, drawing in Subject Matter Experts). The data team clearly annotates any data stream that triggers QC-related alerts, and any data that are flagged as suspect during manual inspection.

The Standard Operating Procedure (as of March 2017) for the data evaluation team can be found here:

QC SOP

The “as-designed” details of QA/QC methods for OOI data and data products and physical samples can be found within the Protocols and Procedures for OOI Data Products document, which also includes calibration and field verification procedures.

QA/QC PROTOCOLS

Automated QC Algorithms

Data products are run through six automated QC algorithms. QC reports are created on a biweekly or monthly basis. Automated QC algorithms were coded based on specifications created by OOI Program Scientists and derived from other observatory experiences. The six algorithms currently implemented are:

OOI Test OOI Description QARTOD Equivalent QARTOD Recommendation (from manuals) Notes
Global Range Test (pdf) Data are flagged unless they fall within valid world ocean ranges or instrument limits (whichever is more restrictive) Gross Range Only considers manufacturer-defined sensor and calibration limits Different tests, different names.

OOI test is currently operational.

Local Range Test (pdf) Data are flagged unless they fall within locally valid site-specific or depth ranges. Interpolates thresholds between depth and season intervals Local Range Starts with constant limits for each depth/season interval Roughly identical, same nomenclature.

OOI Local ranges are still being established and entered based on first year of operations.

Spike Test (pdf) Deviation from mean compared to 2*N neighboring points Spike N=1, default threshold is based on the rate of change distribution from previous data sets Roughly identical, same nomenclature.

OOI test is currently operational.

Trend Test (pdf) Data are flagged as having a trend if the SD of the residuals to a polynomial curve < original data, multiplied by some factor. Designed to test for sensor drift. N/A No QARTOD equivalent OOI only.

This test is not currently operational, and is being reviewed for efficacy.

Stuck Value Test (pdf) If 2 neighboring values differ by less than the resolution of the sensor for more than N repetitions, data are flagged Stuck Sensor Manual suggests 3 consecutive points for a stuck sensor suspect flag and 5 for a fail flag. QARTOD manual suggestion may be too low for well-mixed portions of the water column.

We are evaluating the results from the OOI lookup values.

Gradient Test (pdf) If d(data)/d(t) between two points is greater than a set threshold, all following points fail until one falls within an absolute limit (TOLDAT). First data point is assumed good unless a “good” starting data (STARTDAT) point is defined. Rate of Change QARTOD recommends two neighboring points and does not incorporate TOLDAT or STARTDAT values. Different tests, different names.

OOI Gradient test is under review and not currently operational.

Links in order of appearance above:

Global Range Test: DATA PRODUCT SPECIFICATION FOR GLOBAL RANGE TEST Local Range Test: DATA PRODUCT SPECIFICATION FOR LOCAL RANGE TEST Spike Test: DATA PRODUCT SPECIFICATION FOR SPIKE TEST Trend Test: DATA PRODUCT SPECIFICATION FOR TREND TEST Stuck Valve Test:DATA PRODUCT SPECIFICATION FOR STUCK VALUE TEST Gradient Test: DATA PRODUCT SPECIFICATION FOR GRADIENT TEST

QC algorithms are only run on science data products, not on auxiliary or engineering parameters. The automated QC algorithms do not screen out or delete any data, or prevent it from being downloaded. The algorithms only flag “suspect” data points in the plotting tools and deliver those flags as additional attributes in downloaded data.

Other Useful Documentation

OOI Algorithms

The algorithm code used to process the QC algorithms in the OOI Cyberinfrastructure system

QC Lookup Tables

QC lookup tables that describe QC limits

How to Interpret Results

A guide to interpreting QC results in downloaded data files