T-122-10
Making a List and Checking It Twice: The Importance of Data Standards and QA/QC

Julie M. Defilippi , Atlantic Coastal Cooperative Statistics Program, Arlington, VA
The utility of a data set is directly proportional to how well the data are collected, processed, stored, compiled, maintained, and shared. A good QA/QC plan establishes data standards and checks to be implemented throughout the data lifecycle. Standards and protocols for the collection and sharing of data should be established at the onset of a project. Data collection should include methods to prevent entry errors such as double data entry, using dropdown lists of values, and validations. Data processing checks and auditing improves the quality of the data set during the compilation and storing stages of the data lifecycle. Adherence to the QA/QC plan results in a data set that is easily manipulated, queried and shared. These concepts will be illustrated using examples from the Atlantic Coastal Cooperative Statistics Program (ACCSP), a state-federal partnership comprised of 23 partners established to address deficiencies in the data that constrained marine fisheries management along the Atlantic coast. The ACCSP motto “Good Data, Good Decisions” speaks directly to the concept that the product of a data analysis can only ever be as good as the input. High-quality, holistic spatial and temporal data sets are an excellent asset to regulatory, academic and public sectors.