CCLME.ORG - 40 CFR PART 58—AMBIENT AIR QUALITY SURVEILLANCE
Loading (50 kb)...'
(continued)

(b) Reporting is required by all Metropolitan Statistical Areas with a population exceeding 350,000.

(c) The population of a Metropolitan Statistical Area for purposes of index reporting is the most recent decennial U.S. census population.

[64 FR 42547, Aug. 4, 1999]

Subpart G—Federal Monitoring
top
Source: 44 FR 27571, May 10, 1979, unless otherwise noted. Redesignated at 58 FR 8467, Feb. 12, 1993.

§ 58.60 Federal monitoring.
top
The Administrator may locate and operate an ambient air monitoring station if the State fails to locate, or schedule to be located, during the initial network design process or as a result of the annual review required by §58.20(d):

(a) A SLAMS at a site which is necessary in the judgment of the Regional Administrator to meet the objectives defined in appendix D to this part, or

(b) A NAMS at a site which is necessary in the judgment of the Administrator for meeting EPA national data needs.

§ 58.61 Monitoring other pollutants.
top
The Administrator may promulgate criteria similar to that referenced in subpart B of this part for monitoring a pollutant for which a National Ambient Air Quality Standard does not exist. Such an action would be taken whenever the Administrator determines that a nationwide monitoring program is necessary to monitor such a pollutant.

Appendix A to Part 58—Quality Assurance Requirements for State and Local Air Monitoring Stations (SLAMS)
top
1. General Information.

1.1 This appendix specifies the minimum quality assurance/quality control (QA/QC) requirements applicable to SLAMS air monitoring data submitted to EPA. State and local agencies are encouraged to develop and maintain quality assurance programs more extensive than the required minimum.

1.2 To assure the quality of data from air monitoring measurements, two distinct and important interrelated functions must be performed. One function is the control of the measurement process through broad quality assurance activities, such as establishing policies and procedures, developing data quality objectives, assigning roles and responsibilities, conducting oversight and reviews, and implementing corrective actions. The other function is the control of the measurement process through the implementation of specific quality control procedures, such as audits, calibrations, checks, replicates, routine self-assessments, etc. In general, the greater the control of a given monitoring system, the better will be the resulting quality of the monitoring data. The results of quality assurance reviews and assessments indicate whether the control efforts are adequate or need to be improved.

1.3 Documentation of all quality assurance and quality control efforts implemented during the data collection, analysis, and reporting phases is important to data users, who can then consider the impact of these control efforts on the data quality (see reference 1 of this appendix). Both qualitative and quantitative assessments of the effectiveness of these control efforts should identify those areas most likely to impact the data quality and to what extent.

1.4 Periodic assessments of SLAMS data quality are required to be reported to EPA. To provide national uniformity in this assessment and reporting of data quality for all SLAMS networks, specific assessment and reporting procedures are prescribed in detail in sections 3, 4, and 5 of this appendix. On the other hand, the selection and extent of the QA and QC activities used by a monitoring agency depend on a number of local factors such as the field and laboratory conditions, the objectives for monitoring, the level of the data quality needed, the expertise of assigned personnel, the cost of control procedures, pollutant concentration levels, etc. Therefore, the quality system requirements, in section 2 of this appendix, are specified in general terms to allow each State to develop a quality assurance program that is most efficient and effective for its own circumstances while achieving the Ambient Air Quality Programs data quality objectives.

2. Quality System Requirements.

2.1 Each State and local agency must develop a quality system (reference 2 of this appendix) to ensure that the monitoring results:

(a) Meet a well-defined need, use, or purpose.

(b) Satisfy customers' expectations.

(c) Comply with applicable standards specifications.

(d) Comply with statutory (and other) requirements of society.

(e) Reflect consideration of cost and economics.

(f) Implement a quality assurance program consisting of policies, procedures, specifications, standards, and documentation necessary to:

(1) Provide data of adequate quality to meet monitoring objectives, and

(2) Minimize loss of air quality data due to malfunctions or out-of-control conditions. This quality assurance program must be described in detail, suitably documented in accordance with Agency requirements (reference 4 of this appendix), and approved by the appropriate Regional Administrator, or the Regional Administrator's designee. The Quality Assurance Program will be reviewed during the systems audits described in section 2.5 of this appendix.

2.2 Primary requirements and guidance documents for developing the quality assurance program are contained in references 2 through 7 of this appendix, which also contain many suggested and required procedures, checks, and control specifications. Reference 7 of this appendix describes specific guidance for the development of a QA Program for SLAMS. Many specific quality control checks and specifications for methods are included in the respective reference methods described in part 50 of this chapter or in the respective equivalent method descriptions available from EPA (reference 8 of this appendix). Similarly, quality control procedures related to specifically designated reference and equivalent method analyzers are contained in the respective operation or instruction manuals associated with those analyzers. Quality assurance guidance for meteorological systems at PAMS is contained in reference 9 of this appendix. Quality assurance procedures for VOC, NOX (including NO and NO2), O3, and carbonyl measurements at PAMS must be consistent with reference 15 of this appendix. Reference 4 of this appendix includes requirements for the development of quality assurance project plans, and quality assurance and control programs, and systems audits demonstrating attainment of the requirements.

2.3 Pollutant Concentration and Flow Rate Standards.

2.3.1 Gaseous pollutant concentration standards (permeation devices or cylinders of compressed gas) used to obtain test concentrations for CO, SO2, NO, and NO2 must be traceable to either a National Institute of Standards and Technology (NIST) NIST-Traceable Reference Material (NTRM) or a NIST-certified Gas Manufacturer's Internal Standard (GMIS), certified in accordance with one of the procedures given in reference 10 of this appendix.

2.3.2 Test concentrations for O3 must be obtained in accordance with the UV photometric calibration procedure specified in 40 CFR part 50, appendix D, or by means of a certified ozone transfer standard. Consult references 11 and 12 of this appendix for guidance on primary and transfer standards for O3.

2.3.3 Flow rate measurements must be made by a flow measuring instrument that is traceable to an authoritative volume or other applicable standard. Guidance for certifying some types of flowmeters is provided in reference 7 of this appendix.

2.4 National Performance Audit Program (NPAP). Agencies operating SLAMS are required to participate in EPA's NPAP. These audits are described in reference 7 of this appendix. For further instructions, agencies should contact either the appropriate EPA Regional QA Coordinator at the appropriate EPA Regional Office location, or the NPAP Coordinator, Emissions Monitoring and Analysis Division (MD–14), U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.

2.5 Systems Audit Programs. Systems audits of the ambient air monitoring programs of agencies operating SLAMS shall be conducted at least every 3 years by the appropriate EPA Regional Office. Systems audit programs are described in reference 7 of this appendix. For further instructions, agencies should contact either the appropriate EPA Regional QA Coordinator or the Systems Audit QA Coordinator, Office of Air Quality Planning and Standards, Emissions Monitoring and Analysis Division (MD-14), U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.

3. Data Quality Assessment Requirements.

3.0.1 All ambient monitoring methods or analyzers used in SLAMS shall be tested periodically, as described in this section, to quantitatively assess the quality of the SLAMS data. Measurement uncertainty is estimated for both automated and manual methods. Terminology associated with measurement uncertainty are found within this appendix and includes:

(a) Precision. A measurement of mutual agreement among individual measurements of the same property usually under prescribed similar conditions, expressed generally in terms of the standard deviation;

(b) Accuracy. The degree of agreement between an observed value and an accepted reference value, accuracy includes a combination of random error (precision) and systematic error (bias) components which are due to sampling and analytical operations;

(c) Bias. The systematic or persistent distortion of a measurement process which causes errors in one direction. The individual results of these tests for each method or analyzer shall be reported to EPA as specified in section 4 of this appendix. EPA will then calculate quarterly assessments of measurement uncertainty applicable to the SLAMS data as described in section 5 of this appendix. Data assessment results should be reported to EPA only for methods and analyzers approved for use in SLAMS monitoring under appendix C of this part.

3.0.2 Estimates of the data quality will be calculated on the basis of single monitors and reporting organizations and may also be calculated for each region and for the entire Nation. A reporting organization is defined as a State, subordinate organization within a State, or other organization that is responsible for a set of stations that monitors the same pollutant and for which data quality assessments can be pooled. States must define one or more reporting organizations for each pollutant such that each monitoring station in the State SLAMS network is included in one, and only one, reporting organization.

3.0.3 Each reporting organization shall be defined such that measurement uncertainty among all stations in the organization can be expected to be reasonably homogeneous, as a result of common factors.

(a) Common factors that should be considered by States in defining reporting organizations include:

(1) Operation by a common team of field operators.

(2) Common calibration facilities.

(3) Oversight by a common quality assurance organization.

(4) Support by a common laboratory or headquarters.

(b) Where there is uncertainty in defining the reporting organizations or in assigning specific sites to reporting organizations, States shall consult with the appropriate EPA Regional Office. All definitions of reporting organizations shall be subject to final approval by the appropriate EPA Regional Office.

3.0.4 Assessment results shall be reported as specified in section 4 of this appendix. Table A–1 of this appendix provides a summary of the minimum data quality assessment requirements, which are described in more detail in the following sections.

3.1 Precision of Automated Methods Excluding PM2.5.

3.1.1 Methods for SO2, NO2, O3 and CO. A one- point precision check must be performed at least once every 2 weeks on each automated analyzer used to measure SO2, NO2, O3 and CO. The precision check is made by challenging the analyzer with a precision check gas of known concentration (effective concentration for open path analyzers) between 0.08 and 0.10 ppm for SO2, NO2, and O3 analyzers, and between 8 and 10 ppm for CO analyzers. To check the precision of SLAMS analyzers operating on ranges higher than 0 to 1.0 ppm SO2, NO2, and O3, or 0 to 100 ppm for CO, use precision check gases of appropriately higher concentration as approved by the appropriate Regional Administrator or their designee. However, the results of precision checks at concentration levels other than those specified above need not be reported to EPA. The standards from which precision check test concentrations are obtained must meet the specifications of section 2.3 of this appendix.

3.1.1.1 Except for certain CO analyzers described below, point analyzers must operate in their normal sampling mode during the precision check, and the test atmosphere must pass through all filters, scrubbers, conditioners and other components used during normal ambient sampling and as much of the ambient air inlet system as is practicable. If permitted by the associated operation or instruction manual, a CO point analyzer may be temporarily modified during the precision check to reduce vent or purge flows, or the test atmosphere may enter the analyzer at a point other than the normal sample inlet, provided that the analyzer's response is not likely to be altered by these deviations from the normal operational mode. If a precision check is made in conjunction with a zero or span adjustment, it must be made prior to such zero or span adjustments. Randomization of the precision check with respect to time of day, day of week, and routine service and adjustments is encouraged where possible.

3.1.1.2 Open path analyzers are tested by inserting a test cell containing a precision check gas concentration into the optical measurement beam of the instrument. If possible, the normally used transmitter, receiver, and as appropriate, reflecting devices should be used during the test, and the normal monitoring configuration of the instrument should be altered as little as possible to accommodate the test cell for the test. However, if permitted by the associated operation or instruction manual, an alternate local light source or an alternate optical path that does not include the normal atmospheric monitoring path may be used. The actual concentration of the precision check gas in the test cell must be selected to produce an effective concentration in the range specified in section 3.1.1. Generally, the precision test concentration measurement will be the sum of the atmospheric pollutant concentration and the precision test concentration. If so, the result must be corrected to remove the atmospheric concentration contribution. The corrected concentration is obtained by subtracting the average of the atmospheric concentrations measured by the open path instrument under test immediately before and immediately after the precision check test from the precision test concentration measurement. If the difference between these before and after measurements is greater than 20 percent of the effective concentration of the test gas, discard the test result and repeat the test. If possible, open path analyzers should be tested during periods when the atmospheric pollutant concentrations are relatively low and steady.

3.1.1.3 Report the actual concentration (effective concentration for open path analyzers) of the precision check gas and the corresponding concentration measurement (corrected concentration, if applicable, for open path analyzers) indicated by the analyzer. The percent differences between these concentrations are used to assess the precision of the monitoring data as described in section 5.1. of this appendix.

3.1.2 Methods for Particulate Matter Excluding PM2.5. A one-point precision check must be performed at least once every 2 weeks on each automated analyzer used to measure PM10. The precision check is made by checking the operational flow rate of the analyzer. If a precision flow rate check is made in conjunction with a flow rate adjustment, it must be made prior to such flow rate adjustment. Randomization of the precision check with respect to time of day, day of week, and routine service and adjustments is encouraged where possible.

3.1.2.1 Standard procedure: Use a flow rate transfer standard certified in accordance with section 2.3.3 of this appendix to check the analyzer's normal flow rate. Care should be used in selecting and using the flow rate measurement device such that it does not alter the normal operating flow rate of the analyzer. Report the actual analyzer flow rate measured by the transfer standard and the corresponding flow rate measured, indicated, or assumed by the analyzer.

3.1.2.2 Alternative procedure:

3.1.2.2.1 It is permissible to obtain the precision check flow rate data from the analyzer's internal flow meter without the use of an external flow rate transfer standard, provided that:

3.1.2.2.1.1 The flow meter is audited with an external flow rate transfer standard at least every 6 months.

3.1.2.2.1.2 Records of at least the three most recent flow audits of the instrument's internal flow meter over at least several weeks confirm that the flow meter is stable, verifiable and accurate to ±4%.

3.1.2.2.1.3 The instrument and flow meter give no indication of improper operation.

3.1.2.2.2 With suitable communication capability, the precision check may thus be carried out remotely. For this procedure, report the set-point flow rate as the actual flow rate along with the flow rate measured or indicated by the analyzer flow meter.

3.1.2.2.3 For either procedure, the percent differences between the actual and indicated flow rates are used to assess the precision of the monitoring data as described in section 5.1 of this appendix (using flow rates in lieu of concentrations). The percent differences between these concentrations are used to assess the precision of the monitoring data as described in section 5.1. of this appendix.

3.2 Accuracy of Automated Methods Excluding PM2.5.

3.2.1 Methods for SO2, NO2, O3, or CO.

3.2.1.1 Each calendar quarter (during which analyzers are operated), audit at least 25 percent of the SLAMS analyzers that monitor for SO2, NO2, O3, or CO such that each analyzer is audited at least once per year. If there are fewer than four analyzers for a pollutant within a reporting organization, randomly reaudit one or more analyzers so that at least one analyzer for that pollutant is audited each calendar quarter. Where possible, EPA strongly encourages more frequent auditing, up to an audit frequency of once per quarter for each SLAMS analyzer.

3.2.1.2 (a) The audit is made by challenging the analyzer with at least one audit gas of known concentration (effective concentration for open path analyzers) from each of the following ranges applicable to the analyzer being audited:



------------------------------------------------------------------------
Concentration Range, PPM
Audit Level --------------------------------------
SO2, O3 NO2 CO
------------------------------------------------------------------------
1................................ 0.03-0.08 0.03-0.08 3-8
2................................ 0.15-0.20 0.15-0.20 15-20
3................................ 0.35-0.45 0.35-0.45 35-45
4................................ 0.80-0.90 ........... 80-90
------------------------------------------------------------------------


(b) NO2 audit gas for chemiluminescence-type NO2 analyzers must also contain at least 0.08 ppm NO.

3.2.1.3 NO concentrations substantially higher than 0.08 ppm, as may occur when using some gas phase titration (GPT) techniques, may lead to audit errors in chemiluminescence analyzers due to inevitable minor NO-NOX channel imbalance. Such errors may be atypical of routine monitoring errors to the extent that such NO concentrations exceed typical ambient NO concentrations at the site. These errors may be minimized by modifying the GPT technique to lower the NO concentrations remaining in the NO2 audit gas to levels closer to typical ambient NO concentrations at the site.

3.2.1.4 To audit SLAMS analyzers operating on ranges higher than 0 to 1.0 ppm for SO2, NO2, and O3 or 0 to 100 ppm for CO, use audit gases of appropriately higher concentration as approved by the appropriate Regional Administrator or the Administrators's designee. The results of audits at concentration levels other than those shown in the above table need not be reported to EPA.

3.2.1.5 The standards from which audit gas test concentrations are obtained must meet the specifications of section 2.3 of this appendix. The gas standards and equipment used for auditing must not be the same as the standards and equipment used for calibration or calibration span adjustments. The auditor should not be the operator or analyst who conducts the routine monitoring, calibration, and analysis.

3.2.1.6 For point analyzers, the audit shall be carried out by allowing the analyzer to analyze the audit test atmosphere in its normal sampling mode such that the test atmosphere passes through all filters, scrubbers, conditioners, and other sample inlet components used during normal ambient sampling and as much of the ambient air inlet system as is practicable. The exception provided in section 3.1 of this appendix for certain CO analyzers does not apply for audits.

3.2.1.7 Open path analyzers are audited by inserting a test cell containing the various audit gas concentrations into the optical measurement beam of the instrument. If possible, the normally used transmitter, receiver, and, as appropriate, reflecting devices should be used during the audit, and the normal monitoring configuration of the instrument should be modified as little as possible to accommodate the test cell for the audit. However, if permitted by the associated operation or instruction manual, an alternate local light source or an alternate optical path that does not include the normal atmospheric monitoring path may be used. The actual concentrations of the audit gas in the test cell must be selected to produce effective concentrations in the ranges specified in this section 3.2 of this appendix. Generally, each audit concentration measurement result will be the sum of the atmospheric pollutant concentration and the audit test concentration. If so, the result must be corrected to remove the atmospheric concentration contribution. The corrected concentration is obtained by subtracting the average of the atmospheric concentrations measured by the open path instrument under test immediately before and immediately after the audit test (or preferably before and after each audit concentration level) from the audit concentration measurement. If the difference between the before and after measurements is greater than 20 percent of the effective concentration of the test gas standard, discard the test result for that concentration level and repeat the test for that level. If possible, open path analyzers should be audited during periods when the atmospheric pollutant concentrations are relatively low and steady. Also, the monitoring path length must be reverified to within ±3 percent to validate the audit, since the monitoring path length is critical to the determination of the effective concentration.

3.2.1.8 Report both the actual concentrations (effective concentrations for open path analyzers) of the audit gases and the corresponding concentration measurements (corrected concentrations, if applicable, for open path analyzers) indicated or produced by the analyzer being tested. The percent differences between these concentrations are used to assess the accuracy of the monitoring data as described in section 5.2 of this appendix.

3.2.2 Methods for Particulate Matter Excluding PM2.5.

3.2.2.1 Each calendar quarter, audit the flow rate of at least 25 percent of the SLAMS PM10 analyzers such that each PM10 analyzer is audited at least once per year. If there are fewer than four PM10 analyzers within a reporting organization, randomly re-audit one or more analyzers so that at least one analyzer is audited each calendar quarter. Where possible, EPA strongly encourages more frequent auditing, up to an audit frequency of once per quarter for each SLAMS analyzer.

3.2.2.2 The audit is made by measuring the analyzer's normal operating flow rate, using a flow rate transfer standard certified in accordance with section 2.3.3 of this appendix. The flow rate standard used for auditing must not be the same flow rate standard used to calibrate the analyzer. However, both the calibration standard and the audit standard may be referenced to the same primary flow rate or volume standard. Great care must be used in auditing the flow rate to be certain that the flow measurement device does not alter the normal operating flow rate of the analyzer. Report the audit (actual) flow rate and the corresponding flow rate indicated or assumed by the sampler. The percent differences between these flow rates are used to calculate accuracy (PM10) as described in section 5.2 of this appendix.

3.3 Precision of Manual Methods Excluding PM2.5.

3.3.1 For each network of manual methods other than for PM2.5, select one or more monitoring sites within the reporting organization for duplicate, collocated sampling as follows: for 1 to 5 sites, select 1 site; for 6 to 20 sites, select 2 sites; and for over 20 sites, select 3 sites. Where possible, additional collocated sampling is encouraged. For purposes of precision assessment, networks for measuring TSP and PM10 shall be considered separately from one another. PM10 and TSP sites having annual mean particulate matter concentrations among the highest 25 percent of the annual mean concentrations for all the sites in the network must be selected or, if such sites are impractical, alternative sites approved by the Regional Administrator may be selected.

3.3.2 In determining the number of collocated sites required for PM10, monitoring networks for lead should be treated independently from networks for particulate matter, even though the separate networks may share one or more common samplers. However, a single pair of samplers collocated at a common-sampler monitoring site that meets the requirements for both a collocated lead site and a collocated particulate matter site may serve as a collocated site for both networks.

3.3.3 The two collocated samplers must be within 4 meters of each other, and particulate matter samplers must be at least 2 meters apart to preclude airflow interference. Calibration, sampling, and analysis must be the same for both collocated samplers and the same as for all other samplers in the network.

3.3.4 For each pair of collocated samplers, designate one sampler as the primary sampler whose samples will be used to report air quality for the site, and designate the other as the duplicate sampler. Each duplicate sampler must be operated concurrently with its associated routine sampler at least once per week. The operation schedule should be selected so that the sampling days are distributed evenly over the year and over the seven days of the week. A six-day sampling schedule is required. Report the measurements from both samplers at each collocated sampling site. The calculations for evaluating precision between the two collocated samplers are described in section 5.3 of this appendix.

3.4 Accuracy of Manual Methods Excluding PM2.5. The accuracy of manual sampling methods is assessed by auditing a portion of the measurement process.

3.4.1 Procedures for PM10 and TSP.

3.4.1.1 Procedures for flow rate audits for PM10. Each calendar quarter, audit the flow rate of at least 25 percent of the PM10 samplers such that each PM10 sampler is audited at least once per year. If there are fewer than four PM10 samplers within a reporting organization, randomly reaudit one or more samplers so that one sampler is audited each calendar quarter. Audit each sampler at its normal operating flow rate, using a flow rate transfer standard certified in accordance with section 2.3.3 of this appendix. The flow rate standard used for auditing must not be the same flow rate standard used to calibrate the sampler. However, both the calibration standard and the audit standard may be referenced to the same primary flow rate standard. The flow audit should be scheduled so as to avoid interference with a scheduled sampling period. Report the audit (actual) flow rate and the corresponding flow rate indicated by the sampler's normally used flow indicator. The percent differences between these flow rates are used to calculate accuracy and bias as described in section 5.4.1 of this appendix.

3.4.1.2 Great care must be used in auditing high-volume particulate matter samplers having flow regulators because the introduction of resistance plates in the audit flow standard device can cause abnormal flow patterns at the point of flow sensing. For this reason, the flow audit standard should be used with a normal filter in place and without resistance plates in auditing flow-regulated high-volume samplers, or other steps should be taken to assure that flow patterns are not perturbed at the point of flow sensing.

3.4.2 SO2 Methods.

3.4.2.1 Prepare audit solutions from a working sulfite-tetrachloromercurate (TCM) solution as described in section 10.2 of the SO2 Reference Method (40 CFR part 50, appendix A). These audit samples must be prepared independently from the standardized sulfite solutions used in the routine calibration procedure. Sulfite-TCM audit samples must be stored between 0 and 5 °C and expire 30 days after preparation.

3.4.2.2 Prepare audit samples in each of the concentration ranges of 0.2-0.3, 0.5-0.6, and 0.8-0.9 µg SO2/ml. Analyze an audit sample in each of the three ranges at least once each day that samples are analyzed and at least twice per calendar quarter. Report the audit concentrations (in µg SO2/ml) and the corresponding indicated concentrations (in µg SO2/ml). The percent differences between these concentrations are used to calculate accuracy as described in section 5.4.2 of this appendix.

3.4.3 NO2 Methods. Prepare audit solutions from a working sodium nitrite solution as described in the appropriate equivalent method (see reference 8 of this appendix). These audit samples must be prepared independently from the standardized nitrite solutions used in the routine calibration procedure. Sodium nitrite audit samples expire in 3 months after preparation. Prepare audit samples in each of the concentration ranges of 0.2-0.3, 0.5-0.6, and 0.8-0.9 µg NO2/ml. Analyze an audit sample in each of the three ranges at least once each day that samples are analyzed and at least twice per calendar quarter. Report the audit concentrations (in µg NO2/ml) and the corresponding indicated concentrations (in µg NO2/ml). The percent differences between these concentrations are used to calculate accuracy as described in section 5.4.2 of this appendix.

3.4.4 Pb Methods.

3.4.4.1 For the Pb Reference Method (40 CFR part 50, appendix G), the flow rates of the high-volume Pb samplers shall be audited as part of the TSP network using the same procedures described in section 3.4.1 of this appendix. For agencies operating both TSP and Pb networks, 25 percent of the total number of high-volume samplers are to be audited each quarter.

3.4.4.2 Each calendar quarter, audit the Pb Reference Method analytical procedure using glass fiber filter strips containing a known quantity of Pb. These audit sample strips are prepared by depositing a Pb solution on unexposed glass fiber filter strips of dimensions 1.9 cm by 20.3 cm (3/4 inch by 8 inch) and allowing them to dry thoroughly. The audit samples must be prepared using batches of reagents different from those used to calibrate the Pb analytical equipment being audited. Prepare audit samples in the following concentration ranges:



------------------------------------------------------------------------
Equivalent
Pb Ambient Pb
Range Concentration, Concentration,
µg/Strip µg/m3 \1\
------------------------------------------------------------------------
1................................... 100-300 0.5-1.5
2................................... 600-1000 3.0-5.0
------------------------------------------------------------------------
\1\ Equivalent ambient Pb concentration in µg/m3 is based on
sampling at 1.7 m3/min for 24 hours on a 20.3 cmx25.4 cm (8 inchx10
inch) glass fiber filter.


3.4.4.3 Audit samples must be extracted using the same extraction procedure used for exposed filters.

3.4.4.4 Analyze three audit samples in each of the two ranges each quarter samples are analyzed. The audit sample analyses shall be distributed as much as possible over the entire calendar quarter. Report the audit concentrations (in µg Pb/strip) and the corresponding measured concentrations (in µg Pb/strip) using unit code 77. The percent differences between the concentrations are used to calculate analytical accuracy as described in section 5.4.2 of this appendix.

3.4.4.5 The accuracy of an equivalent Pb method is assessed in the same manner as for the reference method. The flow auditing device and Pb analysis audit samples must be compatible with the specific requirements of the equivalent method.

3.5 Measurement Uncertainty for Automated and Manual PM2.5 Methods. The goal for acceptable measurement uncertainty has been defined as 10 percent coefficient of variation (CV) for total precision and ±10 percent for total bias (reference 14 of this appendix).

3.5.1 Flow Rate Audits.

3.5.1.1 Automated methods for PM2.5. A one-point precision check must be performed at least once every 2 weeks on each automated analyzer used to measure PM2.5. The precision check is made by checking the operational flow rate of the analyzer. If a precision flow rate check is made in conjunction with a flow rate adjustment, it must be made prior to such flow rate adjustment. Randomization of the precision check with respect to time of day, day of week, and routine service and adjustments is encouraged where possible.

3.5.1.1.1 Standard procedure: Use a flow rate transfer standard certified in accordance with section 2.3.3 of this appendix to check the analyzer's normal flow rate. Care should be used in selecting and using the flow rate measurement device such that it does not alter the normal operating flow rate of the analyzer. Report the actual analyzer flow rate measured by the transfer standard and the corresponding flow rate measured, indicated, or assumed by the analyzer.

3.5.1.1.2 Alternative procedure: It is permissible to obtain the precision check flow rate data from the analyzer's internal flow meter without the use of an external flow rate transfer standard, provided that the flow meter is audited with an external flow rate transfer standard at least every 6 months; records of at least the three most recent flow audits of the instrument's internal flow meter over at least several weeks confirm that the flow meter is stable, verifiable and accurate to ±4%; and the instrument and flow meter give no indication of improper operation. With suitable communication capability, the precision check may thus be carried out remotely. For this procedure, report the set-point flow rate as the actual flow rate along with the flow rate measured or indicated by the analyzer flow meter.

3.5.1.1.3 For either procedure, the differences between the actual and indicated flow rates are used to assess the precision of the monitoring data as described in section 5.5 of this appendix.

3.5.1.2 Manual methods for PM2.5. Each calendar quarter, audit the flow rate of each SLAMS PM2.5 analyzer. The audit is made by measuring the analyzer's normal operating flow rate, using a flow rate transfer standard certified in accordance with section 2.3.3 of this appendix. The flow rate standard used for auditing must not be the same flow rate standard used to calibrate the analyzer. However, both the calibration standard and the audit standard may be referenced to the same primary flow rate or volume standard. Great care must be used in auditing the flow rate to be certain that the flow measurement device does not alter the normal operating flow rate of the analyzer. Report the audit (actual) flow rate and the corresponding flow rate indicated or assumed by the sampler. The procedures used to calculate measurement uncertainty PM2.5 are described in section 5.5 of this appendix.

3.5.2 Measurement of Precision using Collocated Procedures for Automated and Manual Methods of PM2.5.

(a) For PM2.5 sites within a reporting organization each EPA designated Federal reference method (FRM) or Federal equivalent method (FEM) must:

(1) Have 15 percent of the monitors collocated (values of .5 and greater round up).

(2) Have at least 1 collocated monitor (if the total number of monitors is less than 4). The first collocated monitor must be a designated FRM monitor.

(b) In addition, monitors selected must also meet the following requirements:

(1) A monitor designated as an EPA FRM shall be collocated with a monitor having the same EPA FRM designation.

(2) For each monitor designated as an EPA FEM, 50 percent of the designated monitors shall be collocated with a monitor having the same method designation and 50 percent of the monitors shall be collocated with an FRM monitor. If there are an odd number of collocated monitors required, the additional monitor shall be an FRM. An example of this procedure is found in table A–2 of this appendix.

(c) For PM2.5 sites during the initial deployment of the SLAMS network, special emphasis should be placed on those sites in areas likely to be in violation of the NAAQS. Once areas are initially determined to be in violation, the collocated monitors should be deployed according to the following protocol:

(1) Eighty percent of the collocated monitors should be deployed at sites with concentrations = ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area); one hundred percent if all sites have concentrations above either NAAQS, and each area determined to be in violation should be represented by at least one collocated monitor.

(2) The remaining 20 percent of the collocated monitors should be deployed at sites with concentrations < ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area)

(3) If an organization has no sites at concentration ranges = ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area), 60 percent of the collocated monitors should be deployed at those sites with the annual mean PM2.5 concentrations (or 24-hour NAAQS if that is affecting the area) among the highest 25 percent for all PM2.5 sites in the network.

3.5.2.1 In determining the number of collocated sites required for PM2.5, monitoring networks for visibility should not be treated independently from networks for particulate matter, as the separate networks may share one or more common samplers. However, for class I visibility areas, EPA will accept visibility aerosol mass measurement instead of a PM2.5 measurement if the latter measurement is unavailable. Any PM2.5 monitoring site which does not have a monitor which is an EPA federal reference or equivalent method is not required to be included in the number of sites which are used to determine the number of collocated monitors.

3.5.2.2 The two collocated samples must be within 4 meters of each other, and particulate matter samplers must be at least 2 meters apart (1 meter apart for samplers having flow rates less than 200 liters/min.) to preclude airflow interference. Calibration, sampling, and analysis must be the same for both collocated samplers and the same as for all other samplers in the network.

3.5.2.3 For each pair of collocated samplers, designate one sampler as the primary sampler whose samples will be used to report air quality for the site, and designate the other as the duplicate sampler. Each duplicate sampler must be operated concurrently with its associated primary sampler. The operation schedule should be selected so that the sampling days are distributed evenly over the year and over the 7 days of the week and therefore, a 6-day sampling schedule is required. Report the measurements from both samplers at each collocated sampling site. The calculations for evaluating precision between the two collocated samplers are described in section 5.5 of this appendix.

3.5.3 Measurement of Bias using the FRM Audit Procedures for Automated and Manual Methods of PM2.5.

3.5.3.1 The FRM audit is an independent assessment of the total measurement system bias. These audits will be performed under the National Performance Audit Program (section 2.4 of this appendix) or a comparable program. Twenty-five percent of the SLAMS monitors within each reporting organization will be assessed with an FRM audit each year. Additionally, every designated FRM or FEM within a reporting organization must:

(a) Have at least 25 percent of each method designation audited, including collocated sites (even those collocated with FRM instruments), (values of .5 and greater round up).

(b) Have at least one monitor audited.

(c) Be audited at a frequency of four audits per year.

(d) Have all FRM or FEM samplers subject to an FRM audit at least once every 4 years. Table A–2 illustrates the procedure mentioned above.

3.5.3.2 For PM2.5 sites during the initial deployment of the SLAMS network, special emphasis should be placed on those sites in areas likely to be in violation of the NAAQS. Once areas are initially determined to be in violation, the FRM audit program should be implemented according to the following protocol:

(a) Eighty percent of the FRM audits should be deployed at sites with concentrations = ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area); one hundred percent if all sites have concentrations above either NAAQS, and each area determined to be in violation should implement an FRM audit at a minimum of one monitor within that area.

(b) The remaining 20 percent of the FRM audits should be implemented at sites with concentrations < ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area).

(c) If an organization has no sites at concentration ranges = ninety percent of the annual PM2.5 NAAQS (or 24-hour NAAQS if that is affecting the area), 60 percent of the FRM audits should be implemented at those sites with the annual mean PM2.5 concentrations (or 24-hour NAAQS if that is affecting the area) among the highest 25 percent for all PM2.5 sites in the network. Additional information concerning the FRM audit program is contained in reference 7 of this appendix. The calculations for evaluating bias between the primary monitor and the FRM audit are described in section 5.5.

4. Reporting Requirements.

(a) For each pollutant, prepare a list of all monitoring sites and their AIRS site identification codes in each reporting organization and submit the list to the appropriate EPA Regional Office, with a copy to AIRS-AQS. Whenever there is a change in this list of monitoring sites in a reporting organization, report this change to the Regional Office and to AIRS-AQS.

4.1 Quarterly Reports. For each quarter, each reporting organization shall report to AIRS-AQS directly (or via the appropriate EPA Regional Office for organizations not direct users of AIRS) the results of all valid precision, bias and accuracy tests it has carried out during the quarter. The quarterly reports of precision, bias and accuracy data must be submitted consistent with the data reporting requirements specified for air quality data as set forth in §58.35(c). EPA strongly encourages early submittal of the QA data in order to assist the State and Local agencies in controlling and evaluating the quality of the ambient air SLAMS data. Each organization shall report all QA/QC measurements. Report results from invalid tests, from tests carried out during a time period for which ambient data immediately prior or subsequent to the tests were invalidated for appropriate reasons, and from tests of methods or analyzers not approved for use in SLAMS monitoring networks under appendix C of this part. Such data should be flagged so that it will not be utilized for quantitative assessment of precision, bias and accuracy.

4.2 Annual Reports.

4.2.1 When precision, bias and accuracy estimates for a reporting organization have been calculated for all four quarters of the calendar year, EPA will calculate and report the measurement uncertainty for the entire calendar year. These limits will then be associated with the data submitted in the annual SLAMS report required by §58.26.

4.2.2 Each reporting organization shall submit, along with its annual SLAMS report, a listing by pollutant of all monitoring sites in the reporting organization.

5. Calculations for Data Quality Assessment.

(a) Calculations of measurement uncertainty are carried out by EPA according to the following procedures. Reporting organizations should report the data for individual precision, bias and accuracy tests as specified in sections 3 and 4 of this appendix even though they may elect to perform some or all of the calculations in this section on their own.

5.1 Precision of Automated Methods Excluding PM2.5. Estimates of the precision of automated methods are calculated from the results of biweekly precision checks as specified in section 3.1 of this appendix. At the end of each calendar quarter, an integrated precision probability interval for all SLAMS analyzers in the organization is calculated for each pollutant.

5.1.1 Single Analyzer Precision.

5.1.1.1 The percent difference (di) for each precision check is calculated using equation 1, where Yi is the concentration indicated by the analyzer for the I-th precision check and Xi is the known concentration for the I-th precision check, as follows:

Equation 1
5.1.1.2 For each analyzer, the quarterly average (dj) is calculated with equation 2, and the standard deviation (Sj) with equation 3, where n is the number of precision checks on the instrument made during the calendar quarter. For example, n should be 6 or 7 if precision checks are made biweekly during a quarter. Equation 2 and 3 follow:

Equation 2 Equation 3
5.1.2 Precision for Reporting Organization.

5.1.2.1 For each pollutant, the average of averages (D) and the pooled standard deviation (Sa) are calculated for all analyzers audited for the pollutant during the quarter, using either equations 4 and 5 or 4a and 5a, where k is the number of analyzers audited within the reporting organization for a single pollutant, as follows:

Equation 4 Equation 4a Equation 5 Equation 5a
5.1.2.2 Equations 4 and 5 are used when the same number of precision checks are made for each analyzer. Equations 4a and 5a are used to obtain a weighted average and a weighted standard deviation when different numbers of precision checks are made for the analyzers.

5.1.2.3 For each pollutant, the 95 Percent Probability Limits for the precision of a reporting organization are calculated using equations 6 and 7, as follows:

Equation 6 Equation 7
5.2 Accuracy of Automated Methods Excluding PM2.5. Estimates of the accuracy of automated methods are calculated from the results of independent audits as described in section 3.2 of this appendix. At the end of each calendar quarter, an integrated accuracy probability interval for all SLAMS analyzers audited in the reporting organization is calculated for each pollutant. Separate probability limits are calculated for each audit concentration level in section 3.2 of this appendix.

5.2.1 Single Analyzer Accuracy. The percentage difference (di) for each audit concentration is calculated using equation 1, where Yi is the analyzer's indicated concentration measurement from the I-th audit check and Xi is the actual concentration of the audit gas used for the I-th audit check.

5.2.2 Accuracy for Reporting Organization.

5.2.2.1 For each audit concentration level of a particular pollutant, the average (D) of the individual percentage differences (di) for all n analyzers audited during the quarter is calculated using equation 8, as follows:

Equation 8
5.2.2.2 For each concentration level of a particular pollutant, the standard deviation (Sa) of all the individual percentage differences for all n analyzers audited during the quarter is calculated, using equation 9, as follows:

Equation 9
5.2.2.3 For reporting organizations having four or fewer analyzers for a particular pollutant, only one audit is required each quarter. For such reporting organizations, the audit results of two consecutive quarters are required to calculate an average and a standard deviation, using equations 8 and 9. Therefore, the reporting of probability limits shall be on a semiannual (instead of a quarterly) basis.

5.2.2.4 For each pollutant, the 95 Percent Probability Limits for the accuracy of a reporting organization are calculated at each audit concentration level using equations 6 and 7.

5.3 Precision of Manual Methods Excluding PM2.5. Estimates of precision of manual methods are calculated from the results obtained from collocated samplers as described in section3.3 of this appendix. At the end of each calendar quarter, an integrated precision probability interval for all collocated samplers operating in the reporting organization is calculated for each manual method network. (continued)