Loading (50 kb)...'
(continued)
For the paired measurements obtained as described in sections 3.3.1 and 3.3.2, calculate the percent difference (di) using equation 1a, where Yi is the concentration of pollutant measured by the duplicate sampler, and Xi is the concentration measured by the sampler reporting air quality for the site. Calculate the quarterly average percent difference (dj), equation 2; standard deviation (Sj), equation 3; and upper and lower 95 percent probability limits for precision, equations 6 and 7.
(1a)
Upper 95 percent probability
limit = dj+1.96 Sj/v2
(6)
Lower 95 percent probability
limit = dj-1.96 Sj/v2
(7)
5.2 Single Instrument Accuracy for TSP and PM10. Each organization, at the end of each sampling quarter, shall calculate and report the percentage difference for each high-volume or PM10 sampler audited during the quarter. Directions for calculation are given below and directions for reporting are given in section 6.
For the flow rate audit described in section 3.4, let Xi represent the known flow rate and Yi represent the indicated flow rate. Calculate the percentage difference (di) using equation 1.
5.3 Single Instrument Accuracy for Pb. Each organization, at the end of each sampling quarter, shall calculate and report the percentage difference for each high-volume lead sampler audited during the quarter. Directions for calculation are given in 5.2 and directions for reporting are given in section 6.
5.4 Single-Analysis-Day Accuracy for Pb. Each organization, at the end of each sampling quarter, shall calculate and report the percentage difference for each Pb analysis audit during the quarter. Directions for calculations are given below and directions for reporting are given in section 6.
For each analysis audit for Pb described in section 3.4.2, let Xi represent the known value of the audit sample and Yi the indicated value of Pb. Calculate the percentage difference (di) for each audit at each concentration level using equation 1.
6. Organization Reporting Requirements.
At the end of each sampling quarter, the organization must report the following data assessment information:
(1) For automated analyzers—precision probability limits from section 4.1 and percentage differences from section 4.2, and
(2) For manual methods—precision probability limits from section 5.1 and percentage differences from sections 5.2 and 5.3. The precision and accuracy information for the entire sampling quarter must be submitted with the air monitoring data. All data used to calculate reported estimates of precision and accuracy including span checks, collocated sampler and audit results must be made available to the permit granting authority upon request.
Table B-1_Minimum PSD Data Assessment Requirements
----------------------------------------------------------------------------------------------------------------
Parameters
Method Assessment method Coverage Frequency reported
----------------------------------------------------------------------------------------------------------------
Precision:
Automated Methods for SO2, Response check at Each analyzer..... Once per 2 weeks.. Actual
NO2, O3, and CO. concentration concentration 2
between .08 & & measured
.10 ppm (8 & concentration.3
10 ppm for CO) 2.
TSP, PM10, Lead............... Collocated Highest Once per week or Two concentration
samplers. concentration every 3rd day for measurements.
site in continuous
monitoring sampling.
network.
Accuracy:
Automated Methods for SO2, Response check at: Each analyzer..... Once per sampling Actual
NO2, O3, and CO. .03-.08 ppm;1,2 quarter. concentration2
.15-.20 ppm;1,2 & measured
.35-.45 ppm;1,2 (indicated)
.80-.90 ppm;1,2 concentration3
(if applicable). for each level.
TSP, PM10..................... Sampler flow check Each sampler...... Once per sampling Actual flow rate
quarter. and flow rate
indicated by the
sampler.
Lead.......................... 1. Sample flow 1. Each sampler... 1. Once/quarter... 1. Same as for
rate check.. 2. Analytical 2. Each quarter Pb TSP.
2. Check system. samples are 2. Actual
analytical system analyzed. concentration
with Pb audit & measured
strips. concentration of
audit samples
(µg Pb/
strip).
----------------------------------------------------------------------------------------------------------------
\1\ Concentration shown times 100 for CO.
\2\ Effective concentration for open path analyzers.
\3\ Corrected concentration, if applicable, for open path analyzers.
References
1. Rhodes, R.C. Guideline on the Meaning and Use of Precision and Accuracy Data Required by 40 CFR part 58, appendices A and B. EPA–600/4–83–023. U.S. Environmental Protection Agency, Research Triangle Park, NC 27711, June, 1983.
2. “Quality Assurance Handbook for Air Pollution Measurement Systems, Volume I—Principles.” EPA–600/9–76–005. March 1976. Available from U.S Environmental Protection Agency, Atmospheric Research and Exposure Assessment Laboratory (MD–77), Research Triangle Park, NC 27711.
3. “Quality Assurance Handbook for Air Pollution Measurement Systems, Volume II—Ambient Air Specific Methods.” EPA–600/4–77–027a. May 1979. Available from U.S. Environmental Protection Agency, Atmospheric Research and Exposure Assessment Laboratory(MD–77), Research Triangle Park, NC 27711.
4. “List of Designated Reference and Equivalent Methods.” Available from U.S. Environmental Protection Agency, Department E (MD–77), Research Triangle Park, NC 27711.
5. Hughes, E.E. and J. Mandel. A Procedure for Establishing Traceability of Gas Mixtures to Certain National Bureau of Standards SRM's. EPA–600/7–81–010. U.S. Environmental Protection Agency, Research Triangle Park, NC 27711, May, 1981. (Joint NBS/EPA Publication)
6. Paur, R.J. and F.F. McElroy. Technical Assistance Document for the Calibration of Ambient Ozone Monitors. EPA–600/4–79–057. U.S. Environmental Protection Agency, Atmospheric Research and Exposure Assessment Laboratory (MD–77), Research Triangle Park, NC 27711, September, 1979.
7. McElroy, F.F. Transfer Standards for the Calibration of Ambient Air Monitoring Analyzers for Ozone. EPA–600/4–79–056. U.S. Environmental Protection Agency, Atmospheric Research and Exposure Assessment Laboratory (MD–77), Research Triangle Park, NC 27711, September, 1979.
[44 FR 27571, May 10, 1979; 44 FR 65070, Nov. 9, 1979; 44 FR 72592, Dec. 14, 1979, as amended at 46 FR 44168, Sept. 3, 1981; 48 FR 2530, Jan. 20, 1983; 51 FR 9596, Mar. 19, 1986; 52 FR 24741, July 1, 1987; 59 FR 41628, 41629, Aug. 12, 1994; 60 FR 52321, Oct. 6, 1995]
Appendix C to Part 58—Ambient Air Quality Monitoring Methodology
top
1.0 Purpose
This appendix specifies the monitoring methods (manual methods or automated analyzers) which must be used in State ambient air quality monitoring stations.
2.0 State and Local Air Monitoring Stations (SLAMS)
2.1 Except as otherwise provided in this appendix, a monitoring method used in a SLAMS must be a reference or equivalent method as defined in §50.1 of this chapter.
2.2 Substitute PM10 samplers.
2.2.1 For purposes of showing compliance with the NAAQS for particulate matter, a high volume TSP sampler described in 40 CFR part 50, appendix B, may be used in a SLAMS in lieu of a PM10 monitor as long as the ambient concentrations of particles measured by the TSP sampler are below the PM10 NAAQS. If the TSP sampler measures a single value that is higher than the PM10 24-hour standard, or if the annual average of its measurements is greater than the PM10 annual standard, the TSP sampler operating as a substitute PM10 sampler must be replaced with a PM10 monitor. For a TSP measurement above the 24-hour standard, the TSP sampler should be replaced with a PM10 monitor before the end of the calendar quarter following the quarter in which the high concentration occurred. For a TSP annual average above the annual standard, the PM10 monitor should be operating by June 30 of the year following the exceedance.
2.2.2 In order to maintain historical continuity of ambient particulate matter trends and patterns for PM10 NAMS that were previously TSP NAMS, the TSP high volume sampler must be operated concurrently with the PM10 monitor for a one-year period beginning with the PM10 NAMS start-up date. The operating schedule for the TSP sampler must be at least once every 6 days regardless of the PM10 sampling frequency.
2.3 Any manual method or analyzer purchased prior to cancellation of its reference or equivalent method designation under §53.11 or §53.16 of this chapter may be used in a SLAMS following cancellation for a reasonable period of time to be determined by the Administrator.
2.4 Approval of non-designated PM2.5 methods operated at specific individual sites. A method for PM2.5 that has not been designated as a reference or equivalent method as defined in §50.1 of this chapter may be approved for use for purposes of section 2.1 of this appendix at a particular SLAMS under the following stipulations.
2.4.1 The method must be demonstrated to meet the comparability requirements (except as provided in this section 2.4.1) set forth in §53.34 of this chapter in each of the four seasons at the site at which it is intended to be used. For purposes of this section 2.4.1, the requirements of §53.34 of this chapter shall apply except as follows:
2.4.1.1 The method shall be tested at the site at which it is intended to be used, and there shall be no requirement for tests at any other test site.
2.4.1.2 For purposes of this section 2.4, the seasons shall be defined as follows: Spring shall be the months of March, April, and May; summer shall be the months of June, July, and August; fall shall be the months of September, October, and November; and winter shall be the months of December, January, and February; when alternate seasons are approved by the Administrator.
2.4.1.3 No PM10 samplers shall be required for the test, as determination of the PM2.5/PM10 ratio at the test site shall not be required.
2.4.1.4 The specifications given in table C–4 of part 53 of this chapter for Class I methods shall apply, except that there shall be no requirement for any minimum number of sample sets with Rj greater than 40 µg/m3 for 24-hour samples or greater than 15 µg/m3 average concentration collected over a 48-hour period.
2.4.2 The monitoring agency wishing to use the method must develop and implement appropriate quality assurance procedures for the method.
2.4.3 The monitoring agency wishing to use the method must develop and implement appropriate procedures for assessing and reporting the precision and accuracy of the method comparable to the procedures set forth in appendix A of this part for designated reference and equivalent methods.
2.4.4 The assessment of network operating precision using collocated measurements with reference method “audit” samplers required under section 3 of appendix A of this part shall be carried out semi-annually rather than annually (i.e., monthly audits with assessment determinations each 6 months).
2.4.5 Requests for approval under this section 2.4 must meet the general submittal requirements of sections 2.7.1 and 2.7.2.1 of this appendix and must include the requirements in sections 2.4.5.1 through 2.4.5.7 of this appendix.
2.4.5.1 A clear and unique description of the site at which the method or sampler will be used and tested, and a description of the nature or character of the site and the particulate matter that is expected to occur there.
2.4.5.2 A detailed description of the method and the nature of the sampler or analyzer upon which it is based.
2.4.5.3 A brief statement of the reason or rationale for requesting the approval.
2.4.5.4 A detailed description of the quality assurance procedures that have been developed and that will be implemented for the method.
2.4.5.5 A detailed description of the procedures for assessing the precision and accuracy of the method that will be implemented for reporting to AIRS.
2.4.5.6 Test results from the comparability tests as required in section 2.4.1 through 2.4.1.4 of this appendix.
2.4.5.7 Such further supplemental information as may be necessary or helpful to support the required statements and test results.
2.4.6 Within 120 days after receiving a request for approval of the use of a method at a particular site under this section 2.4 and such further information as may be requested for purposes of the decision, the Administrator will approve or disapprove the method by letter to the person or agency requesting such approval.
2.5 Approval of non-designated methods under §58.13(f). An automated (continuous) method for PM2.5 that is not designated as either a reference or equivalent method as defined in §50.1 of this chapter may be approved under §58.13(f) for use at a SLAMS for the limited purposes of §58.13(f). Such an analyzer that is approved for use at a SLAMS under §58.13(f), identified as correlated acceptable continuous (CAC) monitors, shall not be considered a reference or equivalent method as defined in §50.1 of this chapter by virtue of its approval for use under §58.13(f), and the PM2.5 monitoring data obtained from such a monitor shall not be otherwise used for purposes of part 50 of this chapter.
2.6 Use of Methods With Higher, Nonconforming Ranges in Certain Geographical Areas.
2.6.1 [Reserved]
2.6.2 Nonconforming Ranges. An analyzer may be used (indefinitely) on a range which extends to concentrations higher than two times the upper limit specified in table B–1 of part 53 of this chapter if:
2.6.2.1 The analyzer has more than one selectable range and has been designated as a reference or equivalent method on at least one of its ranges, or has been approved for use under section 2.5 (which applies to analyzers purchased before February 18, 1975);
2.6.2.2 The pollutant intended to be measured with the analyzer is likely to occur in concentrations more than two times the upper range limit specified in table B–1 of part 53 of this chapter in the geographical area in which use of the analyzer is proposed; and
2.6.2.3 The Administrator determines that the resolution of the range or ranges for which approval is sought is adequate for its intended use. For purposes of this section (2.6), “resolution” means the ability of the analyzer to detect small changes in concentration.
2.6.3 Requests for approval under section 2.6.2 must meet the submittal requirements of section 2.7. Except as provided in subsection 2.7.3, each request must contain the information specified in subsection 2.7.2 in addition to the following:
2.6.3.1 The range or ranges proposed to be used;
2.6.3.2 Test data, records, calculations, and test results as specified in subsection 2.7.2.2 for each range proposed to be used;
2.6.3.3 An identification and description of the geographical area in which use of the analyzer is proposed;
2.6.3.4 Data or other information demonstrating that the pollutant intended to be measured with the analyzer is likely to occur in concentrations more than two times the upper range limit specified in table B–1 of part 53 of this chapter in the geographical area in which use of the analyzer is proposed; and
2.6.3.5 Test data or other information demonstrating the resolution of each proposed range that is broader than that permitted by section 2.5.
2.6.4 Any person who has obtained approval of a request under this section (2.6.2) shall assure that the analyzer for which approval was obtained is used only in the geographical area identified in the request and only while operated in the range or ranges specified in the request.
2.7 Requests for Approval; Withdrawal of Approval.
2.7.1 Requests for approval under sections 2.4, 2.6.2, or 2.8 of this appendix must be submitted to: Director, National Exposure Assessment Laboratory, Department E, (MD-77B), U.S. Environmental Protection Agency, Research Triangle Park, North Carolina 27711.
2.7.2 Except as provided in section 2.7.3, each request must contain:
2.7.2.1 A statement identifying the analyzer (e.g., by serial number) and the method of which the analyzer is representative (e.g., by manufacturer and model number); and
2.7.2.2 Test data, records, calculations, and test results for the analyzer (or the method of which the analyzer is representative) as specified in subpart B, subpart C, or both (as applicable) of part 53 of this chapter.
2.7.3 A request may concern more than one analyzer or geographical area and may incorporate by reference any data or other information known to EPA from one or more of the following:
2.7.3.1 An application for a reference or equivalent method determination submitted to EPA for the method of which the analyzer is representative, or testing conducted by the applicant or by EPA in connection with such an application;
2.7.3.2 Testing of the method of which the analyzer is representative at the initiative of the Administrator under §53.7 of this chapter; or
2.7.3.3 A previous or concurrent request for approval submitted to EPA under this section (2.7).
2.7.4 To the extent that such incorporation by reference provides data or information required by this section (2.7) or by sections 2.4, 2.5, or 2.6, independent data or duplicative information need not be submitted.
2.7.5 After receiving a request under this section (2.7), the Administrator may request such additional testing or information or conduct such tests as may be necessary in his judgment for a decision on the request.
2.7.6 If the Administrator determines, on the basis of any information available to him, that any of the determinations or statements on which approval of a request under this section (2.7) was based are invalid or no longer valid, or that the requirements of section 2.4, 2.5, or 2.6, as applicable, have not been met, he may withdraw the approval after affording the person who obtained the approval an opportunity to submit information and arguments opposing such action.
2.8 Modifications of Methods by Users.
2.8.1 Except as otherwise provided in this section (2.8), no reference method, equivalent method, or alternative method may be used in a SLAMS if it has been modified in a manner that will, or might, significantly alter the performance characteristics of the method without prior approval by the Administrator. For purposes of this section (2.8), “alternative method” means an analyzer the use of which has been approved under section 2.4, 2.5, or 2.6 of this appendix or some combination thereof.
2.8.2 Requests for approval under this section (2.8) must meet the submittal requirements of sections 2.7.1 and 2.7.2.1 of this appendix.
2.8.3 Each request submitted under this section (2.8) must include:
2.8.3.1 A description, in such detail as may be appropriate, of the desired modification;
2.8.3.2 A brief statement of the purpose(s) of the modification, including any reasons for considering it necessary or advantageous;
2.8.3.3 A brief statement of belief concerning the extent to which the modification will or may affect the performance characteristics of the method; and
2.8.3.4 Such further information as may be necessary to explain and support the statements required by sections 2.8.3.2 and 2.8.3.3.
2.8.4 Within 75 days after receiving a request for approval under this section (2.8) and such further information as he may request for purposes of his decision, the Administrator will approve or disapprove the modification in question by letter to the person or agency requesting such approval.
2.8.5 A temporary modification that will or might alter the performance characteristics of a reference, equivalent, or alternative method may be made without prior approval under this section (2.8) if the method is not functioning or is malfunctioning, provided that parts necessary for repair in accordance with the applicable operation manual cannot be obtained within 45 days. Unless such temporary modification is later approved under section 2.8.4, the temporarily modified method shall be repaired in accordance with the applicable operation manual as quickly as practicable but in no event later than 4 months after the temporary modification was made, unless an extension of time is granted by the Administrator. Unless and until the temporary modification is approved, air quality data obtained with the method as temporarily modified must be clearly identified as such when submitted in accordance with §58.28 or §58.35 of this chapter and must be accompanied by a report containing the information specified in section 2.8.3. A request that the Administrator approve a temporary modification may be submitted in accordance with sections 2.8.1 through 2.8.4. In such cases the request will be considered as if a request for prior approval had been made.
2.9 Use of IMPROVE Samplers at a SLAMS. “IMPROVE” samplers may be used in SLAMS for monitoring of regional background and regional transport concentrations of fine particulate matter. The IMPROVE samplers were developed for use in the Interagency Monitoring of Protected Visual Environments (IMPROVE) network to characterize all of the major components and many trace constituents of the particulate matter that impair visibility in Federal Class I Areas. These samplers are routinely operated at about 70 locations in the United States. IMPROVE samplers consist of four sampling modules that are used to collect twice weekly 24-hour duration simultaneous samples. Modules A, B, and C collect PM2.5 on three different filter substrates that are compatible with a variety of analytical techniques, and module D collects a PM10 sample. PM2.5 mass and elemental concentrations are determined by analysis of the 25mm diameter stretched Teflon filters from module A. More complete descriptions of the IMPROVE samplers and the data they collect are available elsewhere (references 4, 5, and 6 of this appendix).
3.0 National Air Monitoring Stations (NAMS)
3.1 Methods used in those SLAMS which are also designated as NAMS to measure SO2, CO, NO2, or O3 must be automated reference or equivalent methods (continuous analyzers).
4.0 Photochemical Assessment Monitoring Stations (PAMS)
4.1 Methods used for O3 monitoring at PAMS must be automated reference or equivalent methods as defined in §50.1 of this chapter.
4.2 Methods used for NO, NO2 and NOX monitoring at PAMS should be automated reference or equivalent methods as defined for NO2 in §50.1 of this chapter. If alternative NO, NO2 or NOX monitoring methodologies are proposed, such techniques must be detailed in the network description required by §58.40 and subsequently approved by the Administrator.
4.3 Methods for meteorological measurements and speciated VOC monitoring are included in the guidance provided in references 2 and 3. If alternative VOC monitoring methodology (including the use of new or innovative technologies), which is not included in the guidance, is proposed, it must be detailed in the network description required by §58.40 and subsequently approved by the Administrator.
5.0 Particulate Matter Episode Monitoring
5.1 For short-term measurements of PM10 during air pollution episodes (see §51.152 of this chapter) the measurement method must be:
5.1.1 Either the “Staggered PM10” method or the “PM10 Sampling Over Short Sampling Times” method, both of which are based on the reference method for PM10 and are described in reference 1: or
5.1.2 Any other method for measuring PM10:
5.1.2.1 Which has a measurement range or ranges appropriate to accurately measure air pollution episode concentration of PM10,
5.1.2.2 Which has a sample period appropriate for short-term PM10 measurements, and
5.1.2.3 For which a quantitative relationship to a reference or equivalent method for PM10 has been established at the use site. Procedures for establishing a quantitative site-specific relationship are contained in reference 1.
5.2 Quality Assurance. PM10 methods other than the reference method are not covered under the quality assessment requirements of appendix A. Therefore, States must develop and implement their own quality assessment procedures for those methods allowed under this section 4. These quality assessment procedures should be similar or analogous to those described in section 3 of appendix A for the PM10 reference method.
6.0 References
1. Pelton, D. J. Guideline for Particulate Episode Monitoring Methods, GEOMET Technologies, Inc., Rockville, MD. Prepared for U.S. Environmental Protection Agency, Research Triangle Park, NC. EPA Contract No. 68–02–3584. EPA 450/4–83–005. February 1983.
2. Technical Assistance Document For Sampling and Analysis of Ozone Precursors. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/8–91–215. October 1991.
3. Quality Assurance Handbook for Air Pollution Measurement Systems: Volume IV. Meteorological Measurements. Atmospheric Research and Exposure Assessment Laboratory, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711. EPA 600/4–90–0003. August 1989.
(4) Eldred, R.A., Cahill, T.A., Wilkenson, L.K., et al., Measurements of fine particles and their chemical components in the IMPROVE/NPS networks, in Transactions of the International Specialty Conference on Visibility and Fine Particles, Air and Waste Management Association: Pittsburgh, PA, 1990; pp 187-196.
(5) Sisler, J.F., Huffman, D., and Latimer, D.A.; Spatial and temporal patterns and the chemical composition of the haze in the United States: An analysis of data from the IMPROVE network, 1988-1991, ISSN No. 0737-5253-26, National Park Service, Ft. Collins, CO, 1993.
(6) Eldred, R.A., Cahill, T.A., Pitchford, M., and Malm, W.C.; IMPROVE—a new remote area particulate monitoring system for visibility studies, Proceedings of the 81st Annual Meeting of the Air Pollution Control Association, Dallas, Paper 88-54.3, 1988.
[44 FR 27571, May 10, 1979, as amended at 44 FR 37918, June 29, 1979; 44 FR 65070, Nov. 9, 1979; 51 FR 9597, Mar. 19, 1986; 52 FR 24741, 24742, July 1, 1987; 58 FR 8469, Feb. 12, 1993; 59 FR 41628, Aug. 12, 1994; 62 FR 38843, July 18, 1997]
Appendix D to Part 58—Network Design for State and Local Air Monitoring Stations (SLAMS), National Air Monitoring Stations (NAMS), and Photochemical Assessment Monitoring Stations (PAMS)
top
1. SLAMS Monitoring Objectives and Spatial Scales
2. SLAMS Network Design Procedures
2.1 Background Information for Establishing SLAMS
2.2 Substantive Changes in SLAMS/NAMS Network Design Elements
2.3 Sulfur Dioxide (SO2) Design Criteria for SLAMS
2.4 Carbon Monoxide (CO) Design Criteria for SLAMS
2.5 Ozone (O3) Design Criteria for SLAMS
2.6 Nitrogen Dioxide (NO2) Design Criteria for SLAMS
2.7 Lead (Pb) Design Criteria for SLAMS
2.8 Particluate Matter Design Criteria for SLAMS
3. Network Design for National Air Monitoring Stations (NAMS)
3.1 [Reserved]
3.2 Sulfur Dioxide (SO2) Design Criteria for NAMS
3.3 Carbon Monoxide (CO) Design Criteria for NAMS
3.4 Ozone (O3) Design Criteria for NAMS
3.5 Nitrogen Dioxide (NO2) Design Criteria for NAMS
3.6 Lead (Pb) Design Criteria for NAMS
3.7 Particulate Matter Design Criteria for NAMS
4. Network Design for Photochemical Assessment Monitoring Stations (PAMS)
5. Summary
6. References
1. SLAMS Monitoring Objectives and Spatial Scales
The purpose of this appendix is to describe monitoring objectives and general criteria to be applied in establishing the State and Local Air Monitoring Stations (SLAMS) networks and for choosing general locations for new monitoring stations. It also describes criteria for determining the number and location of National Air Monitoring Stations (NAMS), Photochemical Assessment Monitoring Stations (PAMS), and core Stations for PM2.5. These criteria will also be used by EPA in evaluating the adequacy of the SLAMS/NAMS/PAMS and core PM2.5 networks.
The network of stations that comprise SLAMS should be designed to meet a minimum of six basic monitoring objectives. These basic monitoring objectives are:
(1) To determine highest concentrations expected to occur in the area covered by the network.
(2) To determine representative concentrations in areas of high population density.
(3) To determine the impact on ambient pollution levels of significant sources or source categories.
(4) To determine general background concentration levels.
(5) To determine the extent of Regional pollutant transport among populated areas; and in support of secondary standards.
(6) To determine the welfare-related impacts in more rural and remote areas (such as visibility impairment and effects on vegetation).
It should be noted that this appendix contains no criteria for determining the total number of stations in SLAMS networks, except in areas where Pb concentrations currently exceed or have exceeded the Pb NAAQS during any one quarter of the most recent eight quarters. The optimum size of a particular SLAMS network involves trade offs among data needs and available resources that EPA believes can best be resolved during the network design process.
This appendix focuses on the relationship between monitoring objectives and the geographical location of monitoring stations. Included are a rationale and set of general criteria for identifying candidate station locations in terms of physical characteristics which most closely match a specific monitoring objective. The criteria for more specifically siting the monitoring station, including spacing from roadways and vertical and horizontal probe and path placement, are described in appendix E of this part.
To clarify the nature of the link between general monitoring objectives and the physical location of a particular monitoring station, the concept of spatial scale of representativeness of a monitoring station is defined. The goal in siting stations is to correctly match the spatial scale represented by the sample of monitored air with the spatial scale most appropriate for the monitoring objective of the station.
Thus, spatial scale of representativeness is described in terms of the physical dimensions of the air parcel nearest to a monitoring station throughout which actual pollutant concentrations are reasonably similar. The scale of representativeness of most interest for the monitoring objectives defined above are as follows:
Microscale—defines the concentrations in air volumes associated with area dimensions ranging from several meters up to about 100 meters.
Middle Scale—defines the concentration typical of areas up to several city blocks in size with dimensions ranging from about 100 meters to 0.5 kilometer.
Neighborhood Scale—defines concentrations within some extended area of the city that has relatively uniform land use with dimensions in the 0.5 to 4.0 kilometers range.
Urban Scale—defines the overall, citywide conditions with dimensions on the order of 4 to 50 kilometers. This scale would usually require more than one site for definition.
Regional Scale—defines usually a rural area of reasonably homogeneous geography and extends from tens to hundreds of kilometers.
National and Global Scales—these measurement scales represent concentrations characterizing the nation and the globe as a whole.
Proper siting of a monitoring station requires precise specification of the monitoring objective which usually includes a desired spatial scale of representativeness. For example, consider the case where the objective is to determine maximum CO concentrations in areas where pedestrians may reasonably be exposed. Such areas would most likely be located within major street canyons of large urban areas and near traffic corridors. Stations located in these areas are most likely to have a microscale of representativeness since CO concentrations typically peak nearest roadways and decrease rapidly as the monitor is moved from the roadway. In this example, physical location was determined by consideration of CO emission patterns, pedestrian activity, and physical characteristics affecting pollutant dispersion. Thus, spatial scale of representativeness was not used in the selection process but was a result of station location.
In some cases, the physical location of a station is determined from joint consideration of both the basic monitoring objective, and a desired spatial scale of representativeness. For example, to determine CO concentrations which are typical over a reasonably broad geographic area having relatively high CO concentrations, a neighborhood scale station is more appropriate. Such a station would likely be located in a residential or commercial area having a high overall CO emission density but not in the immediate vicinity of any single roadway. Note that in this example, the desired scale of representativeness was an important factor in determining the physical location of the monitoring station.
In either case, classification of the station by its intended objective and spatial scale of representativeness is necessary and will aid in interpretation of the monitoring data.
Table 1 illustrates the relationship between the four basic monitoring objectives and the scales of representativeness that are generally most appropriate for that objective.
Table 1_Relationship Among Monitoring Objectives and Scale of
Representativeness
------------------------------------------------------------------------
Monitoring Objective Appropriate Siting Scales
------------------------------------------------------------------------
Highest concentration..................... Micro, Middle, neighborhood
(sometimes urban \1\)
Population................................ Neighborhood, urban
Source impact............................. Micro, middle, neighborhood
General/background........................ Neighborhood, urban,
regional
Regional transport........................ Urban/regional
Welfare-related impacts................... Urban/regional
------------------------------------------------------------------------
\1\ Urban denotes a geographic scale applicable to both cities and rural
areas
Open path analyzers can often be used effectively and advantageously to provide better monitoring representation for population exposure monitoring and general or background monitoring in urban and neighborhood scales of representation. Such analyzers may also be able to provide better area coverage or operational advantages in high concentration and source-impact monitoring in middle scale and possibly microscale areas. However, siting of open path analyzers for the latter applications must be carried out with proper regard for the specific monitoring objectives and for the path-averaging nature of these analyzers. Monitoring path lengths need to be commensurate with the intended scale of representativeness and located carefully with respect to local sources or potential obstructions. For short-term/high-concentration or source-oriented monitoring, the monitoring path may need to be further restricted in length and be oriented approximately radially with respect to the source in the downwind direction, to provide adequate peak concentration sensitivity. Alternatively, multiple (e.g., orthogonal) paths may be used advantageously to obtain both wider area coverage and peak concentration sensitivity. Further discussion on this topic is included in section 2.2 of this appendix.
Subsequent sections of this appendix describe in greater detail the most appropriate scales of representativeness and general monitoring locations for each pollutant.
2. SLAMS Network Design Procedures
The preceding section of this appendix has stressed the importance of defining the objectives for monitoring a particular pollutant. Since monitoring data are collected to “represent” the conditions in a section or subregion of a geographical area, the previous section included a discussion of the scale of representativeness of a monitoring station. The use of this physical basis for locating stations allows for an objective approach to network design.
The discussion of scales in sections 2.3 through 2.8 of this appendix does not include all of the possible scales for each pollutant. The scales that are discussed are those that are felt to be most pertinent for SLAMS network design.
In order to evaluate a monitoring network and to determine the adequacy of particular monitoring stations, it is necessary to examine each pollutant monitoring station individually by stating its monitoring objective and determining its spatial scale of representativeness. This will do more than insure compatibility among stations of the same type. It will also provide a physical basis for the interpretation and application of the data. This will help to prevent mismatches between what the data actually represent and what the data are interpreted to represent. It is important to note that SLAMS are not necessarily sufficient for completely describing air quality. In many situations, diffusion models must be applied to complement ambient monitoring, e.g., determining the impact of point sources or defining boundaries of nonattainment areas.
Information such as emissions density, housing density, climatological data, geographic information, traffic counts, and the results of modeling will be useful in designing regulatory networks. Air pollution control agencies have shown the value of screening studies, such as intensive studies conducted with portable samplers, in designing networks. In many cases, in selecting sites for core PM2.5 or carbon monoxide SLAMS, and for defining the boundaries of PM2.5 optional community monitoring zones, air pollution control agencies will benefit from using such studies to evaluate the spatial distribution of pollutants.
2.1 Background Information for Establishing SLAMS. Background information that must be considered in the process of selecting SLAMS from the existing network and in establishing new SLAMS includes emission inventories, climatological summaries, and local geographical characteristics. Such information is to be used as a basis for the judgmental decisions that are required during the station selection process. For new stations, the background information should be used to decide on the actual location considering the monitoring objective and spatial scale while following the detailed procedures in References 1 through 4.
Emission inventories are generally the most important type of background information needed to design the SLAMS network. The emission data provide valuable information concerning the size and distribution of large point sources. Area source emissions are usually available for counties but should be subdivided into smaller areas or grids where possible, especially if diffusion modeling is to be used as a basis for determining where stations should be located. Sometimes this must be done rather crudely, for example, on the basis of population or housing units. In general, the grids should be smaller in areas of dense population than in less densely populated regions.
Emission inventory information for point sources should be generally available for any area of the country for annual and seasonal averaging times. Specific information characterizing the emissions from large point sources for the shorter averaging times (diurnal variations, load curves, etc.) can often be obtained from the source. Area source emission data by season, although not available from the EPA, can be generated by apportioning annual totals according to degree days.
Detailed area source data are also valuable in evaluating the adequacy of an existing station in terms of whether the station has been located in the desired spatial scale of representativeness. For example, it may be the desire of an agency to have an existing CO station measuring in the neighborhood scale.
By examining the traffic data for the area and examining the physical location of the station with respect to the roadways, a determination can be made as to whether or not the station is indeed measuring the air quality on the desired scale.
The climatological summaries of greatest use are the frequency distributions of wind speed and direction. The wind rose is an easily interpreted graphical presentation of the directional frequencies. Other types of useful climatological data are also available, but generally are not as directly applicable to the site selection process as are the wind statistics.
In many cases, the meteorological data originating from the most appropriate (not necessarily the nearest) national weather service (NWS) airport station in the vicinity of the prospective siting area will adequately reflect conditions over the area of interest, at least for annual and seasonal averaging times. In developing data in complex meteorological and terrain situations, diffusion meteorologists should be consulted. NWS stations can usually provide most of the relevant weather information in support of network design activities anywhere in the country. Such information includes joint frequency distributions of winds and atmospheric stability (stability-wind roses).
The geographical material is used to determine the distribution of natural features, such as forests, rivers, lakes, and manmade features. Useful sources of such information may include road and topographical maps, aerial photographs, and even satellite photographs. This information may include the terrain and land-use setting of the prospective monitor siting area, the proximity of larger water bodies, the distribution of pollutant sources in the area, the location of NWS airport stations from which weather data may be obtained, etc. Land use and topographical characteristics of specific areas of interest can be determined from U.S. Geological Survey (USGS) maps and land use maps. Detailed information on urban physiography (building/street dimensions, etc.) can be obtained by visual observations, aerial photography, and also surveys to supplement the information available from those sources. Such information could be used in determining the location of local pollutant sources in and around the prospective station locations.
2.2 Substantive Changes in SLAMS/NAMS Network Design Elements. Two important purposes of the SLAMS monitoring data are to examine and evaluate overall air quality within a certain region, and to assess the trends in air pollutant levels over several years. The EPA believes that one of the primary tools for providing these characterizations is an ambient air monitoring program which implements technically representative networks. The design of these networks must be carefully evaluated not only at their outset, but at relatively frequent intervals thereafter, using an appropriate combination of other important technical tools, including: dispersion and receptor modeling, saturation studies, point and area source emissions analyses, and meteorological assessments. The impetus for these subsequent reexaminations of monitoring network adequacy stems not only from the need to evaluate the effect that changes in the environment may pose, but also from the recognition that new and/or refined tools and techniques for use in impact assessments are continually emerging and available for application.
Substantiative changes to an ambient air monitoring network are both inevitable and necessary; however, any changes in any substantive aspect of an existing SLAMS network or monitoring site that might affect the continuity or comparability of pollutant measurements over time must be carefully and thoroughly considered. Such substantive changes would include cessation of monitoring at an existing site, relocation of an existing site, a change in the type of monitoring method used, any change in the probe or path height or orientation that might affect pollutant measurements, any significant changes in calibration procedures or standards, any significant change in operational or quality assurance procedures, any significant change in the sources or the character of the area in the vicinity of a monitoring site, or any other change that could potentially affect the continuity or comparability of monitoring data obtained before and after the change.
In general, these types of changes should be made cautiously with due consideration given to the impact of such changes on the network/site's ability to meet its intended goals. Some of these changes will be inevitable (such as when a monitoring site will no longer be available and the monitor must be relocated, for example). Other changes may be deemed necessary and advantageous, after due consideration of their impact, even though they may have a deleterious effect on the long-term comparability of the monitoring data. In these cases, an effort should be made to quantify, if possible, or at least characterize, the nature or extent of the effects of the change on the monitoring data. In all cases, the changes and all information pertinent to the effect of the change should be properly and completely documented for evaluation by trends analysts.
The introduction of open path methods to the SLAMS monitoring network may seem relatively straightforward, given the kinds of technical analyses required in this appendix. However, given the uncertainties attendant to these analyses and the critical nature and far-reaching regulatory implications of some sites in the current SLAMS network composed of point monitors, there is a need to ‘bridge’ between databases generated by these different candidate methods to evaluate and promote continuity in understanding of the historical representativeness of the database.
Concurrent, nominally collocated monitoring must be conducted in all instances where an open path analyzer is effectively intended to replace a criteria pollutant point monitor which meets either of the following:
1. Data collected at the site represents the maximum concentration for a particular nonattainment area; or
2. Data collected at the site is currently used to characterize the development of a nonattainment area State implementation plan.
The Regional Administrator, the Administrator, or their appropriate designee may also require collocated monitoring at other sites which are, based on historical technical data, significant in assessing air quality in a particular area. The term of this requirement is determined by the Regional Administrator (for SLAMS), Administrator (for NAMS), or their appropriate designee. The recommended minimum term consists of one year (or one season of maximum pollutant concentration) with a maximum term indexed to the subject pollutant NAAQS compliance interval (e.g., three calendar years for ozone). The requirement involves concurrent monitoring with both the open path analyzer and the existing point monitor during this term. Concurrent monitoring with more than one point analyzer with an open path analyzer using one or more measurement paths may also be advantageous to confirm adequate peak concentration sensitivity or to optimize the location and length of the monitoring path or paths.
All or some portion of the above requirement may be waived by the Regional Administrator (for SLAMS), the Administrator (for NAMS), or their designee in response to a request, based on accompanying technical information and analyses, or in certain unavoidable instances caused by logistical circumstances.
These requirements for concurrent monitoring also generally apply to situations where therelocation of any SLAMS site, using either a point monitor or an open path analyzer, within an area is being contemplated. (continued)