Loading (50 kb)...'
(continued)
5.1.2 Methods Developed for CDB Determinations
A variety of methods have been used for preparing and analyzing CDB samples; most of these methods rely on one of the analytical techniques described above. Among the earliest reports, Princi (1947) and Smith et al. (1955) employed a colorimetric procedure to analyze for CDB and CDU. Samples were dried and digested through several cycles with concentrated mineral acids (HNO 3 and H 2 SO 4) and hydrogen peroxide (H 2 O 2). The digest was neutralized, and the cadmium was complexed with diphenylthiocarbazone and extracted with chloroform. The dithizone-cadmium complex then was quantified using a spectrometer.
Colorimetric procedures for cadmium analyses were replaced by methods based on atomic absorption spectroscopy (AAS) in the early 1960s, but many of the complex sample preparation procedures were retained. Kjellstrom (1979) reports that in Japanese, American and Swedish laboratories during the early 1970s, blood samples were wet ashed with mineral acids or ashed at high temperature and wetted with nitric acid. The cadmium in the digest was complexed with metal chelators including diethyl dithiocarbamate (DDTC), ammonium pyrrolidine dithiocarbamate (APDC) or diphenylthiocarbazone (dithizone) in ammonia-citrate buffer and extracted with methyl isobutyl ketone (MIBK). The resulting solution then was analyzed by flame AAS or graphite- furnace AAS for cadmium determinations using deuterium-lamp background correction.
In the late 1970s, researchers began developing simpler preparation procedures. Roels et al. (1978) and Roberts and Clark (1986) developed simplified digestion procedures. Using the Roberts and Clark method, a 0.5 ml aliquot of blood is collected and transferred to a digestion tube containing 1 ml concentrated HNO 3. The blood is then digested at 110 ° C for 4 hours. The sample is reduced in volume by continued heating, and 0.5 ml 30% H 2 O 2 is added as the sample dries. The residue is dissolved in 5 ml dilute (1%) HNO 3, and 20 m l of sample is then analyzed by graphite-furnace AAS with deuterium-background correction.
The current trend in the preparation of blood samples is to dilute the sample and add matrix modifiers to reduce background interference, rather than digesting the sample to reduce organic content. The method of Stoeppler and Brandt (1980), and the abbreviated procedure published in the American Public Health Association's (APHA) Methods for Biological Monitoring (1988), are straightforward and are nearly identical. For the APHA method, a small aliquot (50-300 m l) of whole blood that has been stabilized with ethylenediaminetetraacetate (EDTA) is added to 1.0 ml 1M HNO 3, vigorously shaken and centrifuged. Aliquots (10-25 m l) of the supernatant then are then analyzed by graphite-furnace AAS with appropriate background correction.
Using the method of Stoeppler and Brandt (1980), aliquots (50-200 m l) of whole blood that have been stabilized with EDTA are pipetted into clean polystyrene tubes and mixed with 150-600 m l of 1 M HNO 3. After vigorous shaking, the solution is centrifuged and a 10-25 m l aliquot of the supernatant then is analyzed by graphite-furnace AAS with appropriate background correction.
Claeys-Thoreau (1982) and DeBenzo et al. (1990) diluted blood samples at a ratio of 1:10 with a matrix modifier (0.2% Triton X-100, a wetting agent) for direct determinations of CDB. DeBenzo et al. also demonstrated that aqueous standards of cadmium, instead of spiked, whole-blood samples, could be used to establish calibration curves if standards and samples are treated with additional small volumes of matrix modifiers (i.e., 1% HNO 3, 0.2% ammonium hydrogenphosphate and 1 mg/ml magnesium salts.)
These direct dilution procedures for CDB analysis are simple and rapid. Laboratories can process more than 100 samples a day using a dedicated graphite-furnace AAS, an auto-sampler, and either a Zeeman- or a deuterium-background correction system. Several authors emphasize using optimum settings for graphite-furnace temperatures during the drying, charring, and atomization processes associated with the flameless AAS method, and the need to run frequent QC samples when performing automated analysis.
5.1.3 Sample Collection and Handling
Sample collection procedures are addressed primarily to identify ways to minimize the degree of variability that may be introduced by sample collection during medical monitoring. It is unclear at this point the extent to which collection procedures contribute to variability among CDB samples. Sources of variation that may result from sampling procedures include time-of-day effects and introduction of external contamination during the collection process. To minimize these sources, strict adherence to a sample collection protocol is recommended. Such a protocol must include provisions for thorough cleaning of the site from which blood will be extracted; also, every effort should be made to collect samples near the same time of day. It is also important to recognize that under the recent OSHA blood-borne pathogens standard (29 CFR 1910.1030), blood samples and certain body fluids must be handled and treated as if they are infectious.
5.1.4 Best Achievable Performance
The best achievable performance using a particular method for CDB determinations is assumed to be equivalent to the performance reported by research laboratories in which the method was developed.
For their method, Roberts and Clark (1986) demonstrated a limit of detection of 0.4 m g Cd/l in whole blood, with a linear response curve from 0.4 to 16.0 m g Cd/l. They report a coefficient of variation (CV) of 6.7% at 8.0 m g/l.
The APHA (1988) reports a range of 1.0-25 m g/l, with a CV of 7.3% (concentration not stated). Insufficient documentation was available to critique this method.
Stoeppler and Brandt (1980) achieved a detection limit of 0.2 m g Cd/l whole blood, with a linear range of 0.4-12.0 m g Cd/l, and a CV of 15-30%, for samples at 1.0aeg m l. Improved precision (CV of 3.8%) was reported for CDB concentrations at 9.3 m g/l.
5.1.5 General Method Performance
For any particular method, the performance expected from commercial laboratories may be somewhat lower than that reported by the research laboratory in which the method was developed. With participation in appropriate proficiency programs and use of a proper in-house QA/QC program incorporating provisions for regular corrective actions, the performance of commercial laboratories is expected to approach that reported by research laboratories. Also, the results reported for existing proficiency programs serve as a gauge of the likely level of performance that currently can be expected from commercial laboratories offering these analyses.
Weber (1988) reports on the results of the proficiency program run by the Centre de Toxicologie du Quebec (CTQ). As indicated previously, participants in that program receive 18 blood samples per year having cadmium concentrations ranging from 0.2-20 m g/l. Currently, 76 laboratories are participating in this program. The program is established for several analytes in addition to cadmium, and not all of these laboratories participate in the cadmium proficiency-testing program.
Under the CTQ program, cadmium results from individual laboratories are compared against the consensus mean derived for each sample. Results indicate that after receiving 60 samples (i.e., after participation for approximately three years), 60% of the laboratories in the program are able to report results that fall within + 1 m g/l or 15% of the mean, whichever is greater. (For this procedure, the 15% criterion was applied to concentrations exceeding 7 m g/l.) On any single sample of the last 20 samples, the percentage of laboratories falling within the specified range is between 55 and 80%.
The CTQ also evaluates the performance of participating laboratories against a less severe standard: + 2 m g/l or 15% of the mean, whichever is greater (Weber 1988); 90% of participating laboratories are able to satisfy this standard after approximately 3 years in the program. (The 15% criterion is used for concentrations in excess of 13 m g/l.) On any single sample of the last 15 samples, the percentage of laboratories falling within the specified range is between 80 and 95% (except for a single test for which only 60% of the laboratories achieved the desired performance).
Based on the data presented in Weber (1988), the CV for analysis of CDB is nearly constant at 20% for cadmium concentrations exceeding 5 m g/l, and increases for cadmium concentrations below 5 m g/l. At 2 m g/l, the reported CV rises to approximately 40%. At 1 m g/l, the reported CV is approximately 60%.
Participating laboratories also tend to overestimate concentrations for samples exhibiting concentrations less than 2 m g/l (see Figure 11 of Weber 1988). This problem is due in part to the proficiency evaluation criterion that allows reporting a minimum + 2.0 m g/l for evaluated CDB samples. There is currently little economic or regulatory incentive for laboratories participating in the CTQ program to achieve greater accuracy for CDB samples containing cadmium at concentrations less than 2.0 m g/l, even if the laboratory has the experience and competency to distinguish among lower concentrations in the samples obtained from the CTQ.
The collective experience of international agencies and investigators demonstrate the need for a vigorous QC program to ensure that CDB values reported by participating laboratories are indeed reasonably accurate. As Friberg (1988) stated:
"Information about the quality of published data has often been lacking. This is of concern as assessment of metals in trace concentrations in biological media are fraught with difficulties from the collection, handling, and storage of samples to the chemical analyses. This has been proven over and over again from the results of interlaboratory testing and quality control exercises. Large variations in results were reported even from "experienced" laboratories."
The UNEP/WHO global study of cadmium biological monitoring set a limit for CDB accuracy using the maximum allowable deviation method at Y=X + (0.1X+1) for a targeted concentration of 10 m g Cd/l (Friberg and Vahter 1983). The performance of participating laboratories over a concentration range of 1.5-12 m g/l was reported by Lind et al. (1987). Of the 3 QC runs conducted during 1982 and 1983, 1 or 2 of the 6 laboratories failed each run. For the years 1983 and 1985, between zero and 2 laboratories failed each of the consecutive QC runs.
In another study (Vahter and Friberg 1988), QC samples consisting of both external (unknown) and internal (stated) concentrations were distributed to laboratories participating in the epidemiology research. In this study, the maximum acceptable deviation between the regression analysis of reported results and reference values was set at Y=X + (0.05X+0.2) for a concentration range of 0.3-5.0 m g Cd/l. It is reported that only 2 of 5 laboratories had acceptable data after the first QC set, and only 1of 5 laboratories had acceptable data after the second QC set. By the fourth QC set, however, all 5 laboratories were judged proficient.
The need for high quality CDB monitoring is apparent when the toxicological and biological characteristics of this metal are considered; an increase in CDB from 2 to 4 m g/l could cause a doubling of the cadmium accumulation in the kidney, a critical target tissue for selective cadmium accumulation (Nordberg and Nordberg 1988).
Historically, the CDC's internal QC program for CDB cadmium monitoring program has found achievable accuracy to be + 10% of the true value at CDB concentrations > 5.0 m g/l (Paschal 1990). Data on the performance of laboratories participating in this program currently are not available.
5.1.6 Observed CDB Concentrations
As stated in Section 4.3, CDB concentrations are representative of ongoing levels of exposure to cadmium. Among those who have been exposed chronically to cadmium for extended periods, however, CDB may contain a component attributable to the general cadmium body burden.
5.1.6.1 CDB concentrations among unexposed samples
Numerous studies have been conducted examining CDB concentrations in the general population, and in control groups used for comparison with cadmium-exposed workers. A number of reports have been published that present erroneously high values of CDB (Nordberg and Nordberg 1988). This problem was due to contamination of samples during sampling and analysis, and to errors in analysis. Early AAS methods were not sufficiently sensitive to accurately estimate CDB concentrations.
Table 4 presents results of recent studies reporting CDB levels for the general U.S. population not exposed occupationally to cadmium. Other surveys of tissue cadmium using U.S. samples and conducted as part of a cooperative effort among Japan, Sweden and the U.S., did not collect CDB data because standard analytical methodologies were unavailable, and because of analytic problems (Kjellstrom 1979; SWRI 1978).
Arithmetic and/or geometric means and standard deviations are provided in Table 4 for measurements among the populations defined in each study listed. The range of reported measurements and/or the 95% upper and lower confidence intervals for the means are presented when this information was reported in a study. For studies reporting either an arithmetic or geometric standard deviation along with a mean, the lower and upper 95th percentile for the distribution also were derived and reported in the table.
The data provided in Table 4 from Kowal et al. (1979) are from studies conducted between 1974 and 1976 evaluating CDB levels for the general population in Chicago, and are considered to be representative of the U.S. population. These studies indicate that the average CDB concentration among those not occupationally exposed to cadmium is approximately 1 m g/l.
In several other studies presented in Table 4, measurements are reported separately for males and females, and for smokers and nonsmokers. The data in this table indicate that similar CDB levels are observed among males and females in the general population, but that smokers tend to exhibit higher CDB levels than nonsmokers. Based on the Kowal et al. (1979) study, smokers not occupationally exposed to cadmium exhibit an average CDB level of 1.4 m g/l.
In general, nonsmokers tend to exhibit levels ranging to 2 m g/l, while levels observed among smokers range to 5aeg/l. Based on the data presented in Table 4, 95% of those not occupationally exposed to cadmium exhibit CDB levels less than 5 m g/l.
5.1.6.2 CDB concentrations among exposed workers.
Table 5 is a summary of results from studies reporting CDB levels among workers exposed to cadmium in the work place. As in Table 4, arithmetic and/or geometric means and standard deviations are provided if reported in the listed studies. The absolute range, or the 95% confidence interval around the mean, of the data in each study are provided when reported. In addition, the lower and upper 95th percentile of the distribution are presented for each study i which a mean and corresponding standard deviation were reported. Table 5 also provides estimates of the duration, and level, of exposure to cadmium in the work place if these data were reported in the listed studies. The data presented in Table 5 suggest that CDB levels are dose related. Sukuri et al. (1983) show that higher CDB levels are observed among workers experiencing higher work place exposure. This trend appears to be true of every the studies listed in the table.
CDB levels reported in Table 5 are higher among those showing signs of cadmium-related kidney damage than those showing no such damage. Lauwerys et al. (1976) report CDB levels among workers with kidney lesions that generally are above the levels reported for workers without kidney lesions. Ellis et al. (1983) report a similar observation comparing workers with and without renal dysfunction, although they found more overlap between the 2 groups than Lauwerys et al.
The data in Table 5 also indicate that CDB levels are higher among those experiencing current occupational exposure than those who have been removed from such exposure. Roels et al. (1982) indicate that CDB levels observed among workers experiencing ongoing exposure in the work place are almost entirely above levels observed among workers removed from such exposure. This finding suggests that CDB levels decrease once cadmium exposure has ceased.
A comparison of the data presented in Tables 4 and 5 indicates that CDB levels observed among cadmium-exposed workers is significantly higher than levels observed among the unexposed groups. With the exception of 2 studies presented in Table 5 (1 of which includes former workers in the sample group tested), the lower 95th percentile for CDB levels among exposed workers are greater than 5 m g/l, which is the value of the upper 95th percentile for CDB levels observed among those who are not occupationally exposed. Therefore, a CDB level of 5 m g/l represents a threshold above which significant work place exposure to cadmium may be occurring.
5.1.7 Conclusions and Recommendations for CDB
Based on the above evaluation, the following recommendations are made for a CDB proficiency program.
5.1.7.1 Recommended method
The method of Stoeppler and Brandt (1980) should be adopted for analyzing CDB. This method was selected over other methods for its straightforward sample-preparation procedures, and because limitations of the method were described adequately. It also is the method used by a plurality of laboratories currently participating in the CTQ proficiency program. In a recent CTQ interlaboratory comparison report (CTQ 1991), analysis of the methods used by laboratories to measure CDB indicates that 46% (11 of 24) of the participating laboratories used the Stoeppler and Brandt methodology (HNO 3 deproteinization of blood followed by analysis of the supernatant by GF-AAS). Other CDB methods employed by participating laboratories identified in the CTQ report include dilution of blood (29%), acid digestion (12%) and miscellaneous methods (12%).
Laboratories may adopt alternate methods, but it is the responsibility of the laboratory to demonstrate that the alternate methods meet the data quality objectives defined for the Stoeppler and Brandt method (see Section 5.1.7.2 below).
5.1.7.2 Data quality objectives
Based on the above evaluation, the following data quality objectives (DQOs) should facilitate interpretation of analytical results.
Limit of Detection. 0.5 m g/l should be achievable using the Stoeppler and Brandt method. Stoeppler and Brandt (1980) report a limit of detection equivalent to <0.2 m g/l in whole blood using 2 m l aliquots of deproteinized, diluted blood samples.
Accuracy. Initially, some of the laboratories performing CDB measurements may be expected to satisfy criteria similar to the less severe criteria specified by the CTQ program, i.e., measurements within 2 m g/l or 15% (whichever is greater) of the target value. About 60% of the laboratories enrolled in the CTQ program could meet this criterion on the first proficiency test (Weber 1988).
Currently, approximately 12 laboratories in the CTQ program are achieving an accuracy for CDB analysis within the more severe constraints of + 1 m g/l or 15% (whichever is greater). Later, as laboratories gain experience, they should achieve the level of accuracy exhibited by these 12 laboratories. The experience in the CTQ program has shown that, even without incentives, laboratories benefit from the feedback of the program; after they have analyzed 40-50 control samples from the program, performance improves to thepoint where about 60% of the laboratories can meet the stricter criterion of + 1 m g/l or 15% (Weber 1988). Thus, this stricter target accuracy is a reasonable DQO.
Precision. Although Stoeppler and Brandt (1980) suggest that a coefficient of variation (CV) near 1.3% (for a 10 m g/l concentration) is achievable for within-run reproducibility, it is recognized that other factors affecting within- and between- run comparability will increase the achievable CV. Stoeppler and Brandt (1980) observed CVs that were as high as 30% for low concentrations (0.4 m g/l), and CVs of less than 5% for higher concentrations.
For internal QC samples (see Section 3.3.1), laboratories should to attain an overall precision near 25%. For CDB samples with concentrations less than 2 m g/l, a target precision of 40% is reasonable, while precisions of 20% should be achievable for concentrations greater than 2 m g/l. Although these values are more strict than values observed in the CTQ interlaboratory program reported by Webber (1988), they are within the achievable limits reported by Stoeppler and Brandt (1980).
5.1.7.3 Quality assurance/quality control
Commercial laboratories providing measurement of CDB should adopt an internal QA/QC program that incorporates the following components: Strict adherence to the selected method, including all calibration requirements; regular incorporation of QC samples during actual runs; a protocol for corrective actions, and documentation of these actions; and, participation in an interlaboratory proficiency program. Note that the nonmandatory QA/QC program presented in Attachment 3 is based on the Stoeppler and Brandt method for CDB analysis. Should an alternate method be adopted, the laboratory should develop a QA/QC program satisfying the provisions of Section 3.3.1.
5.2 Measuring Cadmium in Urine (CDU)
As in the case of CDB measurement, proper determination of CDU requires strict analytical discipline regarding collection and handling of samples. Because cadmium is both ubiquitous in the environment and employed widely in coloring agents for industrial products that may be used during sample collection, preparation and analysis, care should be exercised to ensure that samples are not contaminated during the sampling procedure.
Methods for CDU determination share many of the same features as those employed for the determination of CDB. Thus, changes and improvements to methods for measuring CDU over the past 40 years parallel those used to monitor CDB. The direction of development has largely been toward the simplification of sample preparation techniques made possible because of improvements in analytic techniques.
5.2.1 Units of CDU Measurement
Procedures adopted for reporting CDU concentrations are not uniform. In fact, the situation for reporting CDU is more complicated than for CDB, where concentrations are normalized against a unit volume of whole blood.
Concentrations of solutes in urine vary with several biological factors (including the time since last voiding and the volume of liquid consumed over the last few hours); as a result, solute concentrations should be normalized against another characteristic of urine that represents changes in solute concentrations. The 2 most common techniques are either to standardize solute concentrations against the concentration of creatinine, or to standardize solute concentrations against the specific gravity of the urine. Thus, CDU concentrations have been reported in the literature as "uncorrected" concentrations of cadmium per volume of urine (i.e., m g Cd/l urine), "corrected" concentrations of cadmium per volume of urine at a standard specific gravity (i.e., m g Cd/l urine at a specific gravity of 1.020), or "corrected" mass concentration per unit mass of creatinine (i.e., m g Cd/g creatinine). (CDU concentrations [whether uncorrected or corrected for specific gravity, or normalized to creatinine] occasionally are reported in nanomoles [i.e., nmoles] of cadmium per unit mass or volume. In this protocol, these values are converted to m g of cadmium per unit mass or volume using 89 nmoles of cadmium=10 m g.)
While it is agreed generally that urine values of analytes should be normalized for reporting purposes, some debate exists over what correction method should be used. The medical community has long favored normalization based on creatinine concentration, a common urinary constituent. Creatinine is a normal product of tissue catabolism, is excreted at a uniform rate, and the total amount excreted per day is constant on a day-to-day basis (NIOSH 1984b). While this correction method is accepted widely in Europe, and within some occupational health circles, Kowals (1983) argues that the use of specific gravity (i.e., total solids per unit volume) is more straightforward and practical (than creatinine) in adjusting CDU values for populations that vary by age or gender.
Kowals (1983) found that urinary creatinine (CRTU) is lower in females than males, and also varies with age. Creatinine excretion is highest in younger males (20-30 years old), decreases at middle age (50-60 years), and may rise slightly in later years. Thus, cadmium concentrations may be underestimated for some workers with high CRTU levels.
Within a single void urine collection, urine concentration of any analyte will be affected by recent consumption of large volumes of liquids, and by heavy physical labor in hot environments. The absolute amount of analyte excreted may be identical, but concentrations will vary widely so that urine must be corrected for specific gravity (i.e., to normalize concentrations to the quantity of total solute) using a fixed value (e.g., 1.020 or 1.024). However, since heavy-metal exposure may increase urinary protein excretion, there is a tendency to underestimate cadmium concentrations in samples with high specific gravities when specific-gravity corrections are applied.
Despite some shortcomings, reporting solute concentrations as a function of creatinine concentration is accepted generally; OSHA therefore recommends that CDU levels be reported as the mass of cadmium per unit mass of creatinine ( m g/g CTRU).
Reporting CDU as m g/g CRTU requires an additional analytical process beyond the analysis of cadmium: Samples must be analyzed independently for creatinine so that results may be reported as the ratio of cadmium to creatinine concentrations found in the urine sample. Consequently, the overall quality of the analysis depends on the combined performance by a laboratory on these 2 determinations. The analysis used for CDU determinations is addressed below in terms of m g Cd/l, with analysis of creatinine addressed separately. Techniques for assessing creatinine are discussed in Section 5.4.
Techniques for deriving cadmium as a ratio of CRTU, and the confidence limits for independent measurements of cadmium and CRTU, are provided in Section 3.3.3.
5.2.2 Analytical Techniques Used to Monitor CDU
Analytical techniques used for CDU determinations are similar to those employed for CDB determinations; these techniques are summarized in Table 3. As with CDB monitoring, the technique most suitable for CDU determinations is atomic absorption spectroscopy (AAS). AAS methods used for CDU determinations typically employ a graphite furnace, with background correction made using either the deuterium-lamp or Zeeman techniques; Section 5.1.1 provides a detailed description of AAS methods.
5.2.3 Methods Developed for CDU Determinations
Princi (1947), Smith et al. (1955), Smith and Kench (1957), and Tsuchiya (1967) used colorimetric procedures similar to those described in the CDB section above to estimate CDU concentrations. In these methods, urine (50 ml) is reduced to dryness by heating in a sand bath and digested (wet ashed) with mineral acids. Cadmium then is complexed with dithiazone, extracted with chloroform and quantified by spectrophotometry. These early studies typically report reagent blank values equivalent to 0.3 m g Cd/l, and CDU concentrations among nonexposed control groups at maximum levels of 10 m g Cd/l -erroneously high values when compared to more recent surveys of cadmium concentrations in the general population.
By the mid-1970s, most analytical procedures for CDU analysis used either wet ashing (mineral acid) or high temperatures (>400 ° C) to digest the organic matrix of urine, followed by cadmium chelation with APDC or DDTC solutions and extraction with MIBK. The resulting aliquots were analyzed by flame or graphite-furnace AAS (Kjellstrom 1979).
Improvements in control over temperature parameters with electrothermal heating devices used in conjunction with flameless AAS techniques, and optimization of temperature programs for controlling the drying, charring, and atomization processes in sample analyses, led to improved analytical detection of diluted urine samples without the need for sample digestion or ashing. Roels et al. (1978) successfully used a simple sample preparation, dilution of 1.0 ml aliquots of urine with 0.1 N HNO 3, to achieve accurate low-level determinations of CDU.
In the method described by Pruszkowska et al. (1983), which has become the preferred method for CDU analysis, urine samples were diluted at a ratio of 1:5 with water; diammonium hydrogenphosphate in dilute HNO 3 was used as a matrix modifier. The matrix modifier allows for a higher charring temperature without loss of cadmium through volatilization during pre-atomization. This procedure also employs a stabilized temperature platform in a graphite furnace, while nonspecific background absorbtion is corrected using the Zeeman technique. This method allows for an absolute detection limit of approximately 0.04 m g Cd/l urine.
5.2.4 Sample Collection and Handling
Sample collection procedures for CDU may contribute to variability observed among CDU measurements. Sources of variation attendant to sampling include time-of-day, the interval since ingestion of liquids, and the introduction of external contamination during the collection process. Therefore, to minimize contributionsfrom these variables, strict adherence to a sample-collection protocol is recommended. This a protocol should include provisions for normalizing the conditions under which urine is collected. Every effort also should be made to collect samples during the same time of day.
Collection of urine samples from an industrial work force for biological monitoring purposes usually is performed using "spot" (i.e., single-void) urine with the pH of the sample determined immediately. Logistic and sample-integrity problems arise when efforts are made to collect urine over long periods (e.g., 24 hrs). Unless single-void urines are used, there are numerous opportunities for measurement error because of poor control over sample collection, storage and environmental contamination.
To minimize the interval during which sample urine resides in the bladder, the following adaption to the "spot" collection procedure is recommended: The bladder should first be emptied, and then a large glass of water should be consumed; the sample may be collected within an hour after the water is consumed.
5.2.5 Best Achievable Performance
Performance using a particular method for CDU determinations is assumed to be equivalent to the performance reported by the research laboratories in which the method was developed. Pruszkowska et al. (1983) report a detection limit of 0.04 m g/l CDU, with a CV of <4not=tween0-5 m g/l.The CDC reports a minimum CDU detection limit of 0.07 m g/l using a modified method based on Pruszkowska et al. (1983). No CV is stated in this protocol; the protocol contains only rejection criteria for internal QC parameters used during accuracy determinations with known standards (Attachment 8 of exhibit 106 of OSHA docket H057A). Stoeppler and Brandt (1980) report a CDU detection limit of 0.2 m g/l for their methodology.
5.2.6 General Method Performance
For any particular method, the expected initial performance from commercial laboratories may be somewhat lower than that reported by the research laboratory in which the method was developed. With participation in appropriate proficiency programs, and use of a proper in-house QA/QC program incorporating provisions for regular corrective actions, the performance of commercial laboratories may be expected to improve and approach that reported by a research laboratories. The results reported for existing proficiency programs serve to specify the initial level of performance that likely can be expected from commercial laboratories offering analysis using a particular method.
Weber (1988) reports on the results of the CTQ proficiency program, which includes CDU results for laboratories participating in the program. Results indicate that after receiving 60 samples (i.e., after participating in the program for approximately 3 years), approximately 80% of the participating laboratories report CDU results ranging between + 2 m g/l or 15% of the consensus mean, whichever is greater. On any single sample of the last 15 samples, the proportion of laboratories falling within the specified range is between 75 and 95%, except for a single test for which only 60% of the laboratories reported acceptable results. For each of the last 15 samples, approximately 60% of the laboratories reported results within + 1 m g or 15% of the mean, whichever is greater. The range of concentrations included in this set of samples was not reported.
Another report from the CTQ (1991) summarizes preliminary CDU results from their 1991 interlaboratory program. According to the report, for 3 CDU samples with values of 9.0, 16.8, 31.5 m g/l, acceptable results (target + 2 m g/l) were achieved by only 44-52% of the 34 laboratories participating in the CDU program. The overall CVs for these 3 CDU samples among the 34 participating laboratories were 31%, 25%, and 49%, respectively. The reason for this poor performance has not been determined.
A more recent report from the CTQ (Weber, private communication) indicates that 36% of the laboratories in the program have been able to achieve the target of + 1 m g/l or 15% for more than 75% of the samples analyzed over the last 5 years, while 45% of participating laboratories achieved a target of + 2 m g/l or 15% for more than 75% of the samples analyzed over the same period.
Note that results reported in the interlaboratory programs are in terms of m g Cd/l of urine, unadjusted for creatinine. The performance indicated, therefore, is a measure of the performance of the cadmium portion of the analyses, and does not include variationthat may be introduced during the analysis of CRTU.
5.2.7 Observed CDU Concentrations
Prior to the onset of renal dysfunction, CDU concentrations provide a general indication of the exposure history (i.e., body burden) (see Section 4.3). Once renal dysfunction occurs, CDU levels appear to increase and are no longer indicative solely of cadmium body burden (Friberg and Elinder 1988).
5.2.7.1 Range of CDU concentrations observed among unexposed samples
Surveys of CDU concentrations in the general population were first reported from cooperative studies among industrial countries (i.e., Japan, U.S. and Sweden) conducted in the mid-1970s. In summarizing these data, Kjellstrom (1979) reported that CDU concentrations among Dallas, Texas men (age range: 9- 59 years; smokers and nonsmokers) varied from 0.11-1.12 m g/l (uncorrected for creatinine or specific gravity). These CDU concentrations are intermediate between population values found in Sweden (range: 0.11-0.80 m g/l) and Japan (range: 0.14-2.32 m g/l).
Kowal and Zirkes (1983) reported CDU concentrations for almost 1,000 samples collected during 1978-79 from the general U.S. adult population (i.e., nine states; both genders; ages 20-74 years). They report that CDU concentrations are lognormally distributed; low levels predominated, but a small proportion of the population exhibited high levels. These investigators transformed the CDU concentrations values, and reported the same data 3 different ways: m g/l urine (unadjusted), m g/l (specific gravity adjusted to 1.020), and m g/g CRTU. These data are summarized in Tables 6 and 7.
Based on further statistical examination of these data, including the lifestyle characteristics of this group, Kowal (1988) suggested increased cadmium absorption (i.e., body burden) was correlated with low dietary intakes of calcium and iron, as well as cigarette smoking.
CDU levels presented in Table 6 are adjusted for age and gender. Results suggest that CDU levels may be slightly different among men and women (i.e., higher among men when values are unadjusted, but lower among men when the values are adjusted, for specific gravity or CRTU). Mean differences among men and women are small compared to the standard deviations, and therefore may not be significant. Levels of CDU also appear increase with age. The data in Table 6 suggest as well that reporting CDU levels adjusted for specific gravity or as a function of CRTU results in reduced variability.
The data in the Table 6 indicate the geometric mean of CDU levels observed among the general population is 0.52 m g Cd/l urine (unadjusted), with a geometric standard deviation of 3.0. Normalized for creatinine, the geometric mean for the population is 0.66 m g/g CRTU, with a geometric standard deviation of 2.7. Table 7 provides the distributions of CDU concentrations for the general population studied by Kowal and Zirkes. The data in this table indicate that 95% of the CDU levels observed among those not occupationally exposed to cadmium are below 3 m g/g CRTU.
5.2.7.2 Range of CDU concentrations observed among exposed workers
Table 8 is a summary of results from available studies of CDU concentrations observed among cadmium-exposed workers. In this table, arithmetic and/or geometric means and standard deviations are provided if reported in these studies. The absolute range for the data in each study, or the 95% confidence interval around the mean of each study, also are provided when reported. The lower and upper 95th percentile of the distribution are presented for each study in which a mean and corresponding standard deviation were reported. Table 8 also provides estimates of the years of exposure, and the levels of exposure, to cadmium in the work place if reported in these studies. Concentrations reported in this table are in m g/g CRTU, unless otherwise stated.
Data in Table 8 from Lauwerys et al. (1976) and Ellis et al. (1983) indicate that CDU concentrations are higher among those exhibiting kidney lesions or dysfunction than among those lacking these symptoms. Data from the study by Roels et al. (1982) indicate that CDU levels decrease among workers removed from occupational exposure to cadmium in comparison to workers experiencing ongoing exposure. In both cases, however, the distinction between the 2 groups is not as clear as with CDB; there is more overlap in CDU levels observed among each of the paired populations than is true for corresponding CDB levels. As with CDB levels, the data in Table 8 suggest increased CDU concentrations among workers who experienced increased overall exposure.
Although a few occupationally-exposed workers in the studies presented in Table 8 exhibit CDU levels below 3 m g/g CRTU, most of those workers exposed to cadmium levels in excess of the PEL defined in the final cadmium rule exhibit CDU levels above 3 m g/g CRTU; this level represents the upper 95th percentile of the CDU distribution observed among those who are not occupationally exposed to cadmium (Table 7).
The mean CDU levels reported in Table 8 among occupationally-exposed groups studied (except 2) exceed 3 m g/g CRTU. Correspondingly, the level of exposure reported in these studies (with 1 exception) are significantly higher than what workers will experience under the final cadmium rule. The 2 exceptions are from the studies by Mueller et al. (1989) and Kawada et al. (1990); these studies indicate that workers exposed to cadmium during pigment manufacture do not exhibit CDU levels as high as those levels observed among workers exposed to cadmium in other occupations. Exposure levels, however, were lower in the pigment manufacturing plants studied. Significantly, workers removed from occupational cadmium exposure for an average of 4 years still exhibited CDU levels in excess of 3 m g/g CRTU (Roels et al. 1982). In the single-exception study with a reported level of cadmium exposure lower than levels proposed in the final rule (i.e., the study of a pigment manufacturing plant by Kawada et al. 1990), most of the workers exhibited CDU levels less than 3 m g/g CRTU (i.e., the mean value was only 1.3 m g/g CRTU). CDU levels among workers with such limited cadmium exposure are expected to be significantly lower than levels reported on Table 8.
Based on the above data, a CDU level of 3 m g/g CRTU appear to represent a threshold above which significant work place exposure to cadmium occurs over the work span of those being monitored. Note that this threshold is not as distinct as the corresponding threshold described for CDB. In general, the variability associated with CDU measurements among exposed workers appears to be higher than the variability associated with CDB measurements among similar workers.
5.2.8 Conclusions and Recommendations for CDU
The above evaluation supports the following recommendations for a CDU proficiency program. These recommendations address only sampling and analysis procedures for CDU determinations specifically, which are to be reported as an unadjusted m /g Cd/l urine. Normalizing this result to creatinine requires a second analysis for CRTU so that the ratio of the 2 measurements can be obtained. Creatinine analysis is addressed in Section 5.4. Formal procedures for combining the 2 measurements to derive a value and a confidence limit for CDU in m g/g CRTU are provided in Section 3.3.3.
5.2.8.1 Recommended method
The method of Pruszkowska et al. (1983) should be adopted for CDU analysis. This method is recommended because it is simple, straightforward and reliable (i.e., small variations in experimental conditions do not affect the analytical results).
A synopsis of the methods used by laboratories to determine CDU under the interlaboratory program administered by the CTQ (1991) indicates that more than 78% (24 of 31) of the participating laboratories use a dilution method to prepare urine samples for CDU analysis. Laboratories may adopt alternate methods, but it is the responsibility of the laboratory to demonstrate that the alternate methods provide results of comparable quality to the Pruszkowska method.
5.2.8.2 Data quality objectives
The following data quality objectives should facilitate interpretation of analytical results, and are achievable based on the above evaluation.
Limit of Detection. A level of 0.5 m g/l (i.e., corresponding to a detection limit of 0.5 m g/g CRTU, assuming 1 g CRT/l urine) should be achievable. Pruszkowska et al. (1983) achieved a limit of detection of 0.04 m g/l for CDU based on the slope the the curve for their working standards (0.35 pg Cd/0.0044, A signal=1% absorbance using GF-AAS).
The CDC reports a minimum detection limit for CDU of 0.07 m g/l using a modified Pruszkowska method. This limit of detection was defined as 3 times the standard deviation calculated from 10 repeated measurements of a "low level" CDU test sample (Attachment 8 of exhibit 106 of OSHA docket H057A).
Stoeppler and Brandt (1980) report a limit of detection for CDU of 0.2 m g/l using an aqueous dilution (1:2) of the urine samples.
Accuracy. A recent report from the CTQ (Weber, private communication) indicates that 36% of the laboratories in the program achieve the target of + 1 m g/l or 15% for more than 75% of the samples analyzed over the last 5 years, while 45% of participating laboratories achieve a target of + 2 m g/l or 15% for more than 75% of the samples analyzed over the same period. With time and a strong incentive for improvement, it is expected that the proportion of laboratories successfully achieving the stricter level of accuracy should increase. It should be noted, however, these indices of performance do not include variations resulting from the ancillary measurement of CRTU (which is recommended for the proper recording of results). The low cadmium levels expected to be measured indicate that the analysis of creatinine will contribute relatively little to the overall variability observed among creatinine-normalized CDU levels (see Section 5.4). The initial target value for reporting CDU under this program, therefore, is set at + 1 m g/g CRTU or 15% (whichever is greater).
Precision. For internal QC samples (which are recommended as part of an internal QA/QC program, Section 3.3.1), laboratories should attain an overall precision of 25%. For CDB samples with concentrations less than 2 m g/l, a target precision of 40% is, while precisions of 20% should be achievable for CDU concentrations greater than 2 m g/l. Although these values are more stringent than those observed in the CTQ interlaboratory program reported by Webber (1988), they are well within limits expected to be achievable for the method as reported by Stoeppler and Brandt (1980).
5.2.8.3 Quality assurance/quality control
Commercial laboratories providing CDU determinations should adopt an internal QA/QC program that incorporates the following components: Strict adherence to the selected method, including calibration requirements; regular incorporation of QC samples during actual runs; a protocol for corrective actions, and documentation of such actions; and, participation in an interlaboratory proficiency program. Note that the nonmandatory program presented in Attachment 1 as an example of an acceptable QA/QC program, is based on using the Pruszkowska method for CDU analysis. Should an alternate method be adopted by a laboratory, the laboratory should develop a QA/QC program equivalent to the nonmandatory program, and which satisfies the provisions of Section 3.3.1.
5.3 Monitoring b 2-Microglobulin in Urine (B2MU)
As indicated in Section 4.3, B2MU appears to be the best of several small proteins that may be monitored as early indicators of cadmium-induced renal damage. Several analytic techniques are available for measuring B2M.
5.3.1 Units of B2MU Measurement
Procedures adopted for reporting B2MU levels are not uniform. In these guidelines, OSHA recommends that B2MU levels be reported as m g/g CRTU, similar to reporting CDU concentrations. Reporting B2MU normalized to the concentration of CRTU requires an additional analytical process beyond the analysis of B2M: Independent analysisfor creatinine so that results may be reported as a ratio of the B2M and creatinine concentrations found in the urine sample. Consequently, the overall quality of the analysis depends on the combined performance on these 2 analyses. The analysis used for B2MU determinations is described in terms of m g B2M/l urine, with analysis of creatinine addressed separately. Techniques used to measure creatinine are provided in Section 5.4. Note that Section 3.3.3 provides techniques for deriving the value of B2M as function of CRTU, and the confidence limits for independent measurements of B2M and CRTU.
5.3.2 Analytical Techniques Used to Monitor B2MU
One of the earliest tests used to measure B2MU was the radial immunodiffusion technique. This technique is a simple and specific method for identification and quantitation of a number of proteins found in human serum and other body fluids when the protein is not readily differentiated by standard electrophoretic procedures. A quantitative relationship exists between the concentration of a protein deposited in a well that is cut into a thin agarose layer containing the corresponding monospecific antiserum, and the distance that the resultant complex diffuses. The wells are filled with an unknown serum and the standard (or control), and incubated in a moist environment at room temperature. After the optimal point of diffusion has been reached, the diameters of the resulting precipition rings are measured. The diameter of a ring is related to the concentration of the constituent substance. For B2MU determinations required in the medical monitoring program, this method requires a process that may be insufficient to concentrate the protein to levels that are required for detection.
Radioimmunoassay (RIA) techniques are used widely in immunologic assays to measure the concentration of antigen or antibody in body-fluid samples. RIA procedures are based on competitive-binding techniques. If antigen concentration is being measured, the principle underlying the procedure is that radioactive-labeled antigen competes with the sample's unlabeled antigen for binding sites on a known amount of immobile antibody. When these 3 components are present in the system, an equilibrium exists. This equilibrium is followed by a separation of the free and bound forms of the antigen. Either free or bound radioactive- labeled antigen can be assessed to determine the amount of antigen in the sample. The analysis is performed by measuring the level of radiation emitted either by the bound complex following removal of the solution containing the free antigen, or by the isolated solution containing the residual-free antigen. The main advantage of the RIA method is the extreme sensitivity of detection for emitted radiation and the corresponding ability to detect trace amounts of antigen. Additionally, large numbers of tests can be performed rapidly.
The enzyme-linked immunosorbent assay (ELISA) techniques are similar to RIA techniques except that nonradioactive labels are employed. This technique is safe, specific and rapid, and is nearly as sensitive asRIA techniques. An enzyme-labeled antigen is used in the immunologic assay; the labeled antigen detects the presence and quantity of unlabeled antigen in the sample. In a representative ELISA test, a plastic plate is coated with antibody (e.g., antibody to B2M). The antibody reacts with antigen (B2M) in the urine and forms an antigen-antibody complex on the plate. A second anti-B2M antibody (i.e., labeled with an enzyme) is added to the mixture and forms an antibody-antigen-antibody complex. Enzyme activity is measured spectrophotometrically after the addition of a specific chromogenic substrate which is activated by the bound enzyme. The results of a typical test are calculated by comparing the spectrophotometric reading of a serum sample to that of a control or reference serum. In general, these procedures are faster and require less laboratory work than other methods. (continued)