Loading (50 kb)...'
(continued)
Standards should be kept fresh; as samples age, they should be compared with new standards and replaced if necessary.
Internal Quality Control Analyses. Internal QC samples should be determined interspersed with analyses of compliance samples. At a minimum, these samples should be run at a rate of 5% of the compliance samples or 2 samples per analytic run, whichever is greater. If only 2 samples are run, they should contain different levels of cadmium.
Internal QC samples may be obtained as commercially- available reference materials and/or they may be internally prepared. Internally-prepared samples should be well characterized and traced, or compared to a reference material for which a consensus value is available.
Levels of cadmium contained in QC samples should not be known to the analyst prior to reporting the results of the analysis.
Internal QC results should be plotted or charted in a manner which describes sample recovery and laboratory control limits.
Internal Control Limits. The laboratory protocol for evaluating internal QC analyses per control limits should be clearly defined. Limits may be based on statistical methods (e.g., as 2 S from the laboratory mean recovery), or on proficiency testing limits (e.g., + 2 m g or 15% of the mean, whichever is greater).
Statistical limits that exceed + 40% should be reevaluated to determine the source error in the analysis.
When laboratory limits are exceeded, analytic work should terminate until the source of error is determined and corrected; compliance samples affected by the error should be reanalyzed. In addition, the laboratory protocol should address any unusual trends that develop which may be biasing the results. Numerous, consecutive results above or below laboratory mean recoveries, or outside laboratory statistical limits, indicate that problems may have developed.
Corrective Actions. The QA/QC plan should document in detail specific actions taken if control limits are exceeded or unusual trends develop. Corrective actions should be noted on an appropriate form, accompanied by supporting documentation.
In addition to these actions, laboratories should include whatever additional actions are necessary to assure that accurate data are reported to the responsible physicians.
Reference Materials. The following reference materials may be available:
Cadmium in Blood (CDB)
1. Centre de Toxicologie du Quebec, Le Centre Hospitalier de l'Universite Laval, 2705 boul. Laurier, Quebec, Que., Canada G1V 4G2. (Prepared 6 times per year at 1-15 m g Cd/l.)
2. H. Marchandise, Community Bureau of Reference-BCR, Directorate General XII, Commission of the European Communities, 200, rue de la Loi, B-1049, Brussels, Belgium. (Prepared as Bl CBM-1 at 5.37 m g Cd/l, and Bl CBM-2 at 12.38 m g Cd/l.)
3. Kaulson Laboratories Inc., 691 Bloomfield Ave., Caldwell, NJ 07006; tel: (201) 226-9494, FAX (201) 226-3244. (Prepared as #0141 [As, Cd, Hg, Pb] at 2 levels.)
Cadmium in Urine (CDU)
1. Centre de Toxicologie du Quebec, Le Centre Hospitalier de l'Universite Laval, 2705 boul. Laurier, Quebec, Que., Canada G1V 4G2. (Prepared 6 times per year.)
2. National Institute of Standards and Technology (NIST), Dept. of Commerce, Gaithersburg, MD; tel: (301) 975-6776. (Prepared as SRM 2670 freeze-dried urine [metals]; set includes normal and elevated levels of metals; cadmium is certified for elevated level of 88.0 m g/l in reconstituted urine.)
3. Kaulson Laboratories Inc., 691 Bloomfield Ave., Caldwell, NJ 07006; tel: (201) 226-9494, FAX (201) 226-3244. (Prepared as #0140 [As, Cd, Hg, Pb] at 2 levels.)
3.3.1.2 QA/QC procedures for establishing control of B2MU
A written, detailed QA/QC plan for B2MU analysis should be developed. The QA/QC plan should contain a protocol similar to those protocols developed for the CDB/CDU analyses. Differences in analyses may warrant some differences in the QA/QC protocol, but procedures to ensure analytical integrity should be developed and followed.
Examples of performance summaries that can be provided include measurements of accuracy (i.e., the means of measured values verses target values for the control samples) and precision (i.e., based on duplicate analyses). It is recommended that the accuracy and precision measurements be compared to those reported as achievable by the Pharmacia Delphia kit (Pharmacia 1990) to determine if and when unsatisfactory analyses have arisen. If the measurement error of 1 or more of the control samples is more than 15%, the run exceeds control limits. Similarly, this decision is warranted when the average CV for duplicate samples is greater than 5%.
3.3.2 Procedures for Record Keeping
To satisfy reporting requirements for commercial analyses of CDB, CDU and/or B2MU performed for the medical monitoring program mandated under the cadmium rule, participating laboratories should maintain the following documentation for each analyte:
1. For each analytic instrument on which analyte determinations are made, records relating to the most recent calibration and QC sample analyses;
2. for these instruments, a tabulated record for each analyte of those determinations found to be within and outside of control limits over the past 2 years;
3. results for the previous 2 years of the QC sample analyses conducted under the internal QA/QC program (this information should be: Provided for each analyte for which determinations are made and for each analytic instrument used for this purpose, sufficient to demonstrate that internal QA/QC programs are being executed properly, and consistent with data sent to responsible physicians.
4. duplicate copies of monitoring results for each analyte sent to clients during the previous 5 years, as well as associated information; supporting material such as chain-of- custody forms also should be retained; and,
5. proficiency test results and related materials received while participating in the CTQ interlaboratory program over the past 2 years; results also should be tabulated to provide a serial record of relative error (derived per Section 3.3.3 below).
3.3.3 Reporting Procedures
Participating laboratories should maintain these documents: QA/QC program plans; QA/QC status reports; CTQ proficiency program reports; and, analytical data reports. The information that should be included in these reports is summarized in Table 2; a copy of each report should be sent to the responsible physician.
TABLE 2
REPORTING PROCEDURES FOR LABORATORIES PARTICIPATING IN THE CADMIUM MEDICAL
MONITORING PROGRAM
Report Frequency (Time Frame) Contents
-------------------------------------------------------------------------------
1 QA/QC Once (init- A detailed description of the QA/QC
Program ially) protocol to be established by the
Plan laboratory to maintain control of
analyte determinations.
-------------------------------------------------------------------------------
2 QA/QC Every 2 Results of the QC samples incorporated
Status months into regular runs for each instrument
Report (over the period since the last
report).
-------------------------------------------------------------------------------
3 Proficiency Attached to Results fro from the last full year of
Report every proficiency samples submitted to the
data CTQ program.
report
Results of the 100 most recent QC samples
incorporated into regular runs for each
instrument.
-------------------------------------------------------------------------------
4 Analytical For all Date the sample was received.
Data Report reports
of data
results
Date the sample was analyzed.
Appropriate chain-of-custody
information.
Types of analyses performed.
Results of the requested analyses.
Copy of the most current proficiency
report.
-------------------------------------------------------------------------------
As noted in Section 3.3.1, a QA/QC program plan should be developed that documents internal QA/QC procedures (defined under Section 3.3.1) to be implemented by the participating laboratory for each analyte; this plan should provide a list identifying each instrument used in making analyte determinations.
A QA/QC status report should be written bimonthly for each analyte. In this report, the results of the QC program during the reporting period should be reported for each analyte in the following manner: The number (N) of QC samples analyzed during the period; a table of the target levels defined for each sample and the corresponding measured values; the mean of F/T value (as defined below) for the set of QC samples run during the period; and, use of X + 2 S (as defined below) for the set of QC samples run during the period as a measure of precision.
As noted in Section 2, an F/T value for a QC sample is the ratio of the measured concentration of analyte to the established (i.e., reference) concentration of analyte for that QC sample. The equation below describes the derivation of the mean for F/T values, X:
The standard deviation, S, for these measurements is derived using the following equation (note that 2 S is twice this value):
The nonmandatory QA/QC protocol (see Attachment 3) indicates that QC samples should be divided into several discrete pools, and a separate estimate of precision for each pools then should be derived. Several precision estimates should be provided for concentrations which differ in average value. These precision measures may be used to document improvements in performance with regard to the combined pool.
Participating laboratories should use the CTQ proficiency program for each analyte. Results of the this program will be sent by CTQ directly to physicians designated by the participating laboratories. Proficiency results from the CTQ program are used to establish the accuracy of results from each participating laboratory, and should be provided to responsible physicians for use in trend analysis. A proficiency report consisting of these proficiency results should accompany data reports as an attachment.
For each analyte, the proficiency report should include the results from the 6 previous proficiency rounds in the following format:
1. Number (N) of samples analyzed;
2. mean of the target levels, (1/N) S T i, with T i being a consensus mean for the sample;
3. mean of the measurements, (1/N) S M i, with M i being a sample measurement;
4. a measure of error defined by:
Analytical data reports should be submitted to responsible physicians directly. For each sample, report the following information: The date the sample was received; the date the sample was analyzed; appropriate chain-of-custody information; the type(s) of analyses performed; and, the results of the analyses. This information should be reported on a form similar to the form provided an appropriate form. The most recent proficiency program report should accompany the analytical data reports (as an attachment).
Confidence intervals for the analytical results should be reported as X + 2 S, with X being the measured value and 2 S the standard deviation calculated as described above.
For CDU or B2MU results, which are combined with CRTU measurements for proper reporting, the 95% confidence limits are derived from the limits for CDU or B2MU, (p), and the limits for CRTU, (q), as follows:
For these calculations, X + p is the measurement and confidence limits for CDU or B2MU, and Y + q is the measurement and confidence limit for CRTU.
Participating laboratories should notify responsible physicians as soon as they receive information indicating a change in their accreditation status with the CTQ or the CAP. These physicians should not be expected to wait until formal notice of a status change has been received from the CTQ or the CAP.
3.4 Instructions to Physicians
Physicians responsible for the medical monitoring of cadmium-exposed workers must collect the biological samples from workers; they then should select laboratories to perform the required analyses, and should interpret the analytic results.
3.4.1 Sample Collection and Holding Procedures
Blood Samples. The following procedures are recommended for the collection, shipment and storage of blood samples for CDB analysis to reduce analytical variablility; these recommendations were obtained primarily through personal communications with J.P. Weber of the CTQ (1991), and from reports by the Centers for Disease Control (CDC, 1986) and Stoeppler and Brandt (1980).
To the extent possible, blood samples should be collected from workers at the same time of day. Workers should shower or thoroughly wash their hands and arms before blood samples are drawn. The following materials are needed for blood sample collection: Alcohol wipes; sterile gauze sponges; band-aids; 20- gauge, 1.5-in. stainless steel needles (sterile); preprinted labels; tourniquets; vacutainer holders; 3-ml "metal free" vacutainer tubes (i.e., dark-blue caps), with EDTA as an anti- coagulant; and, styrofoam vacutainer shipping containers.
Whole blood samples are taken by venipuncture. Each blue- capped tube should be labeled or coded for the worker and company before the sample is drawn. (Blue-capped tubes are recommended instead of red-capped tubes because the latter may consist of red coloring pigment containing cadmium, which could contaminate the samples.) Immediately after sampling, the vacutainer tubes must be thoroughly mixed by inverting the tubes at least 10 times manually or mechanically using a Vortex device (for 15 sec). Samples should be refrigerated immediately or stored on ice until they can be packed for shipment to the participating laboratory for analysis.
The CDC recommends that blood samples be shipped with a "cool pak" to keep the samples cold during shipment. However, the CTQ routinely ships and receives blood samples for cadmium analysis that have not been kept cool during shipment. The CTQ has found no deterioration of cadmium in biological fluids that were shipped via parcel post without a cooling agent, even though these deliveries often take 2 weeks to reach their destination.
Urine Samples. The following are recommended procedures for the collection, shipment and storage of urine for CDU and B2MU analyses, and were obtained primarily through personal communications with J.P. Weber of the CTQ (1991), and from reports by the CDC (1986) and Stoeppler and Brandt (1980).
Single "spot" samples are recommended. As B2M can degrade in the bladder, workers should first empty their bladder and then drink a large glass of water at the start of the visit. Urine samples then should be collected within 1 hour. Separate samples should be collected for CDU and B2MU using the following materials: Sterile urine collection cups (250 ml); small sealable plastic bags; preprinted labels; 15-ml polypropylene or polyethylene screw-cap tubes; lab gloves ( "metal free"); and, preservatives (as indicated).
The sealed collection cup should be kept in the plastic bag until collection time. The workers should wash their hands with soap and water before receiving the collection cup. The collection cup should not be opened until just before voiding and the cup should be sealed immediately after filling. It is important that the inside of the container and cap are not touched by, or come into contact with, the body, clothing or other surfaces.
For CDU analyzes, the cup is swirled gently to resuspend any solids, and the 15-ml tube is filled with 10-12 ml urine. The CDC recommends the addition of 100 m l concentrated HNO 3 as a preservative before sealing the tube and then freezing the sample. The CTQ recommends minimal handling and does not acidify their interlaboratory urine reference materials prior to shipment, nor do they freeze the sample for shipment. At the CTQ, if the urine sample has much sediment, the sample is acidified in the lab to free any cadmium in the precipitate.
For B2M, the urine sample should be collected directly into a polyethylene bottle previously washed with dilute nitric acid. The pH of the urine should be measured and adjusted to 8.0 with 0.1 N NaOH immediately following collection. Samples should be frozen and stored at -20 C until testing is performed. The B2M in the samples should be stable for 2 days when stored at 2-8 C, and for at least 2 months at -20 C. Repeated freezing and thawing should be avoided to prevent denaturing the B2M (Pharmacia 1990).
3.4.2 Recommendations for Evaluating Laboratories
Using standard error data and the results of proficiency testing obtained from CTQ, responsible physicians can make an informed choice of which laboratory to select to analyze biological samples. In general, laboratories with small standard errors and little disparity between target and measured values tend to make precise and accurate sample determinations. Estimates of precision provided to the physicians with each set of monitoring results can be compared to previously-reported proficiency and precision estimates. The latest precision estimates should be at least as small as the standard error reported previously by the laboratory. Moreover, there should be no indication that precision is deteriorating (i.e., increasing values for the precision estimates). If precision is deteriorating, physicians may decide to use another laboratory for these analyses. QA/QC information provided by the participating laboratories to physicians can, therefore, assist physicians in evaluating laboratory performance.
3.4.3 Use and Interpretation of Results
When the responsible physician has received the CDB, CDU and/or B2MU results, these results must be compared to the action levels discussed in the final rule for cadmium. The comparison of the sample results to action levels is straightforward. The measured value reported from the laboratory can be compared directly to the action levels; if the reported value exceeds an action level, the required actions must be initiated.
4.0 BACKGROUND
Cadmium is a naturally-occurring environmental contaminant to which humans are continually exposed in food, water, and air. The average daily intake of cadmium by the U.S. population is estimated to ben 10-20 m g/day. Most of this intake is via ingestion, for which absorption is estimated at 4-7% (Kowal et al. 1979). An additional nonoccupational source of cadmium is smoking tobacco; smoking a pack of cigarettes a day adds an additional 2-4 m g cadmium to the daily intake, assuming absorption via inhalation of 25-35% (Nordberg and Nordberg 1988; Friberg and Elinder 1988; Travis and Haddock 1980).
Exposure to cadmium fumes and dusts in an occupational setting where air concentrations are 20-50 m g/m 3 results in an additional daily intake of several hundred micrograms (Friberg and Elinder 1988, p. 563). In such a setting, occupational exposure to cadmium occurs primarily via inhalation, although additional exposure may occur through the ingestion of material via contaminated hands if workers eat or smoke without first washing. Some of the particles that are inhaled initially may be ingested when the material is deposited in the upper respiratory tract, where it may be cleared by mucociliary transport and subsequently swallowed.
Cadmium introduced into the body through inhalation or ingestion is transported by the albumin fraction of the blood plasma to the liver, where it accumulates and is stored principally as a bound form complexed with the protein metallothionein. Metallothionein-bound cadmium is the main form of cadmium subsequently transported to the kidney; it is these 2 organs, the liver and kidney, in which the majority of the cadmium body burden accumulates. As much as one half of the total body burden of cadmium may be found in the kidneys (Nordberg and Nordberg 1988).
Once cadmium has entered the body, elimination is slow; about 0.02% of the body burden is excreted per day via urinary/fecal elimination. The whole-body half-life of cadmium is 10-35 years, decreasing slightly with increasing age (Travis and Haddock 1980).
The continual accumulation of cadmium is the basis for its chronic noncarcinogenic toxicity. This accumulation makes the kidney the target organ in which cadmium toxicity usually is first observed (Piscator 1964). Renal damage may occur when cadmium levels in the kidney cortex approach 200 m g/g wet tissue-weight (Travis and Haddock 1980).
The kinetics and internal distribution of cadmium in the body are complex, and depend on whether occupational exposure to cadmium is ongoing or has terminated. In general, cadmium in blood is related principally to recent cadmium exposure, while cadmium in urine reflects cumulative exposure (i.e., total body burden)(Lauwerys et al. 1976; Friberg and Elinder 1988).
4.1 Health Effects
Studies of workers in a variety of industries indicate that chronic exposure to cadmium may be linked to several adverse health effects including kidney dysfunction, reduced pulmonary function, chronic lung disease and cancer (Federal Register 1990). The primary sites for cadmium-associated cancer appear to be the lung and the prostate.
Cancer. Evidence for an association between cancer and cadmium exposure comes from both epidemiological studies and animal experiments. Pott (1965) found a statistically significant elevation in the incidence of prostate cancer among a cohort of cadmium workers. Other epidemiology studies also report an elevated incidence of prostate cancer; however, the increases observed in these other studies were not statistically significant (Meridian Research, Inc. 1989).
One study (Thun et al. 1985) contains sufficiently quantitative estimates of cadmium exposure to allow evaluation of dose-response relationships between cadmium exposure and lung cancer. A statistically significant excess of lung cancer attributed to cadmium exposure was found in this study, even after accounting for confounding variables such as coexposure to arsenic and smoking habits (Meridian Research, Inc. 1989).
Evidence for quantifying a link between lung cancer and cadmium exposure comes from a single study (Takenaka et al. 1983). In this study, dose-response relationships developed from animal data were extrapolated to humans using a variety of models. OSHA chose the multistage risk model for estimating the risk of cancer for humans using these animal data. Animal injection studies also suggest an association between cadmium exposure and cancer, particularly observations of an increased incidence of tumors at sites remote from the point of injection. The International Agency for Research on Cancer (IARC) (Supplement 7, 1987) indicates that this, and related, evidence is sufficient to classify cadmium as an animal carcinogen. However, the results of these injection studies cannot be used to quantify risks attendant to human occupational exposures due to differences in routes of exposure (Meridian Research, Inc. 1989).
Based on the above-cited studies, the U.S. Environmental Protection Agency (EPA) classifies cadmium as "B1," a probable human carcinogen (USEPA 1985). IARC in 1987 recommended that cadmium be listed as a probable human carcinogen.
Kidney Dysfunction. The most prevalent nonmalignant effect observed among workers chronically exposed to cadmium is kidney dysfunction. Initially, such dysfunction is manifested by proteinuria (Meridian Research, Inc. 1989; Roth Associates, Inc. 1989). Proteinuria associated with cadmium exposure is most commonly characterized by excretion of low-molecular weight proteins (15,000- 40,000 MW), accompanied by loss of electrolytes, uric acid, calcium, amino acids, and phosphate. Proteins commonly excreted include b -2-microglobulin (B2M), retinol-binding protein (RBP), immunoglobulin light chains, and lysozyme. Excretion of low molecular weight proteins is characteristic of damage to the proximal tubules of the kidney (Iwao et al. 1980).
Exposure to cadmium also may lead to urinary excretion of high-molecular weight proteins such as albumin, immunoglobulin G, and glycoproteins (Meridian Research, Inc. 1989; Roth Associates, Inc. 1989). Excretion of high-molecular weight proteins is indicative of damage to the glomeruli of the kidney. Bernard et al. (1979) suggest that cadmium-associated damage to the glomeruli and damage to the proximal tubules of the kidney develop independently of each other, but may occur in the same individual.
Several studies indicate that the onset of low-molecular weight proteinuria is a sign of irreversible kidney damage (Friberg et al. 1974; Roels et al. 1982; Piscator 1984; Elinder et al. 1985; Smith et al. 1986). For many workers, once sufficiently elevated levels of B2M are observed in association with cadmium exposure, such levels do not appear to return to normal even when cadmium exposure is eliminated by removal of the worker from the cadmium-contaminated work environment (Friberg, exhibit 29, 1990).
Some studies indicate that cadmium-induced proteinuria may be progressive; levels of B2MU increase even after cadmium exposure has ceased (Elinder et al. 1985). Other researchers have reached similar conclusions (Frieburg testimony, OSHA docket exhibit 29, Elinder testimony, OSHA docket exhibit 55, and OSHA docket exhibits 8-86B). Such observations are not universal, however (Smith et al. 1986; Tsuchiya 1976). Studies in which proteinuria has not been observed, however, may have initiated the reassessment too early (Meridian Research, Inc.1989; Roth Associates, Inc. 1989; Roels 1989).
A quantitative assessment of the risks of developing kidney dysfunction as a result of cadmium exposure was performed using the data from Ellis et al. (1984) and Falck et al. (1983). Meridian Research, Inc. (1989) and Roth Associates, Inc. (1989) employed several mathematical models to evaluate the data from the 2 studies, and the results indicate that cumulative cadmium exposure levels between 5 and 100 m g-years/m 3 correspond with a one-in-a-thousand probability of developing kidney dysfunction.
When cadmium exposure continues past the onset of early kidney damage (manifested as proteinuria), chronic nephrotoxicity may occur (Meridian Research, Inc. 1989; Roth Associates, Inc. 1989). Uremia, which is the loss of the glomerulus' ability to adequately filter blood, may result. This condition leads to severe disturbance of electrolyte concentrations, which may result in various clinical complications including atherosclerosis, hypertension, pericarditis, anemia, hemorrhagic tendencies, deficient cellular immunity, bone changes, and other problems. Progression of the disease may require dialysis or a kidney transplant.
Studies in which animals are chronically exposed to cadmium confirm the renal effects observed in humans (Friberg et al. 1986). Animal studies also confirm cadmium-related problems with calcium metabolism and associated skeletal effects, which also have been observed among humans. Other effects commonly reported in chronic animal studies include anemia, changes in liver morphology, immunosuppression and hypertension. Some of these effects may be associated with cofactors; hypertension, for example, appears to be associated with diet, as well as with cadmium exposure. Animals injected with cadmium also have shown testicular necrosis.
4.2 Objectives for Medical Monitoring
In keeping with the observation that renal disease tends to be the earliest clinical manifestation of cadmium toxicity, the final cadmium standard mandates that eligible workers must be medically monitored to prevent this condition (as well as cadmium-induced cancer). The objectives of medical-monitoring, therefore, are to: Identify workers at significant risk of adverse health effects from excess, chronic exposure to cadmium; prevent future cases of cadmium-induced disease; detect and minimize existing cadmium-induced disease; and, identify workers most in need of medical intervention.
The overall goal of the medical monitoring program is to protect workers who may be exposed continuously to cadmium over a 45-year occupational lifespan. Consistent with this goal, the medical monitoring program should assure that:
1. Current exposure levels remain sufficiently low to prevent the accumulation of cadmium body burdens sufficient to cause disease in the future by monitoring CDB as an indicator of recent cadmium exposure;
2. cumulative body burdens, especially among workers with undefined historical exposures, remain below levels potentially capable of leading to damage and disease by assessing CDU as an indicator of cumulative exposure to cadmium; and,
3. health effects are not occurring among exposed workers by determining B2MU as an early indicator of the onset of cadmium- induced kidney disease.
4.3 Indicators of Cadmium Exposure and Disease
Cadmium is present in whole blood bound to albumin, in erythrocytes, and as a metallothionein-cadmium complex. The metallothionein-cadmium complex that represents the primary transport mechanism for cadmium delivery to the kidney. CDB concentrations in the general, nonexposed population average 1 m g Cd/l whole blood, with smokers exhibiting higher levels (see Section 5.1.6). Data presented in Section 5.1.6 shows that 95% of the general population not occupationally exposed to cadmium have CDB levels less than 5 m g Cd/l.
If total body burdens of cadmium remain low, CDB concentrations indicate recent exposure (i.e., daily intake). This conclusion is based on data showing that cigarette smokers exhibit CDB concentrations of 2-7 m g/l depending on the number of cigarettes smoked per day (Nordberg and Nordberg 1988), while CDB levels for those who quit smoking return to general population values (approximately 1 m g/l) within several weeks (Lauwerys et al. 1976). Based on these observations, Lauwerys et al. (1976) concluded that CDB has a biological half-life of a few weeks to less than 3 months. As indicated in Section 3.1.6, the upper 95th percentile for CDB levels observed among those who are not occupationally exposed to cadmium is 5 m g/l, which suggests that the absolute upper limit to the range reported for smokers by Nordberg and Nordberg may have been affected by an extreme value (i.e., beyond 2 S above the mean).
Among occupationally-exposed workers, the occupational history of exposure to cadmium must be evaluated to interpret CDB levels. New workers, or workers with low exposures to cadmium, exhibit CDB levels that are representative of recent exposures, similar to the general population. However, for workers with a history of chronic exposure to cadmium, who have accumulated significant stores of cadmium in the kidneys/liver, part of the CDB concentrations appear to indicate body burden. If such workers are removed from cadmium exposure, their CDB levels remain elevated, possibly for years, reflecting prior long-term accumulation of cadmium in body tissues. This condition tends to occur, however, only beyond some threshold exposure value, and possibly indicates the capacity of body tissues to accumulate cadmium which cannot be excreted readily (Friberg and Elinder 1988; Nordberg and Nordberg 1988).
CDU is widely used as an indicator of cadmium body burdens (Nordberg and Nordberg 1988). CDU is the major route of elimination and, when CDU is measured, it is commonly expressed either as m g Cd/l urine (unadjusted), m g Cd/l urine (adjusted for specific gravity), or m g Cd/g CRTU (see Section 5.2.1). The metabolic model for CDU is less complicated than CDB, since CDU is dependent in large part on the body (i.e., kidney) burden of cadmium. However, a small proportion of CDU still be attributed to recent cadmium exposure, particularly if exposure to high airborne concentrations of cadmium occurred. Note that CDU is subject to larger interindividual and day-to-day variations than CDB, so repeated measurements are recommended for CDU evaluations.
CDU is bound principally to metallothionein, regardless of whether the cadmium originates from metallothionein in plasma or from the cadmium pool accumulated in the renal tubules. Therefore, measurement of metallothionein in urine may provide information similar to CDU, while avoiding the contamination problems that may occur during collection and handling urine for cadmium analysis (Nordberg and Nordberg 1988). However, a commercial method for the determination of metallothionein at the sensitivity levels required under the final cadmium rule is not currently available; therefore, analysis of CDU is recommended.
Among the general population not occupationally exposed to cadmium, CDU levels average less than 1 m g/l (see Section 5.2.7). Normalized for creatinine (CRTU), the average CDU concentration of the general population is less than 1 m g/g CRTU. As cadmium accumulates over the lifespan, CDU increases with age. Also, cigarette smokers may eventually accumulate twice the cadmium body burden of nonsmokers, CDU is slightly higher in smokers than in nonsmokers, even several years after smoking cessation (Nordberg and Nordberg 1988). Despite variations due to age and smoking habits, 95% of those not occupationally exposed to cadmium exhibit levels of CDU less than 3 m g/g CRTU (based on the data presented in Section 5.2.7).
About 0.02% of the cadmium body burden is excreted daily in urine. When the critical cadmium concentration (about 200 ppm) in the kidney is reached, or if there is sufficient cadmium-induced kidney dysfunction, dramatic increases in CDU are observed (Nordberg and Nordberg 1988). Above 200 ppm, therefore, CDU concentrations cease to be an indicator of cadmium body burden, and are instead an index of kidney failure.
Proteinuria is an index of kidney dysfunction, and is defined by OSHA to be a material impairment. Several small proteins may be monitored as markers for proteinuria. Below levels indicative of proteinuria, these small proteins may be early indicators of increased risk of cadmium-induced renal tubular disease. Analytes useful for monitoring cadmium-induced renal tubular damage include:
1. b -2-Microglobulin (B2M), currently the most widely used assay for detecting kidney dysfunction, is the best characterized analyte available (Iwao et al. 1980; Chia et al. 1989);
2. Retinol Binding Protein (RBP) is more stable than B2M in acidic urine (i.e., B2M breakdown occurs if urinary pH is less than 5.5; such breakdown may result in false [i.e., low] B2M values [Bernard and Lauwerys, 1990]);
3. N-Acetyl-B-Glucosaminidase (NAG) is the analyte of an assay that is simple, inexpensive, reliable, and correlates with cadmium levels under 10 m g/g CRTU, but the assay is less sensitive than RBP or B2M (Kawada et al. 1989);
4. Metallothionein (MT) correlates with cadmium and B2M levels, and may be a better predictor of cadmium exposure than CDU and B2M (Kawada et al. 1989);
5. Tamm-Horsfall Glycoprotein (THG) increases slightly with elevated cadmium levels, but this elevation is small compared to increases in urinary albumin, RBP, or B2M (Bernard and Lauwerys 1990);
6. Albumin (ALB), determined by the biuret method, is not sufficiently sensitive to serve as an early indicator of the onset of renal disease (Piscator 1962);
7. Albumin (ALB), determined by the Amido Black method, is sensitive and reproducible, but involves a time-consuming procedure (Piscator 1962);
8. Glycosaminoglycan (GAG) increases among cadmium workers, but the significance of this effect is unknown because no relationship has been found between elevated GAG and other indices of tubular damage (Bernard and Lauwerys 1990);
9. Trehalase seems to increase earlier than B2M during cadmium exposure, but the procedure for analysis is complicated and unreliable (Iwata et al. 1988); and,
10. Kallikrein is observed at lower concentrations among cadmium-exposed workers than among normal controls (Roels et al. 1990).
Of the above analytes, B2M appears to be the most widely used and best characterized analyte to evaluate the presence/absence, as well as the extent of, cadmium-induced renal tubular damage (Kawada, Koyama, and Suzuki 1989; Shaikh and Smith 1984; Nogawa 1984). However, it is important that samples be collected and handled so as to minimize B2M degradation under acidic urine conditions.
The threshold value of B2MU commonly used to indicate the presence of kidney damage 300 m g/g CRTU (Kjellstrom et al. 1977a; Buchet et al. 1980; and Kowal and Zirkes 1983). This value represents the upper 95th or 97.5th percentile level of urinary excretion observed among those without tubular dysfunction (Elinder, exbt L-140-45, OSHA docket H057A). In agreement with these conclusions, the data presented in Section 5.3.7 of this protocol generally indicate that the level of 300 m g/g CRTU appears to define the boundary for kidney dysfuncion. It is not clear, however, that this level represents the upper 95th percentile of values observed among those who fail to demonstrate proteinuria effects.
Although elevated B2MU levels appear to be a fairly specific indicator of disease associated with cadmium exposure, other conditions that may lead to elevated B2MU levels include high fevers from influenza, extensive physical exercise, renal disease unrelated to cadmium exposure, lymphomas, and AIDS (Iwao et al. 1980; Schardun and van Epps 1987). Elevated B2M levels observed in association with high fevers from influenza or from extensive physical exercise are transient, and will return to normal levels once the fever has abated or metabolic rates return to baseline values following exercise. The other conditions linked to elevated B2M levels can be diagnosed as part of a properly- designed medical examination. Consequently, monitoring B2M, when accompanied by regular medical examinations and CDB and CDU determinations (as indicators of present and past cadmium exposure), may serve as a specific, early indicator of cadmium- induced kidney damage.
4.4 Criteria for Medical Monitoring of Cadmium Workers
Medical monitoring mandated by the final cadmium rule includes a combination of regular medical examinations and periodic monitoring of 3 analytes: CDB, CDU and B2MU. As indicated above, CDB is monitored as an indicator of current cadmium exposure, while CDU serves as an indicator of the cadmium body burden; B2MU) is assessed as an early marker of irreversible kidney damage and disease.
The final cadmium rule defines a series of action levels that have been developed for each of the 3 analytes to be monitored. These action levels serve to guide the responsible physician through a decision-making process. For each action level that is exceeded, a specific response is mandated. The sequence of action levels, and the attendant actions, are described in detail in the final cadmium rule.
Other criteria used in the medical decision-making process relate to tests performed during the medical examination (including a determination of the ability of a worker to wear a respirator). These criteria, however, are not affected by the results of the analyte determinations addressed in the above paragraphs and, consequently, will not be considered further in these guidelines.
4.5 Defining to Quality and Proficiency of the Analyte Determinations
As noted above in Sections 2 and 3, the quality of a measurement should be defined along with its value to properly interpret the results. Generally, it is necessary to know the accuracy and the precision of a measurement before it can be properly evaluated. The precision of the data from a specific laboratory indicates the extent to which the repeated measurements of the same sample vary within that laboratory. The accuracy of the data provides an indication of the extent to which these results deviate from average results determined from many laboratories performing the same measurement (i.e., in the absence of an independent determination of the true value of a measurement). Note that terms are defined operationally relative to the manner in which they will be used in this protocol. Formal definitions for the terms in italics used in this section can be found in the list of definitions (Section 2).
Another data quality criterion required to properly evaluate measurement results is the limit of detection of that measurement. For measurements to be useful, the range of the measurement which is of interest for biological monitoring purposes must lie entirely above the limit of detection defined for that measurement.
The overall quality of a laboratory's results is termed the performance of that laboratory. The degree to which a laboratory satisfies a minimum performance level is referred to as the proficiency of the laboratory. A successful medical monitoring program, therefore, should include procedures developed for monitoring and recording laboratory performance; these procedures can be used to identify the most proficient laboratories.
5.0 Overview of Medical Monitoring Tests for CDB, CDU, B2MU and CRTU
To evaluate whether available methods for assessing CDB, CDU, B2MU and CRTU are adequate for determining the parameters defined by the proposed action levels, it is necessary to review procedures available for sample collection, preparation and analysis. A variety of techniques for these purposes have been used historically for the determination of cadmium in biological matrices (including CDB and CDU), and for the determination of specific proteins in biological matrices (including B2MU). However, only the most recent techniques are capable of satisfying the required accuracy, precision and sensitivity (i.e., limit of detection) for monitoring at the levels mandated in the final cadmium rule, while still facilitating automated analysis and rapid processing.
5.1 Measuring Cadmium in Blood (CDB)
Analysis of biological samples for cadmium requires strict analytical discipline regarding collection and handling of samples. In addition to occupational settings, where cadmium contamination would be apparent, cadmium is a ubiquitous environmental contaminant, and much care should be exercised to ensure that samples are not contaminated during collection, preparation or analysis. Many common chemical reagents are contaminated with cadmium at concentrations that will interfere with cadmium analysis; because of the widespread use of cadmium compounds as colored pigments in plastics and coatings, the analyst should continually monitor each manufacturer's chemical reagents and collection containers to prevent contamination of samples.
Guarding against cadmium contamination of biological samples is particularly important when analyzing blood samples because cadmium concentrations in blood samples from nonexposed populations are generally less than 2 m g/l (2 ng/ml), while occupationally-exposed workers can be at medical risk to cadmium toxicity if blood concentrations exceed 5 m g/l (ACGIH 1991 and 1992). This narrow margin between exposed and unexposed samples requires that exceptional care be used in performing analytic determinations for biological monitoring for occupational cadmium exposure.
Methods for quantifying cadmium in blood have improved over the last 40 years primarily because of improvements in analytical instrumentation. Also, due to improvements in analytical techniques, there is less need to perform extensive multi-step sample preparations prior to analysis. Complex sample preparation was previously required to enhance method sensitivity (for cadmium), and to reduce interference by other metals or components of the sample.
5.1.1 Analytical Techniques Used to Monitor Cadmium in Biological Matrices
A number of analytical techniques have been used for determining cadmium concentrations in biological materials. A summary of the characteristics of the most widely employed techniques is presented in Table 3. The technique most suitable for medical monitoring for cadmium is atomic absorption spectroscopy (AAS).
To obtain a measurement using AAS, a light source (i.e., hollow cathode or lectrode-free discharge lamp) containing the element of interest as the cathode, is energized and the lamp emits a spectrum that is unique for that element. This light source is focused through a sample cell, and a selected wavelength is monitored by a monochrometer and photodetector cell. Any ground state atoms in the sample that match those of the lamp element and are in the path of the emitted light may absorb some of the light and decrease the amount of light that reaches the photodetector cell. The amount of light absorbed at each characteristic wavelength is proportional to the number of ground state atoms of the corresponding element that are in the pathway of the light between the source and detector.
To determine the amount of a specific metallic element in a sample using AAS, the sample is dissolved in a solvent and aspirated into a high-temperature flame as an aerosol. At high temperatures, the solvent is rapidly evaporated or decomposed and the solute is initially solidified; the majority of the sample elements then are transformed into an atomic vapor. Next, a light beam is focused above the flame and the amount of metal in the sample can be determined by measuring the degree of absorbance of the atoms of the target element released by the flame at a characteristic wavelength.
A more refined atomic absorption technique, flameless AAS, substitutes an electrothermal, graphite furnace for the flame. An aliquot (10-100 m l) of the sample is pipetted into the cold furnace, which is then heated rapidly to generate an atomic vapor of the element.
AAS is a sensitive and specific method for the elemental analysis of metals; its main drawback is nonspecific background absorbtion and scattering of the light beam by particles of the sample as it decomposes at high temperatures; nonspecific absorbance reduces the sensitivity of the analytical method. The problem of nonspecific absorbance and scattering can be reduced by extensive sample pretreatment, such as ashing and/or acid digestion of the sample to reduce its organic content.
Current AAS instruments employ background correction devices to adjust electronically for background absorbtion and scattering. A common method to correct for background effects is to use a deuterium arc lamp as a second light source. A continuum light source, such as the deuterium lamp, emits a broad spectrum of wavelengths instead of specific wavelengths characteristic of a particular element, as with the hollow cathode tube. With this system, light from the primary source and the continuum source are passed alternately through the sample cell. The target element effectively absorbs light only from the primary source (which is much brighter than the continuum source at the characteristic wavelengths), while the background matrix absorbs and scatters light from both sources equally. Therefore, when the ratio of the two beams is measured electronically, the effect of nonspecific background absorption and scattering is eliminated. A less common, but more sophisticated, background correction system is based on the Zeeman effect, which uses a magnetically-activated light polarizer to compensate electronically for nonspecific absorbtion and scattering.
Atomic emission spectroscopy with inductively-coupled argon plasma (AES-ICAP) is widely used to analyze for metals. With this instrument, the sample is aspirated into an extremely hot argon plasma flame, which excites the metal atoms; emission spectra specific for the sample element then are generated. The quanta of emitted light passing through a monochrometer are amplified by photomultiplier tubes and measured by a photodetector to determine the amount of metal in the sample. An advantage of AES- ICAP over AAS is that multi-elemental analyses of a sample can be performed by simultaneously measuring specific elemental emission energies. However, AES-ICAP lacks the sensitivity of AAS, exhibiting a limit of detection which is higher than the limit of detection for graphite-furnace AAS (Table 3).
Neutron activation (NA) analysis and isotope dilution mass spectrometry (IDMS) are 2 additional, but highly specialized, methods that have been used for cadmium determinations. These methods are expensive because they require elaborate and sophisticated instrumentation.
NA analysis has the distinct advantage over other analytical methods of being able to determine cadmium body burdens in specific organs (e.g., liver, kidney) in vivo (Ellis et al. 1983). Neutron bombardment of the target transforms cadmium- 13 to cadmium-114, which promptly decays (< 10 -14 sec) to its ground state, emitting gamma rays that are measured using large gamma detectors; appropriate shielding and instrumentation are required when using this method.
IDMS analysis, a definitive but laborious method, is based on the change in the ratio of 2 isotopes of cadmium (cadmium 111 and 112) that occurs when a known amount of the element (with an artificially altered ratio of the same isotopes [i.e., a cadmium 111 "spike"]is added to a weighed aliquot of the sample (Michiels and De Bievre 1986). (continued)