This section includes a selected list of conditions that seem especially important in drug detection, overdose, or poisoning. Treatment of drug overdose by dialysis or other means can often be assisted with the objective information derived from drug levels. In some cases, drug screening of urine and serum may reveal additional drugs or substances, such as alcohol, which affect management or clinical response.

Lead. Lead exposure in adults is most often due to occupational hazard (e.g., exposure to lead in manufacture or use of gasoline additives and in smelting) or to homemade “moonshine” whiskey distilled in lead-containing equipment. When children are severely affected, it is usually from eating old lead-containing paint chips. One group found some indications of chronic lead exposure in about one half of those persons examined who had lived for more than 5 years near a busy automobile expressway in a major city. Fertilization of crops with city sewage sludge is reported to increase the lead content of the crops. Several studies report that parents who smoke cigarettes are risk factors for increased blood lead values in their children. Living in houses built before 1960 is another risk factor because lead-based paint was used before it was banned. Renovating these houses may spread fragments or powder from the lead-containing paint. Living near factories manufacturing lead batteries is another risk factor.

Symptoms. Acute lead poisoning is uncommon. Symptoms may include “lead colic” (crampy abdominal pain, constipation, occasional bloody diarrhea) and, in 50% of patients, hypertensive encephalopathy. Chronic poisoning is more common. Its varying symptoms may include lead colic, constipation with anorexia (85% of patients), and peripheral neuritis (wrist drop) in adults and lead encephalopathy (headache, convulsions) in children. A “lead line” is frequently present just below the epiphyses (in approximately 70% of patients with clinical symptoms and 20%-40% of persons with abnormal exposure but no symptoms).

Hematologic findings. Most patients develop slight to moderate anemia, usually hypochromic but sometimes normochromic. RBCs with basophilic stippling is the most characteristic peripheral blood finding. Some authors claim stippling is invariably present; others report that stippling is present in only 20%-30% of cases. Normal persons may have as many as 500 stippled cells/1 million RBCs. The reticulocyte count is usually greater than 4%.

d-Aminolevulinic acid dehydrase. Body intake of lead produces biochemical effects on heme synthesis (see Fig. 34-1). The level of d-aminolevulinic acid dehydrase (ALA-D), which converts ALA to porphobilinogen, is decreased as early as the fourth day after exposure begins. Once the ALA-D level is reduced, persistence of abnormality correlates with the amount of lead in body tissues (body burden), so that the ALA-D level remains reduced as long as significant quantities of lead remain. Therefore, after chronic lead exposure, low ALA-D values may persist for years even though exposure has ceased. The level of ALA-D is also a very sensitive indicator of lead toxicity and is usually reduced to 50% or less of normal activity when blood lead values are in the 30-50 µg/100 ml (1.4-2.4 µmol/L) range. Unfortunately, the ALA-D level reaches a plateau when marked reduction takes place, so it cannot be used to quantitate degree of exposure. In addition, this enzyme must be assayed within 24 hours after the blood specimen is secured. Relatively few laboratories perform the test, although it has only a moderate degree of technical difficulty.

Blood lead assay. Intake of lead ordinarily results in rapid urinary lead excretion. If excessive lead exposure continues, lead is stored in bone. If bone storage capacity is exceeded, lead accumulates in soft tissues. Blood lead levels depend on the relationship between intake, storage, and excretion. The blood lead level is primarily an indication of acute (current) exposure but is also influenced by previous storage. According to 1991 Centers for Disease Control (CDC) guidelines, whole blood lead values over 10 µg/100 ml (0.48 µmol/L) are considered abnormal in children less than 6 years old. Values higher than 25 µg/100 ml (1.21 µmol/L) are considered abnormal in children over age 6 years and in adolescents. Values more than 40 µg/100 ml (1.93 µmol/L) are generally considered abnormal in adults, although the cutoff point for children may also be valid for adults. Symptoms of lead poisoning are associated with levels higher than 80 µ/100 ml (3.86 µmol/L), although mild symptoms may occur at 50 µg/100 ml (2.41 µmol/L) in children. Blood lead assay takes considerable experience and dedication to perform accurately. Contamination is a major headache—in drawing the specimen, in sample tubes, in laboratory glassware, and in the assay procedure itself. Special Vacutainer-type tubes for trace metal determination are commercially available and are strongly recommended.

Urine d-aminolevulinic acid (ALA) assay. Another procedure frequently used is urine ALA assay. Blood and urine ALA levels increase when the blood ALA-D level is considerably reduced. Therefore, ALA also becomes an indicator of body lead burden, and urine ALA begins to increase when blood lead values are higher than 40 µg/100 ml (1.93 µmol/L). Disadvantages of urine ALA assay are difficulties with 24-hour urine collection or, if random specimens are used, the effects of urine concentration or dilution on apparent ALA concentration. In addition, at least one investigator found that the urine ALA level was normal in a significant number of cases when the blood lead level was in the 40-80 µg/100 ml (1.93-3.86 µmol/L) (mildly to moderately abnormal) range. Light, room temperature, and alkaline pH all decrease ALA levels. If ALA determination is not done immediately, the specimen must be refrigerated and kept in the dark (the collection bottle wrapped in paper or foil) with the specimen acidified, using glacial acetic or tartaric acid.

Detecting lead exposure. If a patient is subjected to continuous lead exposure of sufficient magnitude, blood lead level, urine lead excretion, ALA-D level, and urine ALA level all correlate well. If the exposure ceases before laboratory tests are made, blood lead level (and sometimes even urine lead level) may decrease relative to ALA-D or urine ALA. Assay of ALA-D is the most sensitive of these tests. In fact, certain patients whose urine ALA and blood lead levels are within normal limits may display a mild to moderate decrease in ALA-D levels. It remains to be determined whether this reflects previous toxicity in all cases or simply means that ALA-D levels between 50% and 100% of normal are too easily produced to mean truly abnormal lead exposure.

Urine lead excretion has also been employed as an index of exposure, since blood lead values change more rapidly than urine lead excretion. However, excretion values depend on 24-hour urine specimens, with the usual difficulty in complete collection. A further problem is that excretion values may be normal in borderline cases or in cases of previous exposure. Urine lead has been measured after administration of a chelating agent such as ethylenediamine tetraacetic acid (EDTA), which mobilizes body stores of lead. This is a more satisfactory technique than ordinary urine excretion for determining body burden (i.e., previous exposure). Abnormal exposure is suggested when the 24-hour urine lead excretion is greater than 1 µg for each milligram of calcium-EDTA administered. Disadvantages are those of incomplete urine collection, difficulty in accurate lead measurement, and occasional cases of EDTA toxicity.

Erythrocyte protoporphyrin (zinc protoporphyrin, or ZPP) is still another indicator of lead exposure. Lead inhibits ferrochelatase (heme synthetase), an enzyme that incorporates iron into protoporphyrin IX (erythrocyte protoporphyrin) to form heme. Decreased erythrocyte protoporphyrin conversion leads to increased erythrocyte protoporphyrin levels. The standard assay for erythrocyte protoporphyrin involved extraction of a mixture of porphyrins, including protoporphyrin IX, from blood, and measurement of protoporphyrin using fluorescent wavelengths. In normal persons, protoporphyrin IX is not complexed to metal ions. Under the conditions of measurement it was thought that the protoporphyrin being measured was metal free, since iron-complexed protoporphyrin did not fluoresce, so that what was being measured was called “free erythrocyte protoporphyrin.” However, in lead poisoning, protoporphyrin IX becomes complexed to zinc; hence, the term ZPP. The protoporphyrin-zinc complex will fluoresce although the protoporphyrin-iron complex will not. Therefore, most laboratory analytic techniques for ZPP involve fluorescent methods. In fact, some have used visual RBC fluorescence (in a heparinized wet preparation using a microscope equipped with ultraviolet light) as a rapid screening test for lead poisoning. Zinc protoporphyrin levels are elevated in about 50%-75% of those who have a subclinical increase in blood lead levels (40-60 µg/100 ml) and are almost always elevated in symptomatic lead poisoning. However, the method is not sensitive enough for childhood lead screening (10 µg/100 ml or 0.48 µmol/L). An instrument called the “hematofluorometer” is available from several manufacturers and can analyze a single drop of whole blood for ZPP. The reading is affected by the number of RBCs present and must be corrected for hematocrit level. The ZPP test results are abnormal in chronic iron deficiency and hemolytic anemia as well as in lead poisoning. The ZPP level is also elevated in erythropoietic protoporphyria (a rare congenital porphyria variant) and in chronic febrile illness. An increased serum bilirubin level falsely increases ZPP readings, and fluorescing drugs or other substances in plasma may interfere.

Urinary coproporphyrin III excretion is usually, although not invariably, increased in clinically evident lead poisoning. Since this compound fluoresces under Wood’s light, simple screening tests based on fluorescence of coproporphyrin III in urine specimens under ultraviolet light have been devised.

Diagnosis of lead poisoning. The question arises as to which test should be used to detect or diagnose lead poisoning. The ALA-D assay is the most sensitive current test, and ALA-D levels may be abnormal (decreased) when all other test results are still normal. Disadvantages are the long-term persistence of abnormality once it is established, which may represent past instead of recent exposure. The specimen is unstable, and few laboratories perform the test. Zinc protoporphyrin is sensitive for lead poisoning and detects 50%-70% of cases of subclinical lead exposures in adults but is not sensitive enough to detect mandated levels of subclinical exposure in young children. There would be a problem in differentiating acute from chronic exposure because of the irreversible change induced in the RBCs, which remains throughout the 120-day life span of the RBCs. Thus, ZPP represents biologic effects of lead averaged over 3-4 months’ time. Also, the test is not specific for lead exposure. Blood lead assay is considered the best diagnostic test for actual lead poisoning. Blood lead indicates either acute or current exposure; levels in single short exposures rise and fall fairly quickly. However, small elevations (in the 40-60 µg/100 ml range), especially in single determinations, may be difficult to interpret because of laboratory variation in the assay. Some investigators recommend assay of blood lead together with ZPP, since elevation of ZPP values would suggest that exposure to lead must have been more than a few days’ duration.

Heavy metals. Mercury, arsenic, bismuth, and antimony are included. Urine samples are preferred to blood samples. Hair and nails are useful for detection or documentation of long-term exposure to arsenic or mercury.

Organic phosphates (cholinesterase inhibitors). Certain insecticides such as parathion and the less powerful malathion are inhibitors of the enzyme acetylcholinesterase. Acetylcholinesterase inactivates excess acetylcholine at nerve endings. Inhibition or inactivation of acetylcholinesterase permits overproduction of acetylcholine at nerve-muscle junctions. Symptoms include muscle twitching, cramps, and weakness; parasympathetic effects such as pinpoint pupils, nausea, sweating, diarrhea, and salivation; and various CNS aberrations. Organic phosphate poisons inactivate not only acetylcholinesterase (which is found in RBCs as well as at nerve endings) but also pseudocholinesterase, which is found in plasma. Therefore, laboratory diagnosis of organophosphate poisoning is based on finding decreased acetylcholines-terase levels in RBCs or pseudocholinesterase in serum (these two cholinesterase types are frequently referred to simply as “cholinesterase.” Levels in RBCs reflect chronic poisoning more accurately than serum values since RBC levels take longer to decrease than serum pseudocholinesterase and take longer to return to normal after exposure. Also, serum levels are reduced by many conditions and drugs. However, plasma measurement is much easier, so that screening tests are generally based on plasma measurement. In acute poisoning, RBC or serum cholinesterase activity is less than 50% of normal. In most cases, a normal result rules out severe acute anticholinesterase toxicity. However, the population reference range is fairly wide, so that a person with a preexposure value in the upper end of the population range might have his or her value decreased 50% and still be within the population reference range. Therefore, lowormal values do not exclude the possibility of organophosphate toxicity. It is strongly recommended that persons who may be occupationally exposed to the organophosphates should have their baseline serum cholinesterase (pseudocholinesterase) value established. Once this is done, periodic monitoring could be done to detect subclinical toxicity. It may take up to 6 weeks for serum pseudocholinesterase to return to normal after the end of exposure. Severe acute or chronic liver disease or pregnancy can decrease cholinesterase levels.

Barbiturates and glutethimide. Barbiturates and glutethimide (Doriden) are the most common vehicles of drug-overdose suicide. In testing, either anticoagulated (not heparinized) whole blood or urine can be used; blood is preferred. TLC is used both for screening and to identify the individual substance involved. Chemical screening tests are also available. It is preferable to secure both blood and urine specimens plus gastric contents, if available. Many of the larger laboratories can perform quantitative assay of serum phenobarbital.

Phenothiazine tranquilizers. Urine can be screened with the dipstick Phenistix or the ferric chloride procedure. TLC, GC, and other techniques are available for detection and quantitation.

Acetaminophen. Acetaminophen (Paracetamol; many different brand names) has been replacing aspirin for headache and minor pain because of the gastric irritant and anticoagulant side effects of aspirin. With greater use of acetaminophen has come occasional cases of overdose. Acetaminophen is rapidly absorbed from the small intestine. Peak serum concentration is reached in 0.5-2 hours, and the serum half-life is 1-4 hours. About 15%-50% is bound to serum albumin. Acetaminophen is 80%-90% metabolized by the liver microsome pathway, 4%-14% is excreted unchanged by the kidneys, and a small amount is degraded by other mechanisms.

Liver toxocity. The usual adult dose is 0.5 gm every 3-4 hours. In adults, liver toxicity is unlikely to occur if the ingested dose is less than 10 gm at one time, and death is unlikely if less than 15 gm is ingested. However, 10 gm or more at one time may produce liver damage, and 25 gm can be fatal. Children under age 5 years are less likely to develop liver injury. The toxic symptoms of overdose usually subside in 24 hours after the overdose, even in persons who subsequently develop liver injury. Liver function test results are typical of acute hepatocellular injury, with AST (SGOT) levels similar to those of acute hepatitis virus hepatitis. The peak of AST elevation most often occurs 4-6 days after onset. The liver completely recovers in about 3 months if the patient survives.

Laboratory evaluation. Serum acetaminophen levels are helpful to estimate the likelihood of hepatic damage. These levels are used as a guide to continue or discontinue therapy. Peak acetaminophen levels provide the best correlation with toxicity. Current recommendations are that the assay specimen should be drawn 4 hours after ingestion of the dose—not earlier—to be certain that the peak has been reached. A serum level greater than 200 µg/ml (13.2 µmol/L) at 4 hours is considered potentially toxic, and a level less than 150 µg/ml (9.9 µmol/L) is considered nontoxic. The assay should be repeated 12 hours after ingestion. A value greater than 50 µg/ml (3.3 µmol/L) is considered toxic, and a value less than 35 µg/ml (2.3 µmol/L) is considered nontoxic. Colorimetric assay methods are available in kit form that are technically simple and reasonably accurate. However, inexperienced persons can obtain misleading results, so the amount of drug ingested and other factors should also be considered before therapy is terminated. Salicylates, ketones, and ascorbic acid (vitamin C) in high concentration interfere in some assay methods. Either AST or alanine aminotransferase levels should be determined daily for at least 4 days as a further check on liver function.

Acetylsalicylic acid (aspirin). Absorption of aspirin takes place in the stomach and small intestine. Absorption is influenced by rate of tablet dissolution and by pH (acid pH assists absorption and alkaline pH retards it). Peak plasma concentration from ordinary aspirin doses is reached in about 2 hours (1-4 hours), with the peak from enteric-coated aspirin about 4-6 hours later. In some cases of overdose serum values from ordinary aspirin may take several hours to reach their maximum level due to pylorospasm. Aspirin is rapidly metabolized to salicylic acid in GI tract mucosa and liver; it is further metabolized by the liver to metabolically inactive salicyluric acid. Of the original dose, 10%-80% is excreted by the kidneys as salicylate, 5%-15% as salicylic acid, and 15%-40% as salicyluric acid. The half-life of salicylate or its active metabolites in serum at usual drug doses is 2-4.5 hours. The half-life is dose dependent, since the degradation pathways can be saturated. At high doses the half-life may be 15-30 hours. Also, steady-state serum concentration is not linear with respect to dose, with disproportionate serum levels being produced by much smaller increments in dose.

Laboratory tests. Mild toxicity (tinnitus, visual disturbances, GI tract disturbances) correlates with serum salicylate levels more than 30 mg/100 ml (300 µg/ml), and severe toxicity (CNS symptoms) is associated with levels more than 50 mg/100 ml (500 µg/ml). In younger children, severe toxicity is often associated with ketosis and metabolic acidosis, whereas in older children and adults, respiratory alkalosis or mixed acidosis-alkalosis is more frequent. Peak serum salicylate values correlate best with toxicity. It is recommended that these be drawn at least 6 hours after the overdose to avoid serum values falsely below peak levels due to delayed absorption. Enteric-coated aspirin delays absorption an additional 4-6 hours. Screening tests for salicylates include urine testing with a ferric chloride reagent or Phenistix (both of these tests are also used in the diagnosis of phenylketonuria). The most commonly used quantitative test is a colorimetric procedure based on ferric chloride. Ketone bodies and phenothiazine tranquilizers can interfere.

Carbon monoxide. Carbon monoxide combines with hemoglobin to form carboxyhemoglobin. While doing this it occupies oxygen-binding sites and also produces a change in the hemoglobin molecule that binds the remaining oxygen more tightly, with less being available for tissue cell respiration. Headache, fatigue, and lightheadedness are the most frequent symptoms.

Laboratory diagnosis. Carbon monoxide poisoning is detected by hemoglobin analysis for carboxyhemoglobin. This is most readily done on an instrument called a CO-Oximeter. A 30%-40% carboxyhemoglobin content is associated with severe symptoms, and more than 50% is associated with coma. Cigarette smoking may produce levels as high as 10%-15%. Carboxyhemoglobin is stable for more than 1 week at room temperature in EDTA anticoagulant. The specimen should be drawn as soon as possible after exposure, since carbon monoxide is rapidly cleared from hemoglobin by breathing normal air.

Carbon monoxide poisoning can be suspected from an arterial blood gas specimen when a measured oxygen saturation (percent O2 saturation of hemoglobin) value is found to be significantly below what would be expected if oxygen saturation were calculated from the PO2 and pH values. For this screening procedure to be valid, the O2 saturation must be measured directly, not calculated. Some blood gas machines measure O2 saturation, but the majority calculate it from the PO2 and pH values.

Ethyl alcohol (ethanol). Ethanol is absorbed from the small intestine and, to a lesser extent, from the stomach. Factors that influence absorption are (1) whether food is also ingested, since food delays absorption, and if so, the amount and kind of food; (2) the rate of gastric emptying; and (3) the type of alcoholic beverage ingested. Without food, the absorptive phase (time period during which alcohol is being absorbed until the peak blood value is reached) may be as short as 15 minutes or as long as 2 hours. In one study, peak values occurred at 30 minutes after ingestion in nearly 50% of experimental subjects and in about 75% by 1 hour, but 6% peaked as late as 2 hours. With food, absorption is delayed to varying degrees. Once absorbed, ethanol rapidly equilibrates throughout most body tissues. The liver metabolizes about 75% of absorbed ethanol. The predominant liver cell metabolic pathway of ethanol is the alcohol dehydrogenase enzyme system, whose product is acetaldehyde. Acetaldehyde, in turn, is metabolized by the hepatic microsome system. About 10%-15% of absorbed ethanol is excreted unchanged through the kidneys and through the lungs.

Ethanol measurement. There are several methods for patient alcohol measurement. The legal system generally recognizes whole blood as the gold standard specimen. Arterial blood ethanol is somewhat higher than venous blood levels, especially in the active absorption phase. Capillary blood (fingerstick or ear lobe blood) is about 70%-85% of the arterial concentration. The major problems with whole blood are that values are influenced by the hematocrit, and most current chemistry analyzers must use serum. A serum value is about 18%-20% higher than a whole blood value obtained on the same specimen, whereas blood levels correlated by law to degrees of physical and mental impairment are defined by whole blood assay. Serum values theoretically can be converted to equivalent whole blood values by means of a serum/whole blood (S/WB) conversion ratio. Most laboratories apparently use a S/WB conversion ratio of 1.20. Unfortunately, there is significant disagreement in the literature on which ratio to use; different investigators report S/WB ratios varying between 1.03 and 1.35. Based on the work of Rainey (1993), the median S/WB conversion ratio is 1.15 (rather than 1.20), the range of ratios included in 95% certainty is 0.95-1.40, and the range of ratios included in 99% certainty is 0.90-1.49. Whole blood values can be obtained directly by using serum analytic methods on a protein-free filtrate from a whole blood specimen.

Enzymatic methods using alcohol dehydrogenase or alcohol oxidase are replacing the classic potassium dicromate methods. There is some dispute in the literature whether or not alcohol dehydrogenase methods are affected by isopropanol (commonly used for venipuncture skin cleansing). In experiments performed in my laboratory, no cross-reaction was found in concentrations much stronger than what should be encountered from skin cleansing. Nevertheless, because of legal considerations, specimens for ethanol should be drawn without using any type of alcohol as a skin-cleansing agent. Increased blood ketones, as found in diabetic ketoacidosis, can falsely elevate either blood or breath alcohol test results.

Urine is not recommended for analysis to estimate degree of alcohol effect because the blood/urine ratio is highly variable and there may be stasis of the specimen in the bladder. However, urine can be used to screen for the presence of alcohol. Breath analyzers are the assay method most commonly used for police work since the measurement can be done wherever or whenever it is desirable. Breath analyzers measure the ethanol content at the end of expiration following a deep inspiration. The measurement is then correlated to whole blood by multiplying the measured breath ethanol level by the factor 2,100. On the average, breath alcohol concentration correlates reasonably well with whole blood alcohol concentration using this factor. However, there is significant variation between correlation factors reported in different individuals and average factors in different groups, so that use of any single “universal” factor will underestimate the blood ethanol concentration in some persons and overestimate it in others. Also, correlation with blood ethanol levels is better when breath ethanol is measured in the postabsorptive state than in the absorptive state. When breath analyzers are used, it is important that there be a period of at least 15 minutes before testing during which no alcohol ingestion, smoking, food or drink consumption, or vomiting has taken place to avoid contamination of the breath specimen by alcohol in the mouth. Some alcohol-containing mouthwashes may produce legally significant breath alcohol levels at 2 minutes after applying the mouthwash, but not at 10 minutes after use. Ketone bodies in patients with diabetic acidosis may interfere with breath ethanol measurement. One further advantage of breath testing in the field is the usefulness of a negative test for ethanol in a person whose behavior suggests effects of alcohol; this result could mean a serious acute medical problem that needs immediate attention.

Legal use of blood alcohol assay. Most courts of law follow the recommendations of the National Safety Council on alcohol and drugs (the following ethanol values are whole blood values):

  • Below 0.05% (50 mg/100 ml): No influence by alcohol within the meaning of the law.
  • Between 0.05% and 0.10% (50-100 mg/100 ml): A liberal, wide zone in which alcohol influence usually is present, but courts of law are advised to consider the person’s behavior and circumstances leading to the arrest in making their decision.
  • Above 0.10% (100 mg/100 ml): Definite evidence of being “under the influence,” since most persons with this concentration will have lost, to a measurable extent, some of the clearness of intellect and self-control they would normally possess.

Based on the work of Rainey, the minimal serum alcohol level that would correspond to a whole blood alcohol level of 0.10% (100 mg/100 ml, w/v) with 95% certainty is 140 mg/100 ml (30.4 mmol/L) and at 99% certainty is 149 mg/100 ml (32.3 mmol/L).

Some organizations, including the American Medical Association (AMA) Council on Scientific Affairs (1986), suggest adopting 0.05% blood alcohol content as per se evidence of alcohol-impaired driving.

Estimating previous blood alcohol levels. In certain situations it would be desirable to estimate the blood alcohol level at some previous time from the results of a subsequent alcohol level. The usual method for this is the Widmark equation: P = A + (F Ч T), when P is the concentration of blood alcohol (in milligrams per liter) at the previous time, A is the concentration of blood alcohol (in milligrams per liter) when it was measured, F is a factor (or constant) whose value is 130 (in milligrams per kilogram per hour), and T is the time (in hours) elapsed between the time the blood alcohol was measured and the previous time that the blood alcohol value must be estimated.

There is considerable controversy regarding the usefulness of the Widmark equation. The equation is valid for a person only in the postabsorptive state (i.e., after the peak blood alcohol level is reached). Time to peak is most often considered to be 0.5-2.0 hours, so that the blood specimen must be drawn no earlier than 2 hours after the beginning of alcohol intake. The Widmark equation is based on kinetics of alcohol taken during fasting. Food increases alcohol elimination, so that food would cause the Widmark equation to overestimate the previous alcohol level. The factor (constant) of 130 is not necessarily applicable to any individual person, since a range of experimentally measured individual values from 100-340 has been reported.

Clinical and laboratory effects of alcohol. Alcohol has a considerable number of metabolic and toxic effects that may directly or indirectly involve the clinical laboratory. Liver manifestations include gamma-glutamyltransferase (GGT; formerly gamma-glutamyl transpeptidase) elevation, fatty liver, acute alcoholic hepatitis (“active cirrhosis”), or Laennec’s cirrhosis, and may lead indirectly to bleeding from esophageal varices or to cytopenia either from hypersplenism or through other mechanisms. RBC macrocytosis is frequently associated with chronic alcoholism, and nutritional anemia such as that due to folic acid may be present. Other frequently associated conditions include acute pancreatitis, hypertriglyceridemia, alcoholic gastritis, alcoholic hypoglycemia, various neurologic abnormalities, and subdural hematoma. The chronic alcoholic is more susceptible to infection. Finally, alcohol interacts with a variety of medications. It potentiates many of the CNS depressants, such as various sedatives, narcotics, hypnotics, and tranquilizers (especially chlordiazepoxide and diazepam). Alcohol is a factor in many cases of overdose, even when the patient has no history of alcohol intake or denies intake. The presence of alcohol should be suspected when toxicity symptoms from barbiturates or other medications are associated with blood levels that normally would be considered safe. Alcohol may antagonize the action of various other medications, such as coumarin and phenytoin. Alcohol intake in pregnancy has been reported to produce increased rates of stillbirth and infant growth deficiency as well as a specific “fetal alcohol syndrome.” Fetal alcohol syndrome includes a particular pattern of facial appearance, postnatal growth deficiency with normal bone age, various skeletal and organ malformations, and various neurologic abnormalities (including average IQ below normal).

Ethanol is one of a number of substances (other alcohols, lactic acid, etc.) that elevate serum osmolality using freezing point depression methods. This creates a gap between measured osmolality and calculated osmolality. Osmolality using vapor pressure instruments is not affected by ethanol.

Laboratory screening for alcoholism. Various tests have been used to screen for chronic alcoholism. Of these, the most commonly advocated is the GGT. This test and the AST reflect the effect of alcohol on the liver. A third possibility is mean corpuscular volume (MCV), the average size of the patient RBC. This reflects macrocytosis induced by liver disease and possibly also by folic acid deficiency in alcoholics. In heavy drinkers or alcoholics, GGT has a sensitivity of about 70% (literature range, 63%-81%), MCV detects about 60% (26%-90%), and the AST value is elevated in about 50% (27%-77%). Most (but not all) reports indicate some correlation in likelihood and degree of GGT elevation with the amount and frequency of alcohol consumption. Thus, heavy drinkers are more likely to have GGT elevations, and these elevations are (on the average) higher than those of less heavy drinkers. However, there are many exceptions. There is some disagreement as to whether so-called social drinkers have a significant incidence of elevated GGT levels. The majority of investigators seem to believe that they do not.

Other biochemical abnormalities associated with alcoholism (but found in <40% of cases) include hypophosphatemia, hypomagnesemia, hyponatremia, hypertriglyceridemia, and hyperuricemia.

An isoform of transferrin that contains fewer sialic acid molecules than normal transferrin and thus is called carbohydrate-deficient transferrin has been advocated by some researchers. A few studies claim that in alcoholism its levels become elevated more often than those of GGT and that therefore it is a much more specific indicator of alcohol abuse. One study claimed 55% sensitivity in detecting moderate alcohol intake and nearly 100% sensitivity in heavy chronic drinkers. A commercial assay kit is available. However, at present the test would most likely have to be performed in a large reference laboratory.

Tests for Tobacco Use. In some instances, such as in smoking-cessation clinics, life insurance company examinations, and tests to determine degree of passive exposure to tobacco smoke, it is desirable to detect and quantitate tobacco exposure. The modalities that have been investigated are carboxyhemoglobin (based on effect of carbon monoxide generated by tobacco combustion), thiocyanate (a metabolite of cyanide derived from tobacco tar), and cotinine (a metabolite of nicotine). Most current tests are based on thiocyanate or cotinine. Thiocyanate is absorbed in the lungs and has a biological half-life of about 14 days. Ingestion of certain vegetables can falsely elevate serum thiocyanate levels. Cotinine is specific for nicotine, is not affected by diet, has a serum within-day variation of about 15%-20%, and has a biological half-life of about 19 hours. Thus, cotinine tests become negative after tobacco abstinence of a week or less, whereas thiocyanate requires considerably longer before becoming nondetectable. Also, thiocyanate can be assayed chemically and less expensively than cotinine, which is done by immunoassay. Nevertheless, because cotinine is specific for nicotine and is affected only by active or passive exposure to tobacco, cotinine seems to be favored by investigators. Cotinine can be assayed in serum, saliva, or urine; the levels are higher in urine.