Articles on Medical Diseases and Conditions

Toxicology

This section includes a selected list of conditions that seem especially important in drug detection, overdose, or poisoning. Treatment of drug overdose by dialysis or other means can often be assisted with the objective information derived from drug levels. In some cases, drug screening of urine and serum may reveal additional drugs or substances, such as alcohol, which affect management or clinical response.

Lead. Lead exposure in adults is most often due to occupational hazard (e.g., exposure to lead in manufacture or use of gasoline additives and in smelting) or to homemade “moonshine” whiskey distilled in lead-containing equipment. When children are severely affected, it is usually from eating old lead-containing paint chips. One group found some indications of chronic lead exposure in about one half of those persons examined who had lived for more than 5 years near a busy automobile expressway in a major city. Fertilization of crops with city sewage sludge is reported to increase the lead content of the crops. Several studies report that parents who smoke cigarettes are risk factors for increased blood lead values in their children. Living in houses built before 1960 is another risk factor because lead-based paint was used before it was banned. Renovating these houses may spread fragments or powder from the lead-containing paint. Living near factories manufacturing lead batteries is another risk factor.

Symptoms. Acute lead poisoning is uncommon. Symptoms may include “lead colic” (crampy abdominal pain, constipation, occasional bloody diarrhea) and, in 50% of patients, hypertensive encephalopathy. Chronic poisoning is more common. Its varying symptoms may include lead colic, constipation with anorexia (85% of patients), and peripheral neuritis (wrist drop) in adults and lead encephalopathy (headache, convulsions) in children. A “lead line” is frequently present just below the epiphyses (in approximately 70% of patients with clinical symptoms and 20%-40% of persons with abnormal exposure but no symptoms).

Hematologic findings. Most patients develop slight to moderate anemia, usually hypochromic but sometimes normochromic. RBCs with basophilic stippling is the most characteristic peripheral blood finding. Some authors claim stippling is invariably present; others report that stippling is present in only 20%-30% of cases. Normal persons may have as many as 500 stippled cells/1 million RBCs. The reticulocyte count is usually greater than 4%.

d-Aminolevulinic acid dehydrase. Body intake of lead produces biochemical effects on heme synthesis (see Fig. 34-1). The level of d-aminolevulinic acid dehydrase (ALA-D), which converts ALA to porphobilinogen, is decreased as early as the fourth day after exposure begins. Once the ALA-D level is reduced, persistence of abnormality correlates with the amount of lead in body tissues (body burden), so that the ALA-D level remains reduced as long as significant quantities of lead remain. Therefore, after chronic lead exposure, low ALA-D values may persist for years even though exposure has ceased. The level of ALA-D is also a very sensitive indicator of lead toxicity and is usually reduced to 50% or less of normal activity when blood lead values are in the 30-50 µg/100 ml (1.4-2.4 µmol/L) range. Unfortunately, the ALA-D level reaches a plateau when marked reduction takes place, so it cannot be used to quantitate degree of exposure. In addition, this enzyme must be assayed within 24 hours after the blood specimen is secured. Relatively few laboratories perform the test, although it has only a moderate degree of technical difficulty.

Blood lead assay. Intake of lead ordinarily results in rapid urinary lead excretion. If excessive lead exposure continues, lead is stored in bone. If bone storage capacity is exceeded, lead accumulates in soft tissues. Blood lead levels depend on the relationship between intake, storage, and excretion. The blood lead level is primarily an indication of acute (current) exposure but is also influenced by previous storage. According to 1991 Centers for Disease Control (CDC) guidelines, whole blood lead values over 10 µg/100 ml (0.48 µmol/L) are considered abnormal in children less than 6 years old. Values higher than 25 µg/100 ml (1.21 µmol/L) are considered abnormal in children over age 6 years and in adolescents. Values more than 40 µg/100 ml (1.93 µmol/L) are generally considered abnormal in adults, although the cutoff point for children may also be valid for adults. Symptoms of lead poisoning are associated with levels higher than 80 µ/100 ml (3.86 µmol/L), although mild symptoms may occur at 50 µg/100 ml (2.41 µmol/L) in children. Blood lead assay takes considerable experience and dedication to perform accurately. Contamination is a major headache—in drawing the specimen, in sample tubes, in laboratory glassware, and in the assay procedure itself. Special Vacutainer-type tubes for trace metal determination are commercially available and are strongly recommended.

Urine d-aminolevulinic acid (ALA) assay. Another procedure frequently used is urine ALA assay. Blood and urine ALA levels increase when the blood ALA-D level is considerably reduced. Therefore, ALA also becomes an indicator of body lead burden, and urine ALA begins to increase when blood lead values are higher than 40 µg/100 ml (1.93 µmol/L). Disadvantages of urine ALA assay are difficulties with 24-hour urine collection or, if random specimens are used, the effects of urine concentration or dilution on apparent ALA concentration. In addition, at least one investigator found that the urine ALA level was normal in a significant number of cases when the blood lead level was in the 40-80 µg/100 ml (1.93-3.86 µmol/L) (mildly to moderately abnormal) range. Light, room temperature, and alkaline pH all decrease ALA levels. If ALA determination is not done immediately, the specimen must be refrigerated and kept in the dark (the collection bottle wrapped in paper or foil) with the specimen acidified, using glacial acetic or tartaric acid.

Detecting lead exposure. If a patient is subjected to continuous lead exposure of sufficient magnitude, blood lead level, urine lead excretion, ALA-D level, and urine ALA level all correlate well. If the exposure ceases before laboratory tests are made, blood lead level (and sometimes even urine lead level) may decrease relative to ALA-D or urine ALA. Assay of ALA-D is the most sensitive of these tests. In fact, certain patients whose urine ALA and blood lead levels are within normal limits may display a mild to moderate decrease in ALA-D levels. It remains to be determined whether this reflects previous toxicity in all cases or simply means that ALA-D levels between 50% and 100% of normal are too easily produced to mean truly abnormal lead exposure.

Urine lead excretion has also been employed as an index of exposure, since blood lead values change more rapidly than urine lead excretion. However, excretion values depend on 24-hour urine specimens, with the usual difficulty in complete collection. A further problem is that excretion values may be normal in borderline cases or in cases of previous exposure. Urine lead has been measured after administration of a chelating agent such as ethylenediamine tetraacetic acid (EDTA), which mobilizes body stores of lead. This is a more satisfactory technique than ordinary urine excretion for determining body burden (i.e., previous exposure). Abnormal exposure is suggested when the 24-hour urine lead excretion is greater than 1 µg for each milligram of calcium-EDTA administered. Disadvantages are those of incomplete urine collection, difficulty in accurate lead measurement, and occasional cases of EDTA toxicity.

Erythrocyte protoporphyrin (zinc protoporphyrin, or ZPP) is still another indicator of lead exposure. Lead inhibits ferrochelatase (heme synthetase), an enzyme that incorporates iron into protoporphyrin IX (erythrocyte protoporphyrin) to form heme. Decreased erythrocyte protoporphyrin conversion leads to increased erythrocyte protoporphyrin levels. The standard assay for erythrocyte protoporphyrin involved extraction of a mixture of porphyrins, including protoporphyrin IX, from blood, and measurement of protoporphyrin using fluorescent wavelengths. In normal persons, protoporphyrin IX is not complexed to metal ions. Under the conditions of measurement it was thought that the protoporphyrin being measured was metal free, since iron-complexed protoporphyrin did not fluoresce, so that what was being measured was called “free erythrocyte protoporphyrin.” However, in lead poisoning, protoporphyrin IX becomes complexed to zinc; hence, the term ZPP. The protoporphyrin-zinc complex will fluoresce although the protoporphyrin-iron complex will not. Therefore, most laboratory analytic techniques for ZPP involve fluorescent methods. In fact, some have used visual RBC fluorescence (in a heparinized wet preparation using a microscope equipped with ultraviolet light) as a rapid screening test for lead poisoning. Zinc protoporphyrin levels are elevated in about 50%-75% of those who have a subclinical increase in blood lead levels (40-60 µg/100 ml) and are almost always elevated in symptomatic lead poisoning. However, the method is not sensitive enough for childhood lead screening (10 µg/100 ml or 0.48 µmol/L). An instrument called the “hematofluorometer” is available from several manufacturers and can analyze a single drop of whole blood for ZPP. The reading is affected by the number of RBCs present and must be corrected for hematocrit level. The ZPP test results are abnormal in chronic iron deficiency and hemolytic anemia as well as in lead poisoning. The ZPP level is also elevated in erythropoietic protoporphyria (a rare congenital porphyria variant) and in chronic febrile illness. An increased serum bilirubin level falsely increases ZPP readings, and fluorescing drugs or other substances in plasma may interfere.

Urinary coproporphyrin III excretion is usually, although not invariably, increased in clinically evident lead poisoning. Since this compound fluoresces under Wood’s light, simple screening tests based on fluorescence of coproporphyrin III in urine specimens under ultraviolet light have been devised.

Diagnosis of lead poisoning. The question arises as to which test should be used to detect or diagnose lead poisoning. The ALA-D assay is the most sensitive current test, and ALA-D levels may be abnormal (decreased) when all other test results are still normal. Disadvantages are the long-term persistence of abnormality once it is established, which may represent past instead of recent exposure. The specimen is unstable, and few laboratories perform the test. Zinc protoporphyrin is sensitive for lead poisoning and detects 50%-70% of cases of subclinical lead exposures in adults but is not sensitive enough to detect mandated levels of subclinical exposure in young children. There would be a problem in differentiating acute from chronic exposure because of the irreversible change induced in the RBCs, which remains throughout the 120-day life span of the RBCs. Thus, ZPP represents biologic effects of lead averaged over 3-4 months’ time. Also, the test is not specific for lead exposure. Blood lead assay is considered the best diagnostic test for actual lead poisoning. Blood lead indicates either acute or current exposure; levels in single short exposures rise and fall fairly quickly. However, small elevations (in the 40-60 µg/100 ml range), especially in single determinations, may be difficult to interpret because of laboratory variation in the assay. Some investigators recommend assay of blood lead together with ZPP, since elevation of ZPP values would suggest that exposure to lead must have been more than a few days’ duration.

Heavy metals. Mercury, arsenic, bismuth, and antimony are included. Urine samples are preferred to blood samples. Hair and nails are useful for detection or documentation of long-term exposure to arsenic or mercury.

Organic phosphates (cholinesterase inhibitors). Certain insecticides such as parathion and the less powerful malathion are inhibitors of the enzyme acetylcholinesterase. Acetylcholinesterase inactivates excess acetylcholine at nerve endings. Inhibition or inactivation of acetylcholinesterase permits overproduction of acetylcholine at nerve-muscle junctions. Symptoms include muscle twitching, cramps, and weakness; parasympathetic effects such as pinpoint pupils, nausea, sweating, diarrhea, and salivation; and various CNS aberrations. Organic phosphate poisons inactivate not only acetylcholinesterase (which is found in RBCs as well as at nerve endings) but also pseudocholinesterase, which is found in plasma. Therefore, laboratory diagnosis of organophosphate poisoning is based on finding decreased acetylcholines-terase levels in RBCs or pseudocholinesterase in serum (these two cholinesterase types are frequently referred to simply as “cholinesterase.” Levels in RBCs reflect chronic poisoning more accurately than serum values since RBC levels take longer to decrease than serum pseudocholinesterase and take longer to return to normal after exposure. Also, serum levels are reduced by many conditions and drugs. However, plasma measurement is much easier, so that screening tests are generally based on plasma measurement. In acute poisoning, RBC or serum cholinesterase activity is less than 50% of normal. In most cases, a normal result rules out severe acute anticholinesterase toxicity. However, the population reference range is fairly wide, so that a person with a preexposure value in the upper end of the population range might have his or her value decreased 50% and still be within the population reference range. Therefore, lowormal values do not exclude the possibility of organophosphate toxicity. It is strongly recommended that persons who may be occupationally exposed to the organophosphates should have their baseline serum cholinesterase (pseudocholinesterase) value established. Once this is done, periodic monitoring could be done to detect subclinical toxicity. It may take up to 6 weeks for serum pseudocholinesterase to return to normal after the end of exposure. Severe acute or chronic liver disease or pregnancy can decrease cholinesterase levels.

Barbiturates and glutethimide. Barbiturates and glutethimide (Doriden) are the most common vehicles of drug-overdose suicide. In testing, either anticoagulated (not heparinized) whole blood or urine can be used; blood is preferred. TLC is used both for screening and to identify the individual substance involved. Chemical screening tests are also available. It is preferable to secure both blood and urine specimens plus gastric contents, if available. Many of the larger laboratories can perform quantitative assay of serum phenobarbital.

Phenothiazine tranquilizers. Urine can be screened with the dipstick Phenistix or the ferric chloride procedure. TLC, GC, and other techniques are available for detection and quantitation.

Acetaminophen. Acetaminophen (Paracetamol; many different brand names) has been replacing aspirin for headache and minor pain because of the gastric irritant and anticoagulant side effects of aspirin. With greater use of acetaminophen has come occasional cases of overdose. Acetaminophen is rapidly absorbed from the small intestine. Peak serum concentration is reached in 0.5-2 hours, and the serum half-life is 1-4 hours. About 15%-50% is bound to serum albumin. Acetaminophen is 80%-90% metabolized by the liver microsome pathway, 4%-14% is excreted unchanged by the kidneys, and a small amount is degraded by other mechanisms.

Liver toxocity. The usual adult dose is 0.5 gm every 3-4 hours. In adults, liver toxicity is unlikely to occur if the ingested dose is less than 10 gm at one time, and death is unlikely if less than 15 gm is ingested. However, 10 gm or more at one time may produce liver damage, and 25 gm can be fatal. Children under age 5 years are less likely to develop liver injury. The toxic symptoms of overdose usually subside in 24 hours after the overdose, even in persons who subsequently develop liver injury. Liver function test results are typical of acute hepatocellular injury, with AST (SGOT) levels similar to those of acute hepatitis virus hepatitis. The peak of AST elevation most often occurs 4-6 days after onset. The liver completely recovers in about 3 months if the patient survives.

Laboratory evaluation. Serum acetaminophen levels are helpful to estimate the likelihood of hepatic damage. These levels are used as a guide to continue or discontinue therapy. Peak acetaminophen levels provide the best correlation with toxicity. Current recommendations are that the assay specimen should be drawn 4 hours after ingestion of the dose—not earlier—to be certain that the peak has been reached. A serum level greater than 200 µg/ml (13.2 µmol/L) at 4 hours is considered potentially toxic, and a level less than 150 µg/ml (9.9 µmol/L) is considered nontoxic. The assay should be repeated 12 hours after ingestion. A value greater than 50 µg/ml (3.3 µmol/L) is considered toxic, and a value less than 35 µg/ml (2.3 µmol/L) is considered nontoxic. Colorimetric assay methods are available in kit form that are technically simple and reasonably accurate. However, inexperienced persons can obtain misleading results, so the amount of drug ingested and other factors should also be considered before therapy is terminated. Salicylates, ketones, and ascorbic acid (vitamin C) in high concentration interfere in some assay methods. Either AST or alanine aminotransferase levels should be determined daily for at least 4 days as a further check on liver function.

Acetylsalicylic acid (aspirin). Absorption of aspirin takes place in the stomach and small intestine. Absorption is influenced by rate of tablet dissolution and by pH (acid pH assists absorption and alkaline pH retards it). Peak plasma concentration from ordinary aspirin doses is reached in about 2 hours (1-4 hours), with the peak from enteric-coated aspirin about 4-6 hours later. In some cases of overdose serum values from ordinary aspirin may take several hours to reach their maximum level due to pylorospasm. Aspirin is rapidly metabolized to salicylic acid in GI tract mucosa and liver; it is further metabolized by the liver to metabolically inactive salicyluric acid. Of the original dose, 10%-80% is excreted by the kidneys as salicylate, 5%-15% as salicylic acid, and 15%-40% as salicyluric acid. The half-life of salicylate or its active metabolites in serum at usual drug doses is 2-4.5 hours. The half-life is dose dependent, since the degradation pathways can be saturated. At high doses the half-life may be 15-30 hours. Also, steady-state serum concentration is not linear with respect to dose, with disproportionate serum levels being produced by much smaller increments in dose.

Laboratory tests. Mild toxicity (tinnitus, visual disturbances, GI tract disturbances) correlates with serum salicylate levels more than 30 mg/100 ml (300 µg/ml), and severe toxicity (CNS symptoms) is associated with levels more than 50 mg/100 ml (500 µg/ml). In younger children, severe toxicity is often associated with ketosis and metabolic acidosis, whereas in older children and adults, respiratory alkalosis or mixed acidosis-alkalosis is more frequent. Peak serum salicylate values correlate best with toxicity. It is recommended that these be drawn at least 6 hours after the overdose to avoid serum values falsely below peak levels due to delayed absorption. Enteric-coated aspirin delays absorption an additional 4-6 hours. Screening tests for salicylates include urine testing with a ferric chloride reagent or Phenistix (both of these tests are also used in the diagnosis of phenylketonuria). The most commonly used quantitative test is a colorimetric procedure based on ferric chloride. Ketone bodies and phenothiazine tranquilizers can interfere.

Carbon monoxide. Carbon monoxide combines with hemoglobin to form carboxyhemoglobin. While doing this it occupies oxygen-binding sites and also produces a change in the hemoglobin molecule that binds the remaining oxygen more tightly, with less being available for tissue cell respiration. Headache, fatigue, and lightheadedness are the most frequent symptoms.

Laboratory diagnosis. Carbon monoxide poisoning is detected by hemoglobin analysis for carboxyhemoglobin. This is most readily done on an instrument called a CO-Oximeter. A 30%-40% carboxyhemoglobin content is associated with severe symptoms, and more than 50% is associated with coma. Cigarette smoking may produce levels as high as 10%-15%. Carboxyhemoglobin is stable for more than 1 week at room temperature in EDTA anticoagulant. The specimen should be drawn as soon as possible after exposure, since carbon monoxide is rapidly cleared from hemoglobin by breathing normal air.

Carbon monoxide poisoning can be suspected from an arterial blood gas specimen when a measured oxygen saturation (percent O2 saturation of hemoglobin) value is found to be significantly below what would be expected if oxygen saturation were calculated from the PO2 and pH values. For this screening procedure to be valid, the O2 saturation must be measured directly, not calculated. Some blood gas machines measure O2 saturation, but the majority calculate it from the PO2 and pH values.

Ethyl alcohol (ethanol). Ethanol is absorbed from the small intestine and, to a lesser extent, from the stomach. Factors that influence absorption are (1) whether food is also ingested, since food delays absorption, and if so, the amount and kind of food; (2) the rate of gastric emptying; and (3) the type of alcoholic beverage ingested. Without food, the absorptive phase (time period during which alcohol is being absorbed until the peak blood value is reached) may be as short as 15 minutes or as long as 2 hours. In one study, peak values occurred at 30 minutes after ingestion in nearly 50% of experimental subjects and in about 75% by 1 hour, but 6% peaked as late as 2 hours. With food, absorption is delayed to varying degrees. Once absorbed, ethanol rapidly equilibrates throughout most body tissues. The liver metabolizes about 75% of absorbed ethanol. The predominant liver cell metabolic pathway of ethanol is the alcohol dehydrogenase enzyme system, whose product is acetaldehyde. Acetaldehyde, in turn, is metabolized by the hepatic microsome system. About 10%-15% of absorbed ethanol is excreted unchanged through the kidneys and through the lungs.

Ethanol measurement. There are several methods for patient alcohol measurement. The legal system generally recognizes whole blood as the gold standard specimen. Arterial blood ethanol is somewhat higher than venous blood levels, especially in the active absorption phase. Capillary blood (fingerstick or ear lobe blood) is about 70%-85% of the arterial concentration. The major problems with whole blood are that values are influenced by the hematocrit, and most current chemistry analyzers must use serum. A serum value is about 18%-20% higher than a whole blood value obtained on the same specimen, whereas blood levels correlated by law to degrees of physical and mental impairment are defined by whole blood assay. Serum values theoretically can be converted to equivalent whole blood values by means of a serum/whole blood (S/WB) conversion ratio. Most laboratories apparently use a S/WB conversion ratio of 1.20. Unfortunately, there is significant disagreement in the literature on which ratio to use; different investigators report S/WB ratios varying between 1.03 and 1.35. Based on the work of Rainey (1993), the median S/WB conversion ratio is 1.15 (rather than 1.20), the range of ratios included in 95% certainty is 0.95-1.40, and the range of ratios included in 99% certainty is 0.90-1.49. Whole blood values can be obtained directly by using serum analytic methods on a protein-free filtrate from a whole blood specimen.

Enzymatic methods using alcohol dehydrogenase or alcohol oxidase are replacing the classic potassium dicromate methods. There is some dispute in the literature whether or not alcohol dehydrogenase methods are affected by isopropanol (commonly used for venipuncture skin cleansing). In experiments performed in my laboratory, no cross-reaction was found in concentrations much stronger than what should be encountered from skin cleansing. Nevertheless, because of legal considerations, specimens for ethanol should be drawn without using any type of alcohol as a skin-cleansing agent. Increased blood ketones, as found in diabetic ketoacidosis, can falsely elevate either blood or breath alcohol test results.

Urine is not recommended for analysis to estimate degree of alcohol effect because the blood/urine ratio is highly variable and there may be stasis of the specimen in the bladder. However, urine can be used to screen for the presence of alcohol. Breath analyzers are the assay method most commonly used for police work since the measurement can be done wherever or whenever it is desirable. Breath analyzers measure the ethanol content at the end of expiration following a deep inspiration. The measurement is then correlated to whole blood by multiplying the measured breath ethanol level by the factor 2,100. On the average, breath alcohol concentration correlates reasonably well with whole blood alcohol concentration using this factor. However, there is significant variation between correlation factors reported in different individuals and average factors in different groups, so that use of any single “universal” factor will underestimate the blood ethanol concentration in some persons and overestimate it in others. Also, correlation with blood ethanol levels is better when breath ethanol is measured in the postabsorptive state than in the absorptive state. When breath analyzers are used, it is important that there be a period of at least 15 minutes before testing during which no alcohol ingestion, smoking, food or drink consumption, or vomiting has taken place to avoid contamination of the breath specimen by alcohol in the mouth. Some alcohol-containing mouthwashes may produce legally significant breath alcohol levels at 2 minutes after applying the mouthwash, but not at 10 minutes after use. Ketone bodies in patients with diabetic acidosis may interfere with breath ethanol measurement. One further advantage of breath testing in the field is the usefulness of a negative test for ethanol in a person whose behavior suggests effects of alcohol; this result could mean a serious acute medical problem that needs immediate attention.

Legal use of blood alcohol assay. Most courts of law follow the recommendations of the National Safety Council on alcohol and drugs (the following ethanol values are whole blood values):

  • Below 0.05% (50 mg/100 ml): No influence by alcohol within the meaning of the law.
  • Between 0.05% and 0.10% (50-100 mg/100 ml): A liberal, wide zone in which alcohol influence usually is present, but courts of law are advised to consider the person’s behavior and circumstances leading to the arrest in making their decision.
  • Above 0.10% (100 mg/100 ml): Definite evidence of being “under the influence,” since most persons with this concentration will have lost, to a measurable extent, some of the clearness of intellect and self-control they would normally possess.

Based on the work of Rainey, the minimal serum alcohol level that would correspond to a whole blood alcohol level of 0.10% (100 mg/100 ml, w/v) with 95% certainty is 140 mg/100 ml (30.4 mmol/L) and at 99% certainty is 149 mg/100 ml (32.3 mmol/L).

Some organizations, including the American Medical Association (AMA) Council on Scientific Affairs (1986), suggest adopting 0.05% blood alcohol content as per se evidence of alcohol-impaired driving.

Estimating previous blood alcohol levels. In certain situations it would be desirable to estimate the blood alcohol level at some previous time from the results of a subsequent alcohol level. The usual method for this is the Widmark equation: P = A + (F Ч T), when P is the concentration of blood alcohol (in milligrams per liter) at the previous time, A is the concentration of blood alcohol (in milligrams per liter) when it was measured, F is a factor (or constant) whose value is 130 (in milligrams per kilogram per hour), and T is the time (in hours) elapsed between the time the blood alcohol was measured and the previous time that the blood alcohol value must be estimated.

There is considerable controversy regarding the usefulness of the Widmark equation. The equation is valid for a person only in the postabsorptive state (i.e., after the peak blood alcohol level is reached). Time to peak is most often considered to be 0.5-2.0 hours, so that the blood specimen must be drawn no earlier than 2 hours after the beginning of alcohol intake. The Widmark equation is based on kinetics of alcohol taken during fasting. Food increases alcohol elimination, so that food would cause the Widmark equation to overestimate the previous alcohol level. The factor (constant) of 130 is not necessarily applicable to any individual person, since a range of experimentally measured individual values from 100-340 has been reported.

Clinical and laboratory effects of alcohol. Alcohol has a considerable number of metabolic and toxic effects that may directly or indirectly involve the clinical laboratory. Liver manifestations include gamma-glutamyltransferase (GGT; formerly gamma-glutamyl transpeptidase) elevation, fatty liver, acute alcoholic hepatitis (“active cirrhosis”), or Laennec’s cirrhosis, and may lead indirectly to bleeding from esophageal varices or to cytopenia either from hypersplenism or through other mechanisms. RBC macrocytosis is frequently associated with chronic alcoholism, and nutritional anemia such as that due to folic acid may be present. Other frequently associated conditions include acute pancreatitis, hypertriglyceridemia, alcoholic gastritis, alcoholic hypoglycemia, various neurologic abnormalities, and subdural hematoma. The chronic alcoholic is more susceptible to infection. Finally, alcohol interacts with a variety of medications. It potentiates many of the CNS depressants, such as various sedatives, narcotics, hypnotics, and tranquilizers (especially chlordiazepoxide and diazepam). Alcohol is a factor in many cases of overdose, even when the patient has no history of alcohol intake or denies intake. The presence of alcohol should be suspected when toxicity symptoms from barbiturates or other medications are associated with blood levels that normally would be considered safe. Alcohol may antagonize the action of various other medications, such as coumarin and phenytoin. Alcohol intake in pregnancy has been reported to produce increased rates of stillbirth and infant growth deficiency as well as a specific “fetal alcohol syndrome.” Fetal alcohol syndrome includes a particular pattern of facial appearance, postnatal growth deficiency with normal bone age, various skeletal and organ malformations, and various neurologic abnormalities (including average IQ below normal).

Ethanol is one of a number of substances (other alcohols, lactic acid, etc.) that elevate serum osmolality using freezing point depression methods. This creates a gap between measured osmolality and calculated osmolality. Osmolality using vapor pressure instruments is not affected by ethanol.

Laboratory screening for alcoholism. Various tests have been used to screen for chronic alcoholism. Of these, the most commonly advocated is the GGT. This test and the AST reflect the effect of alcohol on the liver. A third possibility is mean corpuscular volume (MCV), the average size of the patient RBC. This reflects macrocytosis induced by liver disease and possibly also by folic acid deficiency in alcoholics. In heavy drinkers or alcoholics, GGT has a sensitivity of about 70% (literature range, 63%-81%), MCV detects about 60% (26%-90%), and the AST value is elevated in about 50% (27%-77%). Most (but not all) reports indicate some correlation in likelihood and degree of GGT elevation with the amount and frequency of alcohol consumption. Thus, heavy drinkers are more likely to have GGT elevations, and these elevations are (on the average) higher than those of less heavy drinkers. However, there are many exceptions. There is some disagreement as to whether so-called social drinkers have a significant incidence of elevated GGT levels. The majority of investigators seem to believe that they do not.

Other biochemical abnormalities associated with alcoholism (but found in <40% of cases) include hypophosphatemia, hypomagnesemia, hyponatremia, hypertriglyceridemia, and hyperuricemia.

An isoform of transferrin that contains fewer sialic acid molecules than normal transferrin and thus is called carbohydrate-deficient transferrin has been advocated by some researchers. A few studies claim that in alcoholism its levels become elevated more often than those of GGT and that therefore it is a much more specific indicator of alcohol abuse. One study claimed 55% sensitivity in detecting moderate alcohol intake and nearly 100% sensitivity in heavy chronic drinkers. A commercial assay kit is available. However, at present the test would most likely have to be performed in a large reference laboratory.

Tests for Tobacco Use. In some instances, such as in smoking-cessation clinics, life insurance company examinations, and tests to determine degree of passive exposure to tobacco smoke, it is desirable to detect and quantitate tobacco exposure. The modalities that have been investigated are carboxyhemoglobin (based on effect of carbon monoxide generated by tobacco combustion), thiocyanate (a metabolite of cyanide derived from tobacco tar), and cotinine (a metabolite of nicotine). Most current tests are based on thiocyanate or cotinine. Thiocyanate is absorbed in the lungs and has a biological half-life of about 14 days. Ingestion of certain vegetables can falsely elevate serum thiocyanate levels. Cotinine is specific for nicotine, is not affected by diet, has a serum within-day variation of about 15%-20%, and has a biological half-life of about 19 hours. Thus, cotinine tests become negative after tobacco abstinence of a week or less, whereas thiocyanate requires considerably longer before becoming nondetectable. Also, thiocyanate can be assayed chemically and less expensively than cotinine, which is done by immunoassay. Nevertheless, because cotinine is specific for nicotine and is affected only by active or passive exposure to tobacco, cotinine seems to be favored by investigators. Cotinine can be assayed in serum, saliva, or urine; the levels are higher in urine.

Drugs of Abuse

Testing for drugs of abuse usually occurs in two circumstances: possible or known overdose or testing of clinically well persons to detect drug use. Overdose will be discussed in the section on toxicology. Drug screening has its own unique problems. For example, it is necessary to provide legal chain of custody protection to specimens so that each time a specimen changes hands the person receiving it documents this fact and thereby becomes the theoretical protector of the specimen. Another difficulty is attempts by some patients to invalidate the tests if the tests are performed on urine. This may involve diluting the urine specimen, adding substances that might interfere with the test, or substituting someone else’s specimen. Possible dilution can be suspected or detected by specimen appearance (appearance suggesting water), very low specific gravity, or specimen temperature less than or more than body temperature. One investigator has found normal urine temperature immediately after voiding to be 97°-100°F (36°-38°C); the National Institute of Drug Abuse (NIDA) current guidelines are 90.5°-99.8°F (32.5°-37.7°C). Addition of foreign substances may be detected by unusual color or other appearance, low specimen temperature, or by unusually low or high specimen pH (normal urine pH is generally considered to be 4.5-8.0). Sometimes there may be an unusual smell. Specimen substitution by the patient may be suspected by specimen temperature lower than body temperature. A fluid without creatinine is probably not urine. Patient identity should be verified, by photograph if possible, to prevent a substitute from providing the specimen.

A variety of methods can be used for initial screening. Currently, the two most popular are thin-layer chromatography (TLC) and some form of immunoassay. The Syva Company EMIT immunoassay was one of the first to be introduced and remains the most popular. Due to the possibility of cross-reacting substances and the implications of a positive test result to the patient, as well as legal considerations, positive screening test results should be confirmed by a method that uses a different detection principle. Currently, the method of choice is gas chromatography followed by mass spectrometry (GC/MS). Instruments are available that combine both components. Gas chromatography separates the various substances in the mixture and the mass spectrometer bombards each substance from the chromatographic separation with electrons to ionize the constituents. The constituents are separated on the basis of mass/charge ratio, and a mass spectrum peak is calculated for each by comparing the mass to the number of ions of that mass that are present. The spectrum peak is a fingerprint that identifies the compound. Therefore, the gas chromatography element separates the constituents, and the mass spectrometry component identifies them.

Marijuana (cannabis). The most important active component of marijuana is d-9-tetra hydro cannabinol (d-9-THC, usually, although incorrect technically, abbreviated as THC). After inhalation of marijuana, THC can be detected in blood in about 1-2 minutes and reaches a peak in about 7 minutes. The sensation attributed to THC, however, does not appear until about 20-30 minutes after the serum peak, at a time when the serum level of THC is considerably lower. It is fat soluble and is quickly deposited into many tissues, including the brain. At the same time, the THC that reaches the liver is metabolized to a compound with psychogenic properties called “11-hydroxy-THC,” which then itself is rapidly metabolized to various compounds, the principle metabolite being a nonpsychogenic water-soluble compound conjugated to glucuronide molecules called “carboxy-THC.” About 30 minutes after absorption into tissues, THC is slowly released back into the blood, where liver metabolism continually reduces its body availability. If more marijuana is smoked before the previous amount has been eliminated, more THC will be deposited in tissues (up to a saturation point), and total elimination takes longer. Shortly after reaching the serum peak, the serum level of THC begins to fall due to tissue absorption and liver metabolism even if smoking continues, reaching only 10% of the peak levels in 1-2 hours. The serum half-life of THC after inhalation is about 0.5-1.5 hours. Carboxy-THC reaches a serum peak at about 20-30 minutes, at which time it begins to exceed THC. At 1 hour after inhalation, about 15% of plasma cannabinoids is THC and about 40% is carboxy-THC. Both THC and carboxy-THC are nearly all bound to plasma proteins (predominantly lipoproteins), and their concentration in plasma is about twice that of whole blood. About two thirds of the cannabinoid metabolites are excreted in feces and about one third in urine. The body elimination half-life of THC is about 24 hours (range, 18-30 hours), and the elimination half-life of carboxy-THC, the principle metabolite, is 3-6 days. Since the elimination half-life of THC is about 1 day and since steady state is reached after five half-lives, if the individual smokes roughly the same number of marijuana cigarettes each day, there will be equilibrium between intake and elimination of THC in about 5 days. Carboxy-THC has a longer elimination half-life, so that constant or heavy use of marijuana greatly prolongs the time that carboxy-THC will be detectable in the urine. Marijuana can be eaten as well as smoked. Absorption from the GI tract is slower and less predictable than through the lungs. Onset of the psychogenic effect occurs about 1-3 hours after ingestion of marijuana. Serum levels of 11-hydroxy-THC are considerably higher after oral intake of cannabis than levels after smoking.

Urine assay. Carboxy-THC is the major metabolite of THC and is the one usually assayed in urine. Length of detectable urinary excretion varies with the amount of marijuana used per day, which, in turn, depends on the type of material (e.g., ordinary marijuana, hashish, or other forms) and the number of times per day of administration. There is also some effect from the route of use (smoking or ingestion) and individual tolerance or variation in the rate of metabolism. There are also assay technical factors. Most investigation of urine carboxy-THC detection has used detection cutoff levels of either 20 ng/ml or 100 ng/ml. The 100 ng/ml cutoff point was used in order to prevent claims that inhaling smoke from someone else’s marijuana cigarette might produce a positive urine test result. Actually, several studies have tested persons exposed to prolonged inhalation from cigarettes of other persons in small, confined areas (severe passive inhalation), and found that only a few persons had positive urine tests at the 20 ng/ml cutoff level. The longest time interval for a positive test was 3 days. Under ordinary experimental conditions of passive exposure, only a few individuals had detectable urine levels at the 20 ng/ml cutoff; detectability usually disappeared in less than 24 hours and almost always by 48 hours. Urine specimens should be frozen if testing is delayed, to preserve carboxy-THC values.

Saliva assay. It has been reported that THC remains in saliva up to 5 hours after cannabis inhalation. Therefore, detection of THC in saliva theoretically might indicate recent use of marijuana. To date, saliva assay has not been widely used.

Time period after use that marijuana presence can be detected. After a single cigarette containing usual amounts of THC is smoked, urine levels become detectable after about 1 hour and remain detectable at the 100 ng/ml cutoff level for 1-3 days and at the 20 ng/ml cutoff level for 2-7 days (therefore, the total detectable time period at the 20 ng/ml level is about 5-7 days, with a range of 2-10 days). For example, in one study those patients tested after smoking a single cigarette showed urine results more than 100 ng/ml for up to 3 days and results more than 20 ng/ml for an additional 5-8 days. Smoking more than one cigarette on the same day for 1 day only extends the detectability time about 2 days. In chronic heavy users, after smoking is stopped, urine results can remain positive at the 20 ng/ml level in some individuals up to 30-40 days. In one report, chronic marijuana users with recent heavy intake had urine assays more than 100 ng/ml for 7-14 days, followed by assays greater than 20 ng/ml for an additional 7-14 days. However, in another study of chronic marijuana smokers, results of about 25% of those who had smoked within 2 days of testing were negative at the 100 ng/ml level.

Interpretation of test results. Carboxy-THC is not psychotropically active, and because of the variability of excretion due to the different factors just noted, detection of this substance in urine (if confirmed) indicates only that the patient has used marijuana in the recent past without providing evidence that correlates with physical or mental effects of marijuana. Serum levels of THC greater than 2 ng/ml is thought to indicate probability that an individual would have some undesirable effects. In some circumstances, such as patient actions that may have been influenced by marijuana, it might be useful to obtain a THC serum level immediately as an indicator of current status and to compare with the urine carboxy-THC level. If the question arises whether marijuana use is ongoing, monitoring the urine periodically (e.g., every 4-5 days) should demonstrate a progressive downward trend in the values if smoking has indeed stopped, although there may be some fluctuations during this time. Initially positive test results with any screening procedure must be verified by a confirmatory procedure (such as GC/MS) if the positive results will lead to some significant action. The different sensitivity levels of different tests must also be kept in mind, as well as the effect of urine concentration or dilution.

Cocaine. Cocaine can be self-administered intranasally, by smoking, or intravenously. It may also be taken orally, but this is not common since gastric juice inactivates most of the drug. Intranasal administration produces peak blood levels in 30-40 minutes (range, 15-60 minutes). About 80% of the dose reaches the bloodstream. Intravenous administration produces peak levels in 3-5 minutes. Smoking pharmacokinetics are similar to those of IV use, with peak levels reached in about 5 minutes, although only an average of about 45% of the dose reaches the bloodstream. The most common form used for smoking is known as “free-base,” which is derived from the active ingredient cocaine hydrochloride by separating the cocaine base from the hydrochloride ions, usually by extracting the cocaine base in a solvent. Already-processed cocaine base is often called “crack.” This is the most potent form of cocaine. Cocaine is very lipophilic and is rapidly taken up by tissues containing lipid, such as the brain. The half-life of cocaine in the body after the serum peak is about 1.5 hours (range, 0.5-2 hours) for all methods of drug intake. About 25%-40% of the dose that reaches the bloodstream is converted to the major metabolite benzoylecgonine by hydrolysis in fluids and peripheral tissues and excreted in the urine. Benzoylecgonine has a body half-life of 7-9 hours, which is about 6 times as long as that of cocaine. About 20%-25% of serum cocaine is converted to other metabolites, with roughly equal contribution by the liver and by serum cholinesterase. About 1% is excreted unchanged in the urine.

Detection of cocaine. Cocaine or its metabolites can be measured in serum or in urine. The serum half-life of cocaine is short, and cocaine from a single dose is usually nondetectable in 6-10 hours, although it may be detectable longer with very sensitive methodology. Multiple doses may prolong serum detectability. Cocaine in urine is detectable for only about 8-12 hours after a single dose. Cocaine is usually investigated in urine through detection of its metabolite benzoylecgonine. This is detectable in urine beginning 1-4 hours after a cocaine dose. How long it will remain detectable depends on the quantity ingested, whether dosage is single or multiple, individual patient variation, and the sensitivity of the detection method. RIA is the most sensitive of the screening methods (5 µg/L) and may detect cocaine metabolites as long as 7 days after a large dose. Enzyme immunoassay (EIA, an EMIT variant) is less sensitive (300 µg/L) and would detect the presence of the same dose for about 2-3 days. Since some false positive results can be obtained with any of the screening tests, a positive result must be verified by a confirmatory method such as GC/MS. The screening methods are designed to detect benzoylecgonine, which remains detectable considerably longer than cocaine, so that a positive urine screening test result does not mean the patient was under the influence of cocaine at the time he or she produced the urine specimen, and the result usually will not predict (except as an estimate involving considerable time variation) when the last dose was taken. Proof of use at a specific time requires detection of cocaine itself in serum or other body tissue. This is usually done by GC/MS. Specimens should be placed in ice and the serum frozen to prevent hydrolysis of cocaine to its metabolites.

Phencyclidine. Phencyclidine (PCP) effects are frequently not recognized; in one study, only 29% of patients were correctly diagnosed on admission. PCP is a water-soluble powder that is administered by smoking water-dissolved drug applied to some smoking material or by ingestion. About 70% of the dose reaches the bloodstream by either route. Peak serum levels are reached 5-15 minutes after smoking. Peak levels after oral intake are reached after about 2 hours. The body half-life of PCP after serum peak levels are reached varies considerably, averaging about 18 hours with a range of 8-55 hours, and are somewhat dependent on the dose. About 10% of the dose is excreted in the urine unchanged, and the remainder as various metabolites without one of them being greatly predominant. PCP or its metabolites are often detected in urine for about 1 week. In some cases it may be detected for several days to several weeks, again depending on the quantity administered, whether administration was acute or chronic, and the sensitivity of the detection method. Drug excretion can be increased by urine acidification. Serum or urine levels do not correlate well with severity of symptoms. PCP or some of its metabolites can be detected by RIA, EIA, TLC, and other techniques. These methods differ in sensitivity and each method has some substances that may cross-react. GC/MS is the best confirmatory method.

Amphetamines. Metamphetamine is used more frequently than the other amphetamines. Amphetamines can be administered orally, intravenously, or by smoking. Tolerance frequently develops, necessitating larger doses to achieve desired effects. Other drugs are frequently used at the same time. Absorption from the GI tract is fairly rapid. Body half-life is 4-24 hours. About half the dose is metabolized in the liver. About 45% of the metamphetamine dose is excreted in urine unchanged, about 5% as amphetamine, and the remainder as other metabolites. Amphetamines are usually detectable in urine by 3 hours after administration of a single dose, and screening test results can be positive for 24-48 hours (dependent to some extent on the size of the dose and the sensitivity of the method). A positive result for amphetamines in urine generally means use in the last 24-48 hours. Screening methods include RIA, EIA, TLC, and other techniques. A substantial number of over-the-counter medications for colds or for weight reduction contain amphetamines or amphetamine analogs that may cross-react in one or more screening tests. Other medications may also interfere. GC/MS is the best confirmatory method.

Morphine and related alkaloids. Morphine and codeine are made from seed pods of the opium poppy. Heroin is made from morphine. Morphine and heroin are usually injected intravenously. About 10% (range 2%-12%) of a morphine dose is excreted unchanged in the urine, and about 60%-80% of the dose is excreted in urine as conjugated glucuronides. The body half-life is 1.7-4.5 hours. Heroin is rapidly metabolized to morphine, with about 7% of the dose excreted as morphine and 50%-60% excreted as conjugated morphine glucuronides. Codeine is excreted primarily as conjugated codeine glucuronides in the urine, but a small amount (<10%) is metabolized to morphine and morphine conjugated glucuronides, which appear in the urine. Poppy seeds are used as a filling for baked goods and also are used unprocessed; they are sold legally even though they contain some natural morphine and codeine. The amount of opiate alkaloids in poppy seeds is not sufficient to produce any symptoms or noticeable sensation, but consumption of a moderate amount of this material can result in detectable concentrations of morphine in the urine that can last as long as 36-60 hours.

Screening tests for morphine and other opiates are similar to those for other drugs of abuse: RIA, EIA (EMIT and others), TLC, and in addition a hemagglutination inhibition assay. Most of these methods cannot differentiate between codeine and morphine. Also, since codeine metabolism results in a small but measurable amount of morphine conjugates, prescription medications containing codeine for pain relief or cough suppressive effects may produce positive test results for morphine. In general, if the concentration of codeine greatly exceeds that of morphine, the parent drug is probably codeine. In general, excluding prescription drugs, the presence of morphine in the urine indicates nonlegal use of morphine, heroin, or codeine in the past 1-2 days. Detection of these compounds should be confirmed, and the compound identified, using GC/MS. In addition, GC/MS can differentiate between poppy seed ingestion and heroin intake by detecting and measuring 6-monoacetylmorphine, a metabolite of heroin that is not present in poppy seeds or in the urine of persons who ingest poppy seeds.

Cyclosporine

Cyclosporine (previously called “cyclosporin A”) is a compound derived from a soil fungus that has strong immunosuppressive activity and is widely used to prevent transplant rejection. Cyclosporine is thought to inhibit helper T-cell function with minimal effect on B-cell function. Cyclosporine can be administered orally or intravenously. If given orally, it is absorbed through the lymphatics of the distal ileum, with considerable variability in time and degree of absorption. During the immediate postabsorptive period, 8%-60% of the dose is absorbed, although later absorption may improve somewhat. After an oral dose, peak serum levels are reached in about 2.5 hours, and subsequent body elimination half-life is about 4 hours. There is wide variation of these two parameters between individual patients (e.g., elimination time variation of 4.3-53 hours in renal transplant patients). About 50%-60% of the absorbed dose is bound to RBCs, 25%-35% is in plasma, and about 10%-15% is bound to leukocytes. The plasma component is about 60% bound to high-density lipoproteins, about 25% to low-density lipoproteins, and about 10% to other plasma proteins, leaving about 5% unbound. Almost all of the drug is metabolized by the liver microsome system into various derivatives that are excreted in bile and feces, with only 1%-6% of the metabolites excreted in urine. There are several serious side effects. About 25% of transplant patients show some degree of renal toxicity. Lesser numbers develop hypertension or liver toxicity.

Cyclosporine assay. The blood concentration of cyclosporine cannot be predicted from an oral dose. In addition, there is a narrow balance between insufficient immunosuppression with too little drug and inducement of toxicity with too much. Therefore, TDM is considered essential. However, there is considerable controversy in the literature regarding the technical details of cyclosporine TDM. Either whole blood or plasma can be analyzed. Distribution of the drug between plasma and RBCs is temperature dependent, with decrease in serum concentration as temperature decreases. Therefore, to obtain plasma, one must equilibrate the blood at a fixed temperature, and this temperature will influence the assay value. On the other hand, whole blood assay results are affected by the patient hematocrit. Whole blood assay is recommended by the AACC Task Force on Cyclosporine Monitoring (1987). The two widely used assay methods are HPLC and RIA. RIA produces higher values than HPLC and includes some cross-reacting metabolites with the cyclosporine measurement. The HPLC assay is more specific since it does not include metabolites. However, there are many published HPLC procedures that vary in one or more technical details. At present, there is no consensus on a single analytic protocol, and since different methods and technical variations produce different results, an exact therapeutic range has not been established. Average values from the literature are 250-1,000 µg/L using whole blood by RIA, 50-200 µg/L using plasma by RIA, and 100-500 µg/L using whole blood by HPLC. Trough levels are usually obtained. Certain medications affect cyclosporine assay, such as phenytoin, which activates the liver microsome system.

FK-506 (tacrolimus). This is a recent bacteria-derived macrolide immunosuppressive agent that selectively suppresses both helper/inducer and cytotoxic T-lymphocyte activity, similar to the action of cyclosporine. It appears to have immunosuppressive activity equal to or greater than cyclosporine (especially in liver transplants) with substantially less toxicity. However, nephrotoxicity may occur. Use of medications inhibiting liver microsomal activity (e.g., cinetidine, erythromycin, ketoconazole) increases FK-506 plasma concentration. Assay for FK-506 is possible using monoclonal antibody enzyme immunoassay methods, although these are “first generation” and need to be improved. The therapeutic range is also not standardized and is probably method dependent.

Antibiotics

Gentamicin.Methods of estimating antibiotic therapeutic effectiveness have been discussed elsewhere (chapter 14). Several antibiotics possess therapeutic ranges whose upper limits border on toxicity. Serum assays for several of these have been developed, most commonly using some type of immunoassay. One example will be used to illustrate general principles. Gentamicin (Garamycin) is one of the aminoglycoside antibiotics that is active against gram-negative organisms, including Pseudomonas aeruginosa. Unfortunately, side effects include ototoxicity and nephrotoxicity. Drug excretion is mainly through renal glomerular filtration. Serum peak levels and residual (trough) levels both provide valuable information. Residual levels are measured just before the next dose. Values at this time correlate best with nephrotoxicity, especially when serum levels are greater than 2 µg/ml. Specimens for peak level determination are obtained approximately 30 minutes after the end of IV infusion and 1 hour after intramuscular injection. Peak levels correlate best with therapeutic effectiveness (i.e., whether adequate serum levels are present) and possibly with ototoxicity. Normal peak values are usually considered 4-8 µg/ml. Values less than 4 µg/ml may be ineffective, whereas those greater than 10 µg/ml predispose to toxicity. Gentamicin assay is desirable because serum levels differ considerably among patients receiving the same dose, and serum gentamicin half-life is equally variable. Standard doses or nomograms based on serum creatinine level fail to predict blood concentration accurately for peak or residual levels in a substantial number of patients even with adequate renal function. When renal function is impaired or when nephrotoxic antibiotics have previously been administered, serum assay becomes essential. It should be mentioned that peak or residual levels within accepted reference limits do not guarantee safety, since some studies have shown onset of renal function decrease in the presence of acceptable serum values.

Vancomycin. Only a small mount of oral vancomycin is absorbed, so that the oral form is used to kill GI tract bacteria such as Clostridium difficile. Intravenous medication is used for other infections. Intravenous vancomycin is about 50%-60% bound to serum albumin, and 80%-90% is excreted unchanged in the urine. The serum half-life is 2-3 hours in children and 4-8 hours in adults with normal renal function. In renal failure, the serum half-life becomes 7-8 days (range, 4-9 days), and instead of the usual adult IV dose of 500 mg every 6 hours, only 500-1,000 mg once per week is sufficient. Peak and residual (trough) levels are usually recommended. Residual levels are usually obtained just before a dose is given; the reference values are 5-10 mg/100 ml. Unfortunately, different investigators do not agree when to draw specimens after the end of IV infusion for peak values, with times suggested including immediately, 15 minutes, 30 minutes, and 2 hours after the infusion. Vancomycin serum levels apparently fall rapidly for a time after the end of IV infusion and then more slowly. At 15 minutes after the end of infusion, serum values of 25-30 mg/100 ml are equivalent to 30-40 mg/100 ml levels (the most commonly accepted peak value range) at the end of infusion.

Theophylline (Aminophylline)

Theophylline is used primarily as a bronchodilating agent for therapy of asthma. Over the therapeutic range there is a reasonably linear correlation between dosage and therapeutic effect. The drug is administered intravenously and orally. Oral medication is available in regular (noncoated or liquid) and slow-release (SR) forms. SR forms are available in twice-daily, and even once-daily, dosages. For most (but not all) regular oral preparations, absorption takes place predominantly in the small intestine, absorption is essentially

SPA процедуры – Korea-spa.ru

complete, and food usually does not interfere significantly. Absorption rates are more apt to vary, and time to serum peak is less predictable among the different SR preparations. In addition, food (especially a high-fat meal) is more likely to interfere with absorption of some SR preparations. One investigator recommends dose intake 1 hour before or 2 hours after meals when using SR preparations influenced by food. For the regular oral medication, time to peak (for adults) is about 2-3 hours, half-life is 4-6 hours (range, 3-8 hours) and time to steady state is about 15-20 hours. For children, half-life is more variable (1-8 hours) and time to steady state is also more variable (5-40 hours). Time to peak for the oral SR preparation is about 5 hours. About 50%-60% (range, 40%-65%) of theophylline is bound to serum albumin. Binding is less in neonates and at lower pH (acidosis). About 90% is metabolized in the liver, and most of the metabolites, plus about 10%-15% of unchanged theophylline, is excreted by the kidneys. Therefore, except in the first several months of life, renal function is not a major factor in theophylline serum concentration. Adults who smoke tobacco or marijuana and children excrete theophylline somewhat more rapidly (decreased serum half-life) than nonsmoking adults. Factors that reduce theophylline clearance (increased serum half-life) include young infants (ages 0-8 months), congestive heart failure, cor pulmonale, severe liver dysfunction, sustained fever, pneumonia, obesity, cessation of smoking, cimetidine, ciprofloxacin, and erythromycin family antibiotics. Some theophylline assay methods may show partial interference (some false increase in values) from substances present in uremia. Children show more individual differences in theophylline clearance and as a group eliminate theophylline more rapidly than adults. In addition, one report indicated that in children a high-protein diet increased theophylline elimination and a high-carbohydrate diet decreased it.

Besides the factors just mentioned, therapy is complicated by the many theophylline preparations available, many of which vary significantly in theophylline content and the rate it is absorbed. Noncompliance is a constant problem in therapy and in the interpretation of theophylline blood levels, because low levels due to noncompliance may be misinterpreted as due to rapid metabolism or excretion. The reverse mistake can also be made. Another difficulty is the asthmatic who may already have taken one or more extra doses of theophylline before being seen by the physician.

There is a relatively narrow zone between therapeutic range (10-20 µg/ml; 55-110 µmol/L) and values associated with toxicity. The degree of elevation over reference range is not a reliable predictor of toxicity risk except in a very general way, since severe toxicity can develop in some patients at less than twice the upper limit of the reference range. Although there are mild toxic symptoms, severe toxicity may develop without warning. If there is a question about previous drug intake, a specimen for theophylline assay should be obtained before therapy is begun. Therapy can then be started and the dosage modified when assay results become available. Thereafter, when steady state is achieved, serum peak concentration should be measured (30 minutes after the dose for IV theophylline, 2 hours after the dose for regular theophylline, and about 5 hours [range, 3-7 hours, depending on the particular medication] after the dose for sustained-release forms). Theophylline is thus an exception to the general rule that the residual (trough) level is better than the peak level to monitor therapy.

Antiarrythymic Drugs

There is a large and ever-growing list of these medications, too many to include here. TDM data for some members of this group are summarized in Table 37-25. Several have been selected for more detailed discussion here.
Procainamide. Procainamide is used to control certain ventricular arrhythmias and can be given orally or intravenously. Only about 10% is bound to serum protein. Maintenance is usually achieved by oral medication. About 85% of the oral dose is absorbed, mostly in the small intestine. About 50% of the drug is excreted unchanged by the kidneys. About 50% is metabolized, predominantly by the liver. The major metabolite of procainamide is N-acetylprocainamide (NAPA), which constitutes about 25%-30% of the original dose (7%-40%). NAPA is produced in the liver by a process known as N-acetylation. It has antiarrhythmic properties about equal to that of its parent compound. About 10% is bound to serum protein and about 85% is excreted unchanged by the kidneys. It has a serum half-life about twice that of procainamide. Therefore, NAPA levels continue to rise for a time after procainamide levels have stabilized. There is approximately a 1:1 ratio of procainamide to NAPA after both have equilibrated. Poor liver function may decrease NAPA formation and produce a high ratio (> 1.0) of procainamide to NAPA (i.e., less NAPA relative to the amount of procainamide). Even though procainamide degradation may be decreased, it is only 25%-30% metabolized in the liver, so that it is not affected as much as NAPA. On the other hand, poor renal function decreases NAPA excretion and decreases the procainamide/NAPA ratio to less than 1.0 (i.e., more NAPA relative to procainamide). Even though procainamide excretion may also be decreased, the amount of NAPA excreted through the kidneys is much higher than the amount of procainamide, so that poor renal function affects NAPA proportionally more than procainamide. Another factor is the acetylating process of the liver, which is an inherited characteristic. Isoniazid and hydralazine are also metabolized by this system. About one half of the population are slow acetylators and about one half are fast acetylators. Fast acetylation produces more NAPA (tending to produce a procainamide/NAPA ratio < 1.0), and slow acetylation produces less NAPA (procainamide/NAPA ratio > 1.0). Assessment of acetylation status is dependent on adequate renal function, since poor renal function can affect the procainamide/NAPA ratio. About 50% of patients on long-term procainamide therapy develop antinuclear antibodies, and up to 30% may develop a syndrome very similar to systemic lupus erythematosus. Slow acetylators are more likely to develop these conditions than fast acetylators.

Since both procainamide and NAPA have antiarrhythmic action and since several factors influence their levels and their relationship to each other, most authorities recommend that both be assayed and that therapeutic decisions be based on the sum of both rather than on either one alone. Therapeutic range for the combination of procainamide and NAPA is 10-30 µg/ml (42.50-127.47 µmol/L). Specimens for TDM are usually obtained just before the next scheduled dose to evaluate adequacy of dosage. Peak levels or specimens drawn during symptoms are needed to investigate toxic symptoms.

There are two types of procainamide oral preparations, standard (relatively short acting) and sustained release (SR). For the standard type, peak absorption levels are usually reached in about 1.5 hours (range, 1-2 hours) after an oral dose. However, some persons absorb procainamide relatively slowly, and the peak may be delayed up to 4 hours after the dose, close to the time one would expect a trough level. In one study, this occurred about one third of the time. Therefore, some investigators recommend both peak and trough for initial evaluation. Patients with acute myocardial infarction or cardiac failure are more likely to have delayed absorption. Serum half-life is about 3 hours (2-4 hours). Time to steady state is about 18 hours (11-20 hours). Therefore, the half-life is considered a short one, and there is a greater fluctuation in serum values compared with an agent with a long half-life. The peak level after oral SR procainamide occurs about 2 hours after the dose (range, 1-3 hours) but may not occur until later in patients with slow absorption. Time to steady state is about 24-30 hours.

Lidocaine. Lidocaine (Xylocaine) hydrochloride is a local anesthetic that has antiarrhythmic properties. Used as an antiarrhythmic, it is generally given intravenously to patients who are seriously ill. Lidocaine is lipid soluble and distributes rapidly to many tissues. When it is given as a single bolus, plasma levels fall rapidly, with perhaps as much as a 50% decrease in about 20 minutes. On the other hand, drug given by IV infusion reaches a plateau rather slowly because so much of the drug is distributed to peripheral tissues. Therefore, administration is usually done with one or more bolus loading dose injections followed by IV infusion. The half-life of lidocaine is 1-2 hours, and time to steady state is 5-10 hours (5-12 hours). About 70% is protein bound; of the total that is protein bound, about 30% is bound to albumin and 70% to alpha-1 acid glycoprotein. Lidocaine is about 90% metabolized in the liver, with 5%-10% excreted unchanged by the kidneys. The major hepatic metabolites of lidocaine also have some antiarrhythmic effect. The primary metabolites are themselves further metabolized in the liver, with less than 10% of the primary metabolites being excreted unchanged in urine.

Conditions that produce an increase in plasma lidocaine levels are severe chronic liver disease (decreased drug inactivation), chronic renal disease (decreased excretion), and congestive heart failure (reduced volume of distribution). In acute myocardial infarction, there is increase in the binding protein alpha-1 acid glycoprotein and a subsequent increase in plasma total lidocaine values; however, bound drug is pharmacologically inactive, and the nonbound active fraction often does not increase. Propranolol has been reported to decrease lidocaine clearance, producing higher plasma values.

Complications related to lidocaine therapy have been reported in 6%-20% of cases. Therapeutic drug monitoring requires a method that is fast and that can be performed without much delay. HPLC and EMIT are the two most frequently used methods. Colorimetric methods are also available. It has been recommended that lidocaine specimens be drawn 12 hours after beginning therapy and then daily. In seriously ill patients, in those whose arrhythmias persist in spite of lidocaine, and when lidocaine toxicity is suspected, assay every 12 hours could be helpful. The therapeutic range is 1.5-5 µg/ml.

Tocainide. Tocainide (Tonocard) is an analog of lidocaine that also is used to treat ventricular arrythmias. Tocainide has some advantages over lidocaine since tocainide can be given orally and has a longer half-life (about 15 hours; range, 12-18 hours) due to much less first-pass hepatic metabolism. The half-life may be increased with severe liver disease or chronic renal failure. About 10% is bound to serum protein. The metabolites of tocainide are excreted in the urine and do not have antiarrythmic activity. Peak serum levels are reached 1.5-2.0 hours after an oral dose. Steady state is reached in 3 days. Therapeutic range is 4-10 µg/ml. Assay is usually done by HPLC.

Quinidine. Quinidine has been used for treating both atrial and ventricular arrhythmias. There are two forms of quinidine: the sulfate and the gluconate. Both are available for oral administration in both regular and long-acting (SR) preparations. The gluconate form can be given intravenously. Oral regular quinidine sulfate has a time to peak value of about 2 hours (range, 1-3 hours), a serum half-life of about 6 hours (range, 5-8 hours), and a time to steady state of about 24 hours. Regular oral quinidine gluconate has a time to peak value of about 4 hours. SR quinidine sulfate (Quinidex) has a time to peak value of about 2 hours, a serum half-life of about 20 hours, and a time to steady state of about 4 days. SR quinidine gluconate (Quiniglute, Duraquin) has a time to peak value of about 4 hours and a half-life of about 10 hours. However, when the SR preparations are used, there is relatively little fall in serum levels after the initial dose before subsequent doses. About 80% of quinidine (literature range, 60%-90%) is bound to serum proteins. Quinidine is metabolized by the liver, with about 10%-20% excreted unchanged in urine by glomerular filtration. Urine excretion is influenced by urine pH.

Factors that may decrease quinidine levels include hypoalbuminemia, drugs that compete for albumin binding, and drugs that activate hepatic enzyme activity, such as phenytoin and phenobarbital. Factors that tend to increase quinidine levels include congestive heart failure, poor renal function (prerenal or intrinsic renal disease), and possibly severe liver disease. Renal excretion is increased by acidification of the urine and decreased by urine alkalinization.

Several methods are available for quinidine assay. The most commonly used are fluorometric procedures, with or without preliminary extraction steps. These measurements include quinidine and several of its metabolites. Certain other fluorescing compounds may interfere. Extraction eliminates some but not all of the metabolites. More specific methods include HPLC and EMIT. Values for the direct (nonextracted) fluorescent methods are about 50% higher than those using HPLC or EMIT (i.e., the therapeutic range with the nonextracted fluorometric method is about 3-8 µg/ml [9.25-24.66 µmol/L], whereas the range using the double-extracted fluorometric method or HPLC is 2.3-5 µg/ml [7.09-15.41 µmol/L]). The specimen for TDM should be drawn just before the next dose is to be given (residual or trough level).

Reasons for TDM of quinidine include the following:

1. Various quinidine commercial products differ considerably in absorption.
2. Toxic levels of quinidine can produce certain arrhythmias that could be due to patient disease (either from noncontrol or noncompliance).
3. There is a possibility of drug interaction, because patients taking quinidine are likely to be taking several drugs or to receive additional drugs in the future.
4. Patient disease may modify quinidine metabolism or excretion (old age frequently is associated with reduced renal function, which modifies renal excretion of quinidine).

Flecainide. Flecainide (Tambocor) is another drug used for ventricular arrythmias, including premature ventricular contractions and ventricular tachycardia or fibrillation. About 95% is absorbed. Food or antacids do not affect absorption. After absorption, roughly 40% is bound to serum proteins. About 30% (range, 10%-50%) is excreted unchanged in the urine. The major metabolites have no antiarrythmic activity. Peak plasma levels after oral dosage are reached in about 3 hours (range, 1-6 hours). Serum half-life averages 20 hours (range, 7-27 hours) and may be longer in patients with severe renal disease or congestive failure. Steady state is reached in 3-5 days. Propranolol increases flecainide serum levels approximately 20%. Hypokalemia or hyperkalemia may affect the therapeutic action of flecainide. Flecainide paradoxically aggravates ventricular arrythmias in about 7% of patients, especially in the presence of congestive heart failure.

Digoxin. Digoxin could be included in the section on toxicology, since most serum assay requests are for the purpose of investigating possible digoxin toxicity. However, an increasing number of studies have demonstrated unsuspected overdosage or underdosage (30% toxicity and 11% underdigitalization in one study), and requests for baseline levels are becoming more frequent. The volume of requests and the relative ease of performance (by immunoassay) make this assay readily available, even in smaller laboratories. The widespread use of digoxin, the narrow borderline between therapeutic range and toxicity, and the nonspecific nature of mild or moderate toxic signs and symptoms that mimic a variety of common disorders (diarrhea, nausea, arrhythmias, and ECG changes) contribute to the need for serum assay.

Digoxin therapeutic drug monitoring data

About 20%-30% of digoxin is bound to serum albumin. About 80% (range, 60%-90%) is excreted unchanged by the kidneys. About 20% is metabolized in the liver, with most of this being excreted as digoxin metabolites. About 10% of the adult population metabolizes a greater percentage of digoxin (which may be as high as 55%). After an oral dose is given, serum levels rise to a peak at 30-90 minutes and then slowly decline until a plateau is reached about 6-8 hours after administration. Digoxin assay specimens must be drawn at least 6 hours (preferably at least 8 hours) after the last dose in either oral or IV administration, to avoid blood levels that are significantly higher than would be the case when tissue levels have equilibrated. The 6- to 8-hour time span mentioned is minimum elapsed time; specimens may be drawn later. In many cases more information is obtained from a sample drawn shortly before the next scheduled dose. Serum half-life is approximately 36-38 hours. Normal therapeutic range is 0.5-2.0 µg/100 ml (0.6-2.56 nmol/L).

Various metabolic disorders and medications may alter body concentration or serum levels of digoxin or may affect myocardial response to usual dosage. The kidney is the major route of excretion, and a decrease in renal function sufficient to raise serum creatinine levels will elevate serum digoxin levels as well. In renal failure, digoxin half-life may be extended to as long as 5 days. Hypothyroidism also increases digoxin serum values. On the other hand, certain conditions affect patient response to digitalis without affecting blood levels. Myocardial sensitivity to digoxin, regardless of dose, is increased by acute myocardial damage, hypokalemia, hypercalcemia, hypermagnesemia or hypomagnesemia, alkalosis, tissue anoxia, and glucagon. Drugs that produce hypokalemia (including various diuretics, amphotericin B, corticosteroids, or glucose infusion) thus predispose to toxicity. Other medications, such as phenylbutasone, phenytoin, and barbiturates (which activate hepatic degradation mechanisms), or kaolin (Kaopectate), antacids, cholestyramine, and certain oral antibiotics such as neomycin (which interfere with absorption) tend to be antagonistic to the effect of digitalis. Quinidine elevates digoxin levels in about 90% of patients by 50%-100% (range, 30%-330%). The effect on digoxin levels begins within 24 hours, with peak effect in 4-5 days. Certain other medications can increase serum digoxin levels to some extent.

Interfering substances. Digoxin can be measured by a variety of immunoassay methods. Digoxin-like cross-reacting substances have been reported in many patients (not all) in the third trimester of pregnancy, infants up to 6 months of age (the effect peaking at 1 week of age), patients with renal failure, and patients with severe liver disease. Different kits are affected to different extents. Some investigators report that the cross-reacting substances bind to serum proteins. In most cases the cross-reaction increases serum digoxin less than 1.0 µg/100 ml, but sometimes the effect may be greater.

Antidigoxin antibody therapy. Another analytical problem occurs when digitalis toxicity is treated with fragments of antidigoxin antibodies (Fab, “antigen-binding fragments”). These fragments are prepared by first producing antidigoxin IgG class antibody in animals, then enzymatically splitting off the antigen-binding variable regions (Fab portion) of the IgG molecule. This eliminates the “constant” region of the IgG molecule, which is the most antigenic portion of the molecule. The antidigoxin antibody Fab fragments bind to plasma and extracellular fluid digoxin. This creates a disturbance in equilibrium between free (unbound) digoxin within cells and within the extracellular compartments, so that some intracellular digoxin moves out of body cells to restore the equilibrium. The digoxin-Fab bound complexes are excreted in the urine by glomerular filtration. Their elimination half-life with normal renal function is about 15-20 hours (range, 14-25 hours).

Laboratory digoxin assay is involved for two reasons. First, a pretherapy baseline is required to help establish the diagnosis of digoxin toxicity and to help estimate the dose of Fab fragments needed. Second, after injection of the Fab dose, another assay is helpful to determine if adequate therapy was given, either because pretreatment digoxin tissue levels were higher than estimated or too much of the Fab fragment dose was lost in urine before sufficient digoxin had diffused out of the body cells. It is necessary to wait at least 6-8 hours after therapy for a postdose assay, to allow for equilibration time between cells and extracellular fluid. An assay system specific for free digoxin is necessary (usually done by a technique such as microfiltration, which separates unbound from Fab-bound digoxin), because the Fab-digoxin bound complexes are included with unbound (free) digoxin in total digoxin assays. Soon after therapy begins there is greatly increased Fab-digoxin bound complex formation in plasma (and, therefore, elevated total digoxin levels, sometimes as high as 20 times pretreatment levels), whereas free digoxin levels are low. Later, 12-20 hours after the initial therapeutic dose, plasma free digoxin reequilibrates, and may reach toxic levels again if sufficient intracellular digoxin has not been captured. It may take several days to excrete all the Fab-digoxin bound complexes, and the serum total digoxin level may remain elevated more than 1 week if there is poor renal function.

Digoxin assay clinical correlation. In various studies, there is a certain amount of overlap in the area that statistically separates normally digitalized patients from those with toxicity. This overlap exists because it is difficult to recognize mild degrees of toxicity, because patient sensitivity to digitalis varies, and because the assay technique itself, no matter how well done, like all laboratory tests displays a certain amount of variation when repeated determinations are performed on the same specimen. Regardless of these problems, if the clinical picture does not agree with the level of toxicity predicted by digoxin assay values, and laboratory quality control is adequate, the physician should not dismiss or ignore the assay results but should investigate the possibility of interference by improper specimen collection time interval, drug interaction, or metabolic alterations. However, the assay should be repeated first, to verify that a problem exists.

Digitoxin. Digitoxin is more than 95% bound to serum albumin. Serum half-life is about 8 days (2.5-16.5 days). Digitoxin is about 90% metabolized in the liver. About 5%-10% is excreted unchanged through the kidneys. Drugs that activate hepatic enzyme systems, such as phenytoin and barbiturates, increase metabolism of digitoxin and decrease serum levels. Hypoalbuminemia and drugs that compete for binding sites on albumin also tend to decrease digitoxin serum levels. The long half-life of the drug means that toxicity is difficult to overcome, so digoxin has mostly replaced digitoxin in the United States. The therapeutic range of digitoxin is 15-25 ng/ml.

Psychiatric Medications

Lithium carbonate. Lithium is used for control of the manic phase of manic-depressive psychiatric illness. Peak levels are reached in 1-3 hours, and plasma half-life (in young adults) is about 24 hours (range, 8-35 hours). Time to steady state is about 5 days (range, 2-7 days). Most excretion is through the kidneys, where there is both excretion and reabsorption. Excretion is decreased (tending to increase half-life and blood levels) with poor renal function and also with sodium deficiency. Methyldopa also tends to delay lithium excretion. More rapid excretion occurs with salt loading or sodium retention. Interesting side effects are reversible hypothyroidism (about 5% of cases, with some thyroid-stimulating hormone elevation in up to 30% of cases) and neutrophilic leukocytosis. TDM assays are usually performed 12 hours after the last dose (before administration of the next dose). The usual laboratory method is flame photometry, although other methods are becoming available. The therapeutic range is somewhat narrow (approximately 0.5-1.5 mEq/L). Values higher than 2.0 mEq/L are usually considered to be in the toxic range. Maintenance therapy is customarily monitored once a month. Some interest has been shown in red blood cell (RBC) lithium analysis, especially when lack of patient compliance is suspected. RBC lithium levels are more stable over periods of time than serum lithium levels due to the relatively short half-life of serum lithium. Low RBC lithium levels in the presence of normal or elevated serum lithium levels suggest that the patient is noncompliant but took a lithium dose shortly before coming to have the specimen drawn.

Tricyclic antidepressants. The group name of these medications refers to their three-ring structure. They are widely used to treat unipolar psychiatric depression (i.e., depression without a manic phase). About 70% of these patients show some improvement. The tricyclics are thought to act through blocking one of the deactivation pathways of norepinephrine and serotonin at the brain nerve endings, thereby increasing the availability of these neurotransmitter agents in the synapse area. The different drugs differ in their effect on norepinephrine, serotonin, or both. Currently, the most commonly used tricyclics are imipramine (Tofranil), amitriptyline (Elavil), protrypyline (Vivactil), and doxepin (Sinequan). Of these, imipramine is metabolized to desipramine, and amitriptyline is metabolized to nortriptyline; in both cases the metabolites have pharmacologic activity and are actually marketed themselves under different trade names. Doxepin is also metabolized to the active compound desmethyldoxepin. If these parent compounds are assayed, their major metabolite must also be assayed as well as the parent. Other tricyclics are available, and still others are being introduced.

Oral doses are fairly completely absorbed from the GI tract. Once absorbed, there is 70%-96% binding to plasma proteins and considerable first-pass metabolism in the liver. By 6-8 days 60%-85% of the dose is excreted in the urine in the form of metabolites. Peak serum levels are generally attained 2-6 hours (range, 2-8 hours) after an oral dose. There is variation in peak level depending on the drug formula. There is considerable variation in metabolism between individuals, with variation fivefold to tenfold in steady-state levels being common and sometimes differences reported as great as thirtyfold. The half-life averages 20-30 hours (range, 15-93 hours), and steady state is reached on the average in about 7-10 days (range, 2-19 days). Imipramine has a somewhat shorter half-life (6-24 hours) and time to steady state (about 2-5 days) than the other tricyclics. However, there is variation between the various drugs and between individuals taking the same drug. It is reported that 30% or more patients have serum assay values outside the standard therapeutic range. African Americans may reach higher steady-state serum levels than Europeans.

Currently, high-performance liquid chromatography (HPLC) is considered the best assay method. Immunoassay (EMIT method) is also used but is not as specific. For example, thioridizine (Melloril) and possibly other phenothiazines may produce a reaction in the EMIT tricyclic test. When tricyclics are given once daily, TDM specimens are usually drawn 10-14 hours after the last dose (if the dose is given at bedtime, the specimen is drawn in the morning about 12 hours later). If the patient is on divided doses, the specimen should be drawn 4-6 hours after the last dose (this usually means that the specimen is drawn just before the next dose). The literature warns that some collection tube stoppers contain interfering substances and that certain serum separation devices using gels or filtration also might interfere. It is obviously necessary to select a collection and processing method that is known to be safe. Serum should be refrigerated rather than frozen. Quality control studies have shown variation within laboratories and between laboratories that is greater than the level of variation for routine chemistry tests.

Selected Drugs and Drug Groups Anticonvulsants

Most epileptics can be controlled with phenytoin (Dilantin), primidone (Mysoline), phenobarbital, or other agents. Frequently drug combinations are required. Therapy is usually a long-term project. When toxicity develops, many of these therapeutic agents produce symptoms that could also be caused by central nervous system (CNS) disease, such as confusion, somnolence, and various changes in mental behavior. Some drugs, such as primidone, must be carefully brought to a therapeutic level by stages rather than in a single dose. Most antiepileptic drugs are administered to control seizures; but if seizures are infrequent, it is difficult to be certain that the medication is sufficient to prevent future episodes. When drug combinations are used, levels for all the agents should be obtained so that if only one drug is involved in toxicity or therapeutic failure, it can be identified.

When specimens are sent to the laboratory for drug assay, the physician should list all drugs being administered. Some are metabolized to substances that themselves have antiepileptic activity (e.g., primidone is partially metabolized to phenobarbital), and the laboratory then must assay both the parent drug and its metabolite. Without a complete list of medications, there is a good chance that one or more drugs will be overlooked. Once drug blood levels have been obtained, the physician should remember that they are often not linear in relation to dose, so that a percentage change in dose may not result in the same percentage change in blood level. Repeated assays may be needed to guide dosage to achieve desired blood levels. Finally, published therapeutic ranges may not predict the individual response of some patients to the drug. Clinical judgments as well as laboratory values must be used.

Phenytoin. Phenytoin is about 90% bound to serum proteins. About 70% is metabolized in the liver, although only 5% or less is excreted unchanged through the kidneys. Peak phenytoin levels are reached 4-8 hours after an oral dose and within 15 minutes after IV administration. Serum half-life is about 18-30 hours (literature range, 10-95 hours), with an average of about 24 hours. This variation occurs in part because higher doses saturate the liver metabolic pathway and thus increase the half-life with nonmetabolized drug. The serum dose-response curve is not linear, so that relatively small increases in dose may generate relatively large changes in serum levels. Time to reach steady state is usually 4-6 days but may take as long as 5 weeks. Administration by intramuscular injection rather than oral intake is said to reduce blood levels about 50%. The therapeutic range is 10-20 µg/ml. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

Certain drugs or diseases may affect phenytoin blood levels. Severe chronic liver disease, hepatic immaturity in premature infants, or disulfiram (Antabuse) therapy often increase phenytoin levels. Certain other drugs, such as coumarin anticoagulants, chloramphenicol (Chloromycetin), methylphenidate (Ritalin), and certain benzodiazepine tranquilizers such as diazepam (Valium) and chlordiazepoxide (Librium) have caused significant elevations in a minority of patients. Acute alcohol intake may also elevate plasma levels. On the other hand, pregnancy, acute hepatitis, low doses of phenobarbital, carbamazepine (Tegretol), and chronic alcoholism may decrease phenytoin plasma levels, and they may also be decreased in full-term infants up to age 12 weeks and in some patients with renal disease. As noted previously, there may be disproportionate changes in either bound or free phenytoin in certain circumstances. About 10% of total phenytoin is theoretically free, but in one study only about 30% of patients who had free phenytoin measured conformed to this level with the remainder showing considerable variation. Certain clinical conditions or acidic highly protein-bound drugs may displace some phenytoin from albumin, causing the unbound (free) fraction of serum phenytoin to rise. Initially, total serum concentration may be decreased somewhat if the liver metabolizes the newly released free drug. However, the hepatic metabolic pathway may become saturated, with resulting persistent increase in the unbound fraction and return of the total phenytoin level into the reference range. At this time the usual phenytoin assay (total drug) could be normal while the free drug level is increased. Drugs that can displace phenytoin from albumin include valproic acid (Depakene), salicylates, oxacillin, cefazolin, cefotetan, and phenylbutasone. Large quantities of urea or bilirubin have a similar effect. Infants aged 0-12 weeks have reduced phenytoin protein binding. On the other hand, hypoalbuminemia means less binding protein is available and may result in increased free phenytoin levels coincident with decreased total phenytoin levels.

Phenytoin has some interesting side effects in a minority of patients, among which are megaloblastic anemia and a type of benign lymphoid hyperplasia that clinically can suggest malignant lymphoma. Occasional patients develop gum hypertrophy or hirsutism. Phenytoin also can decrease blood levels of cortisol-type drugs, thyroxine (T4), digitoxin, and primidone, and can increase the effect of coumadin and the serum levels of the enzymes gamma-glutamyltransferase and alkaline phosphatase. Phenytoin produces its effects on drugs by competing for binding sites on protein or by stimulating liver microsome activity. Phenytoin alters the serum enzymes by its effect on the liver microsome system.

Primidone. Primidone is not significantly bound to serum proteins and is about 50% metabolized in the liver. About 50% is excreted unchanged by the kidneys. Its major metabolites are phenobarbital (about 20%) and phenylethylmalonamide (about 20%), both of which have anticonvulsant activity of their own and both of which accumulate with long-term primidone administration. Phenobarbital is usually not detectable for 5-7 days after beginning primidone therapy. The ratio of phenobarbital to primidone has been variously reported as 1.0-3.0 after steady state of both drugs has been reached (unless phenobarbital is administered in addition to primidone). If phenytoin is given in addition to primidone, primidone conversion to phenobarbital is increased and the phenobarbital/primidone ratio is therefore increased. Peak serum concentration of primidone occurs in 1-3 hours, although this is somewhat variable. Serum half-life in adults is about 6-12 hours (literature range, 3.3-18 hours). Steady state is reached in about 50 hours (range, 16-60 hours). The therapeutic range is 5-12 µg/ml. It is usually recommended that both primidone and phenobarbital levels be assayed when primidone is used, rather than primidone levels only. If this is done, one must wait until steady state for phenobarbital is reached, which takes a much longer time (8-15 days for children, 10-25 days for adults) than steady state for primidone. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

Phenobarbital. Phenobarbital is about 50% bound to serum protein. It has a very long half-life of 2-5 days (50-120 hours) and takes 2-3 weeks (8-15 days in children, 10-25 days in adults) to reach steady state. About 70%-80% is metabolized by the liver and about 10%-30% is excreted unchanged by the kidneys. Phenobarbital, as well as phenytoin, carbamazepine, and phenylbutasone, has the interesting ability to activate hepatic microsome activity. Thus, phenobarbital increases the activation of the phenytoin liver metabolic pathway and also competes with phenytoin for that pathway. Phenobarbital incidentally increases degradation of other drugs that are metabolized by hepatic microsome activity, such as coumarin anticoagulants, adrenocorticosteroids, quinidine, tetracycline, and tricyclic antidepressants. Acute alcoholism increases patient response to phenobarbital and chronic alcoholism is said to decrease response. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

Valproic Acid. Valproic acid has been used to treat petit mal “absence” seizures and, in some cases, tonic-clonic generalized seizures and myoclonic disorders. About 90% is bound to plasma proteins. There is a relatively small volume of distribution, because most of the drug remains in the vascular system. More than 90% is metabolized in the liver, with 5% or less excreted unchanged by the kidneys. Time to peak after oral dose is 1-3 hours. Food intake may delay the peak. Serum half-life is relatively short (about 12 hours; range, 8-15 hours), and steady state (oral dose) is reached in 2-3 days (range, 30-85 hours in adults; 20-70 hours in children). Liver disease may prolong the interval before steady state. Interestingly, therapeutic effect usually does not appear until several weeks have elapsed. There is some fluctuation in serum values (said to be 20%-50%) even at steady state. Hepatic enzyme-inducing drugs such as phenytoin, phenobarbital, carbamazepine, and primadone increase the rate of valproic acid degradation and thus its rate of excretion, and therefore tend to decrease the serum levels. Hypoalbuminemia or displacement of valproic acid from albumin by acidic strongly protein-bound drugs such as salicylates decrease total valproic acid blood levels. Valproic acid can affect phenytoin and primidone levels, but the effect is variable. Phenobarbital levels are increased due to interference with liver metabolism. One report indicates that ethosuximide levels may also be increased. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

Rarely, valproic acid may produce liver failure. Two types have been described. The more common type appears after months of therapy, with gradual and potentially reversible progression signaled by rising aspartate aminotransferase (AST) levels. Periodic AST measurement has been advocated to prevent this complication. The other type is sudden, is nonreversible, and appears soon after therapy is started.

Carbamazepine. Carbamazepine is used for treatment of grand mal and psychomotor epilepsy. About 70% (range, 65%-85%) is protein bound, not enough to make binding a frequent problem. Carbamazepine is metabolized by the liver. It speeds its own metabolism by activation of the liver microsome system. Only 1% is excreted unchanged in the urine. The major metabolites are the epoxide form, which is metabolically active, and the dihydroxide form, which is derived from the epoxide form. The metabolites are excreted in urine. Carbamazepine absorption after oral dose in tablet form is slow, incomplete (70%-80%), and variable. Pharmacologic data in the literature are likewise quite variable. Dosage with tablets results in a peak level that is reached in about 6-8 hours (range, 2-24 hours). Dosage as a suspension or solution or ingestion of tablets with food results in peak levels at about 3 hours. Serum half-life is about 10-30 hours (range, 8-35 hours) when therapy is begun. But after several days the liver microsome system becomes fully activated, and when this occurs the half-life for a dose change may be reduced to about 12 hours (range, 5-27 hours). Phenytoin, phenobarbital, or primidone also activate the liver microsome system, thereby increasing carbamazepine metabolism and reducing its half-life. The time to steady state is about 2 weeks (range, 2-4 weeks) during initial therapy. Later on, time to steady state for dose changes is about 3-4 days (range, 2-6 days). Transient leukopenia has been reported in about 10% of patients (range, 2%-60%) and persistent leukopenia in about 2% (range, 0%-8%). Thrombocytopenia has been reported in about 2%. Aplastic anemia may occur, but it has been rare.

Therapeutic Drug Monitoring (TDM)

Various studies have shown that therapy guided by drug blood levels (therapeutic drug monitoring, TDM) has a considerably better chance of achieving therapeutic effect and preventing toxicity than therapy using empiric drug dosage. TDM can be helpful in a variety of circumstances, as can be seen in the following discussion.
Новости SEO – www.seofull.ru
Why obtain therapeutic drug blood levels?

1. To be certain that adequate blood concentrations are reached. This is especially important when therapeutic effect must be achieved immediately but therapeutic results are not evident immediately, as might happen when aminoglycoside antibiotics are used.
2. When effective blood levels are close to toxic levels (“narrow therapeutic window”). It is useful to know what margin of safety is permitted by the current medication dose. If blood levels are close to toxic values, a decrease in the dose might be attempted.
3. If expected therapeutic effect is not achieved with standard dosage. It is important to know whether the fault is due to insufficient blood levels or is attributable to some other factor (e.g., patient tolerance to the medication effect or interference with the therapeutic effect by other drugs).
4. If symptoms of toxicity appear with standard dosage. The problem might be one of excessive blood levels, enhancement of effect by other medications, an increase in free as opposed to total drug blood levels, or symptoms that are not due to toxicity from the drug in question.
5. If a disease is present that is known to affect drug absorption, protein binding, metabolism, or excretion.
6. Possible drug interaction. It is safer to know in advance whether other medications have altered the expected blood levels of a drug before symptoms appear of toxicity or of insufficient therapeutic effect.
7. Combination drug therapy. If multiple drugs are used simultaneously for the same purpose (e.g., control of convulsions), knowledge of baseline blood levels for each drug would be helpful should problems develop and the question arise as to which drug is responsible.
8. Possible patient noncompliance. Patients may decrease the dosage or cease taking medication altogether if symptoms improve or may simply forget to take doses.
9. Possible medicolegal considerations. An example is the aminoglycoside antibiotic group, whose use is known to be associated with renal failure in a certain percentage of cases. If a patient develops renal failure while taking one of these antibiotics, the renal failure could be due either to drug toxicity or to the underlying disease. If previous and current antibiotic blood levels are within an established range that is not associated with toxicity, the presumptive cause of renal failure is shifted toward the disease rather than the therapy.
10. Change in dosage or patient status to establish a new baseline for future references.

What factors influence therapeutic drug blood levels?

A great many factors influence TDM blood levels. Discussion of some of the more important follows.

Route of administration. Intravenous (IV) administration places medication into the blood faster than intramuscular injection, which, in turn, is usually faster than oral intake. If IV medication is administered in a few minutes, this may shorten the serum half-life of some medications such as antibiotics compared to methods of administration that take longer. Oral medication may be influenced by malabsorption.

Drug absorption. This may be altered by gastrointestinal (GI) tract motility variations, changes of intestinal acidity, malabsorption disorders, and in some cases interference from food or laxatives.

Drug transport. Many drugs have a substantial fraction that is bound to plasma proteins. Acidic drugs bind predominantly to albumin, and basic drugs bind predominantly to alpha-1 glycoproteins. Protein-bound drug molecules are not metabolically active. Therapeutic drug monitoring using total drug concentration is based on the assumption that the ratio between bound and unbound (“free”) drug remains constant, and therefore alterations in the total drug level mirror alterations in the free drug level. In most cases this is true. However, when 80% or more of a drug is protein bound, there may be circumstances in which alterations in the ratio of bound to free drug may occur. These alterations may consist of either a free drug concentration within toxic range coupled with a total drug concentration within therapeutic range or a free drug concentration within therapeutic range coincident with a total drug concentration within toxic range. This may happen when the quantity of binding protein is reduced (e.g., in hypoalbuminemia) and the dose rate is not changed from that used with normal protein levels. Problems may also arise when the quantity of binding protein is normal but the degree of binding is reduced (in neonatal life and in uremia) or when competition from other drugs displaces some of the bound fraction; interaction between acidic drugs with a high percentage of protein binding (e.g., valproic acid and phenytoin); if metabolism of free drug decreases (severe liver disease); or if excretion of free drug decreases (renal failure). Although an increase in free drug quantity may explain toxic symptoms, it is helpful also to know the total drug concentration to deduce what has happened. In routine TDM, total drug concentration is usually sufficient. If toxicity occurs with total drug levels within the therapeutic range, free drug levels may provide an explanation and a better guideline for therapy. Free drug assays currently are done only by large reference laboratories. The introduction of relatively simple techniques to separate bound from free drug (e.g., membrane filtration) may permit wider availability of free drug assay.

Drug uptake by target tissues. Drug molecules must reach target tissue and penetrate into tissue cells. Conditions such as congestive heart failure can decrease tissue perfusion and thereby delay tissue uptake of the drug.

Extent of drug distribution (volume of distribution). Lipid-soluble drugs penetrate tissues easily and have a much greater diffusion or dispersal throughout the body than non-lipid-soluble drugs. Dispersal away from the blood or target organ decreases blood levels or target tissue levels. The tendency to diffuse throughout the body is measured by dividing the administered drug dose by the plasma concentration of the drug (at equilibrium). This results in the theoretical volume of body fluid within which the drug is diffused to produce the measured serum concentration, which, in turn, indicates the extent of extravascular distribution of the drug.

Drug tissue utilization. Various conditions may alter this parameter, such as disease of the target organ, electrolyte or metabolic derangements, and effect of other medications.

Drug metabolism. Most drugs for which TDM is employed are wholly or partially inactivated (“detoxified”) within the liver. Liver function becomes a critical factor when severe liver damage occurs. Also, some persons metabolize a drug faster than average (“fast metabolizer”), and some metabolize drugs slower (“slow metabolizer”). Certain drugs such as digoxin and lithium carbonate are not metabolized in the liver. The rate of drug metabolism plus the rate of excretion are major determinants of two important TDM parameters. Half-life (biologic half-life) refers to the time required to decrease drug blood concentration by 50%. It is usually measured after absorption has been completed. Steady state refers to drug blood level equilibrium between drug intake and elimination. Before steady state is achieved, drug blood values typically are lower than the level that they eventually attain. As a general rule it takes five half-lives before steady state is reached. Loading doses can decrease this time span considerably. A few investigators use three half-lives as the basis for steady-state measurements.

Drug excretion. Nearly all TDM drugs are excreted predominantly through the kidneys (the major exception is theophylline). Markedly decreased renal function obviously leads to drug retention. The creatinine clearance rate is commonly used to estimate the degree of residual kidney function. When the serum creatinine is more than twice reference upper limits, creatinine clearance is usually less than 25% of normal and measurement is less accurate. In addition, creatinine clearance is somewhat reduced in the elderly, and some maintain that clearance reference ranges should be adjusted for old age.

Dosage. Size and frequency of dose obviously affect drug blood levels.

Age. Infants in general receive the same dose per unit weight as adults; children receive twice the dose, and the elderly receive less. A very troublesome period is the transition between childhood and puberty (approximately ages 10-13 years) since dosage requirements may change considerably and without warning within a few months.

Weight. Dosage based on weight yields desirable drug blood levels more frequently than arbitrary, fixed-dose schedules. One assumes that a larger person has a larger total blood volume and extracellular fluid space within which the drug is distributed and a larger liver to metabolize the drug.

Interference from other medications. Such interference may become manifest at any point in drug intake, metabolism, tissue therapeutic effect, and excretion, as well as lead to possible artifact in technical aspects of drug assay.

Effect of disease on any previously mentioned factors. This most frequently involves considerable loss of renal or hepatic function.

Assay of peak or residual level. In general, peak levels correlate with toxicity, whereas residual (trough) levels are more an indication of proper therapeutic range (i.e., whether the blood level remains within the therapeutic range). Of course, if the residual level is in the toxic range this is an even stronger indication of toxicity. An exception to the general rule is the aminoglycoside antibiotic group, in which the peak level is used to indicate whether therapeutic levels are being reached and the residual level is considered (some disagreement exists on this point) to correlate best with nephrotoxicity. For most drugs, the residual level should be kept within the therapeutic range and the peak level should be kept out of the toxic range. To avoid large fluctuations, some have recommended that the dose interval be one half of the drug half-life; in other words, the drug should be administered at least once during each half-life.

One of the most important laboratory problems of drug level monitoring is the proper time in relationship to dose administration at which to obtain the specimen. There are two guidelines. First, the drug blood level should have reached steady state or equilibrium, which as a rule of thumb takes five drug half-lives. Second, the drug blood level should be at a true peak or residual level. Peak levels are usually reached about 1-2 hours after oral intake, about 1 hour after intramuscular administration, or about 30 minutes after IV medication. Residual levels are usually reached shortly (0-15 minutes) before the next scheduled dose. The greatest problem is being certain when the drug was actually given. I have had best results by first learning when the drug is supposed to be given. If a residual level is needed, the nursing service is then instructed to withhold the dose. The blood specimen is drawn approximately 15 minutes before the scheduled dose time, and the nursing service is then told to administer the dose. If a peak level is needed, the laboratory technologist should make arrangements to have the nursing service record the exact minute that the dose is given and telephone the laboratory. Unless the exact time the specimen was obtained and the exact time the drug dose was given are both known with certainty, drug blood level results cannot be properly interpreted and may be greatly misleading.

Laboratory technical factors. These include the inherent technical variability of any drug assay method (expressed as a coefficient of variation) as well as the other sources of error discussed in Chapter 1. Therapeutic drug monitoring assays in general have shown greater differences between laboratories than found with simple well-established tests such as blood urea nitrogen or serum glucose levels.

Patient compliance. Various studies have shown astonishingly high rates of patient noncompliance with dose instructions, including failure to take any medication at all. Possibly 20%-80% of all patients may be involved. Noncompliance results in subtherapeutic medication blood levels. Some believe that noncompliance is the most frequent cause of problems in patients on long-term therapy.

Therapeutic and toxic ranges

Therapeutic ranges are drug blood levels that have been empirically observed to correlate with desired therapeutic effects in most patients being treated for an uncomplicated disease. The same relationship is true for toxicity and toxic ranges. However, these ranges are not absolute and do not cover the response to a drug in all individual patients or the response when some unexpected factor (e.g., other diseases or other drugs) is superimposed. The primary guide to therapy is a good therapeutic response without evidence of toxicity. Most of the time this will correspond with a drug blood level within the therapeutic range, so the therapeutic range can be used as a general guideline for therapy. In some cases a good response does not correlate with the therapeutic range. In such cases the assay should be repeated on a new specimen to exclude technical error or specimens drawn at the wrong time in relation to dose. If the redrawn result is unchanged, clinical judgment should prevail. Some attempt should be made, however, to see if there is some factor that is superimposed on the disease being treated that could explain the discrepancy. Removal or increase of such a factor could affect the result of therapy at a later date. The same general statements are true for toxicity and toxic ranges. Some patients may develop toxicity at blood levels below the statistically defined toxic range and some may be asymptomatic at blood levels within the toxic range. However, the further the values enter into the toxic range, the more likely it is that toxicity will develop. Thus, patient response and drug level data are both important, and both are often necessary to interpret the total picture.

Some Conditions That Produce Unexpected Therapeutic Drug Monitoring Results
High plasma concentration on normal or low prescribed dose
Patient accidental overdose
Slow metabolizer
Drug interaction that blocks original drug metabolism in liver or injures the liver
Poor liver function (severe damage)
Drug excretion block
Increased binding proteins
Residual level determined on sample drawn after dose was administered instead of before
Laboratory technical factors
Low plasma concentration on normal or high prescribed dose
Poor drug absorption (oral dose)
Interference by another drug
Patient noncompliance
Fast metabolizer
Decreased binding proteins
Peak level determined on sample drawn at incorrect time
Laboratory technical factors
Toxic symptoms with blood levels in therapeutic range
Drug released from proteins (free drug increased)
Drug effect enhanced at tissue level by some other drug or condition
Blood level obtained at incorrect time
Laboratory technical factors
Symptoms may not be due to toxicity of that drug

When to obtain specimens for therapeutic drug monitoring

If a patient develops symptoms that might be caused by a drug, the best time to obtain a specimen for TDM is during the period when the patient has the symptoms (if this is not possible, within a short time afterward). One possible exception, however, is digoxin, whose blood level does not equilibrate with tissue levels until at least 6-8 hours after the dose is given. Therefore, specimens for digoxin TDM should not be drawn less than 6 hours after administration of the previous dose, even if toxic symptoms occur earlier. It should be ascertained how much time elapsed between the onset of toxic symptoms and the time of the last previous medication dose. This information is necessary to determine if there is a relationship of the symptoms to the peak blood level of the drug. If the specimen cannot be drawn during symptoms, the next best alternative is to deliberately obtain a specimen at the peak of the drug blood level. This will indicate if the peak level is within the toxic range. In some instances it may be useful to obtain a blood specimen for TDM at a drug peak level even without toxic symptoms, to be certain that the drug dosage is not too high.

In some cases the question is not drug toxicity but whether dosage is adequate to achieve the desired therapeutic effect. In that case, the best specimen for TDM is one drawn at the residual (valley or trough) drug level, shortly before the next medication dose is given. The major exception to this rule is theophylline, for which a peak level is more helpful than a residual level.

For most drugs, both peak and residual levels should be within the therapeutic range. The peak value should not enter the toxic range and the residual value should not fall to therapeutically inadequate levels.

Information on some of the medications for which TDM is currently being used is given in Table 37-25. The box lists some conditions that produce unexpected TDM results.

Summary

Therapeutic drug monitoring can be extremely helpful in establishing drug levels that are both therapeutically adequate and nontoxic. To interpret TDM results, the clinician should know the pharmacodynamics of the medication, ascertain that steady-state levels have been achieved before ordering TDM assays, try to ensure that specimens are drawn at the correct time in relation to dose administration, be aware of effects from other medication, and view TDM results as one component in the overall clinical picture rather than the sole basis for deciding whether drug dosages are correct. Drug monitoring is carried out in two basic situations: (1) in an isolated attempt to find the reason for therapeutic failure (either toxic symptoms or nonresponse to therapy) and (2) to obtain a baseline value after sufficient time has elapsed for stabilization. Baseline values are needed for comparison with future values if trouble develops and to establish the relationship of a patient’s drug blood level to accepted therapeutic range. This information can be invaluable in future emergencies.

Comments on therapeutic drug monitoring assay

To receive adequate service, the physician must provide the laboratory with certain information as well as the patient specimen. This information includes the exact drug or drugs to be assayed, patient age, time elapsed from the last dose until the specimen was obtained, drug dose, and route of administration. All of these factors affect normal values. It is also desirable to state the reason for the assay (i.e., what is the question that the clinician wants answered) and provide a list of medications the patient is receiving.

Some (not all) of the methods used in drug assay include gas-liquid chromatography (technically difficult but especially useful when several drugs are being administered simultaneously, as frequently occurs in epileptics), thin-layer chromatography (TLC; more frequently used for the hypnotic drugs), radioimmunoassay (RIA), fluorescence-polarization immunoassay, and enzyme-multiplied immunoassay (EMIT).

One of the major reasons why TDM has not achieved wider acceptance is that reliable results are frequently not obtainable. Even when they are, the time needed to obtain a report may be several days rather than several hours. It is essential that the physician be certain that the reference laboratory, whether local or not, is providing reliable results. Reliability can be investigated in several ways: by splitting patient samples to be evaluated between the laboratory and a reference laboratory whose work is known to be good (but if isolated values are discrepant, a question may arise as to whose is correct), by splitting samples and sending one portion 1 week and the remainder the next week, or by obtaining standards from commercial companies and submitting these as unknowns. Most good reference laboratories will do a reasonable amount of such testing without charge if requested to do so beforehand.

In some situations, assay results may be misleading without additional information. In certain drugs, such as phenytoin (Dilantin), digitoxin, and quinidine, a high percentage is bound to serum albumin and only the nonbound fraction is metabolically active. This is similar to thyroid hormone protein binding. The (nonbound) fraction may be increased in hypoalbuminemia or in conditions that change protein binding, such as uremia or administration of drugs that block binding or compete for binding sites. Drug level assays measure total drug and do not reflect changes in protein binding. In addition, some drugs, diseases, or metabolic states may potentiate or inhibit the action of certain therapeutic agents without altering blood levels or protein binding. An example is digoxin toxicity induced by hypokalemia.

Other Congenital Diseases

There are a large number of congenital and genetic disorders, too many to include all in this book. If such a condition is suspected, in general the best procedure is to refer the patient or family to a university center that has an active genetics diagnosis program. If the state government health department has a genetic disease detection program, it can provide useful information and help in finding or making referral arrangements.

Some Genetic Disorders Diagnosable with DNA Probes

Huntington’s chorea
Adult polycystic disease
Alpha and beta thalassemia
Congenital adrenal hyperplasia
Duchenne’s and Becker’s muscular dystrophy
Fragile X syndrome
Hemophilia A and B
Myotonic dystrophy
Osteogenesis imperfecta
Alpha-1 antitrypsin deficiency
Cystic fibrosis
Sickle cell hemoglobinopathy
Retinoblastoma
Familial hypertrophic cardiomyopathy