Category: Clinical Laboratory Medicine

The clinical laboratory has a major role in modern medicine. A bewildering array of laboratory procedures is available, each of which has its special usefulness and its intrinsic problems, its advantages and its drawbacks. Advances in biochemistry and radioisotopes, to name only two conspicuous examples, are continually adding new tests or modifying older methods toward new usefulness. It seems strange, therefore, that medical education has too often failed to grant laboratory medicine the same prominence and concern that are allotted to other subjects

  • Drugs of Abuse

    Testing for drugs of abuse usually occurs in two circumstances: possible or known overdose or testing of clinically well persons to detect drug use. Overdose will be discussed in the section on toxicology. Drug screening has its own unique problems. For example, it is necessary to provide legal chain of custody protection to specimens so that each time a specimen changes hands the person receiving it documents this fact and thereby becomes the theoretical protector of the specimen. Another difficulty is attempts by some patients to invalidate the tests if the tests are performed on urine. This may involve diluting the urine specimen, adding substances that might interfere with the test, or substituting someone else’s specimen. Possible dilution can be suspected or detected by specimen appearance (appearance suggesting water), very low specific gravity, or specimen temperature less than or more than body temperature. One investigator has found normal urine temperature immediately after voiding to be 97°-100°F (36°-38°C); the National Institute of Drug Abuse (NIDA) current guidelines are 90.5°-99.8°F (32.5°-37.7°C). Addition of foreign substances may be detected by unusual color or other appearance, low specimen temperature, or by unusually low or high specimen pH (normal urine pH is generally considered to be 4.5-8.0). Sometimes there may be an unusual smell. Specimen substitution by the patient may be suspected by specimen temperature lower than body temperature. A fluid without creatinine is probably not urine. Patient identity should be verified, by photograph if possible, to prevent a substitute from providing the specimen.

    A variety of methods can be used for initial screening. Currently, the two most popular are thin-layer chromatography (TLC) and some form of immunoassay. The Syva Company EMIT immunoassay was one of the first to be introduced and remains the most popular. Due to the possibility of cross-reacting substances and the implications of a positive test result to the patient, as well as legal considerations, positive screening test results should be confirmed by a method that uses a different detection principle. Currently, the method of choice is gas chromatography followed by mass spectrometry (GC/MS). Instruments are available that combine both components. Gas chromatography separates the various substances in the mixture and the mass spectrometer bombards each substance from the chromatographic separation with electrons to ionize the constituents. The constituents are separated on the basis of mass/charge ratio, and a mass spectrum peak is calculated for each by comparing the mass to the number of ions of that mass that are present. The spectrum peak is a fingerprint that identifies the compound. Therefore, the gas chromatography element separates the constituents, and the mass spectrometry component identifies them.

    Marijuana (cannabis). The most important active component of marijuana is d-9-tetra hydro cannabinol (d-9-THC, usually, although incorrect technically, abbreviated as THC). After inhalation of marijuana, THC can be detected in blood in about 1-2 minutes and reaches a peak in about 7 minutes. The sensation attributed to THC, however, does not appear until about 20-30 minutes after the serum peak, at a time when the serum level of THC is considerably lower. It is fat soluble and is quickly deposited into many tissues, including the brain. At the same time, the THC that reaches the liver is metabolized to a compound with psychogenic properties called “11-hydroxy-THC,” which then itself is rapidly metabolized to various compounds, the principle metabolite being a nonpsychogenic water-soluble compound conjugated to glucuronide molecules called “carboxy-THC.” About 30 minutes after absorption into tissues, THC is slowly released back into the blood, where liver metabolism continually reduces its body availability. If more marijuana is smoked before the previous amount has been eliminated, more THC will be deposited in tissues (up to a saturation point), and total elimination takes longer. Shortly after reaching the serum peak, the serum level of THC begins to fall due to tissue absorption and liver metabolism even if smoking continues, reaching only 10% of the peak levels in 1-2 hours. The serum half-life of THC after inhalation is about 0.5-1.5 hours. Carboxy-THC reaches a serum peak at about 20-30 minutes, at which time it begins to exceed THC. At 1 hour after inhalation, about 15% of plasma cannabinoids is THC and about 40% is carboxy-THC. Both THC and carboxy-THC are nearly all bound to plasma proteins (predominantly lipoproteins), and their concentration in plasma is about twice that of whole blood. About two thirds of the cannabinoid metabolites are excreted in feces and about one third in urine. The body elimination half-life of THC is about 24 hours (range, 18-30 hours), and the elimination half-life of carboxy-THC, the principle metabolite, is 3-6 days. Since the elimination half-life of THC is about 1 day and since steady state is reached after five half-lives, if the individual smokes roughly the same number of marijuana cigarettes each day, there will be equilibrium between intake and elimination of THC in about 5 days. Carboxy-THC has a longer elimination half-life, so that constant or heavy use of marijuana greatly prolongs the time that carboxy-THC will be detectable in the urine. Marijuana can be eaten as well as smoked. Absorption from the GI tract is slower and less predictable than through the lungs. Onset of the psychogenic effect occurs about 1-3 hours after ingestion of marijuana. Serum levels of 11-hydroxy-THC are considerably higher after oral intake of cannabis than levels after smoking.

    Urine assay. Carboxy-THC is the major metabolite of THC and is the one usually assayed in urine. Length of detectable urinary excretion varies with the amount of marijuana used per day, which, in turn, depends on the type of material (e.g., ordinary marijuana, hashish, or other forms) and the number of times per day of administration. There is also some effect from the route of use (smoking or ingestion) and individual tolerance or variation in the rate of metabolism. There are also assay technical factors. Most investigation of urine carboxy-THC detection has used detection cutoff levels of either 20 ng/ml or 100 ng/ml. The 100 ng/ml cutoff point was used in order to prevent claims that inhaling smoke from someone else’s marijuana cigarette might produce a positive urine test result. Actually, several studies have tested persons exposed to prolonged inhalation from cigarettes of other persons in small, confined areas (severe passive inhalation), and found that only a few persons had positive urine tests at the 20 ng/ml cutoff level. The longest time interval for a positive test was 3 days. Under ordinary experimental conditions of passive exposure, only a few individuals had detectable urine levels at the 20 ng/ml cutoff; detectability usually disappeared in less than 24 hours and almost always by 48 hours. Urine specimens should be frozen if testing is delayed, to preserve carboxy-THC values.

    Saliva assay. It has been reported that THC remains in saliva up to 5 hours after cannabis inhalation. Therefore, detection of THC in saliva theoretically might indicate recent use of marijuana. To date, saliva assay has not been widely used.

    Time period after use that marijuana presence can be detected. After a single cigarette containing usual amounts of THC is smoked, urine levels become detectable after about 1 hour and remain detectable at the 100 ng/ml cutoff level for 1-3 days and at the 20 ng/ml cutoff level for 2-7 days (therefore, the total detectable time period at the 20 ng/ml level is about 5-7 days, with a range of 2-10 days). For example, in one study those patients tested after smoking a single cigarette showed urine results more than 100 ng/ml for up to 3 days and results more than 20 ng/ml for an additional 5-8 days. Smoking more than one cigarette on the same day for 1 day only extends the detectability time about 2 days. In chronic heavy users, after smoking is stopped, urine results can remain positive at the 20 ng/ml level in some individuals up to 30-40 days. In one report, chronic marijuana users with recent heavy intake had urine assays more than 100 ng/ml for 7-14 days, followed by assays greater than 20 ng/ml for an additional 7-14 days. However, in another study of chronic marijuana smokers, results of about 25% of those who had smoked within 2 days of testing were negative at the 100 ng/ml level.

    Interpretation of test results. Carboxy-THC is not psychotropically active, and because of the variability of excretion due to the different factors just noted, detection of this substance in urine (if confirmed) indicates only that the patient has used marijuana in the recent past without providing evidence that correlates with physical or mental effects of marijuana. Serum levels of THC greater than 2 ng/ml is thought to indicate probability that an individual would have some undesirable effects. In some circumstances, such as patient actions that may have been influenced by marijuana, it might be useful to obtain a THC serum level immediately as an indicator of current status and to compare with the urine carboxy-THC level. If the question arises whether marijuana use is ongoing, monitoring the urine periodically (e.g., every 4-5 days) should demonstrate a progressive downward trend in the values if smoking has indeed stopped, although there may be some fluctuations during this time. Initially positive test results with any screening procedure must be verified by a confirmatory procedure (such as GC/MS) if the positive results will lead to some significant action. The different sensitivity levels of different tests must also be kept in mind, as well as the effect of urine concentration or dilution.

    Cocaine. Cocaine can be self-administered intranasally, by smoking, or intravenously. It may also be taken orally, but this is not common since gastric juice inactivates most of the drug. Intranasal administration produces peak blood levels in 30-40 minutes (range, 15-60 minutes). About 80% of the dose reaches the bloodstream. Intravenous administration produces peak levels in 3-5 minutes. Smoking pharmacokinetics are similar to those of IV use, with peak levels reached in about 5 minutes, although only an average of about 45% of the dose reaches the bloodstream. The most common form used for smoking is known as “free-base,” which is derived from the active ingredient cocaine hydrochloride by separating the cocaine base from the hydrochloride ions, usually by extracting the cocaine base in a solvent. Already-processed cocaine base is often called “crack.” This is the most potent form of cocaine. Cocaine is very lipophilic and is rapidly taken up by tissues containing lipid, such as the brain. The half-life of cocaine in the body after the serum peak is about 1.5 hours (range, 0.5-2 hours) for all methods of drug intake. About 25%-40% of the dose that reaches the bloodstream is converted to the major metabolite benzoylecgonine by hydrolysis in fluids and peripheral tissues and excreted in the urine. Benzoylecgonine has a body half-life of 7-9 hours, which is about 6 times as long as that of cocaine. About 20%-25% of serum cocaine is converted to other metabolites, with roughly equal contribution by the liver and by serum cholinesterase. About 1% is excreted unchanged in the urine.

    Detection of cocaine. Cocaine or its metabolites can be measured in serum or in urine. The serum half-life of cocaine is short, and cocaine from a single dose is usually nondetectable in 6-10 hours, although it may be detectable longer with very sensitive methodology. Multiple doses may prolong serum detectability. Cocaine in urine is detectable for only about 8-12 hours after a single dose. Cocaine is usually investigated in urine through detection of its metabolite benzoylecgonine. This is detectable in urine beginning 1-4 hours after a cocaine dose. How long it will remain detectable depends on the quantity ingested, whether dosage is single or multiple, individual patient variation, and the sensitivity of the detection method. RIA is the most sensitive of the screening methods (5 µg/L) and may detect cocaine metabolites as long as 7 days after a large dose. Enzyme immunoassay (EIA, an EMIT variant) is less sensitive (300 µg/L) and would detect the presence of the same dose for about 2-3 days. Since some false positive results can be obtained with any of the screening tests, a positive result must be verified by a confirmatory method such as GC/MS. The screening methods are designed to detect benzoylecgonine, which remains detectable considerably longer than cocaine, so that a positive urine screening test result does not mean the patient was under the influence of cocaine at the time he or she produced the urine specimen, and the result usually will not predict (except as an estimate involving considerable time variation) when the last dose was taken. Proof of use at a specific time requires detection of cocaine itself in serum or other body tissue. This is usually done by GC/MS. Specimens should be placed in ice and the serum frozen to prevent hydrolysis of cocaine to its metabolites.

    Phencyclidine. Phencyclidine (PCP) effects are frequently not recognized; in one study, only 29% of patients were correctly diagnosed on admission. PCP is a water-soluble powder that is administered by smoking water-dissolved drug applied to some smoking material or by ingestion. About 70% of the dose reaches the bloodstream by either route. Peak serum levels are reached 5-15 minutes after smoking. Peak levels after oral intake are reached after about 2 hours. The body half-life of PCP after serum peak levels are reached varies considerably, averaging about 18 hours with a range of 8-55 hours, and are somewhat dependent on the dose. About 10% of the dose is excreted in the urine unchanged, and the remainder as various metabolites without one of them being greatly predominant. PCP or its metabolites are often detected in urine for about 1 week. In some cases it may be detected for several days to several weeks, again depending on the quantity administered, whether administration was acute or chronic, and the sensitivity of the detection method. Drug excretion can be increased by urine acidification. Serum or urine levels do not correlate well with severity of symptoms. PCP or some of its metabolites can be detected by RIA, EIA, TLC, and other techniques. These methods differ in sensitivity and each method has some substances that may cross-react. GC/MS is the best confirmatory method.

    Amphetamines. Metamphetamine is used more frequently than the other amphetamines. Amphetamines can be administered orally, intravenously, or by smoking. Tolerance frequently develops, necessitating larger doses to achieve desired effects. Other drugs are frequently used at the same time. Absorption from the GI tract is fairly rapid. Body half-life is 4-24 hours. About half the dose is metabolized in the liver. About 45% of the metamphetamine dose is excreted in urine unchanged, about 5% as amphetamine, and the remainder as other metabolites. Amphetamines are usually detectable in urine by 3 hours after administration of a single dose, and screening test results can be positive for 24-48 hours (dependent to some extent on the size of the dose and the sensitivity of the method). A positive result for amphetamines in urine generally means use in the last 24-48 hours. Screening methods include RIA, EIA, TLC, and other techniques. A substantial number of over-the-counter medications for colds or for weight reduction contain amphetamines or amphetamine analogs that may cross-react in one or more screening tests. Other medications may also interfere. GC/MS is the best confirmatory method.

    Morphine and related alkaloids. Morphine and codeine are made from seed pods of the opium poppy. Heroin is made from morphine. Morphine and heroin are usually injected intravenously. About 10% (range 2%-12%) of a morphine dose is excreted unchanged in the urine, and about 60%-80% of the dose is excreted in urine as conjugated glucuronides. The body half-life is 1.7-4.5 hours. Heroin is rapidly metabolized to morphine, with about 7% of the dose excreted as morphine and 50%-60% excreted as conjugated morphine glucuronides. Codeine is excreted primarily as conjugated codeine glucuronides in the urine, but a small amount (<10%) is metabolized to morphine and morphine conjugated glucuronides, which appear in the urine. Poppy seeds are used as a filling for baked goods and also are used unprocessed; they are sold legally even though they contain some natural morphine and codeine. The amount of opiate alkaloids in poppy seeds is not sufficient to produce any symptoms or noticeable sensation, but consumption of a moderate amount of this material can result in detectable concentrations of morphine in the urine that can last as long as 36-60 hours.

    Screening tests for morphine and other opiates are similar to those for other drugs of abuse: RIA, EIA (EMIT and others), TLC, and in addition a hemagglutination inhibition assay. Most of these methods cannot differentiate between codeine and morphine. Also, since codeine metabolism results in a small but measurable amount of morphine conjugates, prescription medications containing codeine for pain relief or cough suppressive effects may produce positive test results for morphine. In general, if the concentration of codeine greatly exceeds that of morphine, the parent drug is probably codeine. In general, excluding prescription drugs, the presence of morphine in the urine indicates nonlegal use of morphine, heroin, or codeine in the past 1-2 days. Detection of these compounds should be confirmed, and the compound identified, using GC/MS. In addition, GC/MS can differentiate between poppy seed ingestion and heroin intake by detecting and measuring 6-monoacetylmorphine, a metabolite of heroin that is not present in poppy seeds or in the urine of persons who ingest poppy seeds.

  • Cyclosporine

    Cyclosporine (previously called “cyclosporin A”) is a compound derived from a soil fungus that has strong immunosuppressive activity and is widely used to prevent transplant rejection. Cyclosporine is thought to inhibit helper T-cell function with minimal effect on B-cell function. Cyclosporine can be administered orally or intravenously. If given orally, it is absorbed through the lymphatics of the distal ileum, with considerable variability in time and degree of absorption. During the immediate postabsorptive period, 8%-60% of the dose is absorbed, although later absorption may improve somewhat. After an oral dose, peak serum levels are reached in about 2.5 hours, and subsequent body elimination half-life is about 4 hours. There is wide variation of these two parameters between individual patients (e.g., elimination time variation of 4.3-53 hours in renal transplant patients). About 50%-60% of the absorbed dose is bound to RBCs, 25%-35% is in plasma, and about 10%-15% is bound to leukocytes. The plasma component is about 60% bound to high-density lipoproteins, about 25% to low-density lipoproteins, and about 10% to other plasma proteins, leaving about 5% unbound. Almost all of the drug is metabolized by the liver microsome system into various derivatives that are excreted in bile and feces, with only 1%-6% of the metabolites excreted in urine. There are several serious side effects. About 25% of transplant patients show some degree of renal toxicity. Lesser numbers develop hypertension or liver toxicity.

    Cyclosporine assay. The blood concentration of cyclosporine cannot be predicted from an oral dose. In addition, there is a narrow balance between insufficient immunosuppression with too little drug and inducement of toxicity with too much. Therefore, TDM is considered essential. However, there is considerable controversy in the literature regarding the technical details of cyclosporine TDM. Either whole blood or plasma can be analyzed. Distribution of the drug between plasma and RBCs is temperature dependent, with decrease in serum concentration as temperature decreases. Therefore, to obtain plasma, one must equilibrate the blood at a fixed temperature, and this temperature will influence the assay value. On the other hand, whole blood assay results are affected by the patient hematocrit. Whole blood assay is recommended by the AACC Task Force on Cyclosporine Monitoring (1987). The two widely used assay methods are HPLC and RIA. RIA produces higher values than HPLC and includes some cross-reacting metabolites with the cyclosporine measurement. The HPLC assay is more specific since it does not include metabolites. However, there are many published HPLC procedures that vary in one or more technical details. At present, there is no consensus on a single analytic protocol, and since different methods and technical variations produce different results, an exact therapeutic range has not been established. Average values from the literature are 250-1,000 µg/L using whole blood by RIA, 50-200 µg/L using plasma by RIA, and 100-500 µg/L using whole blood by HPLC. Trough levels are usually obtained. Certain medications affect cyclosporine assay, such as phenytoin, which activates the liver microsome system.

    FK-506 (tacrolimus). This is a recent bacteria-derived macrolide immunosuppressive agent that selectively suppresses both helper/inducer and cytotoxic T-lymphocyte activity, similar to the action of cyclosporine. It appears to have immunosuppressive activity equal to or greater than cyclosporine (especially in liver transplants) with substantially less toxicity. However, nephrotoxicity may occur. Use of medications inhibiting liver microsomal activity (e.g., cinetidine, erythromycin, ketoconazole) increases FK-506 plasma concentration. Assay for FK-506 is possible using monoclonal antibody enzyme immunoassay methods, although these are “first generation” and need to be improved. The therapeutic range is also not standardized and is probably method dependent.

  • Antibiotics

    Gentamicin.Methods of estimating antibiotic therapeutic effectiveness have been discussed elsewhere (chapter 14). Several antibiotics possess therapeutic ranges whose upper limits border on toxicity. Serum assays for several of these have been developed, most commonly using some type of immunoassay. One example will be used to illustrate general principles. Gentamicin (Garamycin) is one of the aminoglycoside antibiotics that is active against gram-negative organisms, including Pseudomonas aeruginosa. Unfortunately, side effects include ototoxicity and nephrotoxicity. Drug excretion is mainly through renal glomerular filtration. Serum peak levels and residual (trough) levels both provide valuable information. Residual levels are measured just before the next dose. Values at this time correlate best with nephrotoxicity, especially when serum levels are greater than 2 µg/ml. Specimens for peak level determination are obtained approximately 30 minutes after the end of IV infusion and 1 hour after intramuscular injection. Peak levels correlate best with therapeutic effectiveness (i.e., whether adequate serum levels are present) and possibly with ototoxicity. Normal peak values are usually considered 4-8 µg/ml. Values less than 4 µg/ml may be ineffective, whereas those greater than 10 µg/ml predispose to toxicity. Gentamicin assay is desirable because serum levels differ considerably among patients receiving the same dose, and serum gentamicin half-life is equally variable. Standard doses or nomograms based on serum creatinine level fail to predict blood concentration accurately for peak or residual levels in a substantial number of patients even with adequate renal function. When renal function is impaired or when nephrotoxic antibiotics have previously been administered, serum assay becomes essential. It should be mentioned that peak or residual levels within accepted reference limits do not guarantee safety, since some studies have shown onset of renal function decrease in the presence of acceptable serum values.

    Vancomycin. Only a small mount of oral vancomycin is absorbed, so that the oral form is used to kill GI tract bacteria such as Clostridium difficile. Intravenous medication is used for other infections. Intravenous vancomycin is about 50%-60% bound to serum albumin, and 80%-90% is excreted unchanged in the urine. The serum half-life is 2-3 hours in children and 4-8 hours in adults with normal renal function. In renal failure, the serum half-life becomes 7-8 days (range, 4-9 days), and instead of the usual adult IV dose of 500 mg every 6 hours, only 500-1,000 mg once per week is sufficient. Peak and residual (trough) levels are usually recommended. Residual levels are usually obtained just before a dose is given; the reference values are 5-10 mg/100 ml. Unfortunately, different investigators do not agree when to draw specimens after the end of IV infusion for peak values, with times suggested including immediately, 15 minutes, 30 minutes, and 2 hours after the infusion. Vancomycin serum levels apparently fall rapidly for a time after the end of IV infusion and then more slowly. At 15 minutes after the end of infusion, serum values of 25-30 mg/100 ml are equivalent to 30-40 mg/100 ml levels (the most commonly accepted peak value range) at the end of infusion.

  • Theophylline (Aminophylline)

    Theophylline is used primarily as a bronchodilating agent for therapy of asthma. Over the therapeutic range there is a reasonably linear correlation between dosage and therapeutic effect. The drug is administered intravenously and orally. Oral medication is available in regular (noncoated or liquid) and slow-release (SR) forms. SR forms are available in twice-daily, and even once-daily, dosages. For most (but not all) regular oral preparations, absorption takes place predominantly in the small intestine, absorption is essentially

    SPA процедуры – Korea-spa.ru

    complete, and food usually does not interfere significantly. Absorption rates are more apt to vary, and time to serum peak is less predictable among the different SR preparations. In addition, food (especially a high-fat meal) is more likely to interfere with absorption of some SR preparations. One investigator recommends dose intake 1 hour before or 2 hours after meals when using SR preparations influenced by food. For the regular oral medication, time to peak (for adults) is about 2-3 hours, half-life is 4-6 hours (range, 3-8 hours) and time to steady state is about 15-20 hours. For children, half-life is more variable (1-8 hours) and time to steady state is also more variable (5-40 hours). Time to peak for the oral SR preparation is about 5 hours. About 50%-60% (range, 40%-65%) of theophylline is bound to serum albumin. Binding is less in neonates and at lower pH (acidosis). About 90% is metabolized in the liver, and most of the metabolites, plus about 10%-15% of unchanged theophylline, is excreted by the kidneys. Therefore, except in the first several months of life, renal function is not a major factor in theophylline serum concentration. Adults who smoke tobacco or marijuana and children excrete theophylline somewhat more rapidly (decreased serum half-life) than nonsmoking adults. Factors that reduce theophylline clearance (increased serum half-life) include young infants (ages 0-8 months), congestive heart failure, cor pulmonale, severe liver dysfunction, sustained fever, pneumonia, obesity, cessation of smoking, cimetidine, ciprofloxacin, and erythromycin family antibiotics. Some theophylline assay methods may show partial interference (some false increase in values) from substances present in uremia. Children show more individual differences in theophylline clearance and as a group eliminate theophylline more rapidly than adults. In addition, one report indicated that in children a high-protein diet increased theophylline elimination and a high-carbohydrate diet decreased it.

    Besides the factors just mentioned, therapy is complicated by the many theophylline preparations available, many of which vary significantly in theophylline content and the rate it is absorbed. Noncompliance is a constant problem in therapy and in the interpretation of theophylline blood levels, because low levels due to noncompliance may be misinterpreted as due to rapid metabolism or excretion. The reverse mistake can also be made. Another difficulty is the asthmatic who may already have taken one or more extra doses of theophylline before being seen by the physician.

    There is a relatively narrow zone between therapeutic range (10-20 µg/ml; 55-110 µmol/L) and values associated with toxicity. The degree of elevation over reference range is not a reliable predictor of toxicity risk except in a very general way, since severe toxicity can develop in some patients at less than twice the upper limit of the reference range. Although there are mild toxic symptoms, severe toxicity may develop without warning. If there is a question about previous drug intake, a specimen for theophylline assay should be obtained before therapy is begun. Therapy can then be started and the dosage modified when assay results become available. Thereafter, when steady state is achieved, serum peak concentration should be measured (30 minutes after the dose for IV theophylline, 2 hours after the dose for regular theophylline, and about 5 hours [range, 3-7 hours, depending on the particular medication] after the dose for sustained-release forms). Theophylline is thus an exception to the general rule that the residual (trough) level is better than the peak level to monitor therapy.

  • Antiarrythymic Drugs

    There is a large and ever-growing list of these medications, too many to include here. TDM data for some members of this group are summarized in Table 37-25. Several have been selected for more detailed discussion here.
    Procainamide. Procainamide is used to control certain ventricular arrhythmias and can be given orally or intravenously. Only about 10% is bound to serum protein. Maintenance is usually achieved by oral medication. About 85% of the oral dose is absorbed, mostly in the small intestine. About 50% of the drug is excreted unchanged by the kidneys. About 50% is metabolized, predominantly by the liver. The major metabolite of procainamide is N-acetylprocainamide (NAPA), which constitutes about 25%-30% of the original dose (7%-40%). NAPA is produced in the liver by a process known as N-acetylation. It has antiarrhythmic properties about equal to that of its parent compound. About 10% is bound to serum protein and about 85% is excreted unchanged by the kidneys. It has a serum half-life about twice that of procainamide. Therefore, NAPA levels continue to rise for a time after procainamide levels have stabilized. There is approximately a 1:1 ratio of procainamide to NAPA after both have equilibrated. Poor liver function may decrease NAPA formation and produce a high ratio (> 1.0) of procainamide to NAPA (i.e., less NAPA relative to the amount of procainamide). Even though procainamide degradation may be decreased, it is only 25%-30% metabolized in the liver, so that it is not affected as much as NAPA. On the other hand, poor renal function decreases NAPA excretion and decreases the procainamide/NAPA ratio to less than 1.0 (i.e., more NAPA relative to procainamide). Even though procainamide excretion may also be decreased, the amount of NAPA excreted through the kidneys is much higher than the amount of procainamide, so that poor renal function affects NAPA proportionally more than procainamide. Another factor is the acetylating process of the liver, which is an inherited characteristic. Isoniazid and hydralazine are also metabolized by this system. About one half of the population are slow acetylators and about one half are fast acetylators. Fast acetylation produces more NAPA (tending to produce a procainamide/NAPA ratio < 1.0), and slow acetylation produces less NAPA (procainamide/NAPA ratio > 1.0). Assessment of acetylation status is dependent on adequate renal function, since poor renal function can affect the procainamide/NAPA ratio. About 50% of patients on long-term procainamide therapy develop antinuclear antibodies, and up to 30% may develop a syndrome very similar to systemic lupus erythematosus. Slow acetylators are more likely to develop these conditions than fast acetylators.

    Since both procainamide and NAPA have antiarrhythmic action and since several factors influence their levels and their relationship to each other, most authorities recommend that both be assayed and that therapeutic decisions be based on the sum of both rather than on either one alone. Therapeutic range for the combination of procainamide and NAPA is 10-30 µg/ml (42.50-127.47 µmol/L). Specimens for TDM are usually obtained just before the next scheduled dose to evaluate adequacy of dosage. Peak levels or specimens drawn during symptoms are needed to investigate toxic symptoms.

    There are two types of procainamide oral preparations, standard (relatively short acting) and sustained release (SR). For the standard type, peak absorption levels are usually reached in about 1.5 hours (range, 1-2 hours) after an oral dose. However, some persons absorb procainamide relatively slowly, and the peak may be delayed up to 4 hours after the dose, close to the time one would expect a trough level. In one study, this occurred about one third of the time. Therefore, some investigators recommend both peak and trough for initial evaluation. Patients with acute myocardial infarction or cardiac failure are more likely to have delayed absorption. Serum half-life is about 3 hours (2-4 hours). Time to steady state is about 18 hours (11-20 hours). Therefore, the half-life is considered a short one, and there is a greater fluctuation in serum values compared with an agent with a long half-life. The peak level after oral SR procainamide occurs about 2 hours after the dose (range, 1-3 hours) but may not occur until later in patients with slow absorption. Time to steady state is about 24-30 hours.

    Lidocaine. Lidocaine (Xylocaine) hydrochloride is a local anesthetic that has antiarrhythmic properties. Used as an antiarrhythmic, it is generally given intravenously to patients who are seriously ill. Lidocaine is lipid soluble and distributes rapidly to many tissues. When it is given as a single bolus, plasma levels fall rapidly, with perhaps as much as a 50% decrease in about 20 minutes. On the other hand, drug given by IV infusion reaches a plateau rather slowly because so much of the drug is distributed to peripheral tissues. Therefore, administration is usually done with one or more bolus loading dose injections followed by IV infusion. The half-life of lidocaine is 1-2 hours, and time to steady state is 5-10 hours (5-12 hours). About 70% is protein bound; of the total that is protein bound, about 30% is bound to albumin and 70% to alpha-1 acid glycoprotein. Lidocaine is about 90% metabolized in the liver, with 5%-10% excreted unchanged by the kidneys. The major hepatic metabolites of lidocaine also have some antiarrhythmic effect. The primary metabolites are themselves further metabolized in the liver, with less than 10% of the primary metabolites being excreted unchanged in urine.

    Conditions that produce an increase in plasma lidocaine levels are severe chronic liver disease (decreased drug inactivation), chronic renal disease (decreased excretion), and congestive heart failure (reduced volume of distribution). In acute myocardial infarction, there is increase in the binding protein alpha-1 acid glycoprotein and a subsequent increase in plasma total lidocaine values; however, bound drug is pharmacologically inactive, and the nonbound active fraction often does not increase. Propranolol has been reported to decrease lidocaine clearance, producing higher plasma values.

    Complications related to lidocaine therapy have been reported in 6%-20% of cases. Therapeutic drug monitoring requires a method that is fast and that can be performed without much delay. HPLC and EMIT are the two most frequently used methods. Colorimetric methods are also available. It has been recommended that lidocaine specimens be drawn 12 hours after beginning therapy and then daily. In seriously ill patients, in those whose arrhythmias persist in spite of lidocaine, and when lidocaine toxicity is suspected, assay every 12 hours could be helpful. The therapeutic range is 1.5-5 µg/ml.

    Tocainide. Tocainide (Tonocard) is an analog of lidocaine that also is used to treat ventricular arrythmias. Tocainide has some advantages over lidocaine since tocainide can be given orally and has a longer half-life (about 15 hours; range, 12-18 hours) due to much less first-pass hepatic metabolism. The half-life may be increased with severe liver disease or chronic renal failure. About 10% is bound to serum protein. The metabolites of tocainide are excreted in the urine and do not have antiarrythmic activity. Peak serum levels are reached 1.5-2.0 hours after an oral dose. Steady state is reached in 3 days. Therapeutic range is 4-10 µg/ml. Assay is usually done by HPLC.

    Quinidine. Quinidine has been used for treating both atrial and ventricular arrhythmias. There are two forms of quinidine: the sulfate and the gluconate. Both are available for oral administration in both regular and long-acting (SR) preparations. The gluconate form can be given intravenously. Oral regular quinidine sulfate has a time to peak value of about 2 hours (range, 1-3 hours), a serum half-life of about 6 hours (range, 5-8 hours), and a time to steady state of about 24 hours. Regular oral quinidine gluconate has a time to peak value of about 4 hours. SR quinidine sulfate (Quinidex) has a time to peak value of about 2 hours, a serum half-life of about 20 hours, and a time to steady state of about 4 days. SR quinidine gluconate (Quiniglute, Duraquin) has a time to peak value of about 4 hours and a half-life of about 10 hours. However, when the SR preparations are used, there is relatively little fall in serum levels after the initial dose before subsequent doses. About 80% of quinidine (literature range, 60%-90%) is bound to serum proteins. Quinidine is metabolized by the liver, with about 10%-20% excreted unchanged in urine by glomerular filtration. Urine excretion is influenced by urine pH.

    Factors that may decrease quinidine levels include hypoalbuminemia, drugs that compete for albumin binding, and drugs that activate hepatic enzyme activity, such as phenytoin and phenobarbital. Factors that tend to increase quinidine levels include congestive heart failure, poor renal function (prerenal or intrinsic renal disease), and possibly severe liver disease. Renal excretion is increased by acidification of the urine and decreased by urine alkalinization.

    Several methods are available for quinidine assay. The most commonly used are fluorometric procedures, with or without preliminary extraction steps. These measurements include quinidine and several of its metabolites. Certain other fluorescing compounds may interfere. Extraction eliminates some but not all of the metabolites. More specific methods include HPLC and EMIT. Values for the direct (nonextracted) fluorescent methods are about 50% higher than those using HPLC or EMIT (i.e., the therapeutic range with the nonextracted fluorometric method is about 3-8 µg/ml [9.25-24.66 µmol/L], whereas the range using the double-extracted fluorometric method or HPLC is 2.3-5 µg/ml [7.09-15.41 µmol/L]). The specimen for TDM should be drawn just before the next dose is to be given (residual or trough level).

    Reasons for TDM of quinidine include the following:

    1. Various quinidine commercial products differ considerably in absorption.
    2. Toxic levels of quinidine can produce certain arrhythmias that could be due to patient disease (either from noncontrol or noncompliance).
    3. There is a possibility of drug interaction, because patients taking quinidine are likely to be taking several drugs or to receive additional drugs in the future.
    4. Patient disease may modify quinidine metabolism or excretion (old age frequently is associated with reduced renal function, which modifies renal excretion of quinidine).

    Flecainide. Flecainide (Tambocor) is another drug used for ventricular arrythmias, including premature ventricular contractions and ventricular tachycardia or fibrillation. About 95% is absorbed. Food or antacids do not affect absorption. After absorption, roughly 40% is bound to serum proteins. About 30% (range, 10%-50%) is excreted unchanged in the urine. The major metabolites have no antiarrythmic activity. Peak plasma levels after oral dosage are reached in about 3 hours (range, 1-6 hours). Serum half-life averages 20 hours (range, 7-27 hours) and may be longer in patients with severe renal disease or congestive failure. Steady state is reached in 3-5 days. Propranolol increases flecainide serum levels approximately 20%. Hypokalemia or hyperkalemia may affect the therapeutic action of flecainide. Flecainide paradoxically aggravates ventricular arrythmias in about 7% of patients, especially in the presence of congestive heart failure.

    Digoxin. Digoxin could be included in the section on toxicology, since most serum assay requests are for the purpose of investigating possible digoxin toxicity. However, an increasing number of studies have demonstrated unsuspected overdosage or underdosage (30% toxicity and 11% underdigitalization in one study), and requests for baseline levels are becoming more frequent. The volume of requests and the relative ease of performance (by immunoassay) make this assay readily available, even in smaller laboratories. The widespread use of digoxin, the narrow borderline between therapeutic range and toxicity, and the nonspecific nature of mild or moderate toxic signs and symptoms that mimic a variety of common disorders (diarrhea, nausea, arrhythmias, and ECG changes) contribute to the need for serum assay.

    Digoxin therapeutic drug monitoring data

    About 20%-30% of digoxin is bound to serum albumin. About 80% (range, 60%-90%) is excreted unchanged by the kidneys. About 20% is metabolized in the liver, with most of this being excreted as digoxin metabolites. About 10% of the adult population metabolizes a greater percentage of digoxin (which may be as high as 55%). After an oral dose is given, serum levels rise to a peak at 30-90 minutes and then slowly decline until a plateau is reached about 6-8 hours after administration. Digoxin assay specimens must be drawn at least 6 hours (preferably at least 8 hours) after the last dose in either oral or IV administration, to avoid blood levels that are significantly higher than would be the case when tissue levels have equilibrated. The 6- to 8-hour time span mentioned is minimum elapsed time; specimens may be drawn later. In many cases more information is obtained from a sample drawn shortly before the next scheduled dose. Serum half-life is approximately 36-38 hours. Normal therapeutic range is 0.5-2.0 µg/100 ml (0.6-2.56 nmol/L).

    Various metabolic disorders and medications may alter body concentration or serum levels of digoxin or may affect myocardial response to usual dosage. The kidney is the major route of excretion, and a decrease in renal function sufficient to raise serum creatinine levels will elevate serum digoxin levels as well. In renal failure, digoxin half-life may be extended to as long as 5 days. Hypothyroidism also increases digoxin serum values. On the other hand, certain conditions affect patient response to digitalis without affecting blood levels. Myocardial sensitivity to digoxin, regardless of dose, is increased by acute myocardial damage, hypokalemia, hypercalcemia, hypermagnesemia or hypomagnesemia, alkalosis, tissue anoxia, and glucagon. Drugs that produce hypokalemia (including various diuretics, amphotericin B, corticosteroids, or glucose infusion) thus predispose to toxicity. Other medications, such as phenylbutasone, phenytoin, and barbiturates (which activate hepatic degradation mechanisms), or kaolin (Kaopectate), antacids, cholestyramine, and certain oral antibiotics such as neomycin (which interfere with absorption) tend to be antagonistic to the effect of digitalis. Quinidine elevates digoxin levels in about 90% of patients by 50%-100% (range, 30%-330%). The effect on digoxin levels begins within 24 hours, with peak effect in 4-5 days. Certain other medications can increase serum digoxin levels to some extent.

    Interfering substances. Digoxin can be measured by a variety of immunoassay methods. Digoxin-like cross-reacting substances have been reported in many patients (not all) in the third trimester of pregnancy, infants up to 6 months of age (the effect peaking at 1 week of age), patients with renal failure, and patients with severe liver disease. Different kits are affected to different extents. Some investigators report that the cross-reacting substances bind to serum proteins. In most cases the cross-reaction increases serum digoxin less than 1.0 µg/100 ml, but sometimes the effect may be greater.

    Antidigoxin antibody therapy. Another analytical problem occurs when digitalis toxicity is treated with fragments of antidigoxin antibodies (Fab, “antigen-binding fragments”). These fragments are prepared by first producing antidigoxin IgG class antibody in animals, then enzymatically splitting off the antigen-binding variable regions (Fab portion) of the IgG molecule. This eliminates the “constant” region of the IgG molecule, which is the most antigenic portion of the molecule. The antidigoxin antibody Fab fragments bind to plasma and extracellular fluid digoxin. This creates a disturbance in equilibrium between free (unbound) digoxin within cells and within the extracellular compartments, so that some intracellular digoxin moves out of body cells to restore the equilibrium. The digoxin-Fab bound complexes are excreted in the urine by glomerular filtration. Their elimination half-life with normal renal function is about 15-20 hours (range, 14-25 hours).

    Laboratory digoxin assay is involved for two reasons. First, a pretherapy baseline is required to help establish the diagnosis of digoxin toxicity and to help estimate the dose of Fab fragments needed. Second, after injection of the Fab dose, another assay is helpful to determine if adequate therapy was given, either because pretreatment digoxin tissue levels were higher than estimated or too much of the Fab fragment dose was lost in urine before sufficient digoxin had diffused out of the body cells. It is necessary to wait at least 6-8 hours after therapy for a postdose assay, to allow for equilibration time between cells and extracellular fluid. An assay system specific for free digoxin is necessary (usually done by a technique such as microfiltration, which separates unbound from Fab-bound digoxin), because the Fab-digoxin bound complexes are included with unbound (free) digoxin in total digoxin assays. Soon after therapy begins there is greatly increased Fab-digoxin bound complex formation in plasma (and, therefore, elevated total digoxin levels, sometimes as high as 20 times pretreatment levels), whereas free digoxin levels are low. Later, 12-20 hours after the initial therapeutic dose, plasma free digoxin reequilibrates, and may reach toxic levels again if sufficient intracellular digoxin has not been captured. It may take several days to excrete all the Fab-digoxin bound complexes, and the serum total digoxin level may remain elevated more than 1 week if there is poor renal function.

    Digoxin assay clinical correlation. In various studies, there is a certain amount of overlap in the area that statistically separates normally digitalized patients from those with toxicity. This overlap exists because it is difficult to recognize mild degrees of toxicity, because patient sensitivity to digitalis varies, and because the assay technique itself, no matter how well done, like all laboratory tests displays a certain amount of variation when repeated determinations are performed on the same specimen. Regardless of these problems, if the clinical picture does not agree with the level of toxicity predicted by digoxin assay values, and laboratory quality control is adequate, the physician should not dismiss or ignore the assay results but should investigate the possibility of interference by improper specimen collection time interval, drug interaction, or metabolic alterations. However, the assay should be repeated first, to verify that a problem exists.

    Digitoxin. Digitoxin is more than 95% bound to serum albumin. Serum half-life is about 8 days (2.5-16.5 days). Digitoxin is about 90% metabolized in the liver. About 5%-10% is excreted unchanged through the kidneys. Drugs that activate hepatic enzyme systems, such as phenytoin and barbiturates, increase metabolism of digitoxin and decrease serum levels. Hypoalbuminemia and drugs that compete for binding sites on albumin also tend to decrease digitoxin serum levels. The long half-life of the drug means that toxicity is difficult to overcome, so digoxin has mostly replaced digitoxin in the United States. The therapeutic range of digitoxin is 15-25 ng/ml.

  • Psychiatric Medications

    Lithium carbonate. Lithium is used for control of the manic phase of manic-depressive psychiatric illness. Peak levels are reached in 1-3 hours, and plasma half-life (in young adults) is about 24 hours (range, 8-35 hours). Time to steady state is about 5 days (range, 2-7 days). Most excretion is through the kidneys, where there is both excretion and reabsorption. Excretion is decreased (tending to increase half-life and blood levels) with poor renal function and also with sodium deficiency. Methyldopa also tends to delay lithium excretion. More rapid excretion occurs with salt loading or sodium retention. Interesting side effects are reversible hypothyroidism (about 5% of cases, with some thyroid-stimulating hormone elevation in up to 30% of cases) and neutrophilic leukocytosis. TDM assays are usually performed 12 hours after the last dose (before administration of the next dose). The usual laboratory method is flame photometry, although other methods are becoming available. The therapeutic range is somewhat narrow (approximately 0.5-1.5 mEq/L). Values higher than 2.0 mEq/L are usually considered to be in the toxic range. Maintenance therapy is customarily monitored once a month. Some interest has been shown in red blood cell (RBC) lithium analysis, especially when lack of patient compliance is suspected. RBC lithium levels are more stable over periods of time than serum lithium levels due to the relatively short half-life of serum lithium. Low RBC lithium levels in the presence of normal or elevated serum lithium levels suggest that the patient is noncompliant but took a lithium dose shortly before coming to have the specimen drawn.

    Tricyclic antidepressants. The group name of these medications refers to their three-ring structure. They are widely used to treat unipolar psychiatric depression (i.e., depression without a manic phase). About 70% of these patients show some improvement. The tricyclics are thought to act through blocking one of the deactivation pathways of norepinephrine and serotonin at the brain nerve endings, thereby increasing the availability of these neurotransmitter agents in the synapse area. The different drugs differ in their effect on norepinephrine, serotonin, or both. Currently, the most commonly used tricyclics are imipramine (Tofranil), amitriptyline (Elavil), protrypyline (Vivactil), and doxepin (Sinequan). Of these, imipramine is metabolized to desipramine, and amitriptyline is metabolized to nortriptyline; in both cases the metabolites have pharmacologic activity and are actually marketed themselves under different trade names. Doxepin is also metabolized to the active compound desmethyldoxepin. If these parent compounds are assayed, their major metabolite must also be assayed as well as the parent. Other tricyclics are available, and still others are being introduced.

    Oral doses are fairly completely absorbed from the GI tract. Once absorbed, there is 70%-96% binding to plasma proteins and considerable first-pass metabolism in the liver. By 6-8 days 60%-85% of the dose is excreted in the urine in the form of metabolites. Peak serum levels are generally attained 2-6 hours (range, 2-8 hours) after an oral dose. There is variation in peak level depending on the drug formula. There is considerable variation in metabolism between individuals, with variation fivefold to tenfold in steady-state levels being common and sometimes differences reported as great as thirtyfold. The half-life averages 20-30 hours (range, 15-93 hours), and steady state is reached on the average in about 7-10 days (range, 2-19 days). Imipramine has a somewhat shorter half-life (6-24 hours) and time to steady state (about 2-5 days) than the other tricyclics. However, there is variation between the various drugs and between individuals taking the same drug. It is reported that 30% or more patients have serum assay values outside the standard therapeutic range. African Americans may reach higher steady-state serum levels than Europeans.

    Currently, high-performance liquid chromatography (HPLC) is considered the best assay method. Immunoassay (EMIT method) is also used but is not as specific. For example, thioridizine (Melloril) and possibly other phenothiazines may produce a reaction in the EMIT tricyclic test. When tricyclics are given once daily, TDM specimens are usually drawn 10-14 hours after the last dose (if the dose is given at bedtime, the specimen is drawn in the morning about 12 hours later). If the patient is on divided doses, the specimen should be drawn 4-6 hours after the last dose (this usually means that the specimen is drawn just before the next dose). The literature warns that some collection tube stoppers contain interfering substances and that certain serum separation devices using gels or filtration also might interfere. It is obviously necessary to select a collection and processing method that is known to be safe. Serum should be refrigerated rather than frozen. Quality control studies have shown variation within laboratories and between laboratories that is greater than the level of variation for routine chemistry tests.

  • Selected Drugs and Drug Groups Anticonvulsants

    Most epileptics can be controlled with phenytoin (Dilantin), primidone (Mysoline), phenobarbital, or other agents. Frequently drug combinations are required. Therapy is usually a long-term project. When toxicity develops, many of these therapeutic agents produce symptoms that could also be caused by central nervous system (CNS) disease, such as confusion, somnolence, and various changes in mental behavior. Some drugs, such as primidone, must be carefully brought to a therapeutic level by stages rather than in a single dose. Most antiepileptic drugs are administered to control seizures; but if seizures are infrequent, it is difficult to be certain that the medication is sufficient to prevent future episodes. When drug combinations are used, levels for all the agents should be obtained so that if only one drug is involved in toxicity or therapeutic failure, it can be identified.

    When specimens are sent to the laboratory for drug assay, the physician should list all drugs being administered. Some are metabolized to substances that themselves have antiepileptic activity (e.g., primidone is partially metabolized to phenobarbital), and the laboratory then must assay both the parent drug and its metabolite. Without a complete list of medications, there is a good chance that one or more drugs will be overlooked. Once drug blood levels have been obtained, the physician should remember that they are often not linear in relation to dose, so that a percentage change in dose may not result in the same percentage change in blood level. Repeated assays may be needed to guide dosage to achieve desired blood levels. Finally, published therapeutic ranges may not predict the individual response of some patients to the drug. Clinical judgments as well as laboratory values must be used.

    Phenytoin. Phenytoin is about 90% bound to serum proteins. About 70% is metabolized in the liver, although only 5% or less is excreted unchanged through the kidneys. Peak phenytoin levels are reached 4-8 hours after an oral dose and within 15 minutes after IV administration. Serum half-life is about 18-30 hours (literature range, 10-95 hours), with an average of about 24 hours. This variation occurs in part because higher doses saturate the liver metabolic pathway and thus increase the half-life with nonmetabolized drug. The serum dose-response curve is not linear, so that relatively small increases in dose may generate relatively large changes in serum levels. Time to reach steady state is usually 4-6 days but may take as long as 5 weeks. Administration by intramuscular injection rather than oral intake is said to reduce blood levels about 50%. The therapeutic range is 10-20 µg/ml. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Certain drugs or diseases may affect phenytoin blood levels. Severe chronic liver disease, hepatic immaturity in premature infants, or disulfiram (Antabuse) therapy often increase phenytoin levels. Certain other drugs, such as coumarin anticoagulants, chloramphenicol (Chloromycetin), methylphenidate (Ritalin), and certain benzodiazepine tranquilizers such as diazepam (Valium) and chlordiazepoxide (Librium) have caused significant elevations in a minority of patients. Acute alcohol intake may also elevate plasma levels. On the other hand, pregnancy, acute hepatitis, low doses of phenobarbital, carbamazepine (Tegretol), and chronic alcoholism may decrease phenytoin plasma levels, and they may also be decreased in full-term infants up to age 12 weeks and in some patients with renal disease. As noted previously, there may be disproportionate changes in either bound or free phenytoin in certain circumstances. About 10% of total phenytoin is theoretically free, but in one study only about 30% of patients who had free phenytoin measured conformed to this level with the remainder showing considerable variation. Certain clinical conditions or acidic highly protein-bound drugs may displace some phenytoin from albumin, causing the unbound (free) fraction of serum phenytoin to rise. Initially, total serum concentration may be decreased somewhat if the liver metabolizes the newly released free drug. However, the hepatic metabolic pathway may become saturated, with resulting persistent increase in the unbound fraction and return of the total phenytoin level into the reference range. At this time the usual phenytoin assay (total drug) could be normal while the free drug level is increased. Drugs that can displace phenytoin from albumin include valproic acid (Depakene), salicylates, oxacillin, cefazolin, cefotetan, and phenylbutasone. Large quantities of urea or bilirubin have a similar effect. Infants aged 0-12 weeks have reduced phenytoin protein binding. On the other hand, hypoalbuminemia means less binding protein is available and may result in increased free phenytoin levels coincident with decreased total phenytoin levels.

    Phenytoin has some interesting side effects in a minority of patients, among which are megaloblastic anemia and a type of benign lymphoid hyperplasia that clinically can suggest malignant lymphoma. Occasional patients develop gum hypertrophy or hirsutism. Phenytoin also can decrease blood levels of cortisol-type drugs, thyroxine (T4), digitoxin, and primidone, and can increase the effect of coumadin and the serum levels of the enzymes gamma-glutamyltransferase and alkaline phosphatase. Phenytoin produces its effects on drugs by competing for binding sites on protein or by stimulating liver microsome activity. Phenytoin alters the serum enzymes by its effect on the liver microsome system.

    Primidone. Primidone is not significantly bound to serum proteins and is about 50% metabolized in the liver. About 50% is excreted unchanged by the kidneys. Its major metabolites are phenobarbital (about 20%) and phenylethylmalonamide (about 20%), both of which have anticonvulsant activity of their own and both of which accumulate with long-term primidone administration. Phenobarbital is usually not detectable for 5-7 days after beginning primidone therapy. The ratio of phenobarbital to primidone has been variously reported as 1.0-3.0 after steady state of both drugs has been reached (unless phenobarbital is administered in addition to primidone). If phenytoin is given in addition to primidone, primidone conversion to phenobarbital is increased and the phenobarbital/primidone ratio is therefore increased. Peak serum concentration of primidone occurs in 1-3 hours, although this is somewhat variable. Serum half-life in adults is about 6-12 hours (literature range, 3.3-18 hours). Steady state is reached in about 50 hours (range, 16-60 hours). The therapeutic range is 5-12 µg/ml. It is usually recommended that both primidone and phenobarbital levels be assayed when primidone is used, rather than primidone levels only. If this is done, one must wait until steady state for phenobarbital is reached, which takes a much longer time (8-15 days for children, 10-25 days for adults) than steady state for primidone. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Phenobarbital. Phenobarbital is about 50% bound to serum protein. It has a very long half-life of 2-5 days (50-120 hours) and takes 2-3 weeks (8-15 days in children, 10-25 days in adults) to reach steady state. About 70%-80% is metabolized by the liver and about 10%-30% is excreted unchanged by the kidneys. Phenobarbital, as well as phenytoin, carbamazepine, and phenylbutasone, has the interesting ability to activate hepatic microsome activity. Thus, phenobarbital increases the activation of the phenytoin liver metabolic pathway and also competes with phenytoin for that pathway. Phenobarbital incidentally increases degradation of other drugs that are metabolized by hepatic microsome activity, such as coumarin anticoagulants, adrenocorticosteroids, quinidine, tetracycline, and tricyclic antidepressants. Acute alcoholism increases patient response to phenobarbital and chronic alcoholism is said to decrease response. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Valproic Acid. Valproic acid has been used to treat petit mal “absence” seizures and, in some cases, tonic-clonic generalized seizures and myoclonic disorders. About 90% is bound to plasma proteins. There is a relatively small volume of distribution, because most of the drug remains in the vascular system. More than 90% is metabolized in the liver, with 5% or less excreted unchanged by the kidneys. Time to peak after oral dose is 1-3 hours. Food intake may delay the peak. Serum half-life is relatively short (about 12 hours; range, 8-15 hours), and steady state (oral dose) is reached in 2-3 days (range, 30-85 hours in adults; 20-70 hours in children). Liver disease may prolong the interval before steady state. Interestingly, therapeutic effect usually does not appear until several weeks have elapsed. There is some fluctuation in serum values (said to be 20%-50%) even at steady state. Hepatic enzyme-inducing drugs such as phenytoin, phenobarbital, carbamazepine, and primadone increase the rate of valproic acid degradation and thus its rate of excretion, and therefore tend to decrease the serum levels. Hypoalbuminemia or displacement of valproic acid from albumin by acidic strongly protein-bound drugs such as salicylates decrease total valproic acid blood levels. Valproic acid can affect phenytoin and primidone levels, but the effect is variable. Phenobarbital levels are increased due to interference with liver metabolism. One report indicates that ethosuximide levels may also be increased. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Rarely, valproic acid may produce liver failure. Two types have been described. The more common type appears after months of therapy, with gradual and potentially reversible progression signaled by rising aspartate aminotransferase (AST) levels. Periodic AST measurement has been advocated to prevent this complication. The other type is sudden, is nonreversible, and appears soon after therapy is started.

    Carbamazepine. Carbamazepine is used for treatment of grand mal and psychomotor epilepsy. About 70% (range, 65%-85%) is protein bound, not enough to make binding a frequent problem. Carbamazepine is metabolized by the liver. It speeds its own metabolism by activation of the liver microsome system. Only 1% is excreted unchanged in the urine. The major metabolites are the epoxide form, which is metabolically active, and the dihydroxide form, which is derived from the epoxide form. The metabolites are excreted in urine. Carbamazepine absorption after oral dose in tablet form is slow, incomplete (70%-80%), and variable. Pharmacologic data in the literature are likewise quite variable. Dosage with tablets results in a peak level that is reached in about 6-8 hours (range, 2-24 hours). Dosage as a suspension or solution or ingestion of tablets with food results in peak levels at about 3 hours. Serum half-life is about 10-30 hours (range, 8-35 hours) when therapy is begun. But after several days the liver microsome system becomes fully activated, and when this occurs the half-life for a dose change may be reduced to about 12 hours (range, 5-27 hours). Phenytoin, phenobarbital, or primidone also activate the liver microsome system, thereby increasing carbamazepine metabolism and reducing its half-life. The time to steady state is about 2 weeks (range, 2-4 weeks) during initial therapy. Later on, time to steady state for dose changes is about 3-4 days (range, 2-6 days). Transient leukopenia has been reported in about 10% of patients (range, 2%-60%) and persistent leukopenia in about 2% (range, 0%-8%). Thrombocytopenia has been reported in about 2%. Aplastic anemia may occur, but it has been rare.

  • Therapeutic Drug Monitoring (TDM)

    Various studies have shown that therapy guided by drug blood levels (therapeutic drug monitoring, TDM) has a considerably better chance of achieving therapeutic effect and preventing toxicity than therapy using empiric drug dosage. TDM can be helpful in a variety of circumstances, as can be seen in the following discussion.
    Новости SEO – www.seofull.ru
    Why obtain therapeutic drug blood levels?

    1. To be certain that adequate blood concentrations are reached. This is especially important when therapeutic effect must be achieved immediately but therapeutic results are not evident immediately, as might happen when aminoglycoside antibiotics are used.
    2. When effective blood levels are close to toxic levels (“narrow therapeutic window”). It is useful to know what margin of safety is permitted by the current medication dose. If blood levels are close to toxic values, a decrease in the dose might be attempted.
    3. If expected therapeutic effect is not achieved with standard dosage. It is important to know whether the fault is due to insufficient blood levels or is attributable to some other factor (e.g., patient tolerance to the medication effect or interference with the therapeutic effect by other drugs).
    4. If symptoms of toxicity appear with standard dosage. The problem might be one of excessive blood levels, enhancement of effect by other medications, an increase in free as opposed to total drug blood levels, or symptoms that are not due to toxicity from the drug in question.
    5. If a disease is present that is known to affect drug absorption, protein binding, metabolism, or excretion.
    6. Possible drug interaction. It is safer to know in advance whether other medications have altered the expected blood levels of a drug before symptoms appear of toxicity or of insufficient therapeutic effect.
    7. Combination drug therapy. If multiple drugs are used simultaneously for the same purpose (e.g., control of convulsions), knowledge of baseline blood levels for each drug would be helpful should problems develop and the question arise as to which drug is responsible.
    8. Possible patient noncompliance. Patients may decrease the dosage or cease taking medication altogether if symptoms improve or may simply forget to take doses.
    9. Possible medicolegal considerations. An example is the aminoglycoside antibiotic group, whose use is known to be associated with renal failure in a certain percentage of cases. If a patient develops renal failure while taking one of these antibiotics, the renal failure could be due either to drug toxicity or to the underlying disease. If previous and current antibiotic blood levels are within an established range that is not associated with toxicity, the presumptive cause of renal failure is shifted toward the disease rather than the therapy.
    10. Change in dosage or patient status to establish a new baseline for future references.

    What factors influence therapeutic drug blood levels?

    A great many factors influence TDM blood levels. Discussion of some of the more important follows.

    Route of administration. Intravenous (IV) administration places medication into the blood faster than intramuscular injection, which, in turn, is usually faster than oral intake. If IV medication is administered in a few minutes, this may shorten the serum half-life of some medications such as antibiotics compared to methods of administration that take longer. Oral medication may be influenced by malabsorption.

    Drug absorption. This may be altered by gastrointestinal (GI) tract motility variations, changes of intestinal acidity, malabsorption disorders, and in some cases interference from food or laxatives.

    Drug transport. Many drugs have a substantial fraction that is bound to plasma proteins. Acidic drugs bind predominantly to albumin, and basic drugs bind predominantly to alpha-1 glycoproteins. Protein-bound drug molecules are not metabolically active. Therapeutic drug monitoring using total drug concentration is based on the assumption that the ratio between bound and unbound (“free”) drug remains constant, and therefore alterations in the total drug level mirror alterations in the free drug level. In most cases this is true. However, when 80% or more of a drug is protein bound, there may be circumstances in which alterations in the ratio of bound to free drug may occur. These alterations may consist of either a free drug concentration within toxic range coupled with a total drug concentration within therapeutic range or a free drug concentration within therapeutic range coincident with a total drug concentration within toxic range. This may happen when the quantity of binding protein is reduced (e.g., in hypoalbuminemia) and the dose rate is not changed from that used with normal protein levels. Problems may also arise when the quantity of binding protein is normal but the degree of binding is reduced (in neonatal life and in uremia) or when competition from other drugs displaces some of the bound fraction; interaction between acidic drugs with a high percentage of protein binding (e.g., valproic acid and phenytoin); if metabolism of free drug decreases (severe liver disease); or if excretion of free drug decreases (renal failure). Although an increase in free drug quantity may explain toxic symptoms, it is helpful also to know the total drug concentration to deduce what has happened. In routine TDM, total drug concentration is usually sufficient. If toxicity occurs with total drug levels within the therapeutic range, free drug levels may provide an explanation and a better guideline for therapy. Free drug assays currently are done only by large reference laboratories. The introduction of relatively simple techniques to separate bound from free drug (e.g., membrane filtration) may permit wider availability of free drug assay.

    Drug uptake by target tissues. Drug molecules must reach target tissue and penetrate into tissue cells. Conditions such as congestive heart failure can decrease tissue perfusion and thereby delay tissue uptake of the drug.

    Extent of drug distribution (volume of distribution). Lipid-soluble drugs penetrate tissues easily and have a much greater diffusion or dispersal throughout the body than non-lipid-soluble drugs. Dispersal away from the blood or target organ decreases blood levels or target tissue levels. The tendency to diffuse throughout the body is measured by dividing the administered drug dose by the plasma concentration of the drug (at equilibrium). This results in the theoretical volume of body fluid within which the drug is diffused to produce the measured serum concentration, which, in turn, indicates the extent of extravascular distribution of the drug.

    Drug tissue utilization. Various conditions may alter this parameter, such as disease of the target organ, electrolyte or metabolic derangements, and effect of other medications.

    Drug metabolism. Most drugs for which TDM is employed are wholly or partially inactivated (“detoxified”) within the liver. Liver function becomes a critical factor when severe liver damage occurs. Also, some persons metabolize a drug faster than average (“fast metabolizer”), and some metabolize drugs slower (“slow metabolizer”). Certain drugs such as digoxin and lithium carbonate are not metabolized in the liver. The rate of drug metabolism plus the rate of excretion are major determinants of two important TDM parameters. Half-life (biologic half-life) refers to the time required to decrease drug blood concentration by 50%. It is usually measured after absorption has been completed. Steady state refers to drug blood level equilibrium between drug intake and elimination. Before steady state is achieved, drug blood values typically are lower than the level that they eventually attain. As a general rule it takes five half-lives before steady state is reached. Loading doses can decrease this time span considerably. A few investigators use three half-lives as the basis for steady-state measurements.

    Drug excretion. Nearly all TDM drugs are excreted predominantly through the kidneys (the major exception is theophylline). Markedly decreased renal function obviously leads to drug retention. The creatinine clearance rate is commonly used to estimate the degree of residual kidney function. When the serum creatinine is more than twice reference upper limits, creatinine clearance is usually less than 25% of normal and measurement is less accurate. In addition, creatinine clearance is somewhat reduced in the elderly, and some maintain that clearance reference ranges should be adjusted for old age.

    Dosage. Size and frequency of dose obviously affect drug blood levels.

    Age. Infants in general receive the same dose per unit weight as adults; children receive twice the dose, and the elderly receive less. A very troublesome period is the transition between childhood and puberty (approximately ages 10-13 years) since dosage requirements may change considerably and without warning within a few months.

    Weight. Dosage based on weight yields desirable drug blood levels more frequently than arbitrary, fixed-dose schedules. One assumes that a larger person has a larger total blood volume and extracellular fluid space within which the drug is distributed and a larger liver to metabolize the drug.

    Interference from other medications. Such interference may become manifest at any point in drug intake, metabolism, tissue therapeutic effect, and excretion, as well as lead to possible artifact in technical aspects of drug assay.

    Effect of disease on any previously mentioned factors. This most frequently involves considerable loss of renal or hepatic function.

    Assay of peak or residual level. In general, peak levels correlate with toxicity, whereas residual (trough) levels are more an indication of proper therapeutic range (i.e., whether the blood level remains within the therapeutic range). Of course, if the residual level is in the toxic range this is an even stronger indication of toxicity. An exception to the general rule is the aminoglycoside antibiotic group, in which the peak level is used to indicate whether therapeutic levels are being reached and the residual level is considered (some disagreement exists on this point) to correlate best with nephrotoxicity. For most drugs, the residual level should be kept within the therapeutic range and the peak level should be kept out of the toxic range. To avoid large fluctuations, some have recommended that the dose interval be one half of the drug half-life; in other words, the drug should be administered at least once during each half-life.

    One of the most important laboratory problems of drug level monitoring is the proper time in relationship to dose administration at which to obtain the specimen. There are two guidelines. First, the drug blood level should have reached steady state or equilibrium, which as a rule of thumb takes five drug half-lives. Second, the drug blood level should be at a true peak or residual level. Peak levels are usually reached about 1-2 hours after oral intake, about 1 hour after intramuscular administration, or about 30 minutes after IV medication. Residual levels are usually reached shortly (0-15 minutes) before the next scheduled dose. The greatest problem is being certain when the drug was actually given. I have had best results by first learning when the drug is supposed to be given. If a residual level is needed, the nursing service is then instructed to withhold the dose. The blood specimen is drawn approximately 15 minutes before the scheduled dose time, and the nursing service is then told to administer the dose. If a peak level is needed, the laboratory technologist should make arrangements to have the nursing service record the exact minute that the dose is given and telephone the laboratory. Unless the exact time the specimen was obtained and the exact time the drug dose was given are both known with certainty, drug blood level results cannot be properly interpreted and may be greatly misleading.

    Laboratory technical factors. These include the inherent technical variability of any drug assay method (expressed as a coefficient of variation) as well as the other sources of error discussed in Chapter 1. Therapeutic drug monitoring assays in general have shown greater differences between laboratories than found with simple well-established tests such as blood urea nitrogen or serum glucose levels.

    Patient compliance. Various studies have shown astonishingly high rates of patient noncompliance with dose instructions, including failure to take any medication at all. Possibly 20%-80% of all patients may be involved. Noncompliance results in subtherapeutic medication blood levels. Some believe that noncompliance is the most frequent cause of problems in patients on long-term therapy.

    Therapeutic and toxic ranges

    Therapeutic ranges are drug blood levels that have been empirically observed to correlate with desired therapeutic effects in most patients being treated for an uncomplicated disease. The same relationship is true for toxicity and toxic ranges. However, these ranges are not absolute and do not cover the response to a drug in all individual patients or the response when some unexpected factor (e.g., other diseases or other drugs) is superimposed. The primary guide to therapy is a good therapeutic response without evidence of toxicity. Most of the time this will correspond with a drug blood level within the therapeutic range, so the therapeutic range can be used as a general guideline for therapy. In some cases a good response does not correlate with the therapeutic range. In such cases the assay should be repeated on a new specimen to exclude technical error or specimens drawn at the wrong time in relation to dose. If the redrawn result is unchanged, clinical judgment should prevail. Some attempt should be made, however, to see if there is some factor that is superimposed on the disease being treated that could explain the discrepancy. Removal or increase of such a factor could affect the result of therapy at a later date. The same general statements are true for toxicity and toxic ranges. Some patients may develop toxicity at blood levels below the statistically defined toxic range and some may be asymptomatic at blood levels within the toxic range. However, the further the values enter into the toxic range, the more likely it is that toxicity will develop. Thus, patient response and drug level data are both important, and both are often necessary to interpret the total picture.

    Some Conditions That Produce Unexpected Therapeutic Drug Monitoring Results
    High plasma concentration on normal or low prescribed dose
    Patient accidental overdose
    Slow metabolizer
    Drug interaction that blocks original drug metabolism in liver or injures the liver
    Poor liver function (severe damage)
    Drug excretion block
    Increased binding proteins
    Residual level determined on sample drawn after dose was administered instead of before
    Laboratory technical factors
    Low plasma concentration on normal or high prescribed dose
    Poor drug absorption (oral dose)
    Interference by another drug
    Patient noncompliance
    Fast metabolizer
    Decreased binding proteins
    Peak level determined on sample drawn at incorrect time
    Laboratory technical factors
    Toxic symptoms with blood levels in therapeutic range
    Drug released from proteins (free drug increased)
    Drug effect enhanced at tissue level by some other drug or condition
    Blood level obtained at incorrect time
    Laboratory technical factors
    Symptoms may not be due to toxicity of that drug

    When to obtain specimens for therapeutic drug monitoring

    If a patient develops symptoms that might be caused by a drug, the best time to obtain a specimen for TDM is during the period when the patient has the symptoms (if this is not possible, within a short time afterward). One possible exception, however, is digoxin, whose blood level does not equilibrate with tissue levels until at least 6-8 hours after the dose is given. Therefore, specimens for digoxin TDM should not be drawn less than 6 hours after administration of the previous dose, even if toxic symptoms occur earlier. It should be ascertained how much time elapsed between the onset of toxic symptoms and the time of the last previous medication dose. This information is necessary to determine if there is a relationship of the symptoms to the peak blood level of the drug. If the specimen cannot be drawn during symptoms, the next best alternative is to deliberately obtain a specimen at the peak of the drug blood level. This will indicate if the peak level is within the toxic range. In some instances it may be useful to obtain a blood specimen for TDM at a drug peak level even without toxic symptoms, to be certain that the drug dosage is not too high.

    In some cases the question is not drug toxicity but whether dosage is adequate to achieve the desired therapeutic effect. In that case, the best specimen for TDM is one drawn at the residual (valley or trough) drug level, shortly before the next medication dose is given. The major exception to this rule is theophylline, for which a peak level is more helpful than a residual level.

    For most drugs, both peak and residual levels should be within the therapeutic range. The peak value should not enter the toxic range and the residual value should not fall to therapeutically inadequate levels.

    Information on some of the medications for which TDM is currently being used is given in Table 37-25. The box lists some conditions that produce unexpected TDM results.

    Summary

    Therapeutic drug monitoring can be extremely helpful in establishing drug levels that are both therapeutically adequate and nontoxic. To interpret TDM results, the clinician should know the pharmacodynamics of the medication, ascertain that steady-state levels have been achieved before ordering TDM assays, try to ensure that specimens are drawn at the correct time in relation to dose administration, be aware of effects from other medication, and view TDM results as one component in the overall clinical picture rather than the sole basis for deciding whether drug dosages are correct. Drug monitoring is carried out in two basic situations: (1) in an isolated attempt to find the reason for therapeutic failure (either toxic symptoms or nonresponse to therapy) and (2) to obtain a baseline value after sufficient time has elapsed for stabilization. Baseline values are needed for comparison with future values if trouble develops and to establish the relationship of a patient’s drug blood level to accepted therapeutic range. This information can be invaluable in future emergencies.

    Comments on therapeutic drug monitoring assay

    To receive adequate service, the physician must provide the laboratory with certain information as well as the patient specimen. This information includes the exact drug or drugs to be assayed, patient age, time elapsed from the last dose until the specimen was obtained, drug dose, and route of administration. All of these factors affect normal values. It is also desirable to state the reason for the assay (i.e., what is the question that the clinician wants answered) and provide a list of medications the patient is receiving.

    Some (not all) of the methods used in drug assay include gas-liquid chromatography (technically difficult but especially useful when several drugs are being administered simultaneously, as frequently occurs in epileptics), thin-layer chromatography (TLC; more frequently used for the hypnotic drugs), radioimmunoassay (RIA), fluorescence-polarization immunoassay, and enzyme-multiplied immunoassay (EMIT).

    One of the major reasons why TDM has not achieved wider acceptance is that reliable results are frequently not obtainable. Even when they are, the time needed to obtain a report may be several days rather than several hours. It is essential that the physician be certain that the reference laboratory, whether local or not, is providing reliable results. Reliability can be investigated in several ways: by splitting patient samples to be evaluated between the laboratory and a reference laboratory whose work is known to be good (but if isolated values are discrepant, a question may arise as to whose is correct), by splitting samples and sending one portion 1 week and the remainder the next week, or by obtaining standards from commercial companies and submitting these as unknowns. Most good reference laboratories will do a reasonable amount of such testing without charge if requested to do so beforehand.

    In some situations, assay results may be misleading without additional information. In certain drugs, such as phenytoin (Dilantin), digitoxin, and quinidine, a high percentage is bound to serum albumin and only the nonbound fraction is metabolically active. This is similar to thyroid hormone protein binding. The (nonbound) fraction may be increased in hypoalbuminemia or in conditions that change protein binding, such as uremia or administration of drugs that block binding or compete for binding sites. Drug level assays measure total drug and do not reflect changes in protein binding. In addition, some drugs, diseases, or metabolic states may potentiate or inhibit the action of certain therapeutic agents without altering blood levels or protein binding. An example is digoxin toxicity induced by hypokalemia.

  • Other Congenital Diseases

    There are a large number of congenital and genetic disorders, too many to include all in this book. If such a condition is suspected, in general the best procedure is to refer the patient or family to a university center that has an active genetics diagnosis program. If the state government health department has a genetic disease detection program, it can provide useful information and help in finding or making referral arrangements.

    Some Genetic Disorders Diagnosable with DNA Probes

    Huntington’s chorea
    Adult polycystic disease
    Alpha and beta thalassemia
    Congenital adrenal hyperplasia
    Duchenne’s and Becker’s muscular dystrophy
    Fragile X syndrome
    Hemophilia A and B
    Myotonic dystrophy
    Osteogenesis imperfecta
    Alpha-1 antitrypsin deficiency
    Cystic fibrosis
    Sickle cell hemoglobinopathy
    Retinoblastoma
    Familial hypertrophic cardiomyopathy

  • Porphyrias

    In porphyric diseases, the main similarity is the abnormal secretion of substances that are precursors of the porphyrin compound heme (of hemoglobin). The known pathways of porphyrin synthesis begin with glycine and succinate, which are combined to eventually form a compound known as d-aminolevulinic acid (ALA). This goes on to produce a substance known as “porphobilinogen,” composed of a single pyrrole ring. Four of these rings are joined to form the tetrapyrrole compound proporphyrinogen; this is the precursor of protoporphyrin, which, in turn, is the precursor of heme. The tetrapyrrole compounds exist in eight isomers, depending on where certain side groups are located. The only isomeric forms that are clinically important are I and III. Normally, very small amounts of porphyrin degradation products appear in the feces or in the urine; these are called “coproporphyrins” or “uroporphyrins” (their names refer to where they were first discovered, but both may appear in either urine or feces).
    Новости компании Apple – www.iphone4news.ru
    The porphyrias have been classified in several ways, none of which is entirely satisfactory. The most common system includes erythropoietic porphyria (EP), hepatic porphyria, mixed porphyria, porphyria cutanea tarda (PCT), and acquired (toxic) porphyria. EP is a small group of rare congenital diseases characterized clinically by skin photosensitivity without vesicle formation, pink discoloration of the teeth that fluoresces under ultraviolet light, and sometimes mild hemolytic anemia. If erythropoietic porphyria is suspected, the best diagnostic test is measurement of erythrocyte porphyrin.

    Hereditary hepatic porphyria may be subdivided into three types: acute intermittent porphyria (AIP; Swedish genetic porphyria), variegate porphyria (VP; South African genetic porphyria), and hereditary coproporphyria (HC). All three are inherited as autosomal dominants, and all three may be associated with episodes of acute porphyric attacks, although such attacks are more widely publicized in association with AIP. All three subdivisions manifest increases in the enzyme ALA-synthetase, which catalyzes formation of ALA from its precursors. AIP is characterized by a decrease of 50% or more in the enzyme uroporphyrinogen-I-synthetase (abbreviated URO-I-S and also known as “porphobilinogen deaminase”), which catalyzes the formation of uroporphyrinogen I from porphobilinogen. Levels of URO-I-S are said to be normal in VP and HC. Acute intermittent porphyria is not associated with photosensitivity, whereas skin lesions due to photosensitivity are common in VP and also occur in HC. Parenthetically, these skin lesions resemble those of PCT, and some of these patients were probably included in the PCT group in some early classifications. In VP and HC, increased amounts of protoporphyrin are excreted in the feces, whereas this does not happen in AIP. Although AIP, VP, and HC have increased amounts of coproporphyrin in the feces, HC patients excrete much larger amounts of fecal coproporphyrin III than does AIP or VP.

    The porphyrias can also be classified usefully according to clinical symptoms:

    1. Neurologic only: AIP
    2. Cutaneous only: PCT, EP, EPP
    3. Both neurologic and cutaneous: VP, HC

    Acute intermittent porphyria. URO-I-S is said to be decreased in all patients with AIP. However, about 5%-10% of AIP patients have values within the reference range, so that some overlap occurs. URO-I-S is also said to be decreased in relatives of patients with AIP, again with some overlap at the borderline areas of the reference range. At least one kindred with a condition closely resembling AIP has been reported with normal URO-I-S levels, but the significance of this is not clear. There may be some laboratory variation in results, and equivocal results may have to be repeated. Blood samples should be stored frozen and kept frozen during transit to the laboratory to avoid artifactual decrease in enzyme activity. Therefore, falsely low URO-I-S values may be obtained through improper specimen handling. Hemolytic anemia or reticulocytosis greater than 5% may produce an increase in URO-I-S activity. Assay for URO-I-S is available mostly in university medical centers and large reference laboratories.

    The acute porphyric attacks consist of colicky abdominal pain, vomiting, and constipation (= 80% of patients); and mental symptoms (10%-30% of patients) such as confusion, psychotic behavior, and occasionally even convulsions. About one half of the patients display hypertension and some type of muscle motor weakness. The attacks are frequently accompanied by leukocytosis. These attacks may be precipitated by certain medications (especially by barbiturates;), by estrogens, and by carbohydrate deprivation (dieting or starvation). The attacks usually do not occur until adolescence or adulthood. Porphobilinogen is nearly always present in the urine during the clinical attacks and is an almost pathognomonic finding, but the duration of excretion is highly variable. It may occasionally disappear if not searched for initially. Between attacks, some patients excrete detectable porphobilinogen and others do not. Urine ALA levels are usually increased during acute attacks but not as markedly as porphobilinogen. During remission, ALA levels also may become normal. Patients with AIP may also have hyponatremia and sometimes have falsely elevated thyroxine (T4) results due to elevated thyroxine-binding protein levels.

    Porphobilinogen is usually detected by color reaction with Ehrlich’s reagent and confirmed by demonstrating that the color is not removed by chloroform (Watson-Schwartz test). Since false positive results may occur, it is essential to confirm a positive test by butanol (butyl alcohol) extraction. Porphobilinogen will not be extracted by butanol, whereas butanol will remove most of the other Ehrlich-positive, chloroform-negative substances. Therefore, porphobilinogen is not removed by either chloroform or butanol. A positive result on the porphobilinogen test is the key to diagnosis of symptomatic acute porphyria; some investigators believe that analysis and quantitation of urinary porphyrins or ALA are useful only if the Watson-Schwartz test results are equivocal. However, the majority believe that a positive qualitative test result for porphobilinogen should be confirmed by quantitative chemical techniques (available in reference laboratories) due to experience with false positive Watson-Schwartz test results in various laboratories. They also advise quantitative analysis of porphyrins in urine and feces to differentiate the various types of porphyria. Glucose administration may considerably decrease porphobilinogen excretion.

    Some investigators prefer the Hoesch test to the modified Watson-Schwartz procedure. The Hoesch test also uses Ehrlich’s reagent but is less complicated and does not react with urobilinogen. The possibility of drug-induced false reactions has not been adequately investigated. Neither test has optimal sensitivity. In one study the Watson-Schwartz test could detect porphobilinogen only about 50% of the time when the concentration was 5 times normal. Quantitative biochemical methods available in reference laboratories are more sensitive than these screening tests.

    Porphyria cutanea tarda is a chronic type of porphyria. There usually is some degree of photosensitivity, but it does not develop until after puberty. There often is some degree of liver disease. No porphobilinogen is excreted and acute porphyric attacks do not occur.

    Toxic porphyria may be produced by a variety of chemicals, but the most common is lead. Lead poisoning produces abnormal excretion of coproporphyrin III but not of uroporphyrin III. ALA excretion is also increased.

    Familial dysautonomia (Riley-Day syndrome). Riley-Day syndrome is a familial disorder characterized by a variety of signs and symptoms, including defective lacrimation, relative indifference to pain, postural hypotension, excessive sweating, emotional lability, and absence of the fungiform papilli on the anterior portion of the tongue. Most of those affected are Jewish. Helpful laboratory tests include increased urine homovanillic acid value and decreased serum dopamine-beta-hydroxylase (DBH) value, an enzyme that helps convert dopamine to norepinephrine. Besides the Riley-Day syndrome, the DBH value may also be decreased in mongolism (Down’s syndrome) and Parkinson’s disease. It has been reported to be elevated in about 50% of patients with neuroblastoma, in stress, and in certain congenital disorders (results in the congenital disorders have not been adequately confirmed). There is disagreement as to values in patients with hypertension.