Month: December 2009

  • Theophylline (Aminophylline)

    Theophylline is used primarily as a bronchodilating agent for therapy of asthma. Over the therapeutic range there is a reasonably linear correlation between dosage and therapeutic effect. The drug is administered intravenously and orally. Oral medication is available in regular (noncoated or liquid) and slow-release (SR) forms. SR forms are available in twice-daily, and even once-daily, dosages. For most (but not all) regular oral preparations, absorption takes place predominantly in the small intestine, absorption is essentially

    SPA процедуры – Korea-spa.ru

    complete, and food usually does not interfere significantly. Absorption rates are more apt to vary, and time to serum peak is less predictable among the different SR preparations. In addition, food (especially a high-fat meal) is more likely to interfere with absorption of some SR preparations. One investigator recommends dose intake 1 hour before or 2 hours after meals when using SR preparations influenced by food. For the regular oral medication, time to peak (for adults) is about 2-3 hours, half-life is 4-6 hours (range, 3-8 hours) and time to steady state is about 15-20 hours. For children, half-life is more variable (1-8 hours) and time to steady state is also more variable (5-40 hours). Time to peak for the oral SR preparation is about 5 hours. About 50%-60% (range, 40%-65%) of theophylline is bound to serum albumin. Binding is less in neonates and at lower pH (acidosis). About 90% is metabolized in the liver, and most of the metabolites, plus about 10%-15% of unchanged theophylline, is excreted by the kidneys. Therefore, except in the first several months of life, renal function is not a major factor in theophylline serum concentration. Adults who smoke tobacco or marijuana and children excrete theophylline somewhat more rapidly (decreased serum half-life) than nonsmoking adults. Factors that reduce theophylline clearance (increased serum half-life) include young infants (ages 0-8 months), congestive heart failure, cor pulmonale, severe liver dysfunction, sustained fever, pneumonia, obesity, cessation of smoking, cimetidine, ciprofloxacin, and erythromycin family antibiotics. Some theophylline assay methods may show partial interference (some false increase in values) from substances present in uremia. Children show more individual differences in theophylline clearance and as a group eliminate theophylline more rapidly than adults. In addition, one report indicated that in children a high-protein diet increased theophylline elimination and a high-carbohydrate diet decreased it.

    Besides the factors just mentioned, therapy is complicated by the many theophylline preparations available, many of which vary significantly in theophylline content and the rate it is absorbed. Noncompliance is a constant problem in therapy and in the interpretation of theophylline blood levels, because low levels due to noncompliance may be misinterpreted as due to rapid metabolism or excretion. The reverse mistake can also be made. Another difficulty is the asthmatic who may already have taken one or more extra doses of theophylline before being seen by the physician.

    There is a relatively narrow zone between therapeutic range (10-20 µg/ml; 55-110 µmol/L) and values associated with toxicity. The degree of elevation over reference range is not a reliable predictor of toxicity risk except in a very general way, since severe toxicity can develop in some patients at less than twice the upper limit of the reference range. Although there are mild toxic symptoms, severe toxicity may develop without warning. If there is a question about previous drug intake, a specimen for theophylline assay should be obtained before therapy is begun. Therapy can then be started and the dosage modified when assay results become available. Thereafter, when steady state is achieved, serum peak concentration should be measured (30 minutes after the dose for IV theophylline, 2 hours after the dose for regular theophylline, and about 5 hours [range, 3-7 hours, depending on the particular medication] after the dose for sustained-release forms). Theophylline is thus an exception to the general rule that the residual (trough) level is better than the peak level to monitor therapy.

  • Antiarrythymic Drugs

    There is a large and ever-growing list of these medications, too many to include here. TDM data for some members of this group are summarized in Table 37-25. Several have been selected for more detailed discussion here.
    Procainamide. Procainamide is used to control certain ventricular arrhythmias and can be given orally or intravenously. Only about 10% is bound to serum protein. Maintenance is usually achieved by oral medication. About 85% of the oral dose is absorbed, mostly in the small intestine. About 50% of the drug is excreted unchanged by the kidneys. About 50% is metabolized, predominantly by the liver. The major metabolite of procainamide is N-acetylprocainamide (NAPA), which constitutes about 25%-30% of the original dose (7%-40%). NAPA is produced in the liver by a process known as N-acetylation. It has antiarrhythmic properties about equal to that of its parent compound. About 10% is bound to serum protein and about 85% is excreted unchanged by the kidneys. It has a serum half-life about twice that of procainamide. Therefore, NAPA levels continue to rise for a time after procainamide levels have stabilized. There is approximately a 1:1 ratio of procainamide to NAPA after both have equilibrated. Poor liver function may decrease NAPA formation and produce a high ratio (> 1.0) of procainamide to NAPA (i.e., less NAPA relative to the amount of procainamide). Even though procainamide degradation may be decreased, it is only 25%-30% metabolized in the liver, so that it is not affected as much as NAPA. On the other hand, poor renal function decreases NAPA excretion and decreases the procainamide/NAPA ratio to less than 1.0 (i.e., more NAPA relative to procainamide). Even though procainamide excretion may also be decreased, the amount of NAPA excreted through the kidneys is much higher than the amount of procainamide, so that poor renal function affects NAPA proportionally more than procainamide. Another factor is the acetylating process of the liver, which is an inherited characteristic. Isoniazid and hydralazine are also metabolized by this system. About one half of the population are slow acetylators and about one half are fast acetylators. Fast acetylation produces more NAPA (tending to produce a procainamide/NAPA ratio < 1.0), and slow acetylation produces less NAPA (procainamide/NAPA ratio > 1.0). Assessment of acetylation status is dependent on adequate renal function, since poor renal function can affect the procainamide/NAPA ratio. About 50% of patients on long-term procainamide therapy develop antinuclear antibodies, and up to 30% may develop a syndrome very similar to systemic lupus erythematosus. Slow acetylators are more likely to develop these conditions than fast acetylators.

    Since both procainamide and NAPA have antiarrhythmic action and since several factors influence their levels and their relationship to each other, most authorities recommend that both be assayed and that therapeutic decisions be based on the sum of both rather than on either one alone. Therapeutic range for the combination of procainamide and NAPA is 10-30 µg/ml (42.50-127.47 µmol/L). Specimens for TDM are usually obtained just before the next scheduled dose to evaluate adequacy of dosage. Peak levels or specimens drawn during symptoms are needed to investigate toxic symptoms.

    There are two types of procainamide oral preparations, standard (relatively short acting) and sustained release (SR). For the standard type, peak absorption levels are usually reached in about 1.5 hours (range, 1-2 hours) after an oral dose. However, some persons absorb procainamide relatively slowly, and the peak may be delayed up to 4 hours after the dose, close to the time one would expect a trough level. In one study, this occurred about one third of the time. Therefore, some investigators recommend both peak and trough for initial evaluation. Patients with acute myocardial infarction or cardiac failure are more likely to have delayed absorption. Serum half-life is about 3 hours (2-4 hours). Time to steady state is about 18 hours (11-20 hours). Therefore, the half-life is considered a short one, and there is a greater fluctuation in serum values compared with an agent with a long half-life. The peak level after oral SR procainamide occurs about 2 hours after the dose (range, 1-3 hours) but may not occur until later in patients with slow absorption. Time to steady state is about 24-30 hours.

    Lidocaine. Lidocaine (Xylocaine) hydrochloride is a local anesthetic that has antiarrhythmic properties. Used as an antiarrhythmic, it is generally given intravenously to patients who are seriously ill. Lidocaine is lipid soluble and distributes rapidly to many tissues. When it is given as a single bolus, plasma levels fall rapidly, with perhaps as much as a 50% decrease in about 20 minutes. On the other hand, drug given by IV infusion reaches a plateau rather slowly because so much of the drug is distributed to peripheral tissues. Therefore, administration is usually done with one or more bolus loading dose injections followed by IV infusion. The half-life of lidocaine is 1-2 hours, and time to steady state is 5-10 hours (5-12 hours). About 70% is protein bound; of the total that is protein bound, about 30% is bound to albumin and 70% to alpha-1 acid glycoprotein. Lidocaine is about 90% metabolized in the liver, with 5%-10% excreted unchanged by the kidneys. The major hepatic metabolites of lidocaine also have some antiarrhythmic effect. The primary metabolites are themselves further metabolized in the liver, with less than 10% of the primary metabolites being excreted unchanged in urine.

    Conditions that produce an increase in plasma lidocaine levels are severe chronic liver disease (decreased drug inactivation), chronic renal disease (decreased excretion), and congestive heart failure (reduced volume of distribution). In acute myocardial infarction, there is increase in the binding protein alpha-1 acid glycoprotein and a subsequent increase in plasma total lidocaine values; however, bound drug is pharmacologically inactive, and the nonbound active fraction often does not increase. Propranolol has been reported to decrease lidocaine clearance, producing higher plasma values.

    Complications related to lidocaine therapy have been reported in 6%-20% of cases. Therapeutic drug monitoring requires a method that is fast and that can be performed without much delay. HPLC and EMIT are the two most frequently used methods. Colorimetric methods are also available. It has been recommended that lidocaine specimens be drawn 12 hours after beginning therapy and then daily. In seriously ill patients, in those whose arrhythmias persist in spite of lidocaine, and when lidocaine toxicity is suspected, assay every 12 hours could be helpful. The therapeutic range is 1.5-5 µg/ml.

    Tocainide. Tocainide (Tonocard) is an analog of lidocaine that also is used to treat ventricular arrythmias. Tocainide has some advantages over lidocaine since tocainide can be given orally and has a longer half-life (about 15 hours; range, 12-18 hours) due to much less first-pass hepatic metabolism. The half-life may be increased with severe liver disease or chronic renal failure. About 10% is bound to serum protein. The metabolites of tocainide are excreted in the urine and do not have antiarrythmic activity. Peak serum levels are reached 1.5-2.0 hours after an oral dose. Steady state is reached in 3 days. Therapeutic range is 4-10 µg/ml. Assay is usually done by HPLC.

    Quinidine. Quinidine has been used for treating both atrial and ventricular arrhythmias. There are two forms of quinidine: the sulfate and the gluconate. Both are available for oral administration in both regular and long-acting (SR) preparations. The gluconate form can be given intravenously. Oral regular quinidine sulfate has a time to peak value of about 2 hours (range, 1-3 hours), a serum half-life of about 6 hours (range, 5-8 hours), and a time to steady state of about 24 hours. Regular oral quinidine gluconate has a time to peak value of about 4 hours. SR quinidine sulfate (Quinidex) has a time to peak value of about 2 hours, a serum half-life of about 20 hours, and a time to steady state of about 4 days. SR quinidine gluconate (Quiniglute, Duraquin) has a time to peak value of about 4 hours and a half-life of about 10 hours. However, when the SR preparations are used, there is relatively little fall in serum levels after the initial dose before subsequent doses. About 80% of quinidine (literature range, 60%-90%) is bound to serum proteins. Quinidine is metabolized by the liver, with about 10%-20% excreted unchanged in urine by glomerular filtration. Urine excretion is influenced by urine pH.

    Factors that may decrease quinidine levels include hypoalbuminemia, drugs that compete for albumin binding, and drugs that activate hepatic enzyme activity, such as phenytoin and phenobarbital. Factors that tend to increase quinidine levels include congestive heart failure, poor renal function (prerenal or intrinsic renal disease), and possibly severe liver disease. Renal excretion is increased by acidification of the urine and decreased by urine alkalinization.

    Several methods are available for quinidine assay. The most commonly used are fluorometric procedures, with or without preliminary extraction steps. These measurements include quinidine and several of its metabolites. Certain other fluorescing compounds may interfere. Extraction eliminates some but not all of the metabolites. More specific methods include HPLC and EMIT. Values for the direct (nonextracted) fluorescent methods are about 50% higher than those using HPLC or EMIT (i.e., the therapeutic range with the nonextracted fluorometric method is about 3-8 µg/ml [9.25-24.66 µmol/L], whereas the range using the double-extracted fluorometric method or HPLC is 2.3-5 µg/ml [7.09-15.41 µmol/L]). The specimen for TDM should be drawn just before the next dose is to be given (residual or trough level).

    Reasons for TDM of quinidine include the following:

    1. Various quinidine commercial products differ considerably in absorption.
    2. Toxic levels of quinidine can produce certain arrhythmias that could be due to patient disease (either from noncontrol or noncompliance).
    3. There is a possibility of drug interaction, because patients taking quinidine are likely to be taking several drugs or to receive additional drugs in the future.
    4. Patient disease may modify quinidine metabolism or excretion (old age frequently is associated with reduced renal function, which modifies renal excretion of quinidine).

    Flecainide. Flecainide (Tambocor) is another drug used for ventricular arrythmias, including premature ventricular contractions and ventricular tachycardia or fibrillation. About 95% is absorbed. Food or antacids do not affect absorption. After absorption, roughly 40% is bound to serum proteins. About 30% (range, 10%-50%) is excreted unchanged in the urine. The major metabolites have no antiarrythmic activity. Peak plasma levels after oral dosage are reached in about 3 hours (range, 1-6 hours). Serum half-life averages 20 hours (range, 7-27 hours) and may be longer in patients with severe renal disease or congestive failure. Steady state is reached in 3-5 days. Propranolol increases flecainide serum levels approximately 20%. Hypokalemia or hyperkalemia may affect the therapeutic action of flecainide. Flecainide paradoxically aggravates ventricular arrythmias in about 7% of patients, especially in the presence of congestive heart failure.

    Digoxin. Digoxin could be included in the section on toxicology, since most serum assay requests are for the purpose of investigating possible digoxin toxicity. However, an increasing number of studies have demonstrated unsuspected overdosage or underdosage (30% toxicity and 11% underdigitalization in one study), and requests for baseline levels are becoming more frequent. The volume of requests and the relative ease of performance (by immunoassay) make this assay readily available, even in smaller laboratories. The widespread use of digoxin, the narrow borderline between therapeutic range and toxicity, and the nonspecific nature of mild or moderate toxic signs and symptoms that mimic a variety of common disorders (diarrhea, nausea, arrhythmias, and ECG changes) contribute to the need for serum assay.

    Digoxin therapeutic drug monitoring data

    About 20%-30% of digoxin is bound to serum albumin. About 80% (range, 60%-90%) is excreted unchanged by the kidneys. About 20% is metabolized in the liver, with most of this being excreted as digoxin metabolites. About 10% of the adult population metabolizes a greater percentage of digoxin (which may be as high as 55%). After an oral dose is given, serum levels rise to a peak at 30-90 minutes and then slowly decline until a plateau is reached about 6-8 hours after administration. Digoxin assay specimens must be drawn at least 6 hours (preferably at least 8 hours) after the last dose in either oral or IV administration, to avoid blood levels that are significantly higher than would be the case when tissue levels have equilibrated. The 6- to 8-hour time span mentioned is minimum elapsed time; specimens may be drawn later. In many cases more information is obtained from a sample drawn shortly before the next scheduled dose. Serum half-life is approximately 36-38 hours. Normal therapeutic range is 0.5-2.0 µg/100 ml (0.6-2.56 nmol/L).

    Various metabolic disorders and medications may alter body concentration or serum levels of digoxin or may affect myocardial response to usual dosage. The kidney is the major route of excretion, and a decrease in renal function sufficient to raise serum creatinine levels will elevate serum digoxin levels as well. In renal failure, digoxin half-life may be extended to as long as 5 days. Hypothyroidism also increases digoxin serum values. On the other hand, certain conditions affect patient response to digitalis without affecting blood levels. Myocardial sensitivity to digoxin, regardless of dose, is increased by acute myocardial damage, hypokalemia, hypercalcemia, hypermagnesemia or hypomagnesemia, alkalosis, tissue anoxia, and glucagon. Drugs that produce hypokalemia (including various diuretics, amphotericin B, corticosteroids, or glucose infusion) thus predispose to toxicity. Other medications, such as phenylbutasone, phenytoin, and barbiturates (which activate hepatic degradation mechanisms), or kaolin (Kaopectate), antacids, cholestyramine, and certain oral antibiotics such as neomycin (which interfere with absorption) tend to be antagonistic to the effect of digitalis. Quinidine elevates digoxin levels in about 90% of patients by 50%-100% (range, 30%-330%). The effect on digoxin levels begins within 24 hours, with peak effect in 4-5 days. Certain other medications can increase serum digoxin levels to some extent.

    Interfering substances. Digoxin can be measured by a variety of immunoassay methods. Digoxin-like cross-reacting substances have been reported in many patients (not all) in the third trimester of pregnancy, infants up to 6 months of age (the effect peaking at 1 week of age), patients with renal failure, and patients with severe liver disease. Different kits are affected to different extents. Some investigators report that the cross-reacting substances bind to serum proteins. In most cases the cross-reaction increases serum digoxin less than 1.0 µg/100 ml, but sometimes the effect may be greater.

    Antidigoxin antibody therapy. Another analytical problem occurs when digitalis toxicity is treated with fragments of antidigoxin antibodies (Fab, “antigen-binding fragments”). These fragments are prepared by first producing antidigoxin IgG class antibody in animals, then enzymatically splitting off the antigen-binding variable regions (Fab portion) of the IgG molecule. This eliminates the “constant” region of the IgG molecule, which is the most antigenic portion of the molecule. The antidigoxin antibody Fab fragments bind to plasma and extracellular fluid digoxin. This creates a disturbance in equilibrium between free (unbound) digoxin within cells and within the extracellular compartments, so that some intracellular digoxin moves out of body cells to restore the equilibrium. The digoxin-Fab bound complexes are excreted in the urine by glomerular filtration. Their elimination half-life with normal renal function is about 15-20 hours (range, 14-25 hours).

    Laboratory digoxin assay is involved for two reasons. First, a pretherapy baseline is required to help establish the diagnosis of digoxin toxicity and to help estimate the dose of Fab fragments needed. Second, after injection of the Fab dose, another assay is helpful to determine if adequate therapy was given, either because pretreatment digoxin tissue levels were higher than estimated or too much of the Fab fragment dose was lost in urine before sufficient digoxin had diffused out of the body cells. It is necessary to wait at least 6-8 hours after therapy for a postdose assay, to allow for equilibration time between cells and extracellular fluid. An assay system specific for free digoxin is necessary (usually done by a technique such as microfiltration, which separates unbound from Fab-bound digoxin), because the Fab-digoxin bound complexes are included with unbound (free) digoxin in total digoxin assays. Soon after therapy begins there is greatly increased Fab-digoxin bound complex formation in plasma (and, therefore, elevated total digoxin levels, sometimes as high as 20 times pretreatment levels), whereas free digoxin levels are low. Later, 12-20 hours after the initial therapeutic dose, plasma free digoxin reequilibrates, and may reach toxic levels again if sufficient intracellular digoxin has not been captured. It may take several days to excrete all the Fab-digoxin bound complexes, and the serum total digoxin level may remain elevated more than 1 week if there is poor renal function.

    Digoxin assay clinical correlation. In various studies, there is a certain amount of overlap in the area that statistically separates normally digitalized patients from those with toxicity. This overlap exists because it is difficult to recognize mild degrees of toxicity, because patient sensitivity to digitalis varies, and because the assay technique itself, no matter how well done, like all laboratory tests displays a certain amount of variation when repeated determinations are performed on the same specimen. Regardless of these problems, if the clinical picture does not agree with the level of toxicity predicted by digoxin assay values, and laboratory quality control is adequate, the physician should not dismiss or ignore the assay results but should investigate the possibility of interference by improper specimen collection time interval, drug interaction, or metabolic alterations. However, the assay should be repeated first, to verify that a problem exists.

    Digitoxin. Digitoxin is more than 95% bound to serum albumin. Serum half-life is about 8 days (2.5-16.5 days). Digitoxin is about 90% metabolized in the liver. About 5%-10% is excreted unchanged through the kidneys. Drugs that activate hepatic enzyme systems, such as phenytoin and barbiturates, increase metabolism of digitoxin and decrease serum levels. Hypoalbuminemia and drugs that compete for binding sites on albumin also tend to decrease digitoxin serum levels. The long half-life of the drug means that toxicity is difficult to overcome, so digoxin has mostly replaced digitoxin in the United States. The therapeutic range of digitoxin is 15-25 ng/ml.

  • Psychiatric Medications

    Lithium carbonate. Lithium is used for control of the manic phase of manic-depressive psychiatric illness. Peak levels are reached in 1-3 hours, and plasma half-life (in young adults) is about 24 hours (range, 8-35 hours). Time to steady state is about 5 days (range, 2-7 days). Most excretion is through the kidneys, where there is both excretion and reabsorption. Excretion is decreased (tending to increase half-life and blood levels) with poor renal function and also with sodium deficiency. Methyldopa also tends to delay lithium excretion. More rapid excretion occurs with salt loading or sodium retention. Interesting side effects are reversible hypothyroidism (about 5% of cases, with some thyroid-stimulating hormone elevation in up to 30% of cases) and neutrophilic leukocytosis. TDM assays are usually performed 12 hours after the last dose (before administration of the next dose). The usual laboratory method is flame photometry, although other methods are becoming available. The therapeutic range is somewhat narrow (approximately 0.5-1.5 mEq/L). Values higher than 2.0 mEq/L are usually considered to be in the toxic range. Maintenance therapy is customarily monitored once a month. Some interest has been shown in red blood cell (RBC) lithium analysis, especially when lack of patient compliance is suspected. RBC lithium levels are more stable over periods of time than serum lithium levels due to the relatively short half-life of serum lithium. Low RBC lithium levels in the presence of normal or elevated serum lithium levels suggest that the patient is noncompliant but took a lithium dose shortly before coming to have the specimen drawn.

    Tricyclic antidepressants. The group name of these medications refers to their three-ring structure. They are widely used to treat unipolar psychiatric depression (i.e., depression without a manic phase). About 70% of these patients show some improvement. The tricyclics are thought to act through blocking one of the deactivation pathways of norepinephrine and serotonin at the brain nerve endings, thereby increasing the availability of these neurotransmitter agents in the synapse area. The different drugs differ in their effect on norepinephrine, serotonin, or both. Currently, the most commonly used tricyclics are imipramine (Tofranil), amitriptyline (Elavil), protrypyline (Vivactil), and doxepin (Sinequan). Of these, imipramine is metabolized to desipramine, and amitriptyline is metabolized to nortriptyline; in both cases the metabolites have pharmacologic activity and are actually marketed themselves under different trade names. Doxepin is also metabolized to the active compound desmethyldoxepin. If these parent compounds are assayed, their major metabolite must also be assayed as well as the parent. Other tricyclics are available, and still others are being introduced.

    Oral doses are fairly completely absorbed from the GI tract. Once absorbed, there is 70%-96% binding to plasma proteins and considerable first-pass metabolism in the liver. By 6-8 days 60%-85% of the dose is excreted in the urine in the form of metabolites. Peak serum levels are generally attained 2-6 hours (range, 2-8 hours) after an oral dose. There is variation in peak level depending on the drug formula. There is considerable variation in metabolism between individuals, with variation fivefold to tenfold in steady-state levels being common and sometimes differences reported as great as thirtyfold. The half-life averages 20-30 hours (range, 15-93 hours), and steady state is reached on the average in about 7-10 days (range, 2-19 days). Imipramine has a somewhat shorter half-life (6-24 hours) and time to steady state (about 2-5 days) than the other tricyclics. However, there is variation between the various drugs and between individuals taking the same drug. It is reported that 30% or more patients have serum assay values outside the standard therapeutic range. African Americans may reach higher steady-state serum levels than Europeans.

    Currently, high-performance liquid chromatography (HPLC) is considered the best assay method. Immunoassay (EMIT method) is also used but is not as specific. For example, thioridizine (Melloril) and possibly other phenothiazines may produce a reaction in the EMIT tricyclic test. When tricyclics are given once daily, TDM specimens are usually drawn 10-14 hours after the last dose (if the dose is given at bedtime, the specimen is drawn in the morning about 12 hours later). If the patient is on divided doses, the specimen should be drawn 4-6 hours after the last dose (this usually means that the specimen is drawn just before the next dose). The literature warns that some collection tube stoppers contain interfering substances and that certain serum separation devices using gels or filtration also might interfere. It is obviously necessary to select a collection and processing method that is known to be safe. Serum should be refrigerated rather than frozen. Quality control studies have shown variation within laboratories and between laboratories that is greater than the level of variation for routine chemistry tests.

  • Selected Drugs and Drug Groups Anticonvulsants

    Most epileptics can be controlled with phenytoin (Dilantin), primidone (Mysoline), phenobarbital, or other agents. Frequently drug combinations are required. Therapy is usually a long-term project. When toxicity develops, many of these therapeutic agents produce symptoms that could also be caused by central nervous system (CNS) disease, such as confusion, somnolence, and various changes in mental behavior. Some drugs, such as primidone, must be carefully brought to a therapeutic level by stages rather than in a single dose. Most antiepileptic drugs are administered to control seizures; but if seizures are infrequent, it is difficult to be certain that the medication is sufficient to prevent future episodes. When drug combinations are used, levels for all the agents should be obtained so that if only one drug is involved in toxicity or therapeutic failure, it can be identified.

    When specimens are sent to the laboratory for drug assay, the physician should list all drugs being administered. Some are metabolized to substances that themselves have antiepileptic activity (e.g., primidone is partially metabolized to phenobarbital), and the laboratory then must assay both the parent drug and its metabolite. Without a complete list of medications, there is a good chance that one or more drugs will be overlooked. Once drug blood levels have been obtained, the physician should remember that they are often not linear in relation to dose, so that a percentage change in dose may not result in the same percentage change in blood level. Repeated assays may be needed to guide dosage to achieve desired blood levels. Finally, published therapeutic ranges may not predict the individual response of some patients to the drug. Clinical judgments as well as laboratory values must be used.

    Phenytoin. Phenytoin is about 90% bound to serum proteins. About 70% is metabolized in the liver, although only 5% or less is excreted unchanged through the kidneys. Peak phenytoin levels are reached 4-8 hours after an oral dose and within 15 minutes after IV administration. Serum half-life is about 18-30 hours (literature range, 10-95 hours), with an average of about 24 hours. This variation occurs in part because higher doses saturate the liver metabolic pathway and thus increase the half-life with nonmetabolized drug. The serum dose-response curve is not linear, so that relatively small increases in dose may generate relatively large changes in serum levels. Time to reach steady state is usually 4-6 days but may take as long as 5 weeks. Administration by intramuscular injection rather than oral intake is said to reduce blood levels about 50%. The therapeutic range is 10-20 µg/ml. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Certain drugs or diseases may affect phenytoin blood levels. Severe chronic liver disease, hepatic immaturity in premature infants, or disulfiram (Antabuse) therapy often increase phenytoin levels. Certain other drugs, such as coumarin anticoagulants, chloramphenicol (Chloromycetin), methylphenidate (Ritalin), and certain benzodiazepine tranquilizers such as diazepam (Valium) and chlordiazepoxide (Librium) have caused significant elevations in a minority of patients. Acute alcohol intake may also elevate plasma levels. On the other hand, pregnancy, acute hepatitis, low doses of phenobarbital, carbamazepine (Tegretol), and chronic alcoholism may decrease phenytoin plasma levels, and they may also be decreased in full-term infants up to age 12 weeks and in some patients with renal disease. As noted previously, there may be disproportionate changes in either bound or free phenytoin in certain circumstances. About 10% of total phenytoin is theoretically free, but in one study only about 30% of patients who had free phenytoin measured conformed to this level with the remainder showing considerable variation. Certain clinical conditions or acidic highly protein-bound drugs may displace some phenytoin from albumin, causing the unbound (free) fraction of serum phenytoin to rise. Initially, total serum concentration may be decreased somewhat if the liver metabolizes the newly released free drug. However, the hepatic metabolic pathway may become saturated, with resulting persistent increase in the unbound fraction and return of the total phenytoin level into the reference range. At this time the usual phenytoin assay (total drug) could be normal while the free drug level is increased. Drugs that can displace phenytoin from albumin include valproic acid (Depakene), salicylates, oxacillin, cefazolin, cefotetan, and phenylbutasone. Large quantities of urea or bilirubin have a similar effect. Infants aged 0-12 weeks have reduced phenytoin protein binding. On the other hand, hypoalbuminemia means less binding protein is available and may result in increased free phenytoin levels coincident with decreased total phenytoin levels.

    Phenytoin has some interesting side effects in a minority of patients, among which are megaloblastic anemia and a type of benign lymphoid hyperplasia that clinically can suggest malignant lymphoma. Occasional patients develop gum hypertrophy or hirsutism. Phenytoin also can decrease blood levels of cortisol-type drugs, thyroxine (T4), digitoxin, and primidone, and can increase the effect of coumadin and the serum levels of the enzymes gamma-glutamyltransferase and alkaline phosphatase. Phenytoin produces its effects on drugs by competing for binding sites on protein or by stimulating liver microsome activity. Phenytoin alters the serum enzymes by its effect on the liver microsome system.

    Primidone. Primidone is not significantly bound to serum proteins and is about 50% metabolized in the liver. About 50% is excreted unchanged by the kidneys. Its major metabolites are phenobarbital (about 20%) and phenylethylmalonamide (about 20%), both of which have anticonvulsant activity of their own and both of which accumulate with long-term primidone administration. Phenobarbital is usually not detectable for 5-7 days after beginning primidone therapy. The ratio of phenobarbital to primidone has been variously reported as 1.0-3.0 after steady state of both drugs has been reached (unless phenobarbital is administered in addition to primidone). If phenytoin is given in addition to primidone, primidone conversion to phenobarbital is increased and the phenobarbital/primidone ratio is therefore increased. Peak serum concentration of primidone occurs in 1-3 hours, although this is somewhat variable. Serum half-life in adults is about 6-12 hours (literature range, 3.3-18 hours). Steady state is reached in about 50 hours (range, 16-60 hours). The therapeutic range is 5-12 µg/ml. It is usually recommended that both primidone and phenobarbital levels be assayed when primidone is used, rather than primidone levels only. If this is done, one must wait until steady state for phenobarbital is reached, which takes a much longer time (8-15 days for children, 10-25 days for adults) than steady state for primidone. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Phenobarbital. Phenobarbital is about 50% bound to serum protein. It has a very long half-life of 2-5 days (50-120 hours) and takes 2-3 weeks (8-15 days in children, 10-25 days in adults) to reach steady state. About 70%-80% is metabolized by the liver and about 10%-30% is excreted unchanged by the kidneys. Phenobarbital, as well as phenytoin, carbamazepine, and phenylbutasone, has the interesting ability to activate hepatic microsome activity. Thus, phenobarbital increases the activation of the phenytoin liver metabolic pathway and also competes with phenytoin for that pathway. Phenobarbital incidentally increases degradation of other drugs that are metabolized by hepatic microsome activity, such as coumarin anticoagulants, adrenocorticosteroids, quinidine, tetracycline, and tricyclic antidepressants. Acute alcoholism increases patient response to phenobarbital and chronic alcoholism is said to decrease response. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Valproic Acid. Valproic acid has been used to treat petit mal “absence” seizures and, in some cases, tonic-clonic generalized seizures and myoclonic disorders. About 90% is bound to plasma proteins. There is a relatively small volume of distribution, because most of the drug remains in the vascular system. More than 90% is metabolized in the liver, with 5% or less excreted unchanged by the kidneys. Time to peak after oral dose is 1-3 hours. Food intake may delay the peak. Serum half-life is relatively short (about 12 hours; range, 8-15 hours), and steady state (oral dose) is reached in 2-3 days (range, 30-85 hours in adults; 20-70 hours in children). Liver disease may prolong the interval before steady state. Interestingly, therapeutic effect usually does not appear until several weeks have elapsed. There is some fluctuation in serum values (said to be 20%-50%) even at steady state. Hepatic enzyme-inducing drugs such as phenytoin, phenobarbital, carbamazepine, and primadone increase the rate of valproic acid degradation and thus its rate of excretion, and therefore tend to decrease the serum levels. Hypoalbuminemia or displacement of valproic acid from albumin by acidic strongly protein-bound drugs such as salicylates decrease total valproic acid blood levels. Valproic acid can affect phenytoin and primidone levels, but the effect is variable. Phenobarbital levels are increased due to interference with liver metabolism. One report indicates that ethosuximide levels may also be increased. Specimens for TDM are usually drawn just before the next scheduled dose to evaluate adequacy of dosage. Specimens drawn during symptoms or peak levels are needed to investigate toxic symptoms.

    Rarely, valproic acid may produce liver failure. Two types have been described. The more common type appears after months of therapy, with gradual and potentially reversible progression signaled by rising aspartate aminotransferase (AST) levels. Periodic AST measurement has been advocated to prevent this complication. The other type is sudden, is nonreversible, and appears soon after therapy is started.

    Carbamazepine. Carbamazepine is used for treatment of grand mal and psychomotor epilepsy. About 70% (range, 65%-85%) is protein bound, not enough to make binding a frequent problem. Carbamazepine is metabolized by the liver. It speeds its own metabolism by activation of the liver microsome system. Only 1% is excreted unchanged in the urine. The major metabolites are the epoxide form, which is metabolically active, and the dihydroxide form, which is derived from the epoxide form. The metabolites are excreted in urine. Carbamazepine absorption after oral dose in tablet form is slow, incomplete (70%-80%), and variable. Pharmacologic data in the literature are likewise quite variable. Dosage with tablets results in a peak level that is reached in about 6-8 hours (range, 2-24 hours). Dosage as a suspension or solution or ingestion of tablets with food results in peak levels at about 3 hours. Serum half-life is about 10-30 hours (range, 8-35 hours) when therapy is begun. But after several days the liver microsome system becomes fully activated, and when this occurs the half-life for a dose change may be reduced to about 12 hours (range, 5-27 hours). Phenytoin, phenobarbital, or primidone also activate the liver microsome system, thereby increasing carbamazepine metabolism and reducing its half-life. The time to steady state is about 2 weeks (range, 2-4 weeks) during initial therapy. Later on, time to steady state for dose changes is about 3-4 days (range, 2-6 days). Transient leukopenia has been reported in about 10% of patients (range, 2%-60%) and persistent leukopenia in about 2% (range, 0%-8%). Thrombocytopenia has been reported in about 2%. Aplastic anemia may occur, but it has been rare.

  • Therapeutic Drug Monitoring (TDM)

    Various studies have shown that therapy guided by drug blood levels (therapeutic drug monitoring, TDM) has a considerably better chance of achieving therapeutic effect and preventing toxicity than therapy using empiric drug dosage. TDM can be helpful in a variety of circumstances, as can be seen in the following discussion.
    Новости SEO – www.seofull.ru
    Why obtain therapeutic drug blood levels?

    1. To be certain that adequate blood concentrations are reached. This is especially important when therapeutic effect must be achieved immediately but therapeutic results are not evident immediately, as might happen when aminoglycoside antibiotics are used.
    2. When effective blood levels are close to toxic levels (“narrow therapeutic window”). It is useful to know what margin of safety is permitted by the current medication dose. If blood levels are close to toxic values, a decrease in the dose might be attempted.
    3. If expected therapeutic effect is not achieved with standard dosage. It is important to know whether the fault is due to insufficient blood levels or is attributable to some other factor (e.g., patient tolerance to the medication effect or interference with the therapeutic effect by other drugs).
    4. If symptoms of toxicity appear with standard dosage. The problem might be one of excessive blood levels, enhancement of effect by other medications, an increase in free as opposed to total drug blood levels, or symptoms that are not due to toxicity from the drug in question.
    5. If a disease is present that is known to affect drug absorption, protein binding, metabolism, or excretion.
    6. Possible drug interaction. It is safer to know in advance whether other medications have altered the expected blood levels of a drug before symptoms appear of toxicity or of insufficient therapeutic effect.
    7. Combination drug therapy. If multiple drugs are used simultaneously for the same purpose (e.g., control of convulsions), knowledge of baseline blood levels for each drug would be helpful should problems develop and the question arise as to which drug is responsible.
    8. Possible patient noncompliance. Patients may decrease the dosage or cease taking medication altogether if symptoms improve or may simply forget to take doses.
    9. Possible medicolegal considerations. An example is the aminoglycoside antibiotic group, whose use is known to be associated with renal failure in a certain percentage of cases. If a patient develops renal failure while taking one of these antibiotics, the renal failure could be due either to drug toxicity or to the underlying disease. If previous and current antibiotic blood levels are within an established range that is not associated with toxicity, the presumptive cause of renal failure is shifted toward the disease rather than the therapy.
    10. Change in dosage or patient status to establish a new baseline for future references.

    What factors influence therapeutic drug blood levels?

    A great many factors influence TDM blood levels. Discussion of some of the more important follows.

    Route of administration. Intravenous (IV) administration places medication into the blood faster than intramuscular injection, which, in turn, is usually faster than oral intake. If IV medication is administered in a few minutes, this may shorten the serum half-life of some medications such as antibiotics compared to methods of administration that take longer. Oral medication may be influenced by malabsorption.

    Drug absorption. This may be altered by gastrointestinal (GI) tract motility variations, changes of intestinal acidity, malabsorption disorders, and in some cases interference from food or laxatives.

    Drug transport. Many drugs have a substantial fraction that is bound to plasma proteins. Acidic drugs bind predominantly to albumin, and basic drugs bind predominantly to alpha-1 glycoproteins. Protein-bound drug molecules are not metabolically active. Therapeutic drug monitoring using total drug concentration is based on the assumption that the ratio between bound and unbound (“free”) drug remains constant, and therefore alterations in the total drug level mirror alterations in the free drug level. In most cases this is true. However, when 80% or more of a drug is protein bound, there may be circumstances in which alterations in the ratio of bound to free drug may occur. These alterations may consist of either a free drug concentration within toxic range coupled with a total drug concentration within therapeutic range or a free drug concentration within therapeutic range coincident with a total drug concentration within toxic range. This may happen when the quantity of binding protein is reduced (e.g., in hypoalbuminemia) and the dose rate is not changed from that used with normal protein levels. Problems may also arise when the quantity of binding protein is normal but the degree of binding is reduced (in neonatal life and in uremia) or when competition from other drugs displaces some of the bound fraction; interaction between acidic drugs with a high percentage of protein binding (e.g., valproic acid and phenytoin); if metabolism of free drug decreases (severe liver disease); or if excretion of free drug decreases (renal failure). Although an increase in free drug quantity may explain toxic symptoms, it is helpful also to know the total drug concentration to deduce what has happened. In routine TDM, total drug concentration is usually sufficient. If toxicity occurs with total drug levels within the therapeutic range, free drug levels may provide an explanation and a better guideline for therapy. Free drug assays currently are done only by large reference laboratories. The introduction of relatively simple techniques to separate bound from free drug (e.g., membrane filtration) may permit wider availability of free drug assay.

    Drug uptake by target tissues. Drug molecules must reach target tissue and penetrate into tissue cells. Conditions such as congestive heart failure can decrease tissue perfusion and thereby delay tissue uptake of the drug.

    Extent of drug distribution (volume of distribution). Lipid-soluble drugs penetrate tissues easily and have a much greater diffusion or dispersal throughout the body than non-lipid-soluble drugs. Dispersal away from the blood or target organ decreases blood levels or target tissue levels. The tendency to diffuse throughout the body is measured by dividing the administered drug dose by the plasma concentration of the drug (at equilibrium). This results in the theoretical volume of body fluid within which the drug is diffused to produce the measured serum concentration, which, in turn, indicates the extent of extravascular distribution of the drug.

    Drug tissue utilization. Various conditions may alter this parameter, such as disease of the target organ, electrolyte or metabolic derangements, and effect of other medications.

    Drug metabolism. Most drugs for which TDM is employed are wholly or partially inactivated (“detoxified”) within the liver. Liver function becomes a critical factor when severe liver damage occurs. Also, some persons metabolize a drug faster than average (“fast metabolizer”), and some metabolize drugs slower (“slow metabolizer”). Certain drugs such as digoxin and lithium carbonate are not metabolized in the liver. The rate of drug metabolism plus the rate of excretion are major determinants of two important TDM parameters. Half-life (biologic half-life) refers to the time required to decrease drug blood concentration by 50%. It is usually measured after absorption has been completed. Steady state refers to drug blood level equilibrium between drug intake and elimination. Before steady state is achieved, drug blood values typically are lower than the level that they eventually attain. As a general rule it takes five half-lives before steady state is reached. Loading doses can decrease this time span considerably. A few investigators use three half-lives as the basis for steady-state measurements.

    Drug excretion. Nearly all TDM drugs are excreted predominantly through the kidneys (the major exception is theophylline). Markedly decreased renal function obviously leads to drug retention. The creatinine clearance rate is commonly used to estimate the degree of residual kidney function. When the serum creatinine is more than twice reference upper limits, creatinine clearance is usually less than 25% of normal and measurement is less accurate. In addition, creatinine clearance is somewhat reduced in the elderly, and some maintain that clearance reference ranges should be adjusted for old age.

    Dosage. Size and frequency of dose obviously affect drug blood levels.

    Age. Infants in general receive the same dose per unit weight as adults; children receive twice the dose, and the elderly receive less. A very troublesome period is the transition between childhood and puberty (approximately ages 10-13 years) since dosage requirements may change considerably and without warning within a few months.

    Weight. Dosage based on weight yields desirable drug blood levels more frequently than arbitrary, fixed-dose schedules. One assumes that a larger person has a larger total blood volume and extracellular fluid space within which the drug is distributed and a larger liver to metabolize the drug.

    Interference from other medications. Such interference may become manifest at any point in drug intake, metabolism, tissue therapeutic effect, and excretion, as well as lead to possible artifact in technical aspects of drug assay.

    Effect of disease on any previously mentioned factors. This most frequently involves considerable loss of renal or hepatic function.

    Assay of peak or residual level. In general, peak levels correlate with toxicity, whereas residual (trough) levels are more an indication of proper therapeutic range (i.e., whether the blood level remains within the therapeutic range). Of course, if the residual level is in the toxic range this is an even stronger indication of toxicity. An exception to the general rule is the aminoglycoside antibiotic group, in which the peak level is used to indicate whether therapeutic levels are being reached and the residual level is considered (some disagreement exists on this point) to correlate best with nephrotoxicity. For most drugs, the residual level should be kept within the therapeutic range and the peak level should be kept out of the toxic range. To avoid large fluctuations, some have recommended that the dose interval be one half of the drug half-life; in other words, the drug should be administered at least once during each half-life.

    One of the most important laboratory problems of drug level monitoring is the proper time in relationship to dose administration at which to obtain the specimen. There are two guidelines. First, the drug blood level should have reached steady state or equilibrium, which as a rule of thumb takes five drug half-lives. Second, the drug blood level should be at a true peak or residual level. Peak levels are usually reached about 1-2 hours after oral intake, about 1 hour after intramuscular administration, or about 30 minutes after IV medication. Residual levels are usually reached shortly (0-15 minutes) before the next scheduled dose. The greatest problem is being certain when the drug was actually given. I have had best results by first learning when the drug is supposed to be given. If a residual level is needed, the nursing service is then instructed to withhold the dose. The blood specimen is drawn approximately 15 minutes before the scheduled dose time, and the nursing service is then told to administer the dose. If a peak level is needed, the laboratory technologist should make arrangements to have the nursing service record the exact minute that the dose is given and telephone the laboratory. Unless the exact time the specimen was obtained and the exact time the drug dose was given are both known with certainty, drug blood level results cannot be properly interpreted and may be greatly misleading.

    Laboratory technical factors. These include the inherent technical variability of any drug assay method (expressed as a coefficient of variation) as well as the other sources of error discussed in Chapter 1. Therapeutic drug monitoring assays in general have shown greater differences between laboratories than found with simple well-established tests such as blood urea nitrogen or serum glucose levels.

    Patient compliance. Various studies have shown astonishingly high rates of patient noncompliance with dose instructions, including failure to take any medication at all. Possibly 20%-80% of all patients may be involved. Noncompliance results in subtherapeutic medication blood levels. Some believe that noncompliance is the most frequent cause of problems in patients on long-term therapy.

    Therapeutic and toxic ranges

    Therapeutic ranges are drug blood levels that have been empirically observed to correlate with desired therapeutic effects in most patients being treated for an uncomplicated disease. The same relationship is true for toxicity and toxic ranges. However, these ranges are not absolute and do not cover the response to a drug in all individual patients or the response when some unexpected factor (e.g., other diseases or other drugs) is superimposed. The primary guide to therapy is a good therapeutic response without evidence of toxicity. Most of the time this will correspond with a drug blood level within the therapeutic range, so the therapeutic range can be used as a general guideline for therapy. In some cases a good response does not correlate with the therapeutic range. In such cases the assay should be repeated on a new specimen to exclude technical error or specimens drawn at the wrong time in relation to dose. If the redrawn result is unchanged, clinical judgment should prevail. Some attempt should be made, however, to see if there is some factor that is superimposed on the disease being treated that could explain the discrepancy. Removal or increase of such a factor could affect the result of therapy at a later date. The same general statements are true for toxicity and toxic ranges. Some patients may develop toxicity at blood levels below the statistically defined toxic range and some may be asymptomatic at blood levels within the toxic range. However, the further the values enter into the toxic range, the more likely it is that toxicity will develop. Thus, patient response and drug level data are both important, and both are often necessary to interpret the total picture.

    Some Conditions That Produce Unexpected Therapeutic Drug Monitoring Results
    High plasma concentration on normal or low prescribed dose
    Patient accidental overdose
    Slow metabolizer
    Drug interaction that blocks original drug metabolism in liver or injures the liver
    Poor liver function (severe damage)
    Drug excretion block
    Increased binding proteins
    Residual level determined on sample drawn after dose was administered instead of before
    Laboratory technical factors
    Low plasma concentration on normal or high prescribed dose
    Poor drug absorption (oral dose)
    Interference by another drug
    Patient noncompliance
    Fast metabolizer
    Decreased binding proteins
    Peak level determined on sample drawn at incorrect time
    Laboratory technical factors
    Toxic symptoms with blood levels in therapeutic range
    Drug released from proteins (free drug increased)
    Drug effect enhanced at tissue level by some other drug or condition
    Blood level obtained at incorrect time
    Laboratory technical factors
    Symptoms may not be due to toxicity of that drug

    When to obtain specimens for therapeutic drug monitoring

    If a patient develops symptoms that might be caused by a drug, the best time to obtain a specimen for TDM is during the period when the patient has the symptoms (if this is not possible, within a short time afterward). One possible exception, however, is digoxin, whose blood level does not equilibrate with tissue levels until at least 6-8 hours after the dose is given. Therefore, specimens for digoxin TDM should not be drawn less than 6 hours after administration of the previous dose, even if toxic symptoms occur earlier. It should be ascertained how much time elapsed between the onset of toxic symptoms and the time of the last previous medication dose. This information is necessary to determine if there is a relationship of the symptoms to the peak blood level of the drug. If the specimen cannot be drawn during symptoms, the next best alternative is to deliberately obtain a specimen at the peak of the drug blood level. This will indicate if the peak level is within the toxic range. In some instances it may be useful to obtain a blood specimen for TDM at a drug peak level even without toxic symptoms, to be certain that the drug dosage is not too high.

    In some cases the question is not drug toxicity but whether dosage is adequate to achieve the desired therapeutic effect. In that case, the best specimen for TDM is one drawn at the residual (valley or trough) drug level, shortly before the next medication dose is given. The major exception to this rule is theophylline, for which a peak level is more helpful than a residual level.

    For most drugs, both peak and residual levels should be within the therapeutic range. The peak value should not enter the toxic range and the residual value should not fall to therapeutically inadequate levels.

    Information on some of the medications for which TDM is currently being used is given in Table 37-25. The box lists some conditions that produce unexpected TDM results.

    Summary

    Therapeutic drug monitoring can be extremely helpful in establishing drug levels that are both therapeutically adequate and nontoxic. To interpret TDM results, the clinician should know the pharmacodynamics of the medication, ascertain that steady-state levels have been achieved before ordering TDM assays, try to ensure that specimens are drawn at the correct time in relation to dose administration, be aware of effects from other medication, and view TDM results as one component in the overall clinical picture rather than the sole basis for deciding whether drug dosages are correct. Drug monitoring is carried out in two basic situations: (1) in an isolated attempt to find the reason for therapeutic failure (either toxic symptoms or nonresponse to therapy) and (2) to obtain a baseline value after sufficient time has elapsed for stabilization. Baseline values are needed for comparison with future values if trouble develops and to establish the relationship of a patient’s drug blood level to accepted therapeutic range. This information can be invaluable in future emergencies.

    Comments on therapeutic drug monitoring assay

    To receive adequate service, the physician must provide the laboratory with certain information as well as the patient specimen. This information includes the exact drug or drugs to be assayed, patient age, time elapsed from the last dose until the specimen was obtained, drug dose, and route of administration. All of these factors affect normal values. It is also desirable to state the reason for the assay (i.e., what is the question that the clinician wants answered) and provide a list of medications the patient is receiving.

    Some (not all) of the methods used in drug assay include gas-liquid chromatography (technically difficult but especially useful when several drugs are being administered simultaneously, as frequently occurs in epileptics), thin-layer chromatography (TLC; more frequently used for the hypnotic drugs), radioimmunoassay (RIA), fluorescence-polarization immunoassay, and enzyme-multiplied immunoassay (EMIT).

    One of the major reasons why TDM has not achieved wider acceptance is that reliable results are frequently not obtainable. Even when they are, the time needed to obtain a report may be several days rather than several hours. It is essential that the physician be certain that the reference laboratory, whether local or not, is providing reliable results. Reliability can be investigated in several ways: by splitting patient samples to be evaluated between the laboratory and a reference laboratory whose work is known to be good (but if isolated values are discrepant, a question may arise as to whose is correct), by splitting samples and sending one portion 1 week and the remainder the next week, or by obtaining standards from commercial companies and submitting these as unknowns. Most good reference laboratories will do a reasonable amount of such testing without charge if requested to do so beforehand.

    In some situations, assay results may be misleading without additional information. In certain drugs, such as phenytoin (Dilantin), digitoxin, and quinidine, a high percentage is bound to serum albumin and only the nonbound fraction is metabolically active. This is similar to thyroid hormone protein binding. The (nonbound) fraction may be increased in hypoalbuminemia or in conditions that change protein binding, such as uremia or administration of drugs that block binding or compete for binding sites. Drug level assays measure total drug and do not reflect changes in protein binding. In addition, some drugs, diseases, or metabolic states may potentiate or inhibit the action of certain therapeutic agents without altering blood levels or protein binding. An example is digoxin toxicity induced by hypokalemia.

  • Other Congenital Diseases

    There are a large number of congenital and genetic disorders, too many to include all in this book. If such a condition is suspected, in general the best procedure is to refer the patient or family to a university center that has an active genetics diagnosis program. If the state government health department has a genetic disease detection program, it can provide useful information and help in finding or making referral arrangements.

    Some Genetic Disorders Diagnosable with DNA Probes

    Huntington’s chorea
    Adult polycystic disease
    Alpha and beta thalassemia
    Congenital adrenal hyperplasia
    Duchenne’s and Becker’s muscular dystrophy
    Fragile X syndrome
    Hemophilia A and B
    Myotonic dystrophy
    Osteogenesis imperfecta
    Alpha-1 antitrypsin deficiency
    Cystic fibrosis
    Sickle cell hemoglobinopathy
    Retinoblastoma
    Familial hypertrophic cardiomyopathy

  • Porphyrias

    In porphyric diseases, the main similarity is the abnormal secretion of substances that are precursors of the porphyrin compound heme (of hemoglobin). The known pathways of porphyrin synthesis begin with glycine and succinate, which are combined to eventually form a compound known as d-aminolevulinic acid (ALA). This goes on to produce a substance known as “porphobilinogen,” composed of a single pyrrole ring. Four of these rings are joined to form the tetrapyrrole compound proporphyrinogen; this is the precursor of protoporphyrin, which, in turn, is the precursor of heme. The tetrapyrrole compounds exist in eight isomers, depending on where certain side groups are located. The only isomeric forms that are clinically important are I and III. Normally, very small amounts of porphyrin degradation products appear in the feces or in the urine; these are called “coproporphyrins” or “uroporphyrins” (their names refer to where they were first discovered, but both may appear in either urine or feces).
    Новости компании Apple – www.iphone4news.ru
    The porphyrias have been classified in several ways, none of which is entirely satisfactory. The most common system includes erythropoietic porphyria (EP), hepatic porphyria, mixed porphyria, porphyria cutanea tarda (PCT), and acquired (toxic) porphyria. EP is a small group of rare congenital diseases characterized clinically by skin photosensitivity without vesicle formation, pink discoloration of the teeth that fluoresces under ultraviolet light, and sometimes mild hemolytic anemia. If erythropoietic porphyria is suspected, the best diagnostic test is measurement of erythrocyte porphyrin.

    Hereditary hepatic porphyria may be subdivided into three types: acute intermittent porphyria (AIP; Swedish genetic porphyria), variegate porphyria (VP; South African genetic porphyria), and hereditary coproporphyria (HC). All three are inherited as autosomal dominants, and all three may be associated with episodes of acute porphyric attacks, although such attacks are more widely publicized in association with AIP. All three subdivisions manifest increases in the enzyme ALA-synthetase, which catalyzes formation of ALA from its precursors. AIP is characterized by a decrease of 50% or more in the enzyme uroporphyrinogen-I-synthetase (abbreviated URO-I-S and also known as “porphobilinogen deaminase”), which catalyzes the formation of uroporphyrinogen I from porphobilinogen. Levels of URO-I-S are said to be normal in VP and HC. Acute intermittent porphyria is not associated with photosensitivity, whereas skin lesions due to photosensitivity are common in VP and also occur in HC. Parenthetically, these skin lesions resemble those of PCT, and some of these patients were probably included in the PCT group in some early classifications. In VP and HC, increased amounts of protoporphyrin are excreted in the feces, whereas this does not happen in AIP. Although AIP, VP, and HC have increased amounts of coproporphyrin in the feces, HC patients excrete much larger amounts of fecal coproporphyrin III than does AIP or VP.

    The porphyrias can also be classified usefully according to clinical symptoms:

    1. Neurologic only: AIP
    2. Cutaneous only: PCT, EP, EPP
    3. Both neurologic and cutaneous: VP, HC

    Acute intermittent porphyria. URO-I-S is said to be decreased in all patients with AIP. However, about 5%-10% of AIP patients have values within the reference range, so that some overlap occurs. URO-I-S is also said to be decreased in relatives of patients with AIP, again with some overlap at the borderline areas of the reference range. At least one kindred with a condition closely resembling AIP has been reported with normal URO-I-S levels, but the significance of this is not clear. There may be some laboratory variation in results, and equivocal results may have to be repeated. Blood samples should be stored frozen and kept frozen during transit to the laboratory to avoid artifactual decrease in enzyme activity. Therefore, falsely low URO-I-S values may be obtained through improper specimen handling. Hemolytic anemia or reticulocytosis greater than 5% may produce an increase in URO-I-S activity. Assay for URO-I-S is available mostly in university medical centers and large reference laboratories.

    The acute porphyric attacks consist of colicky abdominal pain, vomiting, and constipation (= 80% of patients); and mental symptoms (10%-30% of patients) such as confusion, psychotic behavior, and occasionally even convulsions. About one half of the patients display hypertension and some type of muscle motor weakness. The attacks are frequently accompanied by leukocytosis. These attacks may be precipitated by certain medications (especially by barbiturates;), by estrogens, and by carbohydrate deprivation (dieting or starvation). The attacks usually do not occur until adolescence or adulthood. Porphobilinogen is nearly always present in the urine during the clinical attacks and is an almost pathognomonic finding, but the duration of excretion is highly variable. It may occasionally disappear if not searched for initially. Between attacks, some patients excrete detectable porphobilinogen and others do not. Urine ALA levels are usually increased during acute attacks but not as markedly as porphobilinogen. During remission, ALA levels also may become normal. Patients with AIP may also have hyponatremia and sometimes have falsely elevated thyroxine (T4) results due to elevated thyroxine-binding protein levels.

    Porphobilinogen is usually detected by color reaction with Ehrlich’s reagent and confirmed by demonstrating that the color is not removed by chloroform (Watson-Schwartz test). Since false positive results may occur, it is essential to confirm a positive test by butanol (butyl alcohol) extraction. Porphobilinogen will not be extracted by butanol, whereas butanol will remove most of the other Ehrlich-positive, chloroform-negative substances. Therefore, porphobilinogen is not removed by either chloroform or butanol. A positive result on the porphobilinogen test is the key to diagnosis of symptomatic acute porphyria; some investigators believe that analysis and quantitation of urinary porphyrins or ALA are useful only if the Watson-Schwartz test results are equivocal. However, the majority believe that a positive qualitative test result for porphobilinogen should be confirmed by quantitative chemical techniques (available in reference laboratories) due to experience with false positive Watson-Schwartz test results in various laboratories. They also advise quantitative analysis of porphyrins in urine and feces to differentiate the various types of porphyria. Glucose administration may considerably decrease porphobilinogen excretion.

    Some investigators prefer the Hoesch test to the modified Watson-Schwartz procedure. The Hoesch test also uses Ehrlich’s reagent but is less complicated and does not react with urobilinogen. The possibility of drug-induced false reactions has not been adequately investigated. Neither test has optimal sensitivity. In one study the Watson-Schwartz test could detect porphobilinogen only about 50% of the time when the concentration was 5 times normal. Quantitative biochemical methods available in reference laboratories are more sensitive than these screening tests.

    Porphyria cutanea tarda is a chronic type of porphyria. There usually is some degree of photosensitivity, but it does not develop until after puberty. There often is some degree of liver disease. No porphobilinogen is excreted and acute porphyric attacks do not occur.

    Toxic porphyria may be produced by a variety of chemicals, but the most common is lead. Lead poisoning produces abnormal excretion of coproporphyrin III but not of uroporphyrin III. ALA excretion is also increased.

    Familial dysautonomia (Riley-Day syndrome). Riley-Day syndrome is a familial disorder characterized by a variety of signs and symptoms, including defective lacrimation, relative indifference to pain, postural hypotension, excessive sweating, emotional lability, and absence of the fungiform papilli on the anterior portion of the tongue. Most of those affected are Jewish. Helpful laboratory tests include increased urine homovanillic acid value and decreased serum dopamine-beta-hydroxylase (DBH) value, an enzyme that helps convert dopamine to norepinephrine. Besides the Riley-Day syndrome, the DBH value may also be decreased in mongolism (Down’s syndrome) and Parkinson’s disease. It has been reported to be elevated in about 50% of patients with neuroblastoma, in stress, and in certain congenital disorders (results in the congenital disorders have not been adequately confirmed). There is disagreement as to values in patients with hypertension.

  • Enzyme Deficiency Diseases

    Congenital cholinesterase deficiency (succinylcholine apnea). Cholinesterase is an enzyme best known for its role in regulation of nerve impulse transmission via breakdown of acetylcholine at the nerve synapse and neuromuscular junction. There are two categories of cholinesterase: acetylcholinesterase (“true cholinesterase”), found in RBCs and nerve tissue; and serum cholinesterase (“pseudocholinesterase”). Cholinesterase deficiency became important when it was noted that such patients were predisposed to prolonged periods of apnea after administration of succinylcholine, a competitor to acetylcholine. Serum cholinesterase inactivates succinylcholine, but acetylcholinesterase does not. Serum cholinesterase deficiency may be congenital or acquired; the congenital type is uncommon but is responsible for most of the cases of prolonged apnea. The patient with congenital deficiency seems to have an abnormal (“atypical”) cholinesterase, of which several genetic variants have been reported.

    Laboratory diagnosis. Serum cholinesterase assay is currently the best screening test for cholinesterase deficiency. If abnormally low values are found, it is necessary to perform inhibition procedures with dibucaine and fluoride to distinguish congenital deficiency (atypical cholinesterase) from acquired deficiency of the normal enzyme. Deficiency of normal enzyme may cause prolonged succinylcholine apnea but not predictably and usually only in very severe deficiency. Acute or chronic liver disease is the most frequent etiology for acquired deficiency. Hypoalbuminemia is frequently associated in hepatic or nonhepatic etiologies. A considerable number of drugs lower serum cholinesterase levels and thus might potentiate the action of succinylcholine.

    Cholinesterase levels are also decreased in organic phosphate poisoning; this affects both RBC and plasma enzyme levels. Screening tests have been devised using “dip-and-read” paper strips. These are probably satisfactory for ruling out phosphate insecticide poisoning but are not accurate in diagnosis of potential for succinylcholine apnea.

    Alpha-1 antitrypsin deficiency. Alpha-1 antitrypsin (AAT) is a serine protease inhibitor that inactivates trypsin but whose primary importance is inactivation of neutrophil elastase that breaks down elastic fibers and collagen. AAT is produced by the liver and comprises about 90% of the globulins that migrate on electrophoresis in the alpha-1 region. AAT deficiency has been associated with two different diseases: pulmonary emphysema in adults (relatively common) and cirrhosis in children (rare). This type of emphysema is characteristically, although not invariably, more severe in the lower lobes. A substantial number of those with homozygous antitrypsin deficiency are affected; reports differ on whether heterozygotes have an increased predisposition to emphysema or to pulmonary disease.

    Laboratory diagnosis. The most useful screening test at present is serum protein electrophoresis; the alpha-1 globulin peak is absent or nearly absent in homozygotes. More definitive diagnosis, as well as separation of severe from intermediate degrees of deficiency, may be accomplished by quantitation of AAT using immunoassay methods such as immunonephelometry or immunodiffusion. Estrogen therapy (birth control pills) may elevate AAT levels. Since this protein is one of the acute-phase reactants involving the alpha-1 and alpha-2 globulin group on electrophoresis, values are frequently elevated in acute or severe chronic infections, sarcoidosis, inflammation, active rheumatoid-collagen disease, steroid therapy, tissue destruction, and some cases of malignancy. In some cases, measurement of other acute-phase reactants, such as C-reactive protein or serum haptoglobin, might help decide whether AAT might be elevated for this reason. Conditions besides congenital deficiency that reduce AAT activity include severe protein loss, severe renal disease, malabsorption, and thyroiditis.

    The gene for AAT is located on the long arm of chromosome 14 (14q). There are a considerable number of allelic variants of the AAT gene (often called protease inhibitor gene or Pi). Most normal persons have an MM phenotype; most carriers are MZ; and most symptomatic deficiency patients are ZZ. Definitive diagnosis can be made in most cases by DNA probe, either direct analysis with M and Z probes, or by restriction fragment linkage polymorphism (RFLP) methods.

    Biotinidase deficiency. Biotin is a water-soluble vitamin that is present in most common foods and, in addition, can be synthesized by GI tract bacteria. Biotin is a cofactor for activity of several carboxylase enzymes that are found in the carboxylic acid cycle, leucine metabolism, and proprionic acid metabolism. Biotinidase converts the precursor substance biocytin to biotin. Biotinidase deficiency prevents conversion of biocytin from dietary sources and forces dependence on biotin produced by GI tract bacteria. Suppression of these bacteria or inactivation of biotin by certain substances such as the glycoprotein avidin in egg white (most commonly caused by eating large quantities of raw eggs) can precipitate biotin deficiency. Other possible causes of biotin deficiency include chronic hemodialysis, long-term total parenteral nutrition without biotin supplement, and occasionally long-term anticonvulsant therapy. Symptoms include retarded growth, weakness, ataxia, hair loss, skin rash, metabolic acidosis, and sometimes convulsions.

    Laboratory diagnosis. Neonatal screening for biotinidase deficiency can be done on heelstick blood spotted on filter paper using a variety of assay methods. The same methods can be used on venous blood.

  • Abnormalities of Glandular Secretion

    Cystic fibrosis. Cystic fibrosis (mucoviscidosis, or fibrocystic disease of the pancreas) is the most common eventually lethal autosomal recessive inherited disorder in Europeans (estimated gene frequency of 1 in 2,000 live births). Incidence in African Americans is 2% of that in Europeans; it is rare in Asians. About 90% of homozygotes have symptoms resulting predominately from damage to mucus-producing exocrine glands, although non-mucus-producing exocrine glands can also be affected. The mucus glands produce abnormally viscid secretions that may inspissate, plug the gland ducts, and generate obstructive complications. In the lungs, this may lead to recurrent bronchopneumonia, the most frequent and most dangerous complication of cystic fibrosis. Pseudomonas aeruginosa and Staphylococcus aureus are the most frequent pathogens. The next most common abnormality is complete or partial destruction of the exocrine portions of the pancreas, leading to various degrees of malabsorption, steatorrhea, digestive disturbances, and malnutrition. This manifestation varies in severity, and about 15% of patients have only a minimal disorder or even a normal pancreatic exocrine function. Less common findings are biliary cirrhosis, most often focal, due to obstruction of bile ductules; and intestinal obstruction by inspissated meconium (meconium ileus), found in 10%-15% of newborns with cystic fibrosis. Gamma-glutamyltransferase (GGT) enzyme is elevated in about one third of patients, predominantly those with some degree of active bile duct injury or cirrhosis.

    Sweat test for screening. Non-mucus-producing exocrine glands such as the sweat glands do not ordinarily cause symptoms. However, they are also affected, because the sodium and chloride concentration in sweat is higher than normal in patients with cystic fibrosis, even though the volume of sweat is not abnormally increased. Therefore, unusually high quantities of sodium and chloride are lost in sweat, and this fact is utilized for diagnosis. Screening tests (silver nitrate or Schwachman test) that depend on the incorporation of silver nitrate into agar plates or special paper have been devised. The patient’s hand is carefully washed and dried, since previously dried sweat will leave a concentrated chloride residue on the skin and give a false positive result. After an extended period or after exercise to increase secretions, the palm or fingers are placed on the silver nitrate surface. Excess chlorides will combine with the silver nitrate to form visible silver chloride. However, this method is not accurate in patients less than 2 months old.

    Sweat test for diagnosis. For definitive diagnosis, sweat is collected by plastic bag or by a technique known as iontophoresis. Iontophoresis using the Gibson-Cooke method is the current standard procedure. Sweat is induced by sweat stimulants such as pilocarpine. The iontophoresis apparatus consists of two small electrodes that create a tiny electric current to transport the stimulating drug into the sweat glands of the skin. The sweat is collected in small gauze pads. The procedure is painless. According to a report from a committee sponsored by the Cystic Fibrosis Foundation in 1983, the standard Gibson-Cooke method is difficult to perform in neonates, and it is better to wait until age 4 weeks if possible. Modifications of the equipment and collection system have been devised and are commercially available.

    In children, a sweat chloride content greater than 60 mEq/L (60 mmol/L) or a sweat sodium content greater than 70 mEq/L (70 mmol/L) is considered definitely abnormal. Sodium and chloride may normally be higher (75-80 mEq/L) during the first 3 days of life, decreasing to childhood values by the fourth day. In children there is an equivocal zone (50-80 mEq/L chloride, possibly even up to 90 mEq/L) in which the diagnosis should be considered unproved. Repeated determinations using good technique and acquiring an adequate quantity of sweat are needed. Volumes of sweat weighing less than 50 mg are not considered reliable for analysis. For diagnostic sweat collection it is recommended that the hand not be used, because the concentration of electrolytes in the palm is significantly greater than elsewhere. A further caution pertains to reference values in persons over age 15 years. Whereas in one study there were only 5% of children with cystic fibrosis who had sweat chloride values less than 50 mEq/L and 3% of controls with values of 60-70 mEq/L, 34% of a group of normal adults were found to have sweat sodium concentration greater than 60 mEq/L and in 4% values were more than 90 mEq/L. Another report did not verify these data; therefore, the diagnosis may be more difficult in adults.

    One might gather from this discussion that sweat electrolyte analysis is no more difficult than standard laboratory tests such as blood gas analysis or serum protein electrophoresis. Unfortunately, surveys have shown that the majority of laboratories have a relatively low level of accuracy for sweat testing. The newer commercially available iontophoresis modifications have not shown that they can consistently achieve the accuracy of a carefully performed Gibson-Cooke analysis, although some evaluations have been very favorable. Authorities in cystic fibrosis strongly recommend that the diagnosis of cystic fibrosis should not be made or excluded with certainty on the basis of a single sweat analysis. A minimum of two tests showing unequivocal results with adequate control results are necessary. It is preferable to refer a patient with symptoms suggestive of cystic fibrosis to a specialized cystic fibrosis center experienced in the Gibson-Cooke technique to obtain a definitive diagnosis.

    Clinically normal heterozygotes and relatives of patients with cystic fibrosis have been reported to have abnormal sweat electrolytes in 5%-20% of instances, although some investigators dispute these findings.

    Some investigators feel that sweat chloride provides better separation of normal persons from persons with cystic fibrosis than sweat sodium.

    DNA linkage analysis. The gene causing cystic fibrosis is located on the long arm of chromosome 7 (7q). The most common variant of cystic fibrosis (70% of cases) results from deletion of the 3-nucleotide sequence of a phenylalanine molecule in amino acid position (codon) 508 (often called df508) of the cystic fibrosis gene. When combined with the four next most common gene abnormalities, current DNA probe techniques using the indirect linkage analysis method reportedly have a sensitivity of 85%. Some are using probes for more than five genetic defects and claim a sensitivity of 90% or more. This technique can be applied to prenatal diagnosis in the first trimester using a chorionic villus biopsy specimen.

    Other screening tests. Other tests have been suggested to screen for cystic fibrosis. Trypsin is an enzyme produced only in the pancreas. Serum trypsin is elevated in the first months or years of the disease (as the pancreatic cells are destroyed) but eventually decreases below the reference range. However, this leads to considerable overlap with normal persons (in the transition stage between elevated values and decreased values), so that only unequivocally high or low values are significant. The precursor of trypsin is trypsinogen, and levels of this proenzyme are elevated in most neonates with cystic fibrosis and can be measured by immunoassay (immunoreactive trypsinogen, IRT). IRT assay can be performed on filter paper dried blood spots in a similar manner to other neonatal congenital disease screening. Several screening projects have been reported with good results. However, the Cystic Fibrosis Committee noted that various immunoassays have not been standardized, cutoff detection levels are not uniform, and there is a possibility that some of the 10% of infants with cystic fibrosis who have normal pancreatic function could be missed. IRT usually declines in infants with cystic fibrosis after a few weeks, so that repeat testing results would be difficult to interpret. Also, IRT may be normal in some infants with meconium ileus. A 1991 state screening program using IRT detected 95% of infants with cystic fibrosis and used a lower cutoff point on repeat testing to compensate for expected decrease in IRT.

    Other methods involve testing meconium in the first stools produced by newborns. One technique tests for increased protein levels (which are predominantly albumin) using a paper dipstick that detects elevated levels of albumin. Since 15%-25% of infants with cystic fibrosis have normal degrees of pancreatic enzyme activity, the test yields at least that number of false negative results. In addition, it yields a considerable number of false positive results. The greatest number of false positives (about 50% of all positive specimens in one study) comes from low-birth-weight infants. Other causes for false positive results include contamination by blood, protein from infant formula, and protein in baby cream. Another approach involves a test for glucose on the meconium stools, which is supposed to reflect the presence or absence of lactase activity. In one study, about one third of cystic fibrosis cases were missed by both the albumin and the glucose (lactose activity) tests.

  • Diseases of Mineral Metabolism

    Wilson’s disease (hepatolenticular degeneration). Wilson’s disease is a familial disorder of copper metabolism transmitted as an autosomal recessive trait. It most often becomes manifest between ages 8 and 30 years; symptoms usually do not develop before age 6 years. About 30%-50% of patients initially develop hepatic symptoms, about 30%-40% begin with neurologic symptoms, and about 20%-30% initially are said to have psychiatric abnormalities such as schizophrenia. A few patients develop a Coombs’-negative hemolytic anemia. Children are more likely to be first seen with hepatic symptoms, although symptoms may occur at any age. In children, these most commonly take the form of chronic hepatitis, although in some patients the test results may resemble acute hepatitis virus hepatitis. A macronodular type of cirrhosis develops later and is usually present in patients with late-stage Wilson’s disease, whether or not there were symptoms of active liver disease. Some patients present with minimally active or with nonactive cirrhosis. Neurologic symptoms typically originate in the basal ganglia area (lentiform nucleus) of the brain and consist of varying degrees of incoordination, tremor, spasticity, rigidity, and dysarthria. There may also be a peculiar flapping tremor. Some young or middle-aged adults develop premature osteoarthritis, especially in the knees.

    Wilson’s disease is characterized by inability of the liver to manufacture normal quantities of ceruloplasmin, an alpha-2 globulin that transports copper. For reasons not entirely understood, excessive copper is deposited in various tissues, eventually producing damage to the basal ganglia of the brain and to the liver. The kidney is also affected, leading to aminoaciduria, and copper is deposited in the cornea, producing a zone of discoloration called the Kayser-Fleischer ring.

    Clinical diagnosis. The triad of typical basal ganglia symptoms, Kayser-Fleischer ring, and hepatic cirrhosis is virtually diagnostic. However, many patients do not have the textbook picture, especially in the early stages. The Kayser-Fleischer ring is often grossly visible but in many cases can be seen only by slit lamp examination. All patients with neurologic symptoms are said to have the Kayser-Fleischer ring as well as about 50% (range, 27%-93%) of those with hepatic symptoms. The Kayser-Fleischer ring is present in only about 20% (range, 0%-37%) of asymptomatic patients detected during family study investigation or at the beginning of symptoms from hepatic disease without neurologic findings. Overall, about 25% of patients (range, 22%-33%) do not have a demonstrable Kayser-Fleischer ring at the time of diagnosis. Patients with primary biliary cirrhosis or, occasionally, other types of chronic cholestatic liver disease may develop a corneal abnormality identical to the Kayser-Fleischer ring.

    Plasma ceruloplasmin assay. Laboratory studies may be of value in diagnosis, especially in the preclinical or early stages. Normally, about 90%-95% of serum copper is bound to ceruloplasmin, one of the alpha-2 globulins. The primary excretion pathway for serum copper is through bile. The serum ceruloplasmin level is low from birth in 95% (range, 90%-96%) of homozygous patients, and is considered the best screening test for Wilson’s disease. About 10% (range, 6%-20%) of Wilson’s disease heterozygotes have decreased serum ceruloplasmin. However, normal newborn infants usually have decreased ceruloplasmin levels, and the test is not considered reliable until 3-6 months of age. Although a normal ceruloplasmin level (over 20 mg/100 ml; 200 mg/L) is usually interpreted as excluding Wilson’s disease, about 5% (range, 4%-10%) of homozygous Wilson’s disease patients have values greater than 20 mg/100 ml. This is more likely to be found in younger children and in those with hepatic disease. Estrogen therapy, pregnancy, active liver disease of various etiologies, malignant lymphoma, and occasionally various acute inflammatory conditions (since ceruloplasmin is one of the “acute reaction” proteins) can raise ceruloplasmin levels in variable numbers of cases. Smoking is reported to raise ceruloplasmin levels about 15%-30%. Although a decreased ceruloplasmin level is usually considered suggestive of Wilson’s disease, about 5% of normal persons may have values less than 20 mg/100 ml (200 mg/L), and values may be decreased in hereditary tyrosinemia, Menke’s kinky hair syndrome, the nephrotic syndrome, malabsorption syndromes such as sprue, and in various liver diseases (about 20% of cases in one study. However, it is possible that some patients with liver disease and decreased ceruloplasmin levels actually have Wilson’s disease).

    Liver biopsy has also been used for diagnosis. The microscopic findings are not specific, and most often consist of either macronodular cirrhosis (often with some fatty change and occasionally with Mallory bodies) or chronic active hepatitis (10%-15% of patients with Wilson’s disease). The most typical finding is increased hepatic copper content by special stains (or tissue analysis, if available). For histologic staining of copper, fixation of the biopsy specimen in alcohol rather than the routine fixatives is recommended. Here again, it is advisable to wait 6-12 weeks after birth. Increased hepatic copper content is not specific for Wilson’s disease, since some degree of copper increase has been reported to occur in some patients with postnecrotic cirrhosis due to hepatitis virus hepatitis, in patients with primary biliary cirrhosis, and occasionally in patients with other chronic cholestatic syndromes. Also, increased hepatic copper content is not present in all patients with Wilson’s disease, especially in small-needle biopsy specimens.

    Serum and urine copper. Total serum copper levels are decreased in 85%-90% of Wilson’s disease patients. However, serum copper not bound to serum ceruloplasmin is usually normal or increased. Twenty-four-hour urine copper excretion in symptomatic Wilson’s disease is increased in 90% of patients. However, 24-hour copper excretion is often normal in presymptomatic patients. Increased urine copper excretion is not specific for Wilson’s disease and may be found in various types of cirrhosis, especially those with some degree of cholestasis and in 10%-30% of chronic active hepatitis patients. However, these conditions usually have normal or elevated serum ceruloplasmin levels.

    DNA probes. The gene affected in Wilson’s disease has been found on the long arm of chromosome 13, close to the gene responsible for retinoblastoma. DNA linkage probes for Wilson’s disease have been reported. In some cases, the retinoblastoma probe has been used.

    Other laboratory abnormalities. Besides abnormalities in copper metabolism, over 50% of patients (78% in one study) have a low serum uric acid level, a finding that could arouse suspicion of Wilson’s disease if supporting evidence is present. Other laboratory findings that may be encountered in some patients are low serum phosphorus levels, thrombocytopenia (about 50%; range, 22%-82%, due to cirrhosis with secondary hypersplenism), aminoaciduria, glucosuria, and uricosuria. A Coombs’-negative hemolytic anemia occurs in a few patients.

    Hemochromatosis. Hemochromatosis is an uncommon disease produced by idiopathic excess iron absorption from the GI tract, which leads to excess deposition of iron in various tissues, especially the liver. There still is dispute as to which iron storage diseases should be included within the term hemochromatosis. In this discussion, hemochromatosis refers to the hereditary iron storage disorder and hemosiderosis to nonhereditary (secondary) forms. Hemochromatosis is transmitted as an autosomal recessive trait with the gene being located on the short arm of chromosome 6 close to the class I histocompatibility antigen (HLA) locus. Males are affected more often than females (3:2 in one series), and males seem overall to have more severe disease than females. HLA-A3 antigen is present in 70%-80% of patients (vs. 20%-30% in the normal population).

    Clinical onset of the disease is usually between ages 40 and 60 years. Signs, symptoms, and laboratory abnormalities depend on the stage of disease and (probably) whether there is also a significant degree of alcohol intake. Cirrhosis, diabetes mellitus, and bronze skin pigmentation form a classic triad diagnostic of hemochromatosis. However, this triad is a late manifestation, and in one study including more early cases it was present in less than 10% of the patients. The most frequent symptom is joint pain (47%-57% of patients; 50%-75% in patients with severe disease), which can be confused with rheumatoid arthritis. Hepatomegaly is present in 54%-93% of patients, cirrhosis on liver biopsy in 57%-94%, heart failure in 0%-35%, hypogonadism (in males) in 18%-61%, skin pigmentation in 51%-85% (not really noticeable in many patients), and clinically evident diabetes in 6%-72%. Alcoholism (15%-50%) or poor nutrition was frequent in some series. Hepatoma has been reported to develop in 15%-30% of patients.

    Laboratory findings include the expected blood glucose abnormalities of diabetes (chapter 28) in those patients with overt diabetes, and decreased glucose tolerance in some of those without clinical diabetes. AST levels are elevated in 46%-54% of cases, reflecting active liver cell involvement. In one series, AST, alkaline phosphatase (ALP), and gamma-glutamyltransferase were normal or only mildly elevated unless the patient was alcoholic.

    Laboratory iron studies. The body iron abnormality is manifested by actual or relative increase in serum iron levels and decrease in total iron-binding capacity (TIBC), producing increased saturation (% saturation) of the TIBC. In addition, hemosiderin very often can be demonstrated in the urine sediment by iron stains. The most sensitive laboratory test for hemochromatosis is percent saturation of TIBC (or of transferrin), which is greater than 60% (reference range, 16%-50%) in over 90% of male homozygotes and the 60% of females who have iron loading but which misses the 40% of females who do not have iron loading. Transferrin saturation of 50% detects most males or females with or without iron loading. Therefore, it has been proposed that the screening cutoff point should be 60% for males and 50% for females. Serum iron level is increased in more than 80% of patients and serum ferritin level is increased in more than 72% of patients; both of these tests are usually abnormal in affected males but much more variable in females. However, in one report about one third of patients with chronic hepatitis B or C also had elevated serum iron, ferritin, and percent saturation, and serum ferritin is often increased by various acute inflammatory conditions. Liver biopsy demonstrates marked deposition of iron in parenchymal cells and frequently reveals cirrhosis.

    The most widely used screening test is serum iron. Elevated values raise the question of hemochromatosis. About 2.4% of normal persons are reported to have elevated serum iron values that spontaneously return to the reference range within 1-2 days. The effect of serum diurnal variation and day-to-day variation must be considered. Serum iron levels can also be increased in chronic hepatitis B or C infection (46% of cases in one study) and in hemosiderosis (nonhereditary iron overload) due to blood transfusion, chronic severe hemolytic anemias, sideroblastic anemias, alcoholic cirrhosis, parenteral iron therapy, and considerably increased iron intake. Several other conditions that may be associated with increased serum iron levels are listed in Table 37-2. Various conditions can lower the serum iron level (especially chronic iron deficiency and moderate or severe chronic disease without iron deficiency), and if one of these conditions is superimposed on hemochromatosis, the serum iron level might be decreased sufficiently to reach the reference range area.

    As noted earlier, the best screening procedure is percent saturation of transferrin. This is calculated by dividing the serum iron value by the TIBC value. However, like serum iron, increase in percent transferrin saturation is not specific for hemochromatosis, since there are other conditions that decrease percent saturation, especially alcohol-related active cirrhosis. One study found that drawing specimens after an overnight fast considerably decreased false elevation of percent saturation. In addition, there is considerable variation in the literature as to the percent saturation cutoff point that should be used (50%-80%, with the majority using either 50% or 62%). The lower levels increase sensitivity in detecting hemochromatosis; the higher levels eliminate many patients who do not have hemochromatosis.

    Definitive diagnosis is made by liver biopsy and measurement of hepatic iron content. Even liver biopsy iron may not differentiate hemochomatosis from hemosiderosis in some cases, and the liver cells of patients with cirrhosis but without demonstrable abnormality of iron metabolism may display some degree of increased iron deposition.

    Family member screening. Hemochromatosis rarely becomes clinically evident before age 30, so that screening family members of patients has been advocated to detect unrecognized homozygotes to begin therapy before clinical symptoms develop. One study found that percent transferrin saturation detected about 90% of occult homozygotes, whereas assay of serum iron levels detected about 85% and assay of serum ferritin levels detected about 50%.