ISSN NUMBER: 1938-7172
Issue 1.3

Michael A. Fiedler, PhD, CRNA

Contributing Editor:
Chuck Biddle, PhD, CRNA

Guest Editors:
Terri M. Cahoon, MSN, CRNA
Mary A. Golinski, PhD, CRNA
Alfred E. Lupien, Ph.D., CRNA

Assistant Editor
Jessica Floyd, BS

A Publication of Lifelong Learning, LLC © Copyright 2007

New health information becomes available constantly. While we strive to provide accurate information, factual and typographical errors may occur. The authors, editors, publisher, and Lifelong Learning, LLC is/are not responsible for any errors or omissions in the information presented. We endeavor to provide accurate information helpful in your clinical practice. Remember, though, that there is a lot of information out there and we are only presenting some of it here. Also, the comments of contributors represent their personal views, colored by their knowledge, understanding, experience, and judgment which may differ from yours. Their comments are written without knowing details of the clinical situation in which you may apply the information. In the end, your clinical decisions should be based upon your best judgment for each specific patient situation. We do not accept responsibility for clinical decisions or outcomes.

Table of Contents














The development of Anesthesia Abstracts continues. This month we have begun working on Continuing Education for Anesthesia Abstracts. While it will take several months for CE to be developed and approved, I expect CE to be available by the end of the year.

After making his debut in last month's issue, Dr. Joe Burkard will be unavailable for a few months. Commander Burkard has been deployed abord a US Navy vessel.

This month we welcome the contributions of Dr. Mary A. Golinski, Dr. Alfred E. Lupien, and Mrs. Terri M. Cahoon. Each of these CRNAs has extensive experience and expertise and we will profile them in a later issue.

We have added more links in the "Resource Links" area. Be sure not to miss this value added content.

As?the process of developing Anesthesia Abstracts continues, your comments and suggestions are always welcome in "Blog with the Editors" and through the "Contact Us" tab. Your suggestions for content areas you would like to see included in future issues are especially important to us. Contributors will make an effort to cover the areas of most interest to you.

Michael A. Fiedler, PhD, CRNA


Gandhi G, Nuttall G, Abel M, Mullany C, Schaff H, O?Brien P, Johnson M, Williams A, Cutshall S, Mundy L, Rizza R, McMahon M


Intensive intraoperative insulin therapy versus conventional glucose management during cardiac surgery

Ann Intern Med 2007;146:233-43

Gandhi G, Nuttall G, Abel M, Mullany C, Schaff H, O’Brien P, Johnson M, Williams A, Cutshall S, Mundy L, Rizza R,

McMahon M




Purpose            The intensive insulin therapy studies that have been conducted to date, specifically for the cardiac surgical patients, have focused on determining whether tight glycemic control (maintaining blood glucose values within the range of ~ 80-110 mg/dL) in the post-operative period reduces overall morbidity and mortality. It is well known that hyperglycemia is a commonplace occurrence during cardiac surgery; this is true whether the patient has an established diagnosis of diabetes mellitus, or not. Very limited data exists as to what is the “optimal management” of intra- operative hyperglycemia. These researchers conducted this study for the purpose of comparing outcomes of those who had intensive insulin therapy during cardiac surgery versus those who had conventional intra-operative glucose management. The primary outcomes that were being assessed included a grouping of death, sternal wound infections, prolonged mechanical ventilation, cardiac dysrhythmias, stroke, and renal failure within 30 days of the open heart procedure. Secondary outcome measures were also assessed comparing across the two groups; these included length of stay in the intensive care unit and the hospital. The limitations were two- fold; the study was conducted at only one tertiary care center (single center study) and the study used the “composite” end point so they could not examine whether outcomes differed by diabetes status of the patient.

Background            Much research has been conducted on critically ill hospitalized patients regarding tight glycemic control and reduction in overall morbidity and mortality. Many professional organizations now promote rigorous glycemic control and it has become routine practice especially for those recovering in the intensive care units from surgery and surgical procedures. Intensive glycemic control in this study was defined as maintaining glucose levels between 80 mg/dL and 100 mg/dL, and for those subjects considered to have conventional insulin therapy, conventional therapy was defined as treating blood glucose values that exceeded 200 mg/dL with intravenous insulin. These same researchers conducted an observational study previously of 409 cardiac surgical patients. They noted that intra-operative hyperglycemia was an independent risk factor for peri-operative complications. They felt that if hyperglycemia could adversely alter one’s inherent immunity, wound healing and vascular function, that possible maintaining normoglycemia during surgery and anesthesia might demonstrate favorable outcomes.

Methodology            This study was a randomized, open-label, controlled trial with blinded observation. Elective cardiac surgery patients were randomly assigned to one of two groups; an intensive insulin therapy group or a conventional treatment group. Those in the intensive insulin treatment group received an insulin infusion when their blood glucose value was >100 mg/dL; the goal, to maintain glucose levels between 80 mg/dL and 100 mg/dL. For those patients randomized to the conventional treatment group, they received an insulin intravenous (IV) bolus when blood glucose values exceeded 200 mg/dL. For the conventional group, they were given insulin boluses every hour until the glucose concentration was <200 mg/dL. If the glucose values were >250mg/dL in the conventional group, they were placed on an infusion until values reached were less than 150 mg/dL. Glucose values were obtained every 30 minutes during surgery and anesthesia. All patients were maintained on insulin infusions post-operatively. Because the researchers did not know “how many” patients would qualify to be placed in to one of the two groups (some may have not had any hyperglycemia, therefore not randomized to any group), they enrolled 400 patients, 200 in each group. The calculated number of subjects to enroll in each group in order to detect significant differences was 177. This would allow for those patients who did not participate due to normal glucose values. Baseline patient characteristics, i.e. demographic data, and outcomes variables were compared across groups.

Result            Clinical and demographic data of all assigned patients did not differ significantly. Approximately 20% of the patients in each group had a pre-existing diagnosis of diabetes mellitus. The two groups did not differ in surgical time or in use/requirement of inotropic agents for more than 48 hours during the peri-operative period. The two groups had similar baseline glucose values (111 mg/dL). After conclusion of bypass, the glucose concentrations were lower in the treatment (infusion) group although not significantly lower. At the 24 hour time frame in the intensive care unit, glucose values were similar in both groups however both groups in the ICU were receiving insulin via infusion. The two groups did not demonstrate statistically significant differences in the “composite” endpoint. The researchers did not detect benefits of the intensive insulin therapy for the individual components of the composite end point, and surprisingly, death occurred more frequently in the intensive treatment group (n=4) versus the conventional treatment group (n=0), and stroke occurred more in the intensive treatment group (n=8) versus the conventional treatment group (n=1). The groups did not differ in the mean length of stay for either the intensive care unit or the hospital. The patient with a pre-existing diagnosis of diabetes in the intensive treatment group did not appear to have improved outcomes versus those with diabetes in the conventional group.

Conclusion            This original study, testing the effectiveness of intensive intra-operative glycemic control in cardiac patients, a known high risk group, and combining the control with intensive management post-operatively, did not demonstrate a reduction in death, morbidity, or length of hospital and/or ICU stay when compared with conventional glycemic control. The study did have several limitations; it was not considered blinded, yet was controlled for in terms of objective definitions to assess outcomes, and those who cared for the patients intra-operatively were unaware of the study hypothesis. The composite outcomes as the primary study end point was chosen because individual end points, such as death, would have required a sample size near impossible to achieve over a limited period of time. The implications of occurrences of death and stroke in the intensive treatment group, while small in number, prevents speculation of rationale. It does warrant considerations for future research.



We in the anesthesia community are frequently caring for patients with known diabetes and the acutely ill with hyperglycemia in the operating rooms. A fairly large amount of research has recently focused on improving outcomes postoperatively including prevention of surgical wound infections, by adhering to intensive or tight glycemic control processes during the post-surgical period. A substantial piece of information though is missing for those of us administering anesthesia, especially for the non-cardiac patients; we are wondering what is the optimal blood glucose level for those undergoing surgical procedures (during their anesthetic), in order to promote positive outcomes?  This research was conducted quite soundly, while not without its limitations, and gave us a very solid starting point for addressing the question. The call for future research in this area is strong and the questions are pivotal; to what extent do we control the glucose levels of patients during an anesthetic for a surgical procedure?  Does it vary from surgical procedure to surgical procedure?  Is there a difference in patient health outcomes when blood glucose values are maintained in a very strict range while under anesthesia?  We are in another era of change and it is exhilarating to maintain a focus on the improvement of post-anesthesia outcomes of care.

Mary A. Golinski, PhD, CRNA





© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Equipment & Technology

Reicher J, Reicher D, Reicher M


Use of radio frequency identification (RFID) tags in bedside monitoring of endotracheal tube position

J Clin Monit Comput 2007;21:155-158

Reicher J, Reicher D, Reicher M


Purpose            The purpose of this pilot project was to evaluate the feasibility of assessing Endotracheal Tube (ETT) position by means of a Radio Frequency Identification (RFID) tag imbedded near the cuff of the ETT.

Background            Early detection of incorrect ETT placement and appropriate repositioning of the ETT is essential for safe and effective use. Several methods of determining the correct depth of ETT insertion are used clinically. Compressing the pilot balloon and palpating the bulging of the ETT cuff in the suprasternal notch may be uncomfortable for awake patients and does not rule out right mainstem intubation. The gold standard for determining ETT position is a chest radiograph but x-rays are not feasible for rapid, repeated, on-demand verification of ETT position. As a result, x-rays may not be sufficient to detect ETT migration in a patient who is intubated for a prolonged period.

RFID tags are small electronic devices. They emit no electromagnetic energy until they are stimulated by an RFID reader. When stimulated by a reader device the RFID tag alters and reemits a radio frequency signal detectable by the reader. RFID readers work properly when within 5 cm of the RFID tag. RFID tags are approved for implantation in humans. Assessing the position of ETTs with RFID technology might reduce the need for chest radiographs.

Methodology            An RFID tag was attached to a 6.0 mm cuffed ETT 1 cm above the cuff. The ETT was then properly placed in an Ambu? intubation training mannequin. The mannequin was covered with an opaque cloth and the position of the RFID tag was determined with a handheld RFID reader. The reader had an LED display that showed the strength of the radio frequency signal returning from the RFID tag. That signal is strongest at each end of the tag.

Result            The position of the RFID tag within the mannequin was correctly determined by the RFID reader. When the position of the ETT was changed, the RFID reader repeatedly identified the location of the RFID tag correctly.

Conclusion            This demonstrated the potential value of RFID tags for bedside monitoring of ETT position. Endotracheal tubes with attached RFID tags would allow rapid, real time, repeated, on-demand verification of ETT position. Human trials should be conducted to evaluate their clinical usefulness.



This is either a technology in search of an application or an incredibly useful idea. It may be that an RFID reader can determine the location of an RFID chip very accurately. (The authors clam to within a few millimeters.) The problem is, we cannot see a patient’s internal anatomy. Knowing where the ETT is (by way of the RFID chip) isn’t enough. We need to know where the ETT is in relation to structures such as the vocal cords and the carina. The authors suggest that the correct position of the ETT be determined with a chest x-ray. Once correct ETT position is established the location of the RFID tag is marked on the patient’s skin with a permanent marker. (This crucial first step would need to be performed quite carefully or the rest of the process would be worthless.) After that, the position of the ETT can be checked against this visible skin surface mark and the ETT repositioned to match it’s original position as needed.

It is difficult to imagine that this method would be superior to the abilities of an experienced anesthesia provider for the relatively short term use in operative patients. It may have utility outside the OR in patients who are intubated long term and not being regularly assessed by anesthesia. The technology is widely used to track packages during shipping and warehousing and even on store shelves. It is relatively inexpensive, apparently safe, and easy to use. There would seem little to lose by giving it a try.


Michael Fiedler, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007


Karacalar S, Ture H, Baris S, Karakaya D, Sarihasan B


Ulnar artery versus radial artery approach for arterial cannulation: a prospective, comparative study

J Clin Anesth 2007;19:209-213

Karacalar S, Ture H, Baris S, Karakaya D, Sarihasan B


Purpose            The purpose of this study was to compare ulnar and radial artery cannulation at the wrist for feasibility, success rate, and complication rate.

Background            The overall success rate of radial arterial cannulation has been reported to be 95%. Many consider the radial artery the preferred site for arterial line insertion. The radial artery is the dominant artery supplying circulation to the hand. Ulnar artery cannulation is possible but less is know about efficacy and complications following ulnar artery cannulation.

The deep palmar arch, supplied by the radial artery, is complete more often than the superficial palmar arch, supplied by the ulnar artery. When collateral circulation from the palmar arches is incomplete, collateral circulation may be available via interosseous arteries. As a result, ulnar artery cannulation is unlikely to compromise circulation to the hand.

The ulnar artery lies deeper than the radial artery. And, unlike the radial nerve and artery, the ulnar nerve lies just medial to the ulnar artery where it may be vulnerable to trauma during attempted arterial cannulation. Nevertheless, clinical reports have not demonstrated a risk of ulnar nerve trauma during ulnar artery cannulation.

Methodology            This prospective, randomized study included 100 adult ASA physical status I, II, and III patients who required an arterial line during general anesthesia. Prior to arterial line insertion, all patients underwent a modified Allen test and an inverse Allen test (ulnar artery). A positive (normal) result was defined as palmar blushing completed within 7 seconds. Pulses were rated as strong, weak, or absent. The number of attempts required to cannulate the artery was recorded. Patients were monitored for complications of arterial cannulation for 5 days.

Result            Ulnar artery cannulation was successful on the first attempt in 78% of patients (n=32). Radial artery cannulation was successful on the first attempt in 64% of patients (n=29)(P=not significant). The overall success rate was 82% for the ulnar artery and 90% for the radial artery(P=not significant). When the ulnar pulse was rated as “strong,” cannulating the ulnar artery was successful 100% of the time compared to 91% of the time for a “strong” radial pulse. There was no apparent association between obesity and the success rate of ulnar or radial cannulation.

One patient had distal blanching following a radial arterial cannulation despite a normal Allen test. The arterial line was removed without sequela. No nerve injury was detected as a result of arterial line placement.

Conclusion            In patients with a strong ulnar pulse, ulnar artery cannulation has a high success rate and is safe. When the ulnar arterial pulse is strong it may be the first choice for arterial cannulation.



When an arterial line is needed the radial artery is usually chosen by default because, among other factors, it is usually easier to cannulate. This study confirms the greater success rate when cannulating the radial artery (90% overall for radial, 82% for ulnar). But it also reminds us that the radial artery is not always easier to cannulate. And when the ulnar pulse was strong, the success rate for inserting an ulnar arterial line was higher overall and higher for success on the first attempt in this study. This, taken with the fact that the radial circulation is dominant in the hand and that ulnar nerve injury has not been clearly identified as a complication of ulnar artery cannulation suggests that perhaps we should consider the ulnar artery when the pulse is easily palpable.


Michael Fiedler, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Aponte H, Acosta S, Rigamonti D, Sylvia B, Austin P, Samolitis T

The use of ultrasound for placement of intravenous catheters

AANA J 2007;75:212-216

Aponte H, Acosta S, Rigamonti D, Sylvia B, Austin P, Samolitis T


Purpose            The purpose of this study was to compare the success rate and time required to place a peripheral IV catheter in adults with potentially difficult venous access using either the traditional method or ultrasound visualization.

Background            Peripheral intravenous (IV) insertion is a common procedure which can sometimes be quite challenging. Patients may lack visible or palpable peripheral veins of sufficient size for IV cannulation due to medical conditions, body fat, dehydration, or medications. Peripheral vasodilators (topical drugs, heat), higher tourniquet pressure, and ultrasound visualization have each been used to facilitate difficult IV placement.

When peripheral IV placement is difficult, central venous cannulation, a surgical cutdown, or an intraosseous infusion may be considered but each is associated with greater risk / discomfort than peripheral IV insertion. Ultrasound has been associated with a 91% success rate facilitating the cannulation of deep brachial or basilic veins when other attempts at peripheral IV cannulation had failed. Using ultrasound for central line insertion reportedly results in a lower incidence of arterial puncture and pneumothorax. The Agency for Healthcare Research and Quality has recommended ultrasound be used for all central line placements. While ultrasound has not gained widespread use for peripheral IV placement, its use is well documented for deep extremity veins and central veins.

Methodology            This prospective, randomized study included 35 adults who either reported a history of difficult IV insertion or were judged as potentially difficult by one of the two investigators. Patients had their IV started using either the traditional method or with visualization aided by a Site-Rite 3 (Bard Access Systems, Salt Lake City, UT) ultrasound device. Both investigators had been trained in the use of ultrasound for peripheral IV insertion and had successfully performed at least five cannulations using ultrasound. The IV catheters used ranged from 18 gauge to 22 gauge as judged appropriate by the investigator. When initial attempts at IV placement failed the insertion method (traditional or ultrasound) remained the same but an alternate site was chosen. The start time for the traditional method was when the nurse anesthetist began looking for a peripheral vein. The start time for the ultrasound method began when a vein was identified on the ultrasound monitor. Data collected included patient demographics, IV catheter size, time required for IV placement, and the number of attempts required for IV placement.

Result            Of 35 subjects enrolled, 16 were assigned to the traditional group and 19 to the ultrasound group. The mean time to successful IV placement was 172 seconds in the traditional group and 303 seconds in the ultrasound group. IVs were placed in one attempt in 81% of patients in the traditional group and 74% of ultrasound patients. The average number of attempts required  was 1.3 and 1.4 respectively. None of the differences between groups were statistically significant.

Conclusion            Ultrasound aided peripheral IV placement was neither faster nor more successful than the traditional IV placement technique in this study. Nevertheless, when peripheral IV placement is difficult ultrasound may help reduce the discomfort and / or morbidity associated with alternate venous access strategies.


In this study, ultrasound aided peripheral IV placement offered no improvement in speed or success rate compared to a traditional IV placement technique. This may have been due partly, as the authors correctly pointed out, to the fact that the potential for difficult IV access was based on patient history or the CRNAs assessment rather than their inability to place a peripheral IV.

Ultrasound has been around for many years, and now, all of a sudden, it is emerging in multiple areas of interest to anesthesia. Ultrasound is being used for peripheral IV insertion, central line insertion, and regional anesthesia. But is ultrasound useful often enough for peripheral IV placement to outweigh the additional expense, training, and time required? On the one hand, most anesthetists rarely encounter a patient in whom they can’t place some sort of peripheral IV. Ultrasound machines are quite costly and require additional training and experience to become facile in their use. On the other hand, if a clinically feasible non-invasive device can help prevent the discomfort and anxiety of multiple attempts at IV placement and the complication rate of a central line it may be worth having, even if used only occasionally. What we need to know, but don’t know right now, is 1) how often CRNAs simply can’t get a good peripheral IV and 2) how many of those times a peripheral IV could be started with ultrasound visualization.

Often times, less interest is shown in negative results. Professional journals infrequently publish negative results. But this study is a first step. It tells us there is probably no advantage to ultrasound in regards to the success rate or time required for IV placement. Now we need to know whether or not ultrasound will allow us to place a peripheral IV when nothing else will.

Michael Fiedler, PhD, CRNA

Agency for Healthcare Research and Quality web site:

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Joshi GP


Intraoperative fluid restriction improves outcome afer major elective gastrointestinal surgery

Anesth Analg  2005;101:601-605.

Joshi GP



Purpose            Intraoperative fluid management (IFM) is a controversial aspect of patient care with many different formulations and approaches practiced. Recent studies question the rather algorithmic approach taken to IFM and suggest that restricting or moderating crystalloid replacement may be associated with improved outcome.

Background            Since the publication of Shire’s 1963 proposal of “third space loss”, anesthesia providers have uncritically embraced a rather algorithmic approach to fluid management in open abdominal surgery, an approach that often results in an aggressive, large volume fluid resuscitation. Excessive perioperative fluid therapy is often associated with a large number of adverse consequences, including decrements in cardiovascular, pulmonary, renal, and coagulation function as well as higher rates of edema, infection, ileus and impaired tissue oxygenation.

Methodology            The author performed a critical analysis of recent studies that compared “restricted” versus “liberal” IFM, examining patient outcome.

Result            Intraoperative crystalloid restriction appears to be associated with a reduction in postoperative morbidity and mortality compared to liberal, high volume replacement based on long used algorithms.

Conclusion            While debate regarding IFM continues, there are many reports and studies warning against excessive perioperative fluid infusion. In the case of gastrointestinal surgery, there is compelling evidence that crystalloid restriction (by avoiding preloading and decreasing replacement of “third space loss”) leads to improved patient outcome. Furthermore, the use of a balanced approach using both colloid and crystalloid solutions seems prudent.



I like papers that challenge “conventional wisdom” or those that become part of a paradigm shift. Consider our rather dogmatic, uncritical adherence to the avoidance of cuffed endotracheal tubes in young children, a view that has been challenged by both clinicians and researchers alike. It is now well appreciated, that modern low-pressure cuffed tubes can be used quite safely (and advantageously) in infants and children undergoing mechanical ventilation.(1)  Likewise we may be in the midst of just such a paradigm shift with respect to IFM.

Joshi describes a litany of IFM-related complications that all of us have experienced in our practices. Derangements in coagulation immediately comes to mind. We now know from related work that antithrombin III is diluted by overly aggressively crystalloid therapy with a resultant hypercoagulable state arising, increasing the odds of a deep vein thrombosis. Pulmonary and cardiovascular complications can occur that can lead to significant morbidity and mortality. Many of these and other complications (e.g, electrolyte imbalances, acidosis, abdominal compartment syndrome, poor wound healing, sepsis and ileus, among just a few) occur as a direct result of overly zealous fluid replacement.

Reading this caused me to dig a bit deeper. In my search of professional websites of the major anesthesia and surgical organizations I found no expert consensus committee or published protocols for fluid administration. A look at a couple of other trials of IFM (2-3) revealed that things are indeed changing, with “goal directed” IFM being the trend. This suggests that we pay close attention to IFM by careful attention to calculated fluid deficit, heart rate, urine output, blood pressure, capillary refill, intravascular pressures, measured blood loss, and similar indices, and that we employ a “balanced” approach (crystalloids and colloids) in IFM.

It is clear that our views and our attendant interventions regarding IFT are evolving. Although consensus is likely to remain elusive on the topic, it is now clear that objective endpoints to IFT should be employed rather than simple algorithms. It may be time for all of us to “turn down the volume” a bit.


Chuck Biddle, PhD, CRNA


1. Newth CL et al. The use of cuffed vs uncuffed endotracheal tubes in pediatric intensive care. J Pediatr. 2004;144:333-77.

2. Gan G et al. Goad directed intraoperative fluid administration reduces length of hospital stay after major surgery. Anesthesiology. 2002;97:820-826.

3. Boldt J. New light on intravascular volume replacement regimens. Anesth Analg. 2003;97:1595-604.



© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

American Society of PeriAnesthesia Nurses PONV/PDNV Strategic Work Team

ASPAN’s evidence-based clinical practice guideline for the prevention and/or management of PONV/PDNV

J Perianesth Nurs 2006;21:230-50

American Society of PeriAnesthesia Nurses PONV/PDNV Strategic Work Team


Purpose          Development of a comprehensive evidence-based guideline for the prevention and treatment of post-operative (PO) and post-discharge nausea and vomiting (PDNV) in adult surgical patients.

Background   Postoperative and/or postdischarge nausea or vomiting affects as many as 80% of surgical patients. Fear of nausea or vomiting is the most commonly expressed concern of elective surgery patients, and rated by patients as more debilitating than postoperative pain. Consequences of nausea and/or vomiting are diverse and include wound dehiscence, unanticipated hospitalization, and postponement of a patient’s ability to resume normal activities of daily living. Despite its clinical, social, and economic consequences; multidisciplinary consensus has not been achieved regarding an evidence-based, multifaceted approach toward attenuating the incidence and consequences of PO/PDNV.

Methodology  The American Society of Perianesthesia Nurses (ASPAN) convened a Strategic Work Team consisting of nurse anesthetists, anesthesiologists, perianesthesia nurses, a pharmacist and an expert in practice guideline development. The team reviewed existing evidence related to the prediction, prevention, and treatment of PO/PDNV in adult surgical patients; developed evidence-based recommendations including both traditional and complimentary therapeutics; and identified areas needing additional research.

Result Strong independent risk factors for the development of PO/PDNV were a history of motion sickness or prior PONV, female gender, non-smokers, the intraoperative use of nitrous oxide or volatile anesthetics, and postoperative administration of narcotics. Weaker evidence supported age and duration of surgery as risk factors, and there was contradictory evidence regarding type of surgical procedure as an independent risk factor. Recommendations to prevent nausea and vomiting included the use of: regional anesthesia, total intravenous anesthesia, non-steroidal anti-inflammatory agents or dexamethasone, 5-HT3 or H1 receptor antagonists, scopolamine patch, and droperidol (with consideration of the FDA “Black Box” warning). Other therapeutic interventions included adequate hydration, multimodal pain management (including non-steroidal anti-inflammatory agents and regional analgesia), and P6 acupoint stimulation.

The work team recommended using a formal risk assessment instrument to guide treatment options. For patients at low risk for PONV, no prophylactic interventions are warranted. The addition of any one of the interventions listed above is recommended for patients at moderate risk. Two prophylactic interventions should be incorporated into the anesthesia plan for patients at severe risk, and three or more interventions are suggested for patients at very severe risk. The panel also recommended increasing a patient’s risk designation by one level in circumstances where vomiting would increase the risk of surgical morbidity, such as some intracranial, maxillomandibular or plastic procedures.

Postoperatively, routine assessment for nausea and vomiting was recommended, with the use of a verbal descriptor or visual analogue scale to quantify nausea, if present. Rescue treatments included verification of adequate hydration, administration of 5-HT3 or H1 receptor antagonists, droperidol (with consideration of the FDA “Black Box” warning), metoclopramide, prochlorperazine, low-dose promethazine, and NK1 antagonists (based on preliminary evidence). Aromatherapy should be considered.

The paucity of research regarding PDNV precluded the panel from developing evidence-based guidelines; however, the work team suggested that clinicians should: a) consider the administration of dexamethasone for high-risk patients (if not administered previously), scopolamine patch, and P6 acupoint stimulation; b) include instruction on managing nausea and vomiting when educating outpatients prior to discharge; c) assess patients for PDNV during follow-up contact; and d) consider ondansetron dissolving tablets, promethazine (tablets or suppository), or scopolamine patch as potential rescue treatments.

Conclusion     In addition to their practice recommendations, the work team identified areas for additional research including the effects of prolonged fasting and oxygen administration, the use of complimentary modalities to attenuate PONV, evaluation of instruments used to measure PONV, determination of PDNV predictors and effective treatments for PDNV.


Kudos to ASPAN!  They recognized nausea and vomiting as the most prevalent clinical malady experienced by their patients and addressed the issue head-on by convening an expert panel to develop evidence-based practice guidelines. The work team conducted a systematic unbiased review of published research, carefully considered the findings of each investigation and developed a comprehensive set of recommendations without being prescriptive (pardon the pun). From pre-op to post-discharge, the work team provides us with treatment options for each phase of a patient’s journey, bolstered not only by the primary references for the studies considered to reach their conclusion, but also a standardized metric for ranking the strength of their recommendations. Although the manuscript is long (19 pages); I would encourage all practitioners to take a few minutes to read the original article, which can be accessed on-line without fee ( The treatment algorithms can be a handy addition to each clinician’s “notebook” or posted in the post-anesthesia care unit for quick reference. The article is also particularly helpful for individuals interested in learning more about the taxonomies used to rank clinical evidence and recommendations, or those who are searching for research ideas related to nausea and vomiting.

It is worth mentioning that a consensus panel convened by the Society of Ambulatory Anesthesia (SAMBA) reached conclusions similar to ASPAN’s work team, strengthening the validity of the recommendations of both panels. Although I do not believe the final version of SAMBA’s recommendations have been distributed; based on an abstract presented at the 2006 annual meeting of the American Society of Anesthesiologists, the only discrepant findings appear to be the SAMBA panel’s conclusions that nausea and vomiting are not affected by oxygen therapy, and that evidence regarding patient hydration is inconclusive.1


Alfred E. Lupien, Ph.D., CRNA


1. Gan TJ, Meyer TA, PONV Consensus Panel Members. Revised PONV consensus panel guidelines. Anesthesiology. 2006;105:A566. Available at: Accessed June 15, 2007.

Editor’s Note: Dr. Lupien is an unpaid member of the Editorial Advisory Board for the Journal of Perianesthesia Nursing, where these guidelines were published.

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Obstetric Anesthesia

Palo R, Ahonen J, Salo H, Salmenper? M, Krusius T, M?ki T


Transfusion of red blood cells: no impact on length of hospital stay in moderately anaemic parturients

Acta Anaesthesiol Scand 2007;51:565-569

Palo R, Ahonen J, Salo H, Salmenperä M, Krusius T, Mäki T


Purpose            The purpose of this study was to evaluate the impact of red blood cell (RBC) transfusion on the length of hospital stay in women with three different levels of anemia following spontaneous vaginal delivery.

Background            Spontaneous vaginal delivery is normally associated with less than 500 mL blood loss. This blood loss usually results in only a small change in hematocrit due to the increased blood volume during pregnancy. Within a week postpartum the hematocrit is usually higher than it was during pregnancy. Transfusion decisions can be difficult in this population, involving significant clinical judgment. RBCs a are transfused to provide adequate oxygen carrying capacity. Many postpartum transfusions may be unnecessary, perhaps as many as 30% in the obstetric population. Unneeded transfusions provide little if any benefit while exposing women to adverse effects such as alloimmunization and infection.

Methodology            This retrospective, observational study included 1954 women who had a hemoglobin (Hb) of 10 gm/dL or less following spontaneous vaginal delivery in selected Finnish hospitals during 2002 and 2003. Parturients who underwent surgery for delivery or complications of delivery were excluded. Women with pre-eclampsia, eclampsia, acute infection connected with the delivery, or significant cardiovascular or pulmonary disease were also excluded.

Women were divided into three groups by their lowest postpartum Hb value: 7-7.9, 8-8.9, and 9-10 gm/dL. The duration of hospital stay for those in each group who received 1-2 units of RBCs was compared to those who were not transfused.

Result            The lowest Hb value was measured most commonly on the third postpartum day. One or two units of RBCs were transfused to 259 women (13.3%), more commonly in those with lower Hb values. The mean duration of hospital stay for all study patients was 5.2 days. (The average hospital stay in Finland for childbirth during the study period was 3.5 days.) The length of hospital say did not differ between women who were transfused and those who were not in any of the three groups. Women with a lower postpartum Hb value were more likely to have had an assisted vaginal delivery. In women over age 30 who had a transfusion, the length of hospital stay correlated with age.

Conclusion            Transfusion of 1-2 units of RBCs did not correlate with length of hospital stay in postpartum women. This suggests that some or all of the transfusions were unnecessary. The authors recommend that developing guidelines for transfusion might improve the quality of care.



A study that truly answered the question, “does transfusion influence the length of hospital stay?” would require randomizing women to a transfusion group and a no transfusion group without considering the impact of that randomization on their health. This would be a clear violation of western medical ethics so the authors did the best they could, they looked at women who had, or did not have a transfusion after the fact. Based on their analysis, they were able to suggest that some of the transfusions may not have been needed. While this may not be as strong a statement as clinicians would hope for, if their goal was to push for a critical evaluation of postpartum transfusion practices they may have started the ball rolling.

We should not assume from this study that transfusing women who have experienced a drop in Hb after a spontaneous vaginal delivery is unnecessary. There are many other possible reasons why no difference in the length of hospital stay was seen in those who were transfused and those who were not. It may have been, for example, that women who needed RBCs were transfused and their hospital stay would have been even longer had they not been transfused. Since all the women in this study remained in the hospital longer than average, some other factor may have influenced the length of hospital stay for all the women, obscuring any effect transfusion had on the length of stay. In the end, this study should make us think about transfusion practices critically but it is insufficient evidence upon which to base practice.


Michael Fiedler, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Sullivan SA, Smith T, Chang E, et al.



Administration of cefazolin prior to skin incision is superior to cefazolin at cord clamping in preventing postcesarean infectious morbidity:  a randomized controlled trial

Am J Obstet Gynecol 2007;196:455.e1-455.e5

Sullivan SA, Smith T, Chang E, et al.




Purpose          Cesarean section surgery is one of the most commonly performed surgical procedures in the United States occurring at a rate greater than one million per year and continually maintaining and exceeding this high rate for several years. For what may be considered obvious reasons, it is the only surgical procedure where antibiotic prophylaxis is intentionally delayed from the standard and customary ‘before-skin-incision’, until after umbilical cord clamping. The infection rates post cesarean delivery fall within the range of what is typical of other surgical procedure post -operative infection rates, estimating 17%. The most common infections observed post cesarean section surgery are endomyometritis and localized wound infections. There are significant common concerns of obstetricians when considering before- skin- incision antibiotic administration; most notable being the unknown effects to the neonate and potential interference with a neonatal sepsis work up. The purpose of this research was to compare the incidence of post cesarean infection rates when first generation cephalosporin antibiotics were given prior to skin incision versus after umbilical cord clamping. Additionally, the effects on any neonatal sepsis workup would be addressed based on the maternal antibiotic timing.

Background   Surgical site infections are the second most common cause of nosocomial infections in general, and for those undergoing cesarean delivery, the incidence ranges from 7-20% depending on patient and environmental specific variables. The first few hours after bacterial contamination is the critical window for the establishment of infection; this is based on animal studies which demonstrate that the greatest protection against infection is achieved when appropriate tissue levels are established prior to any bacterial contamination. The suggested timing for first generation cephalosporins, in order to obtain these appropriate tissue levels, is 15-60 minutes prior to skin incision. Administering the prophylactic antibiotic after cord clamping appears to be the safest for the neonate, however scientific data demonstrating this is severely limited. And while assuming that it is “safest” for the neonate, the delay in administration for the maternal effects has been associated with increased risk for infection at the surgical site. Previous research has demonstrated that an adequately powered study was needed to address the issue and obtain data to support a change in practice.

Methodology  The study was conducted as a true experiment; a randomized, double-blinded, placebo-controlled trial. Power analysis demonstrated that in order to detect a 50% decrease in the historical incidence of post cesarean infection rates a sample of 174 subjects in two groups was needed. Patients in the study group received cefazolin 15-60 minutes prior to skin incision, and patients in the control group received cefazolin after umbilical cord clamping. All subjects were followed through a 6 week post partum period and the incidence of infectious morbidity was documented with clearly established clinical criteria. In order for the diagnosis of endomyometritis to be confirmed, patients had to exhibit a fever greater than 100.4?F on 2 separate occasions. The fever had to be present along with uterine fundal tenderness, tachycardia, or an elevated WBC count. A post-operative skin infection had to demonstrate purulent discharge, erythema, and induration of the incision site. Pyelonephritis was diagnosed as usual; maternal temperature, flank pain, and a positive urine culture of a gram-negative uropathogen. Evidence of neonatal sepsis was confirmed via positive blood culture. The offending organism, any antibiotic resistant strains, and the neonate clinical course data were recorded. A neonatologist, unknowing as to which group the infant was in, determined infant length of stay and admission status, as well as if a sepsis workup was necessary.

Result Analysis of the demographic data did not show any differences between the two groups. This was attributable due to the randomization scheme. Statistically significant differences were seen between the two groups (determined via relative risk with a 95% confidence interval) in terms of evidence of endomyometritis; the control group n = 10 and the experimental group, n = 2. Additionally, statistically significant differences were seen in total infectious morbidity; the control group n = 21 and the experimental group n = 8. Significance was not observed in maternal wound infection rates or in terms of neonatal sepsis, neonatal intensive care unit admission rates, total neonatal length of stay, or sepsis workups. There were significantly fewer neonatal intensive care unit admissions in the study group versus the control group.

Conclusion     Post cesarean infections are distressing to patients, can delay maternal-infant bonding, are costly, and can lead to life threatening conditions. The incidence of these post-operative infections is slightly higher than comparable procedures with a possible explanation being the delay in antibiotic timing. This study demonstrated that antibiotic tissue levels prior to bacterial contamination decreased the incidence of some post -operative infections compared to antibiotic administration at cord clamping. Additional this study did not demonstrate evidence of neonatal harm based on pre-incision antibiotic timing.



We are entrenched in an era in surgery and anesthesia whereby post-operative wound infections, even at low rates, are not tolerated. Scientific evidence via clinical trials has continually demonstrated that appropriate antibiotics and specific timing of administration, reduces the incidence of post-operative infections. In other words, post- operative infections can easily be prevented, and now, in order to maximum reimbursement, this is non-negotiable. Compliance, however, still varies depending on environments. This is a timely research study, grounded in science, yet conducted on a sample of the population that is very different. The weaknesses and/or limitations of the study were forthcoming; it was conducted on a high-risk study population. Obesity, diabetes, pre-term delivery, multiple gestation, and surgical residents performing the procedures which increased length of surgery time, all contribute to the incidence of post -operative infections. Yet it was observed that administering the preventative antibiotics prior to skin incision led to a decreased incidence of post-operative infections. My only concern remains with the data to support what is termed lack- of -evidence for neonatal harm, and the fact that prospective trials are limited to confirm this. Ethical considerations often times prohibit human clinical trials when potential harm to the fetus can ensue. How do we weigh the risk benefit ratio of potential neonatal harm with potential maternal morbidity and mortality from post-operative infections?  Appropriately conducted animal studies leading to confirmation that neonatal harm is minimal as the result of placental transfer of antibiotics must be more plentiful in existence.


Mary A. Golinski, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007


Reuben SS, Ekman EF, Charron D

Evaluating the analgesic efficacy of administering celecoxib as a component of multimodal analgesia for outpatient anterior cruciate ligament reconstruction surgery

Anesth Analg 2007;105:222-227

Reuben SS, Ekman EF, Charron D


Purpose            The purpose of this study was to assess the effectiveness of celecoxib as a component of a multimodal postoperative pain management strategy for patients undergoing outpatient arthroscopic anterior cruciate ligament (ACL) repair.

Background            Patients who have undergone outpatient ACL repair often experience significant pain. Pain is associated with delayed discharge, delayed rehabilitation, poorer outcome, and increased resource consumption. Beginning analgesic treatment preoperatively may control pain more effectively than responding to reports of pain postoperatively. The most effective pain relief with the fewest side effects involves multiple modes of analgesia. Multimodal analgesia described for ACL reconstruction includes nonsteroidal anti-inflammatory drugs (NSAIDs), intraarticular local anesthetics, ketamine, opioids, regional anesthesia, and local application of cooling.

Preoperative administration of nonspecific NSAIDs contributes to postoperative analgesia but their use is controversial due to the possibility of platelet mediated increases in bleeding. The investigators observed increased bleeding associated with ketorolac and ibuprofen in ACL reconstruction patients. The cyclooxygenase-2 (COX-2) specific NSAID celecoxib has no effect on platelet aggregation or bleeding time.

While NSAIDs have been linked to cardiovascular morbidity under some circumstances (high doses, long periods of administration, cardiac surgery patients) these complications have not been observed in orthopedic patients taking therapeutic doses for two weeks or less.

Methodology            This prospective, randomized, double-blind, placebo-controlled study included 200 patients scheduled for outpatient ACL repair. Patients were divided into two groups. The celecoxib group received celecoxib 400 mg 1-2 hours preoperatively while the control group received a placebo identical in appearance.

All patients received acetaminophen 1000 mg 1-2 hours preoperatively. General anesthesia was induced with propofol 2 mg/kg, fentanyl 2 ?g/kg, and ketamine 30 mg. Anesthesia was maintained with nitrous oxide in oxygen and 1% to 2% sevoflurane. All patients received ondansetron 4 mg for PONV prophylaxis. Before incision, 20 mL of 0.25% bupivacaine with 50 ?g clonidine was injected into the knee of all patients. Likewise, they received 20 mL of intraarticular 0.25% bupivacaine with 50 ?g clonidine, and 5 mg morphine before emergence from anesthesia. A cooling pad was applied to their knee before being moved to the PACU.

After discharge, all patients took acetaminophen 1000 mg every six hours for 14 days. Along with the acetaminophen, patients took either celecoxib 200 mg or placebo. Oxycodone was used as a rescue analgesic.

All patients participated in an accelerated rehabilitation program that emphasized full weight bearing and knee extension on the first postoperative day and a return to normal activities within six months.

Result            In the PACU, celecoxib patients reported less pain (P<0.01), needed less opioid analgesia (P<0.001), and had less nausea and vomiting (P< 0.05). Fentanyl was the first line rescue analgesic in the PACU and 78% of celecoxib patients received no fentanyl compared to 36% of control patients. Celecoxib patients were discharged home after an average of 119 (?19) minutes compared to 166 (?32) minutes for control patients(P< 0.05). At home, celecoxib patients reported less pain at rest (P< 0.05) and with movement (P< 0.01) and took fewer oxycodone for rescue analgesia (P< 0.01).

Conclusion            Including celecoxib in a multimodal approach to the management of post-ACL repair pain resulted in less pain at rest and with movement, less need for rescue analgesia, less postoperative nausea and vomiting, and faster discharge home.



Pain is a complex phenomenon made up of many parts. While no one would suggest that currently available NSAIDs are sufficient by themselves to eliminate pain after an orthopedic procedure, it should be clear by now that NSAIDs have a place as part of a multimodal treatment of pain. This study shows how adding an NSAID as another little piece of a multimodal approach to pain relief yields important benefits that go beyond analgesia. Adding celecoxib resulted in less PONV, faster discharge, and less limitation of activity by pain, and, perhaps, fewer complications resulting in admission or readmission (these differences were not statistically significant).

Reducing pain not only makes patients feel better; it can also improve surgical outcomes. This is yet another study to show that reducing pain facilitates orthopedic rehabilitation (greater activity, sooner, with less discomfort).


Michael Fiedler, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007


Wallenborn J, Rudolph C, Gelbrich G, Goerlich TM, Helm J, Olthoff D


The impact of isoflurane, desflurane, or sevoflurane on the frequency and severity of postoperative nausea and vomiting after lumbar disc surgery

J Clin Anesth 2007;19:180-185

Wallenborn J, Rudolph C, Gelbrich G, Goerlich TM, Helm J, Olthoff D



Purpose            The purpose of this study was to compare the incidence and severity of postoperative nausea and vomiting (PONV) over the 24 hour period following an isoflurane, desflurane, or sevoflurane general anesthetic.

Background            General anesthesia maintained with propofol results in a lower incidence of PONV than anesthesia with a potent inhalation agent. Sevoflurane is reportedly associated with a lower incidence of post discharge nausea and vomiting than isoflurane. A comparison of sevoflurane and desflurane showed no difference the incidence of PONV. Potent inhalation agents have been shown to produce their greatest emetic effects during the first two hours after emergence from general anesthesia. The incidence and severity of PONV is associated with female gender, nonsmoking status, a history of PONV, the type of surgery, and postoperative opioid use. While opioid use has been widely reported to be a PONV risk factor, most studies have not examined the temporal relationship between opioid administration and PONV. Thus, it is possible that PONV has been attributed to opioid use in patients who received opioids only after having experienced PONV.

Methodology            This prospective, sequential, observational study included 625 ASA physical status I, II, and III patients undergoing lumbar disc surgery over a three year period. General anesthesia techniques and study personnel did not change over the study period. The inhalation agent used was standardized for each year of the study. Initially all anesthetics were performed with 0.7% - 1.2% isoflurane, then 3.5% - 5.5% desflurane, and lastly 1.2% - 1.9% sevoflurane. Each patient was premedicated with 0.1 mg/kg midazolam orally. Fentanyl was used during induction and maintenance of anesthesia. Anesthesia was induced with either 4 to 5 mg/kg thiopental sodium or 0.2 to 0.3 mg/kg etomidate. Rocuronium 0.6 mg/kg was used as the muscle relaxant. The fresh gas flow included 30% to 40% oxygen in air. Thirty minutes before the end of surgery patients received 10 mg metoclopramide and 8 mg dexamethasone. Gastric tubes were not used. Postoperatively, patients were observed continuously for two hours and then assessed at hourly intervals. All patients were interviewed at 24 hours after emergence from general anesthesia.

In addition to analysis of the incidence and severity of PONV, a logistic regression was performed to identify PONV risk factors.

Result            There was no difference in the demographic data, predicted PONV rate, or known PONV risk factors between the isoflurane, desflurane, or sevoflurane groups. There was no statistically significant difference in the incidence or severity of nausea or vomiting or the need for rescue antiemetics between the three groups. On the whole, PONV occurred most frequently in the early postoperative period.

About 9% of desflurane and sevoflurane patients and 5% of isoflurane patients experienced PONV for the first time within 6 hours after anesthesia. After 6 hours, the rate at which desflurane and sevoflurane patients experienced their first episode of PONV decreased as time passed. Almost no desflurane or sevoflurane patients experienced PONV for the first time more than 12 hours postoperatively. Patients who received isoflurane were different. The isoflurane group experienced a peak in the first episode of PONV within 6 hours after anesthesia and a second peak in first time PONV between 12 and 24 hours postoperatively. Approximately 4% of isoflurane patients experienced PONV for the first time more than 12 hours after anesthesia.

As demonstrated in other studies, female gender, a history of PONV or motion sickness, and not smoking cigarettes were all shown to be risk factors for PONV. Unlike previous studies, the duration of general anesthesia in excess of the duration of surgery was also shown to be a risk factor for PONV.

Conclusion            There was no difference in the incidence or severity of PONV following an isoflurane, desflurane, or sevoflurane anesthetic. The secondary peak in PONV >12 hours following an isoflurane anesthetic may require additional prophylactic measures.



PONV is currently one of the most important quality improvement problems in anesthesia. This study adds some important pieces to our understanding of PONV. I also appreciate the study’s focus on PONV over a full 24 hours postoperatively. Too often, we tend to focus our preventative efforts on the first 6 hours (the duration of action of serotonin antagonists like ondansetron and granisetron).

This is the first controlled study I’ve seen that directly compared the rate of PONV between isoflurane, desflurane, and sevoflurane. It was instructive for me to see that the rate of PONV was virtually identical for desflurane and sevoflurane. It was even more interesting to see that unlike desflurane and sevoflurane patients, a number of isoflurane patients experienced PONV for the first time more than 12 hours postoperatively.

Contrary to a number of previous studies, these investigators found absolutely no association between opioid analgesia and PONV. They rightly point out that previous studies did not examine the temporal relationship between postoperative opioids and PONV and they asked an vital question. Were the opioids given before or after patients experienced PONV? This is an important question, for while opioid use and PONV may be associated with each other, if the PONV occurred before the opioids were given the opioids certainly didn’t cause the PONV.

There is one aspect of this study that deserves serious criticism. The investigators’ claim of discovering a new PONV risk factor lacks credibility. They demonstrated a strong association between the duration of anesthesia in excess of the duration of surgery and the risk of PONV. Their study was not, however, designed to discern PONV risk factors. The study did not control for a number of known PONV risk factors that could account for their finding. Etomidate increases the risk of PONV. Some patients received etomidate for induction of anesthesia but we don’t know which ones. Neostigmine in doses greater than 2.5 mg increases the risk of PONV but we don’t know which patients received neostigmine for antagonism of neuromuscular block or what dose the received. Hypotension between induction of anesthesia and the start of surgery increases the risk of PONV but we don’t know which patients were hypotensive or for how long. While the investigators raise an important question here, it is premature to consider the duration of anesthesia in excess of the duration of surgery an independent risk factor for PONV.


Michael Fiedler, PhD, CRNA




© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Molina AL, de Boer HD, Klimek M, Heeringa M, Klein J

Reversal of rocuronium-induced (1.2 mg/kg) profound neuromuscular block by accidental high dose of sugammadex (40 mg/kg)

Br J Anaesth 2007;98:624-627

Molina AL, de Boer HD, Klimek M, Heeringa M, Klein J


Purpose            The purpose of this report was to describe a case in which a patient enrolled in a sugammadex dose finding study accidentally received a dose of sugammadex 10 times greater than called for by the study protocol and 2.5 times greater than the largest dose previously used in a human trial.

Background            Sugammadex was designed to selectively bind, and form a stable complex with, the nondepolarizing neuromuscular blocking drug rocuronium. It is currently in clinical trials to investigate the safety, efficacy, and optimum dose for reversal of rocuronium neuromuscular blockade. Human studies have used doses up to 16 mg/kg. Sugammadex binds and encapsulates rocuronium. Once encapsulated, the rocuronium is unavailable to bind nicotinic receptors on skeletal muscle. Reversal of neuromuscular block occurs in 1-3 minutes.

Current practice using anticholinesterase drugs to antagonize neuromuscular block may result in bradycardia, bronchoconstriction, excessive salivation, abdominal cramping, nausea, and vomiting. To date, studies of sugammadex have shown rapid and effective reversal of neuromuscular block without side effects sometimes seen with the use of anticholinesterase drugs.

Methodology            A 36 year old, 94 kg, ASA class I male scheduled for nasal septum surgery agreed to participate in a sugammadex dose finding study. He was premedicated with midazolam 7.5 mg and acetaminophen 1000 mg by mouth. General anesthesia was induced with propofol 180 mg. Anesthesia was maintained with an infusion of propofol at 8 to 12 mg/kg/hour and a total of 0.75 ?g/kg sufentanil. Neuromuscular function was monitored with an acceleromyograph. After administration of rocuronium 1.2 mg/kg (112 mg) the patient was intubated and ventilated with oxygen and air. Five minutes after administration of the rocuronium, the patient inadvertently received 40 mg/kg of sugammadex (4 mg/kg was the intended dose). Clinically complete recovery of neuromuscular block was achieved in 79 seconds (1 minute 19 seconds). Following the completion of the surgical procedure (total anesthesia time 150 minutes) the patient was monitored in recovery for two hours. Following that, the patient was evaluated by an independent safety assessor over a seven day period.

Result            No adverse events were observed in this patient. No clinically relevant changes were seen in hematology, biochemistry, renal function, liver function, vital signs, or ECG. There were no signs of residual neuromuscular block or recurarization.

Conclusion            Administration of 40 mg/kg of sugammadex did not speed the recovery of neuromuscular function compared to the intended dose. Neither did it result in adverse events or harmful side effects in this single healthy patient.



Sugammadex is being developed to reverse nondepolarizing neuromuscular blockade by a new mechanism; it binds rocuronium making it unavailable to occupy receptors in the neuromuscular junction. In effect, sugammadex makes the body act as if there is no rocuronium present to cause paralysis. Early indications sound almost too good to be true. Sugammadex works fast and it works despite profound neuromuscular block that would be impossible to antagonize with an anticholinesterase.

It is important to learn from our mistakes, but in anesthesia we seldom have an opportunity to learn so much from a drug mistake that results in absolutely no patient harm. This mistake was incredibly good fortune. So what did we learn? We learned that, in this one healthy patient, a gross overdose still didn’t result in undesirable effects. This suggests that the drug may have a very wide margin of safety. We also learned that giving a really big dose doesn’t reverse neuromuscular block any faster than the recommended dose.

Widespread clinical use can reveal problems with a drug that didn’t show up during development. (Remember rapacuronium?) Right now we can only hope that sugammadex will be as good as it looks. We can also hope that drugs similar to sugammadex can be developed for other muscle relaxants.


Michael Fiedler, PhD, CRNA

© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Fukada T, Ozaki M



Microbial growth in propofol formulations with disodium edetate and the influence of venous access system dead space

Anaesthesia 2007;62:575-580

Fukada T, Ozaki M




Purpose The purpose of this study was to assess bacterial growth in the dead space of intravenous (IV) tubing after injection of propofol with and without preservative.

Background Contamination of IV tubing has been associated with catheter related infections. Propofol has been shown to support the growth of bacteria in vitro. Soon after introduction into clinical practice, propofol related infections were reported. Up to 6% of propofol syringes used in the operating room and intensive care unit have been reported to be contaminated. All propofol formulations sold in the USA contain an antimicrobial agent to retard the growth of pathogens.

Methodology This laboratory study compared the growth of methicillin-resistant Staphylococcus aureus (MRSA) in the dead space of IV tubing with the growth of bacteria in a section of IV tubing without dead space. The dead space was at the injection port of two different types of 3-way stopcocks. Each section of IV tubing was inoculated with a known concentration of MRSA suspended in a) propofol with the preservative EDTA, b) propofol without preservative, or c) 0.9% saline. After inoculation, Ringer’s solution was allowed to flow through the IV tubing at 50 mL/hour for 24 hours. At 1, 6, 12, and 24 hours two sections of tubing were removed from the system and cultured.

A related experiment measured the growth of six different types of bacteria incubated at room temperature in a) propofol with EDTA, b) propofol without EDTA, or c) 0.9% saline.

Result No MRSA was recovered from the section of IV tubing without dead space after 12 hours. Propofol with and without EDTA remained in IV tubing dead space despite 24 hours of Ringer’s solution flowing through the IV tubing. Furthermore, the propofol was grossly contaminated with staphylococcus.

The growth of most bacteria incubated in propofol with EDTA for 48 hours was suppressed. Pseudomonas aeruginosa and Serratia marcescens grew even in propofol with EDTA, though more slowly than in propofol without EDTA.

Conclusion Methicillin-resistant Staphylococcus aureus remained in the dead space of IV tubing and continued to grow despite the inclusion of EDTA in the propofol. The addition of EDTA to propofol does not prevent the growth of all types of bacteria.



This is useful information and it will make me even more careful about how I handle and use propofol. We all understand that unused propofol should be discarded six hours after we draw it up into a syringe. But I’ve never given any thought to the propofol I injected into the IV line. After injecting propofol I have often noticed some residual propofol in the dead space of the IV line that wasn’t washed downstream by the flow of IV fluid through the tubing. I didn’t give it much thought; after all, the propofol contained an antimicrobial! Now I’ll examine the IV tubing dead space for residual propofol and try to minimize the propofol that remains there. This may be even more important if the tubing is connected to a central line.

This study used looked at the dead space in a stopcock and I don’t use those very much. I was curious to see how much propofol remained in the dead space of the IV sets where I work. I was careful to insert the needle into the IV set just far enough to be able to inject the propofol so the maximum amount of propofol would fill the dead space. Despite this very little propofol remained in the IV tubing I use after just a minute or two of IV fluid flowing through the set.


EDTA is not the only preservative used in propofol. Benzyl alcohol, benzyl alcohol and sodium benzoate, and sodium metabisulfate are each used in generic propofol preparations available in the USA. This study may not apply to propofol preparations with these other preservatives. It would be interesting to know how other preservatives affected bacterial growth. Including this information when deciding who’s propofol to buy might help reduce the rate of infections associated to propofol administration even further.


Michael Fiedler, PhD, CRNA


Many pediatric oncology patients have totally implantable venous access systems (portacaths) inserted to allow frequent vascular access while minimizing discomfort and potential for infection. The reservoir, with an internal volume of 0.3 to 0.5 mL, can harbor the bacteria-supporting propofol as did the injection port in this study. The greater concern is that the fluid retained in the reservoir is not visible as in an injection port of intravenous tubing. With their compromised immune status, these patients are susceptible to the acquired infection that this study correlates with the prolonged existence of propofol in line. At a pediatric clinical facility, I chose to avoid that possibility by utilizing a different agent, such as pentothal, or a different technique for induction. Given the results of this study, my inclination is to avoid the use of propofol with immune compromised patients or make an extraordinary effort to evacuate any residual propofol from the dead space of tubing or lines.


Terri M. Cahoon, MSN, CRNA





© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007

Quality Improvement

Roth S, Tung A, Ksiazek S


Visual loss in a prone-positioned spine surgery patient with the head on a foam headrest and goggles covering the eyes: an old complication with a new mechanism

Anesth Analg 2007;104:1185-1187

Roth S, Tung A, Ksiazek S



Purpose            The purpose of this report was to describe a case of postoperative blindness associated with eye goggles, a foam headrest, and the prone position in a patient who underwent spine surgery.

Background            The most common perioperative eye injury is corneal abrasion; the most serious is blindness. Blindness is a reported complication following surgery in the prone position. Most reports of central retinal artery occlusion are unilateral and are generally attributed to mechanical pressure applied to the eye during surgery. Some advocate protective goggles to prevent eye injury.

The FDA’s Medwatch database (voluntary reporting since the 1990’s) contains several reports of eye injuries that have occurred under circumstances similar to this one. These injuries ranged in severity from a keloid scar on the nose to neuropraxia of the supraorbital nerve. The Opti-Guard was used in three cases reported to the Medwatch database.

Methodology            This case report involved a 53 year old, 80 kg man who underwent a lumbar interbody fusion in the prone position. His history included diet controlled diabetes, hypertension, and previous uneventful anesthesia. His preoperative BP was 110/70, HR 72, blood glucose 80-110 by Accucheck, and Hct 45%. He took his usual dose of metoprolol the morning of surgery. His vision was normal preoperatively.

After induction of general anesthesia the eyes were taped shut and a Dupaco Opti-Guard Eye Protector (Dupaco, Oceanside, CA) applied. Next, the patient was turned prone onto an OSI Gentle-touch foam headrest (Orthopedic Systems, Union City, CA). During the case, the minimum arterial BP was 98/58. Blood loss was estimated at 600 mL and was replaced with 2500 mL of crystalloid and 225 mL of salvaged blood. Hematocrit the next morning was 35%.

During the case, the eyes were reportedly checked by palpation every 15 minutes. The Opti-Guard was still properly positioned at the end of the case.

Result            Six hours postoperatively, the patient was blind in his left eye. The left pupil was not reactive to light, there was a large corneal abrasion, edema was present around the eye, the retina was pale, and there was an abrasion on the eyelid. A diagnosis of left central retinal artery occlusion was made. The right eye was unaffected. This presentation supports a compression injury of the eye.

Conclusion            Compression injury of the eye, and subsequent blindness, can occur despite application of Opti-Guard goggles. The Opti-Guard provides no added safety advantage for patients in the prone position.



Because we don’t know everything and can’t foresee every eventuality, even when we go out of our way to protect our patients significant complications can still occur. I’m sure this anesthesia provider was trying to be “doubly safe” by using both a foam headrest with cut outs for the eyes and goggles. Complications can be difficult and personally embarrassing for healthcare providers to talk about. But precisely because this type of complication is both unexpected and uncommon, it is important to share the account so all can learn from it. I applaud the authors for doing so.

The authors attribute the injury to mechanical compression of the eye. Their physical findings strongly support this mechanism of injury. Exactly what caused that compression is not known. The authors speculate that the injury may have been caused by, “… compression of the eye by the plastic lens of the Opti-Guard … or loosening of the glue attaching the flange of the Opti-Guard to the face, resulting in displacement of the lens into the eye when the anesthesia provider blindly palpated the edge of the protector.” This is too speculative for comfort. The authors report that the goggles were properly positioned when the patient was turned supine at the end of the case. They do not report any deformity of the goggles at the end of the case. If the lens of the goggles had been pushed into the eye it seems as though they would show some signs of damage.

Something got to the eye. Since the goggles were in place, whatever it was either went through the goggles or around them. Whatever caused the compression would most likely have caused it whether or not Opti-Guard goggles had been present. The goggles may even have reduced the damage (admittedly equally as speculative).

The authors go on to assert that the Opti-Guard “…inadvertently became a hazard to the patient…” and to associate the blindness with the combination of the Opti-Guard and a foam headrest. This may be the case and we should be alert to the possibility, but it is far from a certainty. The authors present evidence only of an association, not cause and effect.

Two helpful findings seem fairly certain. 1) Compression injury can cause blindness during prone cases despite multiple interventions intended to prevent it. And 2) palpation to verify that nothing is pressing on the eyes did not prevent blindness in this case. In the past, I’ve usually felt the patient’s face to make sure nothing was pressing on the eyes. This case report is the strongest argument I’ve seen yet that visual inspection of the eyes is necessary when patients are in the prone position.


Michael Fiedler, PhD, CRNA




© Copyright 2007 Anesthesia Abstracts · Volume 1 Number 3, June 30, 2007