62.10 Donor Plasma Effects On Platelet Function

A. G. Grand1, J. C. Cardenas1, L. Baer1, N. Matijevic1, B. A. Cotton1, J. B. Holcomb1, C. E. Wade1  1University Of Texas Health Science Center At Houston,Department Of Surgery,Houston, TX, USA

Introduction: We and others have demonstrated a significant decrease in platelet count and function within 2 to 3 hours following admission in severely injured trauma patients after transfusion. The decrease in platelet function is in part due to the reduction in platelet count. However, the decrease in function does not always equal the decrease in count observed in trauma patients. We investigated whether transfusion of plasma was another factor contributing to platelet hypofunction following trauma.

Methods: Whole blood samples were taken from healthy volunteers and their baseline platelet function assessed by impedance aggregometry in response to ADP, collagen, thrombin receptor-activating peptide (TRAP), arachidonic acid (AA) and ristocetin using Multiplate Analyzer. Blood samples were then diluted by 30% using the volunteer’s autologous plasma, autologous plasma that was snap frozen and thawed (autologous FFP), and donor fresh frozen plasma (FFP) from Gulf Coast Regional Blood Center. The percent change in platelet function compared to whole blood or autologous FFP was calculated. FFP from five different donors were used to obtain the average change in function. A student’s t test with significance set at p<0.05 was used to determine if dilution, freezing and the use of donor FFP had an effect on platelet function.

Results: Dilution of whole blood with autologous plasma by 30% showed a significant decrease of 30% in platelet function in response to all agonists, as expected, with the exception of TRAP. Autologous FFP had no additional effect on platelet function, with the exception of a reduction in TRAP (p=0.007). Single donor plasma demonstrated a further reduction in ADP (p=0.02), collagen (p=0.006) and AA (p=0.007) compared to autologous FFP. Finally, comparison across multiple donors (n=5) demonstrated a trending, although not significant, reduction in platelet function in response to all agonists with the exception of ristocetin, which remained unchanged.

Conclusion: Dilution is the major contributor in the decrease in platelet function however, donor plasma may have additional negative effects on platelet function following transfusion.  

 

62.11 Current and Future Approaches to the Proteomic Analysis of Traumatic Coagulopathy

C. C. McCoy1, E. Benrashid1, M. L. Shapiro1, S. N. Vaslef1, J. H. Lawson1  1Duke University Medical Center,Department Of Surgery,Durham, NC, USA

Introduction:
Current translational research in traumatic coagulopathy is limited by the heterogeneity of individual injuries and the acute, unpredictable nature of trauma incidence.  As a result, coagulation studies of plasma from trauma patients lack baseline comparisons and are biased by numerous clinical factors including multisystem trauma, pre-hospital resuscitation and blood component administration.  Developing both animal and human models of trauma will permit a more rigorous characterization of coagulation changes resulting from specific organ injury.

Methods:
A mouse model of blunt traumatic brain injury (TBI) was utilized to create a pilot study of TBI-specific coagulation changes.  Three mice received piston-based, blunt TBI (6.8 m/s, 3mm deflection) following anesthesia and scalp retraction, while three mice received anesthesia and scalp retraction alone.  Thirty minutes following intervention, blood was obtained and a proteomic analysis of coagulation was performed.  2-dimensional difference gel electrophoresis (DIGE) was coupled with matrix-assisted laser desorption/ionization-time of flight (MALDI-TOF) mass spectrometry to identify proteins with altered concentration in an unbiased fashion.

Results:
Multiple coagulation-related proteins were detected by DIGE, and three known coagulation proteins demonstrated concentration alterations (greater than 5% concentration change) following trauma.  Five novel proteins of unknown function also demonstrated alterations in concentration.  Although previously uncharacterized, these proteins could serve as targets for future investigation of injury-induced changes in the plasma proteome.

Conclusion:
Detecting concentration changes of known coagulation factors and novel proteins during TBI reinforces the value of organ-specific models of trauma as a mechanism to study the plasma proteome during injury.  To validate animal model findings, proteomic analyses will be performed on human plasma collected before, during and after elective surgery to develop human, organ-specific injury models.  Data from such research will enhance our knowledge of coagulation changes during tissue trauma, the influence of inflammatory pathways, and their relationship to organ-specific injury.

61.16 Oncostatin M Receptor Deficiency Protects Against Sepsis In Older Mice

S. Y. Salim1, N. Al-Malki1, T. A. Churchill1, R. G. Khadaroo1  1University Of Alberta (CA),Division Of General Surgery, Department Of Surgery, Faculty Of Medicine & Dentistry,Edmonton, ALBERTA, Canada

Introduction:   Sepsis with concomitant multiple organ failure continues to be a clinical challenge, especially in the elderly who have increased morbidity and mortality. Oncostatin M (OSM) is part of the IL-6 cytokine family that has pleiotropic functions in hematopoiesis, immunologic and inflammatory networks. Levels of OSM have been previously shown to increase in patients undergoing septic shock, though the role and mechanism of OSM in sepsis remains elusive. Here we hypothesis that inhibition of OSM via OSM receptor (OSMR) deficiency, provides a survival benefit by modulating the inflammatory process in sepsis.

Methods:   Wild type (WT) litter-mates and OSMR knockouts (OSMR-/-), mice older than 50 weeks received intra-peritoneal injection of cecal slurry (CS). CS obtained from healthy WT C57BL/6 mice was prepared at LD30 of 1.3 grams of mice per mg of cecal contents. Mice were observed for 48 hours prior to peritoneal lavage and organ retrieval. Cytokine levels in tissue homogenates were measured using ELISA, while tissue were analyzed via myeloperoxidase (MPO) activity and histology.

Results:  Mortality rate was significantly higher in the WT mice (100%) compared to OSMR-/- mice (see figure). Clinical assessment showed differences between the strains occurring after 14 hours of CS injection. Cytokine analysis from tissue homogenates in both mice strains showed significant increase in IL-6, IL-10 and KC in lung and peritoneal lavage in CS compared to vehicle controls. Liver homogenates had significant increases in IL-6 and KC in both mice strains treated with CS compared to vehicle controls. Lung homogenates in CS treated OSMR-/- mice had significantly lower levels of IL-6 than WT. However, lung MPO activity was significantly higher in the OSMR-/- mice than WT mice treated with CS.

Conclusion:  This study shows that OSMR deficiency protects against septic shock in older mice. These mice had lower levels of lung IL-6 despite having greater neutrophil recruitment, thus indicating that OSMR deficiency results in decreased proinflammatory response. We speculate that OSM might play a deleterious role in distant organ failure in sepsis. Understanding the immunomodulatory activity of OSM in sepsis may provide novel therapeutic avenues in the future.

61.17 The Impact of Hemorrhagic Shock on Pituitary Function

A. A. Haider1, P. Rhee1, V. Pandit1, N. Kulvatunyou1, B. Zangbar1, M. Mino1, A. Tang1, T. O’Keeffe1, R. Latifi1, R. S. Friese1, B. Joseph1  1University Of Arizona,Trauma/Surgery/Medicine,Tucson, AZ, USA

Introduction:
Hypopituitarism after hypovolemic shock is well established in certain patient cohorts. Understanding the hormonal variations in trauma patients with hemorrhagic shock (HS) has been gaining focus. However; the effects of HS on pituitary function in trauma patients remains unknown. The aim of this study was to assess pituitary hormonal variations in trauma patients with hemorrhagic shock (HS). 

Methods:
Patients with acute traumatic HS presenting at our level 1 trauma center were prospectively enrolled. HS was defined as systolic blood pressure (SBP) ≤ 90 mm Hg and requirement of ≥ 2 units of packed red blood cell transfusion. Serum cortisol and serum pituitary hormones [vasopressin (ADH), adrenocorticotrophic hormone (ACTH), thyroid stimulating hormone (TSH), follicular stimulating hormone (FSH), and luteinizing hormone (LH)] were measured in each patient on admission and at 24, 48, 72,and 96 hours after admission. Outcome measure was: variation in pituitary hormones.

Results:

A total of 42 patients were prospectively enrolled with mean age 36.7±12.4 years , mean SBP 88±64.5 mm of Hg, and median injury severity score 26 [18-38] . The mean admission ADH level was 20.4±9.6 pg/ml, mean admission ACTH level was 42±38.4pg/ml, and mean admission cortisol level was 19.4±7.4 µg/dl. The pattern of pituitary hormones was variable over the 5 days (Figure 1). Four patients died within 24 hrs. Patients who died had higher mean admission ADH levels (p=0.01), higher mean admission ACTH levels (p=0.02), and lower mean admission cortisol levels (p=0.01) compared to patients who survived. On sub-analysis of patients who died after 24hrs (n=6), there was a significant decrease in the level of ADH (p=0.041) and cortisol (p=0.036).

Conclusion:
Acute hypopituitarism does not occur in trauma patients with acute hemorrhagic shock. The physiological response of the pituitary adrenal axis in HS was appropriate.  The pituitary hormone levels were variable over the 5 days. In patients who died there was a decreasing trend in the ADH and cortisol level.

61.18 Inhibition of a Hyperactive Glucocorticoid Receptor Fragment hGR-S1(-349A) by RU486

D. G. Greenhalgh1,2, K. Cho1,2, T. Green1, S. Leventhal1, D. Lim1, D. Greenhalgh1,2  1Shriners Hospitals For Children Northern California,Burns,Sacramento, CA, USA 2University Of California – Davis,Burns,Sacramento, CA, USA

Introduction: We have previously reported that a 118 amino acid fragment of the N-terminus of GR [hGR-S1(-349A)] which lacks ligand and DNA binding domains has a markedly enhanced response to steroids compared to the full-length GR (hGRa) alone.  How hGR-S1(-349A) enhances glucocorticoid activity is not known.  We sought to determine whether blocking classical glucocorticoid receptor pathway with RU486 would alter the enhanced activity of the altered hGR-S1(-349A). Since the fragment lacked both DNA and ligand binding domains we hypothesized that RU486 would not alter its activity.

Methods: Both the hGR and hGR-S1(-349A) were cloned into tsA201 cells and tested for glucocorticoid activity at various doses of hydrocortisone using our standard luciferase assay.

Results: hGR-S1(-349A) had the typical augmentation in response to increasing doses of hydrocortisone.  To our surprise, RU486 inhibited the activity of hGR-S1(-349A) suggesting a role for the hGR in its activity (figure).

Conclusion: hGR-S1(-349A) hyperactivity is inhibited by RU486 despite lacking DNA and ligand binding demands.  These findings suggest that hGR-S1(-349A) acts as a cofactor in conjunction with hGR to augment the glucocorticoid stress response.

 

62.03 Green Plasma Has a Superior Hemostatic Profile Compared with Standard Color Plasma

B. A. Cotton1, J. C. Cardenas1, E. Hartwell1, C. E. Wade1, J. B. Holcomb1, N. Matijevic1  1University Of Texas Health Science Center At Houston,Acute Care Surgery/Surgery,Houston, TX, USA

Introduction:  While the transfusion of plasma has increased over the last decade, the

availability of this product has seen dramatic changes that continue to threaten the current

supply. The conversion to male predominant plasma has further limited potential donors

of emergency release products used for intial resuscitaion in trauma. During this same

period, investigators have demonstrated a sexual dimorphism in response to sepsis and

injury, with less multiple organ failure and improved survival observed in premenopausal

females. Plasma from female donors who are pregnant or taking oral contraceptives often

has a green appearance. This green discoloration (due to increased ceruloplasmin levels)

frequently results in these units being discarded or removed from the donor pool for

commercial use, purely based on its appearnace. The purpose of this pilot study was to

evaluate the hemostatic potential and capacity of green plasma compared to standard

color plasma.

Methods:  We obtained plasma from 12 blood group-matched female donors from our

local blood center. Six of these units had a normal appearing hue (STANDARD) and six

were grossly green-appearing plasma (GREEN). These units were then evaluated by

thrombelastography (TEG), calibrated automated thrombogram (CAT) and factor level

measurements. Univariate analysis was then performed using Wilcoxon rank sum and

values are expressed in medians with 25th and 75th interquartile range.

Results: GREEN plasma had a more hypercoagulable TEG profile for all values (r-value:

1.6 vs. 3.1 min, p=0.004; k-time: 2.2 vs. 4.8 min, p=0.088, angle: 69 vs 42 degrees,

p=0.004; mA: 38 vs. 25 mm, p=0.054) when compared to STANDARD plasma.

Differences were also observed with coagulation factor levels comparison, with

GREEN plasma having higher levels than STANDARD (factor II: 107 vs. 96%, p=0.004;

factor VII: 124 vs. 106%, p=0.149; factor IX: 145 vs. 114%, p=0.077; factor X: 125 vs.

102%, p=0.006; factor XI: 121 vs. 101%, p=0.025). Using CAT, GREEN plasma had

higher lag time (4.1 vs. 3.6 min, p=0.037) and increased endogenous thrombin potential

(1669 vs. 1280 nM/min, p=0.114).

Conclusion: This pilot study demonstrates that plasma from female donors that has a green color has a

superior hemostatic profile than standard color plasma. Current AABB recommendations

for male-predominant plasma have further reduced the availability of emergency release

plasma for life-threatening bleeding. GREEN plasma should be further investigated for

its safety profile and hemostatic potential. Should it prove to be a safe and functionally

non-inferior (and potentially superior) product, GREEN plasma should be actively

re-introduced into the medical community for transfusion of critically injured and

bleeding patients.

59.17 Inhibition of Heterotopic Ossification by Cox-2 Inhibitors Is Independent of BMP Receptor Signaling

S. Agarwal1, J. Peterson1, S. Loder1, O. Eboda1, C. Brownley1, K. Ranganathan1, D. Fine1, K. Stettnichs1, A. Mohedas2, P. Yu2, S. Wang1, S. Buchman1, P. Cederna1, B. Levi1  1University Of Michigan,Surgery,Ann Arbor, MI, USA 2Brigham And Women’s Hospital,Boston, MA, USA

Introduction:  Treatment options for heterotopic ossification (HO) including surgical excision and radiation cause tissue damage and result in recurrence. Prophylactic treatment with non-steroidal anti-inflammatory drugs (NSAIDs) including aspirin and celecoxib has been reported to decrease ectopic bone formation in patients after orthopedic procedures although the pathway remains unexplored. Here we demonstrate that administration of celecoxib, a Cox-2 specific inhibitor, decreases HO formation following trauma independent of bone morphogenetic protein receptor (BMPR) function.

Methods:  For our HO model, male C57BL/6 mice underwent Achilles tenotomy of the left hindlimb with 30% total body surface area partial-thickness burn over the dorsum. Mice were administered intraperitoneal celecoxib or carrier daily.  HO was quantified by micro-CT imaging at 2 week intervals up to 9 weeks after trauma (threshold Hounsfield units 1250). Wound-healing was assessed by daily imaging.  To analyze BMP signaling, we used the BRE-luc reporter in C2C12 cells in vitro. C2C12 cells were administered indomethacin (cox-1/cox-2 inhibitor) in the presence of BMP2, BMP4, BMP6, or BMP9 followed by quantification of luciferase activity.  

Results: Administration of celecoxib resulted in an 80 percent decrease in HO formation at 7 weeks (5.06 mm3 v. 1.30 mm3, p<0.05) and 77 percent decrease at 9 weeks (5.61 mm3 v. 1.55 mm3) after trauma. In contrast to the carrier-treated group, HO formation was undetectable for the first 3 weeks in the celecoxib-treated group. All Achilles tenotomy incision and dorsal burn sites showed grossly normal healing. Mesenchymal stem cells from burned mice treated with 1 uM celecoxib in vitro demonstrated 60% less alkaline phosphatase staining and 75% less alizarin red staining than untreated cells. Finally, in vitro administration of indomethacin to C2C12 cells with the BRE-luc reporter resulted in no significant decrease in luciferase activity, suggesting that Cox-2 inhibition does not inhibit BMP receptor function. 

Conclusion: We demonstrate that Cox-2 inhibition decreases HO volume in a burn/trauma model. Decreased mineral deposition also occurs in the in vitro setting with mesenchymal stem cells, suggesting a direct effect on the cells responsible for bone formation. Furthermore, the BRE-luc reporter assay demonstrates that Cox-2 inhibition likely does not impart its effect through the BMP pathway.  Our findings suggest that therapeutic targets for HO need not be limited to the BMP pathway, and that the Cox-2 enzyme deserves further attention.  Patients at risk for HO following trauma may benefit from early celecoxib treatment with minimal impact on wound healing. 

59.18 Hesperidin Accelerates Closure of Splinted Cutaneous Excisional Wounds in Mice

A. A. Wick1, T. Lecy1, T. W. King1  1University Of Wisconsin,Plastic Surgery,Madison, WI, USA

Introduction:
Every year in the United States, more than 6.5 million patients suffer from wound-related complications, and treatment is estimated to cost over $25 billion. After an injury, keratinocytes from the wounded edge must proliferate, migrate across the wound bed, and differentiate in order to restore normal barrier function. We are interested in discovering new methods by which to enhance proliferation, migration, and differentiation in order to improve the wound healing process. Hesperidin, a natural flavonoid found in citrus fruits and honey, has been shown to improve epidermal barrier function. In this study, we investigated the effects of hesperidin on the rate of wound healing in murine splinted cutaneous excisional wounds, a model shown to simulate healing in human tissue. 

Methods:
Six week old, male mice (n=14) were anesthetized, shaved, and two full-thickness 6 mm wounds were created on their backs under aseptic conditions. A 12 mm silicone stent was secured around each wound using cyanoacrylate glue and interrupted 5-0 nylon sutures in order to prevent healing by contraction and to promote healing by formation of granulation tissue. A sterile non-adherent dressing and transparent occlusive dressing were placed over the wound. Twenty-four hours after injury, hesperidin (10 μM) or vehicle  (0.01% DMSO) was applied topically to each wound and was repeated daily. Digital photographs were taken of the wounds every day at each dressing change and treatment application. Wound closure was defined by gross visualization of resurfacing epithelia and calculated as a percent area of the original wound size. Wounds were quantified using ImageJ software (NIH) and expressed as a ratio of wound area to stent area, with scaling normalized to the inner diameter of the splint.

Results:
The wound sizes were similar in both groups at the beginning of treatment.  Addition of hesperidin (10 μM) significantly accelerated the rate of wound closure compared to DMSO control on day 4 (% open, 44±3 vs 53±2 respectively, p<0.05) and on day 5 (% open, 23±3 vs 33±3 respectively, p<0.05).  Wound closure for both groups was complete by day 8.  No negative side effects were noted in the mice.

Conclusion:
Hesperidin accelerates cutaneous wound closure in our in vivo model. Based on this novel finding, further studies should evaluate the mechanisms by which hesperidin accelerates the wound healing process, possibly leading to the development of new and effective therapeutics for wound healing in patients.
 

59.19 Raman Spectroscopy Provides a Rapid, Non-invasive Method for Identifying Calciphylaxis

S. Agarwal1, B. Lloyd1, S. Nigwekar2, S. Loder1, K. Ranganathan1, P. Cederna1, S. Fagan2, J. Goverman2, M. Morris1, B. Levi1  1University Of Michigan,Surgery,Ann Arbor, MI, USA 2Massachusetts General Hospital,Boston, MA, USA

Introduction:  Calciphylaxis is a painful and debilitating condition which creates large open wounds most frequently in patients with renal failure. Calciphylactic lesions are characterized by the precipitation of calcium deposits in the skin and soft tissues, resulting in vessel thrombosis and tissue necrosis. Diagnosis of calciphylaxis traditionally occurs via histologic evaluation of biopsy specimens. However, incisional biopsy of the affected site may result in further local inflammation leading to a cycle of further calcium deposition. We set out to develop a non-invasive diagnostic method which can identify calciphylaxis lesions and avoids creating local inflammatory trauma. 

Methods:  Two histology-confirmed human calciphylaxis biopsy specimens and normal surrounding tissue were examined using either a Raman microscope (interrogating a tissue area < 1 mm) or a hand-held Raman spectroscopy probe (interrogating a tissue volume < 1 mm3). Characteristic spectra for each specimen including the normal surrounding tissue and the known calciphylaxis regions were collected and compared to identify common peaks contributed by apatite.  Spectra were pre-processed for removal of cosmic spikes and correction of spectrograph/detector alignment and grating-induced anamorphic magnification (curvature).  Spectra were corrected for the fluorescence background by fitting background to a low order polynomial (Polynomial order = 5). Band heights and areas are measured. Concurrently nano-CT scans were performed to confirm the regions of calcification.

Results: Using nano-CT imaging, we demonstrate large areas of calcification including within the vasculature. Both calciphylaxis specimens exhibited a strong peak at 960 cm-1, consistent with Raman spectrum attributed to apatite and apatitic-like tissue components. Here, this strong peak is attributed to small calcium phosphate precipitates within the tissue. Normal tissue examined from these patients showed no Raman signature of calcium phosphate.   Our results suggest that Raman spectroscopy can differentiate the calcium phosphate deposition of calciphylaxis from normal tissue. 

Conclusion: Here we differentiate calciphylaxis and normal surrounding tissue based on the physical characteristics of the tissue using Raman spectroscopy. Although we have employed this technique in previously excised biopsy specimens, a hand-help, fiber-optic probe can be developed to analyze surface tissue in humans prior to biopsy. In the future, Raman spectroscopy may provide a rapid and non-invasive method for diagnosing calciphylaxis. By avoiding an incisional biopsy, we will be able to avoid exacerbating the cycle of inflammation which precipitates calcium phosphate deposition in these patients.   

59.20 Negative Pressure Wound Therapy in Severe Open Fractures: Complications Related to Length of Therapy

K. A. Rezzadeh1, L. A. SEGOVIA1, A. HOKUGO1, R. Jarrahy1  1University Of California, Los Angeles,Surgery- Plastic,Los Angeles, CA, USA

Introduction: Negative pressure wound therapy (NPWT) is a widely accepted method of temporary wound coverage prior to soft tissue flap reconstruction of severe lower extremity injuries. However, an examination of the effects of NPWT on complications related to limb salvage surgery has yet to be performed. The precise role of NPWT in the perioperative management of patients with complicated lower extremity injuries remains unclear. In this study we examine the impact of NPWT on the outcomes of traumatic lower extremity wounds treated with flap coverage for the purpose of limb salvage. Specifically, we elucidate the effect of NPWT on flap complications and overall outcomes based upon the timing of soft tissue reconstruction relative to initial injury.

Methods: :  An institutional case series was performed consisting of thirty-two consecutive patients receiving lower extremity reconstruction following Gustilo Class IIIB and IIIC open tibial fractures from 1996 to 2001. Demographic, operative, and clinical data were collected retrospectively.  Outcomes of interest included length of hospitalization, number of surgical procedures, extremity amputation, and non-union. 

Results:Among patients reconstructed within all study time periods (i.e., acute, subacute, and chronic), the incidence of overall complications was lower for the group treated with NPWT than for those patients who underwent conventional wet-to-dry dressing changes. Patients operated on in the chronic period and who received conventional dressing changes had the highest rate (83.3%) of complications while those who were reconstructed in the acute period with perioperative NPWT had the lowest incidence (0%). 

Conclusion:We have shown that the use of NPWT therapy in the perioperative management of patients with open distal lower extremity fractures reduces complication rates associated with limb salvage surgery. Based on our results, we conclude that NPWT can be used as a temporizing measure to optimize patients prior to flap surgery, effectively lengthening the window of opportunity for reconstructive surgeons to manage these challenging patients.

 

56.08 Ultrasound Guided Central Venous Catheter Insertion: is Safer, but in whose hands? A meta-analysis.

C. J. Lee1, R. S. Chamberlain1,2,3  1Saint Barnabas Medical Center,Surgery,Livingston, NJ, USA 2New Jersey Medical School,Surgery,Newark, NJ, USA 3St. George’s University School Of Medicine,St. George’s, St. George’s, Grenada

Introduction:  Real-time ultrasound guidance for the placement of central venous catheters (CVCs) is purported to increase placement success and to reduce complications but in whose hands?  This meta-analysis assesses all available evidence comparing landmark-guided (LG) to ultrasound-guided (UG) CVC insertion in regards to success, safety/complications and the experience of the operator placing the CVC.

Methods:  A comprehensive literature search of Medline, PubMed, and the Cochrane Central Register of Controlled Trials was performed. 17 prospective, randomized controlled trials were identified which compared UG with LG techniques of CVC placement and specified operator experience. Data were extracted on study design, study size, operator experience, rate of successful catheter placement, number of attempts to success, and rate of accidental arterial puncture. A meta-analysis was constructed to analyze the data.

Results: 17 trials with a total of 3,686 subjects were included. 1,684 CVCs were placed by LG, and1,822 by UG. Among the UG group, 910 subjects were placed by junior operators (<5 years experience), and 912 by senior operators. In the LG group, 754 subjects were placed by junior operators and 930 by senior operators. UG CVC insertion was associated with 14% increase in the likelihood of successful CVC placement by junior operators (risk ratio (RR), 1.14; 95% CI, 1.08-1.21) and a 8% increase in senior operators (RR 1.08; 95% CI, 1.02-1.14). The mean number of needle attempts until CVC placement was lower in the UG placement group (Standard difference in means -0.85; 95% CI, -1.18 to -0.51) however no significant difference was seen between junior and senior operators. UG was associated with a 49% decrease in the likelihood of accidental arterial puncture among junior operators (RR 0.51; 95% CI, 0.28-0.95 and  an 86% decrease in senior operators (RR 0.14; 95% CI, 0.07-0.26). Statistically significant difference between junior and senior operators was only observed in regards to accidental arterial puncture (p=0.004).

Conclusion: UG CVC placement in adult patients is effective in improving overall success rate and decreasing the number of attempts until cannulation and arterial puncture rate regardless of operator experience level.  Accidental arterial punctures during UG CVC were lower in both groups, but a significantly higher benefit was observed among senior operators compared to junior operators.  UG SVC improves success rate, efficiency, and safety in regards to CVC placement, even among those most comfortable or confident in traditional landmark guided techniques.

54.09 Neighborhood Socioeconomic Status Predicts Violent Injury Recidivism

V. E. Chong1, W. S. Lee1, G. P. Victorino1  1UCSF-East Bay,Surgery,Oakland, CA, USA

Introduction:  Measures of individual socioeconomic status, such as income, educational attainment, employment level, and insurance status, are known to correlate with violent injury recidivism. Accordingly, tertiary violence prevention programs at many trauma centers target these areas to help violently injured patients avoid recurrent violent victimization. A person’s environment may also influence their risk of being involved in violence, and as such, neighborhood socioeconomic status may impact the likelihood of recurrent injury. As this potential link has yet to be completely studied, we conducted a review of victims of interpersonal violence treated at our trauma center, hypothesizing that the median income of their neighborhood of residence predicts recurrent violent victimization.

Methods:  We conducted a retrospective analysis of our urban trauma center’s registry, identifying patients who were victims of interpersonal violence from 2005-2010. We included patients ages 12-24, as this is the age of eligibility for our hospital’s violence intervention program. We focused on this age group because we currently have the resources to further address their needs. Patients who died from their trauma were excluded. Recurrent episodes of violent injury were identified, with follow-up through 2012. Median income for the patient’s zip code of residence was derived from US census estimates and divided into quartiles. Multivariate logistic regression was conducted to evaluate predictors of violent injury recidivism.

 

Results: During the study period, 1,888 patients presented to our trauma center as a result of interpersonal violence. Mechanism of injury included blunt assault (n=451; 24%), stabbing (n=266; 14%), and gunshot wound (n=1171; 62%). We identified 162 recidivist events (8.6%). Neighborhood median income ranged from $25,818 to $137,770. Univariate analysis was performed, and multivariate logistic regression confirmed the following factors as independent predictors of violent injury recidivism: male gender (OR=2.54 [1.33, 4.86]; p=0.005), black race (OR=2.14 [1.16, 3.93]; p=0.01), and the two lowest neighborhood median income quartiles, < $37,609 (OR=1.7 [1.15, 2.51]; p=0.008) and $37,609 to $40,062 (OR=1.85 [1.13, 3.02]; p=0.01). 

 

Conclusion: For young patients that present with violent injury, the median income of their neighborhood of residence is independently correlated with their risk of recidivism. Low neighborhood socioeconomic status may be associated with a patient’s disrupted sense of safety after violent injury, and may represent a shortage of resources necessary to help patients avoid future violence. As such, tertiary violence prevention programs aimed at reducing violent injury recidivism should include services at both the individual level, examples of which include job training and educational support, and the neighborhood level, including advocacy efforts focused on community safety and social justice.

 

55.10 Inappropriate warfarin use in trauma: Time for a safety initiative

H. H. Hon1, A. Elmously1, C. D. Stehly1,2, J. Stoltzfus3, S. P. Stawicki1,2, B. A. Hoey1  1St. Luke’s University Health Network,Regional Level I Trauma Center,Bethlehem, PA, USA 2St. Luke’s University Health Network,Department Of Research & Innovation,Bethlehem, PA, USA 3St. Luke’s University Health Network,The Research Institute,Bethlehem, PA, USA

Introduction: Warfarin continues to be widely prescribed in the United States for a variety of conditions. Several studies have demonstrated that pre-injury warfarin may worsen outcomes in trauma patients. We hypothesized that a substantial proportion of our trauma population was receiving pre-injury warfarin for inappropriate indications and that a significant number of such patients had subtherapeutic or supratherapeutic international normalized ratios (INR). Our secondary aim was to determine if pre-injury warfarin is associated with increased mortality.

Methods: A 10-year retrospective review of registry data from a Level I trauma center was conducted between 2004 and 2013. Abstracted variables included patient age, Injury Severity Score (ISS), Abbreviated Injury Score (AIS) for head, mortality, hospital length of stay (HLOS), indication(s) for anticoagulant therapy, admission Glasgow Coma Scale, and admission INR determinations. Warfarin indication(s) and suitability were verified using the most recent American College of Chest Physicians (ACCP) Guidelines. Inappropriate warfarin administration was defined as use inconsistent with ACCP guidelines. For outcome comparisons, a random sample of trauma patients who were not taking warfarin was designated as "control group". Additionally, severe traumatic brain injury (sTBI) was defined as AIS head ≽4. Statistical analyses were conducted using the chi-square and the Mann-Whitney rank sum tests, as appropriate.

Results: A total of 21,136 blunt trauma patients were evaluated by the trauma service during the study period. Of those 1,481 (7%) were receiving warfarin prior to injury, with an additional 1,947 (~10% of the non-warfarin sample) designated as "non-warfarin controls". According to the ACCP Guidelines, 264/1,481 (17.8%) patients in the warfarin group were receiving anticoagulation inappropriately. Moreover, >63% of the patients were outside of the intended therapeutic window with regard to their INR (41.1% subtherapeutic, 22.2% supratherapeutic). Overall, median HLOS was greater in patients taking pre-injury warfarin (4 days vs. 2 days, p <0.01). Mortality was higher in the warfarin group (6.1% or 91/1,481) vs. the control group (2.6% or 50/1,947, p<0.01). Patients with sTBI (AIS head ≽4) were had significantly greater mortality in the warfarin group (26.4% or 56/212) vs. the control group (14.9% or 22/148, p<0.01).

Conclusion: A significant number of trauma patients admitted to our institution were noted to take warfarin for inappropriate indications. Moreover, a significant proportion of patients taking warfarin had either subtherapeutic or supratherapeutic INRs. Pre-injury warfarin was associated with increased mortality and HLOS in this study, especially in the subset of patients with sTBI. National safety initiatives directed at appropriate initiation and management of warfarin are necessary.
 

54.06 Overtriage Rates Continue to Burden a Mature Trauma Center

M. Soult1, J. N. Collins1, T. J. Novosel1, L. D. Britt1, L. J. Weireter1  1Eastern Virginia Medical School,Norfolk, VA, USA

Introduction:  As the number of emergency department (ED) visits continues to increase, trauma systems can become a resource for busy EDs to allow for the rapid evaluation of patients.  Over time at a mature trauma center, the rate of trauma alerts being discharged home from the trauma bay continued to rise.  The objective of this study was to investigate the cause of the increased over triage rate and determine if the over triage rate could be reduced by adherence to activation protocols.

Methods:  A retrospective chart review was performed all patients who presented as a trauma alert to a single level I trauma center from January 2000 to February 2014.  Full activation was designated as an alpha alert, while partial activation was a bravo alert.  Using the Cribari method for over and under triage, yearly rates were calculated.  Additionally, discharge rates as a function of activation level were calculated.

Results: During the study period trauma activation was required for 25,348 patients.  From 2000-2006 there was an average of 1710 activations and the over triage rate was 63%.  The emergency department (ED) underwent expansion that was completed near the end of 2006.  From 2007-2012 there was an average of 1804 activations and the over triage rate of 74% (p<0.05).  Discharge rates mirrored the over triage rate with 14% of alpha and 41% of bravo alerts were discharged from the trauma bay from 2000-2006 and 22% of alpha and 50% of bravo alerts were discharged from the trauma bay from 2007-2012 (p<0.05).  In order to assess if adherence to defined activation criteria was being adhered to, activations were audited from May 2013-February 2014.  There were 2005 activations and the over triage rate underwent a significant decrease to 69%.  However, 24% of alphas and 52% of bravo alerts were discharged from the trauma bay.  The audit revealed that when criteria from the protocol were met and documented, the over triage rate further declined to 65%.  However, even with activation criteria being complied with, discharge rates remained unchanged with 22% of alphas and 56% of bravos being discharged.

Conclusion: Over utilization of resources remains an issue in this trauma system.  Through the maturation of a level I trauma center, comfort with trauma activations increased and there was a departure from identifying alert activation criteria and documenting reason for activation.  However, after identification of this departure, increased adherence to defined protocols caused a decrease in over triage rates.  While current criteria do not cause triage rates to meet national benchmarks, they provide significant improvement and further modification of activation criteria is being evaluated.

 

54.07 Falls In The Elderly: A Cause Of Death Analysis At A Level 1 Trauma Center

C. J. Allen1, W. M. Hannay1, C. R. Murray1, R. J. Straker1, J. P. Meizoso1, J. J. Ray1, M. Hanna1, C. I. Schulman1, A. S. Livingstone1, N. Namias1, K. G. Proctor1  1University Of Miami,Trauma And Critical Care,Miami, FL, USA

Introduction:  Fall-related mortality is projected to exceed both MVC and firearm mortality rates.  One in three elderly adults falls each year. Although tremendous resources are focused on this growing public health concern with many arguing we require more devoted trauma centers to address this issue, exactly how these patients die is unknown. To fill this gap, we analyzed all falls at a level I trauma center. We hypothesize there is large portion of elderly fatalities that do not die directly from the fall injury itself.

Methods:   From 01/2002 to 12/2012, 7,293 fall admissions were reviewed. Demographics, fall height, injury patterns, procedures and outcomes were obtained. Each elderly (≥65y) fatality was analyzed for root cause of death and categorized as either: direct result from fall, complication of prolonged hospital stay (indirectly related to injury of fall), or fatal event preceding fall. Physician notes were used to define cause of death, and medical examiner and autopsy reports were reviewed in ambiguous cases. Parametric data is represented as mean ± standard deviation and non-parametric data as median (interquartile range). Student’s t-test and Fisher’s Exact test were used as appropriate.

Results: In 2002-2007, 25% of all falls were elderly, but in 2008-2012 this proportion increased to 30% (p<0.001). When comparing non-elderly (n=5,187) to elderly (n=2,076), age was 38±17y vs 78±8y, 76% vs 50% M (p<0.001), ISS 10±10 vs 12±10 (p=NS), LOS 8±6d vs 12±6d (p<0.001), mortality 3.4% vs 13.7% (p<0.001). Non-elderly fatalities (n=180) were 82% M, and ISS 32±15, while elderly fatalities (n=285) were 58% M (p<0.001), 91% ground-level height, ISS 23±11 (p<0.001). Elderly cause of death is detailed in the figure below.

Conclusion: The fall population is becoming proportionately more elderly.  Although with similar injury severity, elderly are associated with a higher LOS and >4x the mortality rate compared to the non-elderly. As the vast majority of elderly fatalities fall from ground-level height, they have a significantly lower injury severity than non-elderly fatalities. Over one-third of elderly deaths are due to a fatal event preceding the fall or from complications from a prolonged hospital stay. Since fall mortality rates continue to rise despite public health concern, it is notable that a large portion of these deaths are not directly related to the fall injury. As focus on elderly fall mortality increases and as many believe we require more devoted trauma centers to combat this growing health concern, defining the true causes of death may be necessary to focus resources to the actual issues. 

 

52.08 Geographic Disparities in Mortality Following Head Trauma

M. P. Jarman2, R. C. Castillo2, A. R. Carlini2, A. H. Haider1  1Johns Hopkins University School Of Medicine,Department Of Surgery,Baltimore, MD, USA 2Johns Hopkins Bloomberg School Of Public Health,Department Of Health Policy And Management,Baltimore, MD, USA

Introduction:  Treatment at a designated trauma center is proven to reduce mortality from traumatic brain injury, but the majority of US residents in rural areas do not have timely access to Level I or II trauma centers. Rural residents also face elevated risk of traumatic brain injury compared to non-rural residents, and may experience more severe injuries that their urban and suburban counterparts.  

Methods:  We performed a retrospective analysis of 2006-2011 National Emergency Department Sample data to determine if mortality following trauma brain injury differs across urban/rural classifications. Emergency department (ED) visits with ICD-9-CM codes for intracranial injury (ICD-9-CM 850-854) as the primary diagnosis were included in these analyses (N = 180,499). Odds of death in the ED were calculated using multiple logistic regression analyses with patient residential urban/rural status, Injury Severity Score, comorbidities, trauma center designation, patient age, and patient gender as covariates. All analyses were performed using Stata 12.1.

Results: Residents from rural communities were 21% (p = 0.001) more likely to die of traumatic brain injury that non-rural residents, when controlling for severity, comorbidities, trauma center designation, age, and gender. Rural residents treated at Level I trauma centers were 18% (p = 0.010) more likely to die of their injuries, compared to non-rural residents. There was no statistically significant difference in mortality between rural and non-rural residents with head injury treated at Level II or Level III centers (p = 0.092 and p = 0.465, respectively). Rural residents treated for head injury at Level IV centers were 76% more likely to die compared to non-rural residents (p < 0.001). 

Conclusion: People living in rural communities are significantly more likely than non-rural residents to die following traumatic brain injury. This disparity is present at Level I trauma centers and a Level IV centers, which typically serve as safety net providers in rural communities. The disparity is not present at Level II and Level III trauma centers. Distance and travel time to treatment likely play a significant role in brain injury outcomes for rural residents, but measures of distance and time were not available for these analysis. Future analyses should explore the interaction between time to treatment, level of care, and outcomes for rural residents.

52.09 Undertriage of Older Adult Trauma Patients: Is this a National Phenomenon?

L. M. Kodadek1, S. Selvarajah1, C. G. Velopulos1, A. H. Haider1  1Johns Hopkins University School Of Medicine,Department Of Surgery,Baltimore, MD, USA

Introduction:
Older age is associated with higher morbidity and mortality after injury.  National guidelines recommend that patients age ≥55 years be considered for triage to trauma centers (TCs) to ensure optimal care.  However, recent statewide studies suggest that significantly injured patients ≥55 years are actually more likely to be undertriaged to nontrauma centers (NTCs).  Our objective was to determine if older adult undertriage is a national phenomenon and to determine associated patient and injury factors. 

Methods:
The 2011 Nationwide Emergency Department Sample was used to perform a national analysis.  All adults age ≥55 years presenting to an emergency department (ED) with an injury diagnosis were identified using ICD-9 diagnosis codes (800.*-959.*).  Patients transferred to another short-term facility and patients with ICD-9 codes indicating superficial injury, foreign body injury and late effects were excluded.  To determine injury severity, Stata program for Injury Classification (ICDPIC) was used to assign a new injury severity score (NISS).  We performed weighted descriptive analysis comparing characteristics of patients by trauma center status.  As patients in rural areas may be transported to a NTC because of proximity, a subset analysis was performed examining patients in urban/suburban areas only where this would occur less commonly.  Logistic regression was performed to determine association between patient demographics and injury characteristics, controlling for hospital characteristics, injury severity and Charlson Comorbidity Index.

Results:
Of 4,135,782 ED visits meeting inclusion criteria, 74.0% were treated at NTCs, and 70.4% of all ED visits were in urban/suburban areas.  Although the majority of ED visits were associated with NISS<9 (85.1%), proportionally more ED visits at TCs had NISS≥9 compared to NTCs (22.2% vs. 12.3%, P<.001).  We found that 58.1% of patients with NISS≥9 were managed at a NTC.  On multivariate analysis it was noted that patients at NTCs were more likely to be ≥75 years, female, insured by the government and injured by falls.  Patients with moderate injuries (NISS 9 to 15) were more commonly treated at NTCs (62.7%) and patients with the most severe injuries (NISS > 25) were seen at similar rates in TCs and NTCs (49.1% vs. 50.9%).  

Conclusion:
There is substantial undertriage of patients age ≥55 years at the national level.  Nearly half of significantly injured older patients are being treated at NTCs.  The impact of undertriage needs to be determined and interventions are needed to ensure that older patients receive trauma care at the optimal site.  
 

50.05 Efficacy of Gastrografin Challenge in Comparison to Standard Management of Small Bowel Obstruction

Y. M. Baghdadi1, M. Amr1, M. A. Khasawneh1, S. Polites1, M. Zielinski1  1Mayo Clinic,Rochester, MN, USA

Introduction: The Gastrografin® Challenge is a diagnostic and therapeutic tool to treat patients with small bowel obstruction (SBO), however, long-term data on SBO recurrence after the Gastrografin® Challenge is limited. We hypothesized that patients treated with Gastrografin® would have the same long-term recurrence as those treated prior to the implementation of the Gastrografin® Challenge protocol.

Methods: Institutional review board approval was obtained to review medical records of patients 18 years or older admitted for acute SBO between 7/2009-9/2011. Patients with contraindications to the Gastrografin® Challenge (i.e. signs of strangulation) were excluded. Kaplan-Meier method was used to describe the time-dependent outcomes. Data is presented as mean ± standard deviation, or percentage, as appropriate.

Results: A total of 191 patients were identified of whom 52 received the Gastrografin® Challenge (27%). The mean age was 65 ± 17 years with 103 women (54%). Operative exploration and complications during the initial hospitalization were less common in patients who underwent the Gastrografin® Challenge (27% vs 42%, p=0.045, and 10% vs 39%, p<0.0001, respectively). The duration of index hospitalization was comparable (9 vs 9 days, p= 0.87). Overall survival from recurrence was 91% (95%CI: 76.5%-96.7%) over 12 months (Figure 1). There were no recurrences among patients who received the Gastrografin® Challenge at one year follow-up (survival rates; 0% vs 86.9%, p=0.04). 

Conclusion: The Gastrografin ® challenge is safe and promising tool in managing patients with SBO with fewer explorations at the index of admission and subsequent lower recurrence rates during follow-up. 

 

50.06 Effect Of Alcohol And Illict Drug Use In Pediatric Trauma Patients: An Analysis Of The NTDB

H. Aziz1, P. Rhee1, V. Pandit1, M. Khalil1, R. S. Friese1, B. Joseph1  1University Of Arizona,Trauma/Surgery/Medicine,Tucson, AZ, USA

Introduction:

Alcohol misuse is an important source of preventable injuries in the adolescent population. While alcohol screening and brief interventions are required at American College of Surgeons-accredited trauma centers, there is no standard screening method. The aim of our study was assess the effect of alcohol and illicit drugs in pediatric trauma outcomes. 

Methods:

We performed a retrospective analysis of the NTDB. Pediatric patients (≤ 20) who were tested for alcohol and/ or illicit drugs were included in the analysis. Outcome measures were difference in complications and mortality between the two groups. 

Results:

A total of 31,923 pediatric trauma patients were tested for alcohol; 21% of whom tested positive for alcohol. Of the 17,053 patients tested for illicit drugs; 14 % were tested positive for illicit drug use. Propensity matched analysis revealed that pediatric patient with alcohol/illicit drugs use were more likely to have in hospital complications (21% vs. 16%; p-0.01) and a higher mortality rates (7% vs 3%; p-0.001) as compared to their counterparts. 

Conclusion:

Alcohol and illicit drugs use and abuse are associated significant consequences in pediatric population. Our study highlights the increasing use of alcohol and illicit drugs in pediatric trauma patients. Strict screening criteria in pediatric trauma patients are warranted. 

 

50.07 Injury Severity Score (ISS) as a Predictor of Perioperative Complications in Open Humerus Fractures

N. N. Branch1,2, A. Obirieze2, R. H. Wilson1,2  1Howard University College Of Medicine,Washington, DC, USA 2Howard University Hospital,Surgery,Washington, DC, USA

Introduction: Patients with open humerus fractures are often subjected to high velocity forces.  Consequently, they may present with an isolated injury or polytrauma, requiring prioritization of fracture management with the need to stabilize more immediate life threatening injuries.  The extent of these injuries may intuitively correlate with outcomes, however, not all patients will undergo definitive fixation during their hospitalization.  As such we sought to determine the relationship between Injury Severity Score (ISS) and perioperative complications after open reduction and internal fixation (ORIF) of traumatic open humerus fractures.

Methods: A retrospective analysis of the National Trauma Data Bank from 2007-2010 utilizing ICD-9 codes was conducted.  Patients >18 years old, who underwent open reduction and internal fixation (ORIF) of the humerus at a level I or level II trauma center were included.  Using multivariate analyses covariates were controlled for including but not limited to obesity, congestive heart failure, diabetes, bleeding disorders, age, steroid use, and smoking status.  Univariate and bivariate analyses were also performed.  ISS was stratified into four groups, 1: <16, 2: 16-24, 3: 25-75, and 4: unknown with ISS <16 serving as the reference group.

Results:A total of 5,663 patients met the inclusion criteria.  The majority of whom were white (65%) males (71%) ages 25-44 (39%) with private insurance (26%) whose fracture resulted from blunt trauma secondary a motor vehicle collision (26%).  The average hospital length of stay was 11.8 days with a mean of 3.5 days in the intensive care unit.  59% of patients had an ISS of <16.  On multivariate analysis with increasing ISS (group 2 and 3) increased the odds of developing an organ/space surgical site infection (SSI), sepsis, or any infection (Table 1).  Further, the odds of having any perioperative complication in addition to death within 30 days of admission increased with increasing ISS (Table 1).  Superficial SSI was increased by almost five times for ISS group 3 (OR: 4.73 CI: 2.3-9.2 p<0.001), and deep SSI were increased more than seven times for ISS group 2 (OR: 7.68 CI: 1.31-45.14 p=0.024) compared to an ISS of <16.

Conclusion: Injury Severity Score is an accurate predictor of perioperative complications associated with ORIF of traumatic open fractures of the humerus.  This is particularly true of infectious complications and mortality.  In general with increasing ISS there is an increase in the odds of developing a perioperative complication.