72.03 Does Community Consultation Reach Patients Likely to be Enrolled in EFIC Studies?

W. C. Beck1, B. A. Cotton1, C. E. Wade1, J. M. Podbielski1, L. Vincent1, D. J. Del Junco1, J. B. Holcomb1, J. A. Harvin1  1University Of Texas Health Science Center At Houston,Houston, TX, USA

Introduction:  Exception from informed consent (EFIC) allows subjects to be randomized prior to obtaining individual or representative consent. EFIC requires extensive community notification through processes such as in-person Community Consultation (CC), random-digit dialing, and announcements in radio, television and print media. The objective of this study was to assess whether a recently conducted trial performed under EFIC reached relevant and affected communities through the CC process alone.

Methods:  A randomized transfusion trial at our center was conducted under EFIC. We held CC meetings with local organizations at 15 sites across the catchment area, which encompasses over 200 zip codes. Zip codes including and adjacent to the CC meeting locations were defined as CC ZIPCODE. We identified all zip codes in which patients were either injured or declared resident.

Results: There were 1695 patients screened and 107 patients randomized. Among the randomized group, 18% were injured in CC ZIPCODE (12% among screened). 24% of randomized patients had home zip code in CC ZIPCODE area (19% among screened). When CC ZIPCODE included either injury location or home address, 25% of patients were from CC ZIPCODE areas (22% among screened). CC ZIPCODE patients were more likely to be non-white (56 vs. 40%, p=0.129), penetrating mechanism (48 vs. 26%, p=0.034) and transported by ground (49 vs. 27%, p=0.036) than their counterparts. There were no differences in study group allocation, transfusion volumes or mortality.

Conclusion: In this EFIC trial, 25% of patients were injured or lived in areas where CC was performed. While CC cannot reach all potential patients in emergency research settings, we have demonstrated that high-risk areas can be identifiedand targeted to cover patients likely to be screened and enrolled for such studies. Our CC process was able to over-sample and engage communities at risk for health and healthcare disparities.

70.16 Variations in Metastatic Pattern Among Female Breast Cancer Patients: A US Population Based Study

R. Riccardi1,3, S. Patil1, R. S. Chamberlain1,2,3  1Saint Barnabas Medical Center,General Surgery,Livingston, NJ, USA 2University Of Medicine And Dentistry Of New Jersey,Newark, NJ, USA 3Saint George’s University,Grenada, Grenada, Grenada

Introduction: Worldwide statistics reveal that approximately 30% to 70% of female primary breast cancer patients will ultimately develop recurrence and succumb to metastases. The site of recurrence is intimately linked to survival, as visceral metastases patients generally succumb to disease early, while bone metastases patients can live years if properly treated.  Advances in chemotherapy therapy drugs and regimens including the addition of targeted and immunotherapy therapy has yielded dramatic results in regards to prolonging survival for metastases to certain metastatic sites.  This study aimed to patterns of metastatic spread among different age and ethnic groups and to more precisely determine metastatic site specific survival in metastatic breast cancer from a recent large nationwide cohort.

Methods: Data on all patients with breast cancer were abstracted from the Nationwide Inpatient Sample (NIS) database (2002-2009). Patients with MBC were identified using ICD-9 codes 1960-19889 (Table 1). Four ethnic groups (Caucasian, African American, Hispanic and Others), Two age groups (<50 years and >50 years) and two outcome groups (Alive and Dead) were compared for differences in metastatic patterns. Standard statistical methods were used for data analysis. 

Results: 708,423 patients were analyzed for incidence of metastatic breast cancer. Overall metastases to axillary lymph nodes occurred in 24.3%, bone in 4.6%, and respiratory tract in 2.8%. Both lymph node and extralymphatic metastases were highest among African Americans compared to other ethnic groups. Similarly Hispanics had the highest percent of lymph node and extralymphatic metastases compared to Caucasians except for skin, peritoneum and adrenal metastases. Extra lymphatic metastases both to solid and hollow viscera were more common in patients < 50 years except for liver, brain and ovary. Mortality occurred in 32.9% of patients with metastases to the large intestine, 23.8% to the respiratory tract, 23.9% to the kidney, and 23.7% to the GIT NEC. On multivariate analysis metastases to inguinal lymph nodes, large intestine, liver, respiratory tract, brain, bone, peritoneum, skin, gastrointestinal tract NEC and to the thoracic lymph nodes was independently associated with increased mortality.

Conclusion: Metastatic breast cancer, especially extralymphatic metastases is more common in the younger population. Metastases to atypical sites like the inguinal LN and large intestine is associated with the highest mortality. Increased knowledge of metastatic site specific survival, should be useful to improve breast cancer staging (e.g., perhaps with the addition of M1bone, M1viscera, M1brain), and to better stratify clinical trial enrollment. 

 

70.17 Robotic Assisted Esophagectomy in the Obese Patients

A. Salem1, M. Thau2, K. Meredith1  1University Of Wisconsin,Section Of Surgical Oncology – Division Of General Surgery – ‏The University Of Wisconsin School Of Medicine And Public Health,Madison, WI, USA 2University Of South Florida College Of Medicine,Tampa, FL, USA

Introduction: The impact of body weight on outcomes after robotic-assisted esophageal surgery for cancer has not been studied. We thus examined short-term operative outcomes in patients according to their body mass index (BMI) following robotic-assisted Ivor-Lewis esophagogastrostomy (RAIL) at a high-volume tertiary-care referral cancer center and evaluated the safety of robotic surgery in patients with an elevated BMI.

Methods: A retrospective review of all patients who underwent RAIL for pathologically confirmed distal esophageal cancer was conducted. Patient demographics, clinicopathologic data, and operative outcomes were collected. Current guidelines from the US Centers for Disease Control and Prevention and from the World Health Organization define underweight BMI range as below 18.5 kg/m².  Normal range is defined as a BMI range of 18.5 to 24.9 kg/m². Overweight is defined as a BMI range of 25.0 to 29.9 kg/m² and obesity is defined as a BMI of 30 kg/m² and above. Statistics were calculated using Wilcoxon Rank-Sum and Spearman Coefficient tests with a p-value of 0.05 for significance.

Results:134 patients (106 men, 28 women) with an average age of 66 years were included. The majority of patients, 76% (N=102) received neoadjuvant therapy. When stratified by BMI, five patients were underweight, twenty-nine were normal weight, fifty-eight were overweight and forty-two were obese. All patients had R0 resection. Median operating room (OR) time was 421 minutes and median estimated blood loss was 160 cc. BMI significantly correlated with longer OR time (coefficient= 0.23; p=0.01) and higher EBL (coefficient=0.17; p=0.05).

Conclusion:This retrospective study shows that patients with distal esophageal cancer and an elevated BMI undergoing RAIL have increased operative times and EBL during the procedure.
 

70.18 Should We Operate for an Intra-abdominal Emergency in the Setting of Disseminated Cancer?

C. L. Scaife1, K. C. Hewitt1, X. Sheng2, K. W. Russell1, M. C. Mone1  1University Of Utah,General Surgery / Surgery,Salt Lake City, UT, USA 2University Of Utah,Biostatistics / Pediatrics,Salt Lake City, UTAH, USA

Introduction:
Patients with an advanced cancer diagnosis, who develop an intra-abdominal surgical emergency, pose a medical decision making dilemma. While patient survival is already limited due to advanced malignancy, a surgical intervention may be futile and costly.  Patients, families, and medical providers need more accurate data to assist with these difficult treatment decisions. This study was designed to establish morbidity and mortality rates, and to attempt to identify preoperative risk factors which may predict a poorer outcome.   

Methods:

The national NSQIP database was queried for patients with disseminated cancer undergoing emergent abdominal surgery (2005-2012).  Preoperative NSQIP variables were used for prediction models for 30-day major morbidity and mortality. Example NSQIP variables that were used included patient functional status, ASA class, weight loss, renal function, age, albumin, sepsis, HCT, and pre-op cardiopulmonary comorbidities.  A tree model and logistic regression were employed to find factors associated with these outcomes. The analysis was carried out on a training dataset; model performance (misclassification rate) was then evaluated on a validation dataset.  

Results:

In patients with an abdominal surgical emergency and disseminated or Stage IV cancer, there was overall a major surgical morbidity rate of 47% and a surgical mortality rate of 26%. The tree model for morbidity showed that sepsis or a hematocrit <29.6 was predictive for a major morbidity (error rate 36%). The tree model for mortality showed an ASA score of 4 or 5 with a totally dependent functional status to be predictive of mortality (error rate 24%).

Conclusion:

The decision to operate for an intra-abdominal emergency in the setting of disseminated cancer is difficult when considering the morbidity to the patient, the overall expected cancer specific and post-operative expected survival, and cost. Our study confirms that the risk for surgical morbidity and peri-operative death in this population is very high. Preoperative patient factors, including sepsis, ASA class, anemia, and patient functional dependence, all strongly predict poor patient outcomes. We have also developed further logistic regression models, derived from this database, to provide detail to help with decisions related to care.

70.19 Sarcopenia as a Predictor of Outcomes of Palliative Surgery

A. M. Blakely1, S. Brown2, D. J. Grand2, T. J. Miner1  2Brown University School Of Medicine,Department of Radiology,Providence, RI, USA 1Brown University School Of Medicine,Department Of Surgery,Providence, RI, USA

Introduction:

Sarcopenia, defined as the degenerative loss of muscle quality with age, has been shown to be associated with worse outcomes following resection of various cancer types. Sarcopenia is usually assessed by evaluating psoas muscle density and/or cross-sectional area at the level of the third or fourth lumbar vertebra. To date, sarcopenia has not been evaluated as a predictor of outcomes following palliative surgery.

 

Methods:

Retrospective analysis of a prospective database of all palliative surgery patients receiving an operation from January 2004 to December 2012 was performed. Those patients with a pre-operative abdominopelvic computed tomography (CT) and follow-up of at least 6 months were selected for analysis. CT scans were evaluated by a resident and attending radiologist for mean total psoas muscle cross-sectional area and density. Time from CT to operation, primary tumor type, and post-operative complications were recorded.

 

Results:

From January 2004 to December 2012, 57 patients were identified as having undergone a palliative procedure with prior abdominopelvic CT scan available for evaluation.  Of 57 patients, 27 (47.4%) were male, with median age of 63 years. Median time from CT scan to operation was 10 days. Regarding primary tumor type, 15 (26.3%) were pancreas, 9 (15.8%) colon, 7 (12.3%) stomach, 4 (7.0%) lung, 3 (5.3%) ovary, 3 (5.3%) cholangiocarcinoma, and 16 (28.1%) other. Thirty-day post-operative morbidity was 31.6% and mortality was 5.3%. Mean total psoas area was 1622.8 mm2 (range 785.3 to 3641.3 mm2, standard deviation (SD) 586.6), and the mean psoas density was 49.8 Hounsfield units (HU) (range 27.1 to 69.8 HU, SD 9.9). Increasing age was associated with decreased total psoas area (p=0.0027) and decreased psoas density (p=0.0005). However, neither total psoas area nor density were associated with the development of complications (p=0.88 and p=0.48, respectively).

 

Conclusions:

Sarcopenia, assessed by either psoas muscle area or density, was not associated with complications following palliative surgery. Patient selection for palliative surgery continues to be challenging. Sarcopenia may retain some value in the pre-operative assessment of advanced cancer patients as a component of a frailty index.

70.20 Secular Trends in Morbidity and Mortality Among Surgical Patients with Disseminated Malignancy.

S. Bateni1, F. J. Meyers3, R. J. Bold2, R. J. Canter2  1University Of California – Davis,General Surgery,Sacramento, CA, USA 2University Of California – Davis,Surgical Oncology,Sacramento, CA, USA 3University Of California – Davis,School Of Medicine,Sacramento, CA, USA

Introduction: There is substantial risk of acute morbidity and mortality following surgical intervention for patients with disseminated malignancy. However, concerns also exist regarding the risks of untreated surgical disease among these patients. We sought to characterize temporal trends in morbidity and mortality among disseminated malignancy patients, hypothesizing that surgical intervention would remain a prevalent modality among these patients.

Methods: We queried the American College of Surgeons National Surgical Quality Improvement Program from 2006 to 2010 for patients with disseminated malignancy. Excluding patients undergoing a primary hepatic operation, we identified 21,755 patients. Parametric and non-parametric statistics were used to evaluate the association of patient characteristics and surgical interventions to 30-day morbidity and mortality outcomes. Logistic regression analysis was performed to identify independent predictors of 30-day morbidity and mortality.

Results: The prevalence of surgical intervention for disseminated malignancy was stable at 1.9 – 2.2% of all NSQIP procedures per year. Among disseminated malignancy patients, the most frequent operations performed were bowel resections (24.7%), varied gastrointestinal procedures (22.0%), and multivisceral resections (13.7%). The rates of bowel resection, celiotomy/lysis of adhesions and appendectomy/cholecystectomy, showed small, but statistically significant, decreases over time (26.1 vs 22.6%, 8.4 vs 5.8%, 6.6 vs 2.9% respectively, p<0.001). The rate of emergency operations also decreased over the study period  (17.4 vs 15.0%, p<0.005). In contrast, the rate of preoperative independent functional status rose (82.3 vs 86.1%, p<0.001), while the rate of preoperative weight loss (14.4 vs 12.8%) and sepsis (20.6 vs 15.7%) decreased (p<0.005). Rates of 30-day morbidity (30.1 vs 23.5%), serious morbidity (16.1 vs 10.6%), and mortality (10.4 vs 9.3%) all decreased over the study period (p<0.05). On multivariate analysis, male sex, BMI, age, impaired functional status, weight loss, pre-operative sepsis, leukocytosis, elevated creatinine, anemia, and hypoalbuminemia all predicted worse 30-day morbidity and mortality. A lack of DNR status was associated with greater morbidity, while a present DNR status was associated with higher mortality.

Conclusion: Although 30-day morbidity, serious morbidity, and mortality have decreased for patients with disseminated malignancy undergoing surgical intervention, they remain elevated, and surgical intervention remains prevalent in patients with incurable malignancy. These data highlight the importance of careful patient selection and an evaluation of the goals of therapy among this patient population.

71.01 Comparing African-American (AA) and Non-AA Perceptions on the Benefits of LDKT in South Carolina

D. Tagge1, V. Phan1, A. Wilton1, J. Rodrigue2, D. Taber1, K. Chavin1, P. Baliga1  1Medical University Of South Carolina,Division Of Transplant Surgery,Charleston, SC, USA 2Beth Israel Deaconess Medical Center,The Transplant Institute,Boston, MA, USA

Introduction:  Live-donor kidney transplantation (LDKT) is the best treatment option for end-stage renal disease (ESRD). Thus, educational practices concerning LDKT should be evaluated in order to optimize the message that health care providers deliver give to patients and their caregivers.

Methods:  Anonymous questionnaire data were collected from recent post-transplant recipients. These patients had previously received formal education about LDKT during their initial transplant evluation. Their knowledge about the possible benefits of LDKT compared to deceased donor kidney transplantation (DDKT) and whether they wanted additional information on the possible benefits were collected and analyzed.

Results: 45 participants were included in the analysis; both groups (AA and non-AA) were similar for baseline demographics (Table 1). Both AA and non-AA reported a similar frequency of talking to transplant nephrologists about LDKT; however, AAs reported talking to their dialysis doctors about LDKT significantly more.  AA and non-AA patients similarly recalled the potential benefits of LDKT compared to DDKT. However, further questioning revealed that AA patients desired more information on a variety of the benefits of LDKT compared to DDKT (Table 1).

Conclusion: These survey results suggest there are ethnic differences in how ESRD patients are educated about the benefits of LDKT, with AA patients having an increased need for relevant information pertaining to LDKT benefits.

 

71.02 The Role of Complement Fixing Donor Specific HLA Antibodies in Liver Transplantation

D. W. Forner1, R. G. Sperry2, R. S. Liwski2,3, I. P. Alwayn1,2,3  1Dalhousie University,Department Of Surgery,Halifax, Nova Scotia, Canada 2Dalhousie University,Department Of Pathology,Halifax, Nova Scotia, Canada 3Dalhousie University,Department Of Microbiology & Immunology,Halifax, Nova Scotia, Canada

Introduction:
For many forms of end-stage liver disease, transplantation is the only available therapy. While current outcomes represent a vast improvement over initial liver transplants, there is room for improvement; rates of acute rejection remain around 20%, with rates of chronic rejection only slightly lower. In particular, the usefulness of donor specific HLA antibody (DSA) as a predictor of liver rejection is controversial. Therefore, this study aimed to evaluate the utility of both standard single antigen bead (SAB) assay and a modified C1q assay in the prediction of liver transplant outcomes.

Methods:
This study is a retrospective chart review of liver transplant patients coupled with assaying of patient sera by both standard SAB assay and modified C1q assay. Patient records were reviewed from January 1st 2009 to December 31st 2013.

Results:

In total, 100 patients whom underwent liver transplant met inclusion criteria and had pre-transplant sera tested for DSAs by SAB assay. Twenty-eight patients with confirmed or suspected DSA by SAB assay underwent screening by modified C1q assay.

Twenty-two patients experienced biopsy proven acute cellular rejection (ACR). One patient experienced biopsy confirmed chronic rejection (CR; 1%). An additional 13 individuals experienced suspected rejection (13%).

Twenty-five patients were DSA positive by SAB assay (DSASAB; 25%). These patients were more likely to experience ACR than those with no DSASAB (36% vs 17%, p < 0.05). The positive predictive value was found to be 41%, while the negative predictive value was found to be 80%. 

Examination by the modified C1q assay found thirteen patients possessed complement-activating DSAs (DSAC1q; 13%). Patients with DSAC1q were not more likely to experience ACR than those without (8% vs 24%, p > 0.05).

Conclusion:
Patients with DSA detected using standard SAB assay were more likely to experience ACR. However, the predictive value of this assay was not found to be particularly useful, with nearly two thirds of patients possessing DSASAB experiencing no rejection. Despite the hypothesized advantage the C1q assay holds in the prediction of rejection, patients with complement-fixing DSA were not more likely to experience rejection. This finding is in agreement with recent study by Kubal et al. (2013) examining the clinical utility of the C1q assay. In conclusion, this study offers further evidence that DSA may play a role in liver transplantation outcomes, and larger studies should be carried out to assess the role of complement-fixing antibodies in liver transplantation.

 

71.03 Transversus Abdominis Plane Block vs. Local Wound Infiltration in Laparoscopic Live Donor Nephrectomy

E. W. Kabil1, P. Baliga1, D. Taber1, E. S. Clemmons1, K. Chavin1  1Medical University Of South Carolina,Transplant Surgery,Charleston, Sc, USA

Introduction:  Prior research demonstrates that patients with a transversus abdominis plane (TAP) block may require less combined oral and intravenous narcotics compared to patients without a TAP block after a Laparoscopic Donor Nephrectomy (LDN), but are limited by size and scope and follow-up. 

Methods:  This was a retrospective analysis to compare the narcotic analgesic requirements and clinical outcomes of patients who underwent a TAP block procedure compared to patients that received usual care, consisting of a local wound infiltration with Marcaine 0.05%. Adult donors undergoing surgery between 2010 and 2013 were included. The primary endpoint was the amount of narcotic analgesics (measured in morphine equivalents) required by patients postoperatively until discharge home. Secondary endpoints included post-operative day 1 serum creatinine, hospital length of stay, and complications.  

Results:  60 patients, 15 (25%) of which received TAP, were included in the analysis.  Baseline demographics were similar between groups.  There was a trend in reduced in-hospital morphine equivalents (ME) use in the TAP group (median of 31.30 mg in the TAP group versus 40.00 mg in the usual care group, p = 0.164).  Hospital length of stay was similar for both groups (median of 3 days, p = 0.722).  Serum creatinine level on post-operative day 1 was slightly lower in the TAP group (median of 1.2 mg/dL in the TAP group versus 1.3 mg/dL in the usual care group, p = 0.064). Overall complication rates were significantly less in the TAP group, with zero complications in the TAP group versus 13 in the usual care group, p = 0.0187. 

Conclusion:  ME use was less in the TAP group but the difference was not statistically significant. However, the power of this study (calculated to be approximately 20%) was limited by the low number patients in the TAP group.  The clinical significance of these findings warrants further investigation, as evidence of an improvement in pain control of live donor patients will help to alleviate a signficant disincentive to donation.  

 

71.04 Readmission after Surgery for Carotid Artery Stenosis in Pennsylvania.

E. Kenning1, C. Hollenbeak1,2, D. Han1,3  1Penn State Hershey Medical Center,Department Of Surgery,Hershey, PA, USA 2Penn State Hershey Medical Center,Division Of Outcomes Research And Quality,Hershey, PA, USA 3Penn State Hershey Medical Center,Heart And Vascular Institute, Division Of Vascular Surgery,Hershey, PA, USA

Introduction: Carotid artery stenosis is a premorbid condition that can precipitate devastating neurologic outcomes without intervention.  Carotid endarterectomy (CEA) and carotid artery stenting (CAS) are two proven means of intervening on this disease process.  In a health care milieu in which 30-day unplanned readmissions portend a negative financial impact for hospitals, it is necessary to understand rates and risk factors for readmission following these procedures.  We hypothesized that no difference in readmission rates within 30 days would exist for these two procedures, in spite of differences that might exist between the two patient populations.

Methods: Using the Pennsylvania Health Care Cost Containment Council (PHC4) database, we identified 4,319 people who underwent CEA (N=3,640) or CAS (N=679) in Pennsylvania in 2011.  Univariate analyses were performed to compare patient characteristics and outcomes between patients who underwent CEA and those who underwent CAS.  Logistic regression was used to estimate the effect of intervention on 30-day readmission, after controlling for potential confounders.  Time to readmission was analyzed using the Kaplan-Meier method.

Results: Patients who underwent CEA and CAS were similar with a few exceptions, including age, race, admission type, and comorbid conditions such as CHF, hemiplegia and paraplegia, and renal disease.  The unadjusted rate of 30-day readmission was 9.37% for CEA and 10.75% for CAS (p=0.26).  After controlling for patient and procedure characteristics, differences between 30-day readmission rates were still not statistically significant (odds ratio=1.13; p=0.39).  Finally, time to readmission was similar for those who underwent CEA and those who underwent CAS (p=0.19) (Figure).

Conclusion: Readmission rates following CEA and CAS for carotid artery stenosis are approximately 10%.  There are few differences between patients with carotid stenosis who are selected for endarterectomy and stenting, and the choice of procedure does not appear to be associated with different readmission rates or time to readmission, even after controlling for patient characteristics.

 

71.05 Ruptured Abdominal Aortic Aneurysms and Type of Endovascular Repair

A. Cha1, V. Dombrovskiy1, N. Nassiri1, R. Shafritz1, S. Rahimi1  1Robert Wood Johnson – UMDNJ,Vascular Surgery,New Brunswick, NJ, USA

Introduction: Endovascular repair of ruptured abdominal aortic aneurysms has shown improved outcomes compared to open repair. Endovascular repair can be accomplished via a bifurcated device or an aorto-uni-iliac device with a femoral-femoral crossover graft. The purpose of this study was to evaluate these two endovascular procedures in the treatment of ruptured abdominal aortic aneurysms in Medicare population.

Methods: Data was queried from the Medicare MedPAR and Carrier files 2005-2009 using appropriate ICD-9-CM and CPT-4 codes. Rates of major postoperative complications, hospital and 30-day mortality, hospital length of stay and cost, and 1 year survival were analyzed and compared. Chi-square and Wilcoxon rank-sum tests, Kaplan-Meier survival curves and Cox proportional hazards modelling were used for statistics.

Results: We identified 914 patients with bifurcated device and 77 with aorto-uni-iliac device. Patients with aorto-uni-iliac device compared to those with bifurcated device had greater rates of various postoperative complications (overall, 74.0% vs 62.8%; P=0.049) including cardiac (22.1% vs 12.9%; P=0.024) and renal (36,4% vs 25.1%; P=0.03) complications, ischemic colitis (6.5% vs 0.6%; P<0.0001). They also tended to greater rates of infectious complications (29.9% vs 20.5%; P=0.052), including sepsis (14.3% vs 8.1%; P=0.06), and embolism and thrombosis of lower extremity arteries (5.2% vs 1.8%; P=0.06). Patients with aorto-uni-iliac device had greater hospital (37.7% vs 23.3%; P=0.005) and 30-day mortality rates (41.6% vs 25.4%; P=0.002), and poorer 1-year survival (P=0.01). They had longer hospital length of stay (median = 7days vs 5 days; P=0.04) and tended to greater cost (median = $117,483 vs $101,074; P=0.07).

Conclusion: Use of aorto-uni-iliac device compared to bifurcated device for endovascular repair of ruptured abdominal aortic aneurysms significantly increases complications rate, mortality, and hospital resource utilization. This is most likely due to the more unstable health conditions and more challenging anatomy in patients undergoing repair with aorto-uni-iliac device. Endovascular repair with a bifurcated endoprosthesis should be performed for ruptured abdominal aortic aneurysms when feasible.

 

71.06 Gender and Frailty Independently Predict Morbidity and Mortality in Infrainguinal Vascular Procedures

R. Brahmbhatt1, L. Brewster1,2, S. Shafii1,3, R. Rajani1,3, R. Veeraswamy1, A. Salam1,2, T. Dodson1, S. Arya1,2  1Emory University School Of Medicine,Atlanta, GA, USA 2Atlanta VAMC,Decatur, GA, USA 3Grady Memorial Hospital,Atlanta, GEORGIA [GA], USA

Introduction:  Women are a high-risk group for postoperative complications and mortality after infrainguinal vascular interventions. Frailty has been shown in recent studies to be an independent risk factor for postoperative morbidity and mortality. This study examines the interplay of gender and frailty as well as their effect on outcomes following infrainguinal vascular procedures.

Methods:  The NSQIP database was used to identify all patients who underwent infrainguinal vascular procedures from 2005-2012 (endovascular procedures recorded after 2010). Preoperative frailty was measured using the modified frailty index (mFI; derived from the Canadian Study of Health and Aging). Univariate and multivariate analysis was performed to investigate the association of preoperative frailty and gender, on postoperative complications and death.

Results: Of 24,645 patients who underwent infrainguinal vascular procedures (92% open, 8% endovascular), there were 533 deaths (2.2%) and 6198 (25.1%) major complications within 30 days postoperatively. The population had 8,868 (36%) women with a mean mFI [range 0-11] of 0.269 as compared to mFI of 0.259 in males (p<0.001). Women as well as frail patients (mFI>0.2) were more likely to have a major morbidity (p<0.001) or mortality (p<0.001) in univariate analysis. Frail women had the highest risk of death and major complications [Figure 1]. On multivariate logistic regression analysis, female gender [odds ratio (OR) 1.23; 95% confidence interval (CI) (1.1, 1.6)], age [OR 1.05, 95% CI (1.04-1.06)], ASA class [OR 2.0, 95% CI (1.6-2.5)] and increasing mFI [OR 1.25, 95% CI (1.15-1.4)] were significantly associated with mortality (p<0.05) after adjusting for other covariates. Major complications were also significantly associated (p<0.05) with female gender [OR 1.4, 95% CI (1.3-1.5)], ASA class [OR 1.3, 95% CI (1.17-1.4)], and increasing mFI [OR 1.12, 95% CI (1.1-1.16)] after adjusting for other covariates. Using stratified multivariate analysis based on gender, the presence of frailty resulted in slightly higher odds of death in women as compared to men (OR 1.75 vs. OR 1.6) whereas the increased odds of complications with presence of frailty were similar in both genders (OR 1.3).  

Conclusion: Women undergoing infrainguinal procedures are more frail than men. Female gender and frailty are both associated with increased risk of complications and death following infrainguinal vascular procedures after controlling for other covariates in multivariate analysis. Further studies are needed to explore the interaction of gender and frailty and its effect on long term outcomes for peripheral vascular disease.

 

71.07 Carbon Dioxide Angiography Protects Renal Function Without Losing Efficacy in Tibial Interventions

K. S. Lavingia1, C. Chipman1, S. S. Ahanchi1, J. M. Panneton1  1Eastern Virginia Medical School,Division Of Vascular Surgery,Norfolk, VA, USA

Introduction:

Our aim was to analyze the practical impact of CO2 enhanced angiography compared to conventional contrast angiography on interventional outcomes and renal function.

Methods:

A retrospective review of tibial interventions in patients with lower-limb ischemia was completed between 2008 and 2010. Patients were divided into a CO2 enhanced group and Iodixanol group.  Outcomes were compared using Student’s t-test, Chi-squared analysis, and Kaplan-Meier Curves. 

Results:

We reviewed 463 tibial interventions (87-% critical limb ischemia and 13-% claudication).  All patients were subdivided into stage of chronic kidney disease (CKD): CKD1=86, CKD2=131, CKD3=110, CKD4=32, CKD5=103, 1 with incomplete laboratory data. The data was then divided into 56 interventions with CO2 enhanced imaging and 407 interventions with traditional Iodixanol contrast imaging.  After exclusion of CKD5 patients, mean contrast volume for the CO2 enhanced group was 31mL ± 31 versus 104mL ± 65 for the Iodixanol group, p=<.001. 

The mean age of the entire cohort was 71.6 ± 12 with 58% male. There was a statistically higher proportion of renal insufficiency (RI) in the CO2 enhanced group (48.2%) versus the Iodixanol group (9.2%), p=.02.  Remaining demographics and risk factors were not statistically different between the 2 groups.

CKD5 patients were also removed for renal function analysis. The CO2 enhanced group versus the Iodixanol group had a higher mean preoperative creatinine (pre-Cr) 1.7 versus 1.1, p<.001 and a higher mean postoperative creatinine (post-Cr) 1.7 versus 1.2, p=.04.  However, when comparing the change in Cr (ΔCr) as calculated by the post-Cr minus the pre-Cr for each patient, the mean ΔCr was not significantly different: with the entire cohort having a mean decrease in creatinine of .02. 

Average follow up for the entire cohort was 18 months ± 15.  Kaplan-Meier curves at 12 and 36 months showed no significant difference between the Iodixanol group versus CO2 enhanced group for primary patency (66%, 55% versus 75%, 69%; p=.212), primary assisted patency (85%, 76% versus 92%, 88%; p=.277), secondary patency (95%, 93% versus 97%, 97%; p=.501), limb salvage (85%, 78% versus 97%, 85%; p=.094), or reintervention rate (28% versus 21%, p=.34).   

Conclusion:

Our results dispute the common perception that CO2 enhanced imaging protects renal function at the expense of image quality and ultimately primary outcomes in the tibial territory.  CO2 enhanced angiography reduced the total contrast load and renal insult, without compromising outcomes, supporting the broader application of CO2 enhanced imaging during endovascular tibial interventions. 

71.08 Is Routine Patching Necessary Following Carotid Endarterctomy?

C. Rivera1, N. J. Gargiulo1  1North Shore University And Long Island Jewish Medical Center,Manhasset, NY, USA

Introduction:   Routine patching and periodic postoperative duplex have been widely advocated to achieve optimal results after carotid endarterectomy (CEA). The present 21 year single surgeon experience evaluates the long term outcome of CEA with selective patching and without routine postoperative duplex evaluation.
 

Methods:   An IRB approved retrospective review of all CEAs performed by a single surgeon over a 21 year (1984-2005) period. Preoperative imaging studies, operative reports, physical findings, and co-morbid conditions as well as pre- and postoperative medications were evaluated. Patients having undergone follow-up duplexes are the basis for this review. Restenosis was defined as angiographic criteria suggesting an 80% or more diameter reduction requiring re-intervention.
 

Results: Over a 21 year period, 384 CEAs were performed using a selective patch technique depending on gender, internal carotid artery diameter, cardiovascular risk factors, and preoperative arteriogram. Eighty (20.8%) patients had duplexes performed at this institution as part of their regular follow-up. The remaining 304 patients had clinical follow-up on a yearly basis. Ten of eighty (12.5%) had prosthetic or vein patch closure and seventy of eighty (87.5%) underwent primary closure. The mean follow-up was 49.5 months with a range of 1 to 237 months. Restenosis in the patch group was zero of ten (0%). Sixty-six of seventy (94.2%) patients of the primary closure group did not show any evidence of restenosis. Four patients (5.8%) had arteriographically proven greater then 90% stenosis and required repeat CEA. The remaining 304 patients without routine postoperative duplex remained neurologically asymptomatic (mean follow-up 10.3 years, range 2.5 to 17 years).
 

Conclusion:  In this experience, there is no statistically significant difference in restenosis in the primary closure and selective patch group following CEA. Additionally, the absence of routine postoperative duplex failed to change the clinical outcome in a majority of patients. Although this data set is a small, single center, single surgeon, retrospective review, it does not support the generally well accepted view of routine patching following CEA.

 

 

70.02 Use of Bicaval Dual-Lumen Cannula Improves Survival to Decannulation in Venovenous ECMO

S. C. Bennett1, R. Tripathi3, A. Kilic2, A. Flores3, T. Papadimos3, D. Hayes4, R. S. Higgins1,2, B. A. Whitson2  1Ohio State University,General Surgery,Columbus, OH, USA 2Ohio State University,Cardiothoracic Surgery,Columbus, OH, USA 3Ohio State University,Anesthesiology,Columbus, OH, USA 4Nationwide Children’s Hospital,Pulmonary Medicine,Columbus, OH, USA

Introduction:

Severe acute respiratory failure has a mortality rate of 40 to 60%.  Venovenous extracorporeal membrane oxygenation (VV-ECMO) has emerged as an increasingly popular treatment modality showing a significant survival benefit for acute respiratory failure as compared to traditional ventilatory methods.  Cannulation for VV-ECMO has traditionally consisted of two separate cannulas.  We sought to evaluate the new generation bicaval dual-lumen cannulas (BCDLC) effect on survival and length of stay as compared to single-lumen cannulation.

Methods:

A prospectively maintained, institutional database of all patients undergoing VV-ECMO from January, 2011 through May, 2014 was retrospectively reviewed.  Those patients who had respiratory failure associated with cardiac surgery or those converted from venoarterial (VA) ECMO to VV-ECMO were excluded.  The technique for bicaval dual-lumen cannulation was as follows.  A guidewire was inserted into the inferior vena cava (IVC) via the right internal jugular vein using sterile Seldinger technique and fluoroscopic guidance.  The BCDLC was then placed over the wire using serial dilations, and its final position was confirmed by transesophageal echocardiography and fluoroscopy. Femoral cannulation was performed with standard Seldinger technique without initial image guidance. 

Results:

During the time period reviewed, 36 patients underwent VV-ECMO.  Exclusion criteria was met by 4 patients. Dual-lumen cannulation was used in 12/32 (37.5%) patients.  There were no significant differences in age or gender between the two groups.  The BCDLC cohort had a lower mean number of days on ECMO 12.33 compared to 13.3 in the single-lumen group, (p=0.442). Mean length of stay was 26.33 days for BCDLC cohort compared to 25.35 in single-lumen group, (p=0.561). Survival to decannulation was 11/12 (91.67%) of BCDLC cohort compared to 11/20 (55%) in the single-lumen group, (p=0.03).  Survival to discharge was 9/12 (75%) for BCDLC cohort and 9/20 (45%) for single-lumen group, (p=0.098). Early or concomitant tracheostomy was performed in 7/12 (58.33%) of BCDLC cohort as compared to 2/20 (10%) of single-lumen.  During the time period reviewed there was only one complication, right atrial perforation, from dual-lumen cannula insertion thus our incidence of right heart cavity perforation 1/12 (8.3%) was consistent with the 4-15% incidence reported in the literature. 

Conclusion:

BCDLC is showing better outcomes in terms of ECMO duration, length of stay, and survival to decannulation.  This warrants further comparison of these two groups as well as further data collection pertaining to differences in sedation and mobility.  Further efforts need to be made to achieve early mobilization in our adult population, and additional outcomes data will need to be assessed as ambulatory ECMO technology evolves.  

70.03 Off-Pump Right Atrial Surgery – Adult Vena Caval Inflow Occlusion in Right-Sided Cardiac Lesions

A. Torabi1, N. Dobrilovic2, J. Raman2  1Indiana University School Of Medicine,Indianapolis, IN, USA 2Rush University Medical Center,Chicago, IL, USA

Introduction:  Vena Caval inflow occlusion (VCIO) is an old technique that has been used with success in the pediatric population. Few reports exist of its use in adults.  We report on the use of inflow occlusion in removing infected material, and complex masses from the right side of the heart.

Methods:  Between January, 1999 and July, 2014, thirty-five patients in three hospitals in Australia and North America presented with right sided endocarditis, worsening respiratory status and systemic sepsis, in spite of maximal medical therapy or organized sterile masses with embolic potential in the right side of the heart.  Twenty of them were immunosuppressed due to concomitant medical conditions or, malignancy. Fifteen of these patients were on hemodialysis with organized thrombi related to long-standing access catheters. Twelve patients had heparin-induced thrombocytopenia. Tricuspid vegetectomy was performed in 23 under VCIO, while closure of a patent foramen ovale (PFO) was performed in four patients, tricuspid valve repair in 3.  Removal of infected pacing leads was performed in seven and removal of a migrated IVC filter in another.  Eighteen patients had a single 2-minute period of VCIO, while the others had additional periods of VCIO after a period of 5 minutes of reperfusion.  The first 23 procedures were performed through a sternotomy with bilateral decortication of the lungs.  Nine patients underwent right mini-thoracotomy and two of them were in the setting of redo cardiac surgery. One patient had a repeat VCIO procedure through a sternotomy after a previous thoracotomy.

Results: There were no deaths.  All patients had resolution of sepsis.  Three patients had moderate tricuspid regurgitation (TR), while the others had trivial to mild TR.  One patient had a transient neurological deficit post-operatively and one patient required a late decortication of an empyema.

Conclusion: Tricuspid and right atrial procedures can be performed safely off-pump using vena caval inflow occlusion in high-risk patients.

 

70.04 Treating Ascending Aortic Dissections: Comprehensive Care from the Emergency Department to Surgery

J. B. Grau1,2, C. E. Kuschner1, G. Ferrari1,2, R. E. Shaw1, J. Romeo1, M. E. Brizzio1, J. Yallowitz3, A. Zapolanski1  1The Valley Columbia Heart Center, Columbia University College Of Physicians And Surgeons,Ridgewood, NJ, USA 2University Of Pennsylvania School Of Medicine,Philadelphia, PA, USA 3The Valley Columbia Hospital,Emergency Department,Ridgewood, NJ, USA

Introduction:

Patients with Ascending Aortic Dissections (AAD) are at high risk of mortality unless they are diagnosed early and critical treatment is initiated quickly. Achieving optimal clinical outcomes in these patients requires the collaboration of many practitioners and an enhanced awareness for AAD. We present the surgical results of 11 years of AAD treatment after the initiation of a program that incorporates the emergency, radiology, and cardiothoracic surgery departments with the goal of early identification leading to efficient and timely treatment of AAD.

Methods:

From January 2002 to December 2013, 55 patients with the diagnosis of AAD were treated at our institution. There were 38 males and 17 females with a mean age of 62.3 years. Early diagnosis was accomplished in the Emergency Department (ED) through staff education, implementation of standard protocols and knowledge about presenting signs and symptoms, which created a heightened level of awareness for the diagnosis of AAD during routine triage. The Department of Radiology was prepared to accept these cases immediately and perform emergency Computed Tomographic scans. The cardiothoracic surgery team was mobilized early in this process and became involved when the patient was in-route to the ED and the diagnosis was suspected. The surgical approaches performed were ascending and hemiarch replacements with or without Aortic Valve Repair (AVR), or valve replacement plus/minus coronary artery bypass (CABG). Cerebral protection was achieved using antegrade perfusion via axillary cannulation with moderate hypothermic circulatory arrest.

Results:

A majority of patients were male (69%), with 20% having undergone prior cardiac operations. Cardiac shock was present in 24.4%. A small proportion of patients experienced post-operative stroke (7.3%) or renal failure (9.1%). While the occurrence of cardiac shock was high in this group, the overall mortality was only 9.1%. Although the mortality seen in reoperative cases was 27.3%, the highest mortality was associated with the patients who required the most extensive procedures (AAD+AVR+CABG). Table 1 presents the demographics and surgical outcomes in the 3 different surgical approaches.

Conclusion:

The implementation of a multidisciplinary aortic program at our institution has allowed the development of protocols in the ED that expedite the management of patients with AAD. The data presented here supports the concept that community-based hospitals, if well-organized, can deliver excellent care to patients with acute aortic emergencies, including those that present in cardiogenic shock.

70.05 Socioeconomic Realities of Elevated Pre-Operative Hemoglobin A1c and Risk of Cardiothoracic Surgery

H. Wu1, S. Eells2, S. Vangala3, N. Satou1, R. Shemin1, P. Benharash1, J. A. McKinnell2  1David Geffen School Of Medicine At UCLA,Cardiac Surgery,Los Angeles, CA, USA 2Los Angeles Biomedical Research Institute At Harbor–UCLA Medical Center,Infectious Diseases,Torrance, CA, USA 3David Geffen School Of Medicine At UCLA,General Internal Medicine And Health Services Research,Los Angeles, CA, USA

Introduction:  Socioeconomic factors, such as access to care, insurance, and medical literacy, are known to play an important role in adequate diabetes control. Patients with poorly controlled diabetes mellitus (DM) (HgbA1c≥7) have higher post-operative mortality and are more likely to suffer complications, including myocardial infarction (MI) and infection following cardiothoracic (CT) surgery. We hypothesize that poor diabetic control among patients undergoing CT surgery is associated with socioeconomic factors.

Methods:  We conducted a retrospective cohort investigation of diabetic patients undergoing CT surgery (n=751). Utilizing the UCLA Society of Thoracic Surgeons Adult Cardiac Database, we included patients with recorded HgbA1c value and DM. Bivariate and multivariate logistic regression used to define predictors of poorly controlled diabetes (pre-operative HgbA1c≥7).

Results: Factors associated with pre-operative HgbA1c≥7 in bivariate analysis included lack of insurance (OR 4.75, 95% CI 1.82-12.4, P<0.01), receipt of diabetes therapy (OR 2.01, 95% CI 1.08-3.72, P=0.03), and previous MI (OR 1.71, 95% CI 1.24-2.35, P<0.01). In multivariate analysis when controlling for age, gender, and race/ethnicity, lack of insurance (OR 4.96, 95% CI 1.6-15.4, P<0.01), diabetes therapy (OR 2.27, 95% CI 1.12-4.61, P=0.02), and previous myocardial infarction (OR 1.64, 95% CI 1.15-2.35, P<0.01) was associated with poorly controlled diabetes. 

Conclusion: Socioeconomic factors (lack medical insurance) was independently associated with poorly controlled diabetes prior to CT surgery. The only comorbid medical condition in this investigation independently associated with pre-operative HgbA1c was prior MI. In an era of highly effective medical therapy for diabetes, the social determinants of healthcare surpass traditional medical or biologic predictors for effective diabetes control. As poorly controlled diabetes prior to surgery is associated with poor outcomes afterwards, future research is required to understand the interplay of social determinants of health, diabetic control, and outcomes from cardiac surgery.

 

70.06 Hypertension Risk among Cancer Patients Treated with Sunitib: a Systematic Review and Meta-analysis

S. Lew1,3, R. S. Chamberlain1,2,3  1Saint Barnabas Medical Center,Department Of Surgery,Livingston, NJ, USA 2New Jersey Medical School, Rutgers University,Department Of Surgery,Newark, NJ, USA 3Saint George’s University,School Of Medicine,True Blue, , Grenada

Introduction:  Sunitinib is a multi-targeted tyrosine kinase inhibitor widely used in cancer therapy, which has been linked to varying degrees of treatment related hypertension (HTN). Current publications contain wide variations in the incidence and severity of sunitinib-related HTN. In addition, the HTN risk uniquely associated with two current mainstay sunitinib regimens, 37.5mg continuous dosing and 50mg 4 weeks on, 2 weeks off dosing, has not been well established.  This study sought to determine the incidence of sunitinib-associated severe HTN, to stratify HTN risk based on the type of malignancy treated, and to investigate the dosage related risk of severe HTN.

Methods:  A comprehensive literature search of PubMed, Google Scholar and the Cochrane Central Registry of Controlled Trials was completed. Keywords searched were ‘sunitinib’, ‘sutent’, ‘SU11248’, ‘hypertension’, and ‘clinical trial’. All clinical trials were analyzed for patient recruitment, intervention, and outcomes. Incidence and risk ratio (RR) were calculated with 95% confidence intervals.

Results: 61 single or double arm, phase II/III clinical trials involving 6,813 patients treated with sunitinib were identified. The incidence of sunitinib associated severe (grade 3-4) HTN was 6.5% (95% CI [4.9-8.6]; p<0.0001) among 4,311 patients in 50 clinical trials in which sunitinib monotherapy was used in a single arm trial or RCT.  Among 4,433 Sunitinib treated in 18 RCTs, the risk of severe HTN in the treatment group was 5% compared to 2% in the control (RR 2.82, 95% CI [1.63-4.88]; p=0.0002). The risk of sunitinib associated severe HTN in RCC patients was not higher than the control (RR 1.38, 95%CI [0.64-2.96]; p=0.41) but it was significantly different from all other malignancies (p=0.05). The incidence and risk of severe sunitinib-associated HTN were not significantly different among breast cancer (BC) and non-BC, Gastrointestinal stromal tumor (GIST) and non-GIST, between 37.5mg continuous and 50mg intermittent dosage regimens, or when monotherapy and/or concomitant chemotherapy were used. Significant heterogeneity was found among identified trials with regards to underlying malignancies, dosage of the treatment, and duration of the treatment.

Conclusions: Sunitinib treatment is associated with a significantly increased risk of all- and high-grade HTN. The differences in dosing regimens or combinational chemotherapy with sunitinib did not yield significant hypertension risk reduction. Although adequately powered large studies are needed to further investigate the contributing HTN risk factors and ideal management or prevention, blood pressure should be carefully monitored in patients treated with sunitinib to prevent cardiovascular complications.

70.07 Selective neck dissection for breast cancer with isolated supraclavicular lymph node recurrence

Y. CHO1, Y. Jang1, S. Kim1  1Inha University Hospital,Surgery/Inha University School Of Medicine,Incheon, , South Korea

Introduction:   Isolated supraclavicular lymph node metastasis(iSLNM) in breast cancer can be managed with surgery, systemic therapy with/or radiotherapy. We performed this study to analyze the survival and outcome of selective neck node dissection of breast cancer patients with isolated supraclavicular lymph node metastasis.

Methods: A total of 1,602 consecutive women with primary breast cancer who received surgical treatment in the single institute from 2004 to 2013 were included in this study. iSLNM were defined as only isolated supraclavicular neck nodal recurrence without any local/regional or systemic recurrence. All iSLNM had proved with tissue or imaging studies before surgery. Selective neck dissection defined as curative intent to remove all nodes and soft tissue in neck level IV and part of III and V. The clinical and biological features, the overall survival and disease free survival were analyzed for selective neck dissection.

Results: Of the 1,602 patients, five(0.3%) developed iSLNM during the period. All iSLNM patients had pathologic proof of ISLNM without evidence of any other regional or distant spread by imaging studies at the event. All of iSLNM patients had selective neck dissections according to identified lymph node metastasis. Three of iSLNM were noted at ipsilateral neck, and two in contralateral neck. Mean duration of iSLNM from primary surgery for breast cancer was 44.4 months. All iSLNM patients survive in 48, 32, 57, 14 and 10 months after selective neck dissection, two patients developed liver metastasis and one lung metastasis during the period.

Conclusion: The development of iSLNM may be a bad sign for distant metastasis in breast cancer patient. But aggressive surgical treatment should be established on iSLNM.