logo logo
Association between admission temperature and mortality and major morbidity in preterm infants born at fewer than 33 weeks' gestation. Lyu Yanyu,Shah Prakesh S,Ye Xiang Y,Warre Ruth,Piedboeuf Bruno,Deshpandey Akhil,Dunn Michael,Lee Shoo K, JAMA pediatrics IMPORTANCE:Neonatal hypothermia has been associated with higher mortality and morbidity; therefore, thermal control following delivery is an essential part of neonatal care. Identifying the ideal body temperature in preterm neonates in the first few hours of life may be helpful to reduce the risk for adverse outcomes. OBJECTIVES:To examine the association between admission temperature and neonatal outcomes and estimate the admission temperature associated with lowest rates of adverse outcomes in preterm infants born at fewer than 33 weeks' gestation. DESIGN, SETTING, AND PARTICIPANTS:Retrospective observational study at 29 neonatal intensive care units in the Canadian Neonatal Network. Participants included 9833 inborn infants born at fewer than 33 weeks' gestation who were admitted between January 1, 2010, and December 31, 2012. EXPOSURE:Axillary or rectal body temperature recorded at admission. MAIN OUTCOMES AND MEASURES:The primary outcome was a composite adverse outcome defined as mortality or any of the following: severe neurological injury, severe retinopathy of prematurity, necrotizing enterocolitis, bronchopulmonary dysplasia, or nosocomial infection. The relationships between admission temperature and the composite outcome as well as between admission temperature and the components of the composite outcome were evaluated using multivariable analyses. RESULTS:Admission temperatures of the 9833 neonates were distributed as follows: lower than 34.5°C (1%); 34.5°C to 34.9°C (1%); 35.0°C to 35.4°C (3%); 35.5°C to 35.9°C (7%); 36.0°C to 36.4°C (24%); 36.5°C to 36.9°C (38%); 37.0°C to 37.4°C (19%); 37.5°C to 37.9°C (5%); and 38.0°C or higher (2%). After adjustment for maternal and infant characteristics, the rates of the composite outcome, severe neurological injury, severe retinopathy of prematurity, necrotizing enterocolitis, bronchopulmonary dysplasia, and nosocomial infection had a U-shaped relationship with admission temperature (α > 0 [P < .05]). The admission temperature at which the rate of the composite outcome was lowest was 36.8°C (95% CI, 36.7°C-37.0°C). Rates of severe neurological injury, severe retinopathy of prematurity, necrotizing enterocolitis (95% CI, 36.3°C-36.7°C), bronchopulmonary dysplasia, and nosocomial infection (95% CI, 36.9°C-37.3°C) were lowest at admission temperatures ranging from 36.5°C to 37.2°C. CONCLUSIONS AND RELEVANCE:The relationship between admission temperature and adverse neonatal outcomes was U-shaped. The lowest rates of adverse outcomes were associated with admission temperatures between 36.5°C and 37.2°C. 10.1001/jamapediatrics.2015.0277
Elevated cord serum manganese level is associated with a neonatal high ponderal index. Yu XiaoDan,Cao LuLu,Yu XiaoGang Environmental research BACKGROUND:The effects of low-level prenatal manganese (Mn) exposure on neonatal growth remain unclear. The level of fetal Mn that may be considered "safe" has never been examined. METHODS:A multicenter study including 1377 mother-infant pairs was conducted from 2008 through 2009 in Shanghai. Mn concentrations were determined for both the cord and maternal serum, as well as neonatal birth weight and birth length. The ponderal index (PI) was calculated as (birth weight g/birth length cm(3))×100, and a ponderal index ≥3.17 was defined as a high ponderal index (HPI). RESULTS:The median serum Mn concentration was 4.0μg/L in the cord blood, and was 2.8μg/L in maternal blood. Of 1377 infants, 135 (9.8%) had a HPI. After adjusting for potential confounders, cord serum Mn was not associated with birth weight. However, there was a linear relationship between the cord serum Mn and the birth length (adjusted ß=-0.5, 95% CI=-0.7 to -0.2, p<0.0001). Additionally, a nonlinear relationship was observed between the cord serum Mn and the ponderal index, and between the cord serum Mn and HPI. The ponderal index and the prevalence of HPI increased with Mn levels above 5.0μg/L (Log Mn ≥0.7). A high level of Mn in the cord (≥5.0μg/L) was associated with a higher ponderal index (adjusted ß=0.2, 95% CI=0.1 to 0.2, p<0.001) and a high risk of HPI (adjusted OR=3.3, 95% CI=1.8-6.0, p<0.001). CONCLUSIONS:Higher prenatal Mn exposure, even at a low level, is associated with a higher prevalence of HPI in a nonlinear pattern. Cord serum Mn levels less than 5.0μg/L may be considered safe with respect to neonatal ponderal index assessment. 10.1016/j.envres.2012.11.002
Association between hyperglycaemia and adverse perinatal outcomes in south Asian and white British women: analysis of data from the Born in Bradford cohort. Farrar Diane,Fairley Lesley,Santorelli Gillian,Tuffnell Derek,Sheldon Trevor A,Wright John,van Overveld Lydia,Lawlor Debbie A The lancet. Diabetes & endocrinology BACKGROUND:Diagnosis of gestational diabetes predicts risk of infants who are large for gestational age (LGA) and with high adiposity, which in turn aims to predict a future risk of obesity in the offspring. South Asian women have higher risk of gestational diabetes, lower risk of LGA, and on average give birth to infants with greater adiposity than do white European women. Whether the same diagnostic criteria for gestational diabetes should apply to both groups of women is unclear. We aimed to assess the association between maternal glucose and adverse perinatal outcomes to ascertain whether thresholds used to diagnose gestational diabetes should differ between south Asian and white British women. We also aimed to assess whether ethnic origin affected prevalence of gestational diabetes irrespective of criteria used. METHODS:We used data (including results of a 26-28 week gestation oral glucose tolerance test) of women from the Born in Bradford study, a prospective study that recruited women attending the antenatal clinic at the Bradford Royal Infirmary, UK, between 2007 and 2011 and who intended to give birth to their infant in that hospital. We studied the association between fasting and 2 h post-load glucose and three primary outcomes (LGA [defined as birthweight >90th percentile for gestational age], high infant adiposity [sum of skinfolds >90th percentile for gestational age], and caesarean section). We calculated adjusted odds ratios (ORs) and their 95% confidence intervals (CIs) for a 1 SD increase in fasting and post-load glucose. We established fasting and post-load glucose thresholds that equated to an OR of 1·75 for LGA and high infant adiposity in each group of women to identify ethnic-specific criteria for diagnosis of gestational diabetes. FINDINGS:Of 13,773 pregnancies, 3420 were excluded from analyses. Of 10,353 eligible pregnancies, 4088 women were white British, 5408 were south Asian, and 857 were of other ethnic origin. The adjusted ORs of LGA per 1 SD fasting glucose were 1·22 (95% CI 1·08-1·38) in white British women and 1·43 (1·23-1·67) in south Asian women (pinteraction with ethnicity = 0·39). Results for high infant adiposity were 1·35 (1·23-1·49) and 1·35 (1·18-1·54; pinteraction with ethnicity=0·98), and for caesarean section they were 1·06 (0·97-1·16) and 1·11 (1·02-1·20; pinteraction with ethnicity=0·47). Associations between post-load glucose and the three primary outcomes were weaker than for fasting glucose. A fasting glucose concentration of 5·4 mmol/L or a 2 h post-load level of 7·5 mmol/L identified white British women with 75% or higher relative risk of LGA or high infant adiposity; in south Asian women, the cutoffs were 5·2 mmol/L or 7·2 mml/L; in the whole cohort, the cutoffs were 5·3 mmol/L or 7·5 mml/L. The prevalence of gestational diabetes in our cohort ranged from 1·2% to 8·7% in white British women and 4% to 24% in south Asian women using six different criteria. Compared with the application of our whole-cohort criteria, use of our ethnic-specific criteria increased the prevalence of gestational diabetes in south Asian women from 17·4% (95% CI 16·4-18·4) to 24·2% (23·1-25·3). INTERPRETATION:Our data support the use of lower fasting and post-load glucose thresholds to diagnose gestational diabetes in south Asian than white British women. They also suggest that diagnostic criteria for gestational diabetes recommended by UK NICE might underestimate the prevalence of gestational diabetes compared with our criteria or those recommended by the International Association of Diabetes and Pregnancy Study Groups and WHO, especially in south Asian women. FUNDING:The National Institute for Health Research. 10.1016/S2213-8587(15)00255-7
Association of Coffee Consumption With Total and Cause-Specific Mortality Among Nonwhite Populations. Park Song-Yi,Freedman Neal D,Haiman Christopher A,Le Marchand Loïc,Wilkens Lynne R,Setiawan Veronica Wendy Annals of internal medicine BACKGROUND:Coffee consumption has been associated with reduced risk for death in prospective cohort studies; however, data in nonwhites are sparse. OBJECTIVE:To examine the association of coffee consumption with risk for total and cause-specific death. DESIGN:The MEC (Multiethnic Cohort), a prospective population-based cohort study established between 1993 and 1996. SETTING:Hawaii and Los Angeles, California. PARTICIPANTS:185 855 African Americans, Native Hawaiians, Japanese Americans, Latinos, and whites aged 45 to 75 years at recruitment. MEASUREMENTS:Outcomes were total and cause-specific mortality between 1993 and 2012. Coffee intake was assessed at baseline by means of a validated food-frequency questionnaire. RESULTS:58 397 participants died during 3 195 484 person-years of follow-up (average follow-up, 16.2 years). Compared with drinking no coffee, coffee consumption was associated with lower total mortality after adjustment for smoking and other potential confounders (1 cup per day: hazard ratio [HR], 0.88 [95% CI, 0.85 to 0.91]; 2 to 3 cups per day: HR, 0.82 [CI, 0.79 to 0.86]; ≥4 cups per day: HR, 0.82 [CI, 0.78 to 0.87]; P for trend < 0.001). Trends were similar between caffeinated and decaffeinated coffee. Significant inverse associations were observed in 4 ethnic groups; the association in Native Hawaiians did not reach statistical significance. Inverse associations were also seen in never-smokers, younger participants (<55 years), and those who had not previously reported a chronic disease. Among examined end points, inverse associations were observed for deaths due to heart disease, cancer, respiratory disease, stroke, diabetes, and kidney disease. LIMITATION:Unmeasured confounding and measurement error, although sensitivity analysis suggested that neither was likely to affect results. CONCLUSION:Higher consumption of coffee was associated with lower risk for death in African Americans, Japanese Americans, Latinos, and whites. PRIMARY FUNDING SOURCE:National Cancer Institute. 10.7326/M16-2472
Association of peripheral differential leukocyte counts with dyslipidemia risk in Chinese patients with hypertension: insight from the China Stroke Primary Prevention Trial. Liu Yanhong,Kong Xiangyi,Wang Wen,Fan Fangfang,Zhang Yan,Zhao Min,Wang Yi,Wang Yupeng,Wang Yu,Qin Xianhui,Tang Genfu,Wang Binyan,Xu Xiping,Hou Fan Fan,Gao Wei,Sun Ningling,Li Jianping,Venners Scott A,Jiang Shanqun,Huo Yong Journal of lipid research The aim of the present study was to examine the association between peripheral differential leukocyte counts and dyslipidemia in a Chinese hypertensive population. A total of 10,866 patients with hypertension were enrolled for a comprehensive assessment of cardiovascular risk factors using data from the China Stroke Primary Prevention Trial. Plasma lipid levels and total leukocyte, neutrophil, and lymphocyte counts were determined according to standard methods. Peripheral differential leukocyte counts were consistently and positively associated with serum total cholesterol (TC), LDL cholesterol (LDL-C), and TG levels (all P < 0.001 for trend), while inversely associated with HDL cholesterol levels (P < 0.05 for trend). In subsequent analyses where serum lipids were dichotomized (dyslipidemia/normolipidemia), we found that patients in the highest quartile of total leukocyte count (≥7.6 × 10 cells/l) had 1.64 times the risk of high TG [95% confidence interval (CI): 1.46, 1.85], 1.34 times the risk of high TC (95% CI: 1.20, 1.50), and 1.24 times the risk of high LDL-C (95% CI: 1.12, 1.39) compared with their counterparts in the lowest quartile of total leukocyte count. Similar patterns were also observed with neutrophils and lymphocytes. In summary, these findings indicate that elevated differential leukocyte counts are directly associated with serum lipid levels and increased odds of dyslipidemia. 10.1194/jlr.P067686
CDX2 as a Prognostic Biomarker in Stage II and Stage III Colon Cancer. Dalerba Piero,Sahoo Debashis,Paik Soonmyung,Guo Xiangqian,Yothers Greg,Song Nan,Wilcox-Fogel Nate,Forgó Erna,Rajendran Pradeep S,Miranda Stephen P,Hisamori Shigeo,Hutchison Jacqueline,Kalisky Tomer,Qian Dalong,Wolmark Norman,Fisher George A,van de Rijn Matt,Clarke Michael F The New England journal of medicine Background The identification of high-risk stage II colon cancers is key to the selection of patients who require adjuvant treatment after surgery. Microarray-based multigene-expression signatures derived from stem cells and progenitor cells hold promise, but they are difficult to use in clinical practice. Methods We used a new bioinformatics approach to search for biomarkers of colon epithelial differentiation across gene-expression arrays and then ranked candidate genes according to the availability of clinical-grade diagnostic assays. With the use of subgroup analysis involving independent and retrospective cohorts of patients with stage II or stage III colon cancer, the top candidate gene was tested for its association with disease-free survival and a benefit from adjuvant chemotherapy. Results The transcription factor CDX2 ranked first in our screening test. A group of 87 of 2115 tumor samples (4.1%) lacked CDX2 expression. In the discovery data set, which included 466 patients, the rate of 5-year disease-free survival was lower among the 32 patients (6.9%) with CDX2-negative colon cancers than among the 434 (93.1%) with CDX2-positive colon cancers (hazard ratio for disease recurrence, 3.44; 95% confidence interval [CI], 1.60 to 7.38; P=0.002). In the validation data set, which included 314 patients, the rate of 5-year disease-free survival was lower among the 38 patients (12.1%) with CDX2 protein-negative colon cancers than among the 276 (87.9%) with CDX2 protein-positive colon cancers (hazard ratio, 2.42; 95% CI, 1.36 to 4.29; P=0.003). In both these groups, these findings were independent of the patient's age, sex, and tumor stage and grade. Among patients with stage II cancer, the difference in 5-year disease-free survival was significant both in the discovery data set (49% among 15 patients with CDX2-negative tumors vs. 87% among 191 patients with CDX2-positive tumors, P=0.003) and in the validation data set (51% among 15 patients with CDX2-negative tumors vs. 80% among 106 patients with CDX2-positive tumors, P=0.004). In a pooled database of all patient cohorts, the rate of 5-year disease-free survival was higher among 23 patients with stage II CDX2-negative tumors who were treated with adjuvant chemotherapy than among 25 who were not treated with adjuvant chemotherapy (91% vs. 56%, P=0.006). Conclusions Lack of CDX2 expression identified a subgroup of patients with high-risk stage II colon cancer who appeared to benefit from adjuvant chemotherapy. (Funded by the National Comprehensive Cancer Network, the National Institutes of Health, and others.). 10.1056/NEJMoa1506597
Consistency of blood pressure control after ischemic stroke: prevalence and prognosis. Towfighi Amytis,Markovic Daniela,Ovbiagele Bruce Stroke BACKGROUND AND PURPOSE:Blood pressure (BP) reduction lowers vascular risk after stroke; however, little is known about the relationship between consistency of BP control and risk of subsequent vascular events. METHODS:In this post hoc analysis of the Vitamin Intervention for Stroke Prevention trial (n=3680), individuals with recent (<120 days) stroke, followed up for 2 years, were divided according to proportion of visits in which BP was controlled (<140/90 mm Hg): <25%, 25% to 49%, 50% to 74%, and ≥75%. Multivariable models adjusting for demographic and clinical variables determined the association between consistency of BP control versus primary (stroke) and secondary (stroke, myocardial infarction, or vascular death) outcomes. RESULTS:Only 30% of participants had BP controlled ≥75% of the time. Consistency of BP control affected outcomes in individuals with baseline systolic BP>132 mm Hg. Among individuals with baseline systolic BP>75th percentile (>153 mm Hg), risks of primary and secondary outcomes were lower in those with BP controlled ≥75% versus <25% of visits (adjusted hazard ratio, 0.46; 95% confidence interval, 0.26-0.84 and adjusted hazard ratio, 0.51; 95% confidence interval, 0.32-0.82). Individuals with mean follow-up BP<140/90 mm Hg had lower risk of primary and secondary outcomes than those with BP≥140/90 mm Hg (adjusted hazard ratio, 0.76; 95% confidence interval, 0.59-0.98 and adjusted hazard ratio, 0.76; 95% confidence interval, 0.62-0.92). CONCLUSIONS:In this rigorous clinical trial, fewer than one third of patients with stroke had BP controlled ≥75% of the time for 2 years. Furthermore, consistency of BP control among those with elevated baseline systolic BP was linked to reduction in risk of recurrent stroke and stroke, myocardial infarction, and vascular death. 10.1161/STROKEAHA.113.001900
Fractional flow reserve-guided PCI versus medical therapy in stable coronary disease. De Bruyne Bernard,Pijls Nico H J,Kalesan Bindu,Barbato Emanuele,Tonino Pim A L,Piroth Zsolt,Jagic Nikola,Möbius-Winkler Sven,Mobius-Winckler Sven,Rioufol Gilles,Witt Nils,Kala Petr,MacCarthy Philip,Engström Thomas,Oldroyd Keith G,Mavromatis Kreton,Manoharan Ganesh,Verlee Peter,Frobert Ole,Curzen Nick,Johnson Jane B,Jüni Peter,Fearon William F, The New England journal of medicine BACKGROUND:The preferred initial treatment for patients with stable coronary artery disease is the best available medical therapy. We hypothesized that in patients with functionally significant stenoses, as determined by measurement of fractional flow reserve (FFR), percutaneous coronary intervention (PCI) plus the best available medical therapy would be superior to the best available medical therapy alone. METHODS:In patients with stable coronary artery disease for whom PCI was being considered, we assessed all stenoses by measuring FFR. Patients in whom at least one stenosis was functionally significant (FFR, ≤0.80) were randomly assigned to FFR-guided PCI plus the best available medical therapy (PCI group) or the best available medical therapy alone (medical-therapy group). Patients in whom all stenoses had an FFR of more than 0.80 were entered into a registry and received the best available medical therapy. The primary end point was a composite of death, myocardial infarction, or urgent revascularization. RESULTS:Recruitment was halted prematurely after enrollment of 1220 patients (888 who underwent randomization and 332 enrolled in the registry) because of a significant between-group difference in the percentage of patients who had a primary end-point event: 4.3% in the PCI group and 12.7% in the medical-therapy group (hazard ratio with PCI, 0.32; 95% confidence interval [CI], 0.19 to 0.53; P<0.001). The difference was driven by a lower rate of urgent revascularization in the PCI group than in the medical-therapy group (1.6% vs. 11.1%; hazard ratio, 0.13; 95% CI, 0.06 to 0.30; P<0.001); in particular, in the PCI group, fewer urgent revascularizations were triggered by a myocardial infarction or evidence of ischemia on electrocardiography (hazard ratio, 0.13; 95% CI, 0.04 to 0.43; P<0.001). Among patients in the registry, 3.0% had a primary end-point event. CONCLUSIONS:In patients with stable coronary artery disease and functionally significant stenoses, FFR-guided PCI plus the best available medical therapy, as compared with the best available medical therapy alone, decreased the need for urgent revascularization. In patients without ischemia, the outcome appeared to be favorable with the best available medical therapy alone. (Funded by St. Jude Medical; ClinicalTrials.gov number, NCT01132495.). 10.1056/NEJMoa1205361
Afatinib versus erlotinib as second-line treatment of patients with advanced squamous cell carcinoma of the lung (LUX-Lung 8): an open-label randomised controlled phase 3 trial. Soria Jean-Charles,Felip Enriqueta,Cobo Manuel,Lu Shun,Syrigos Konstantinos,Lee Ki Hyeong,Göker Erdem,Georgoulias Vassilis,Li Wei,Isla Dolores,Guclu Salih Z,Morabito Alessandro,Min Young J,Ardizzoni Andrea,Gadgeel Shirish M,Wang Bushi,Chand Vikram K,Goss Glenwood D, The Lancet. Oncology BACKGROUND:There is a major unmet need for effective treatments in patients with squamous cell carcinoma of the lung. LUX-Lung 8 compared afatinib (an irreversible ErbB family blocker) with erlotinib (a reversible EGFR tyrosine kinase inhibitor), as second-line treatment for patients with advanced squamous cell carcinoma of the lung. METHODS:We did this open-label, phase 3 randomised controlled trial at 183 cancer centres in 23 countries worldwide. We enrolled adults with stage IIIB or IV squamous cell carcinoma of the lung who had progressed after at least four cycles of platinum-based-chemotherapy. Participants were randomly assigned (1:1) to receive afatinib (40 mg per day) or erlotinib (150 mg per day) until disease progression. The randomisation was done centrally with an interactive voice or web-based response system and stratified by ethnic origin (eastern Asian vs non-eastern Asian). Clinicians and patients were not masked to treatment allocation. The primary endpoint was progression-free survival assessed by independent central review (intention-to-treat population). The key secondary endpoint was overall survival. This trial is registered with ClinicalTrials.gov, NCT01523587. FINDINGS:795 eligible patients were randomly assigned (398 to afatinib, 397 to erlotinib). Median follow-up at the time of the primary analysis of progression-free survival was 6·7 months (IQR 3·1-10·2), at which point enrolment was not complete. Progression free-survival at the primary analysis was significantly longer with afatinib than with erlotinib (median 2·4 months [95% CI 1·9-2·9] vs 1·9 months [1·9-2·2]; hazard ratio [HR] 0·82 [95% CI 0·68-1·00], p=0·0427). At the time of the primary analysis of overall survival (median follow-up 18·4 months [IQR 13·8-22·4]), overall survival was significantly greater in the afatinib group than in the erloinib group (median 7·9 months [95% CI 7·2-8·7] vs 6·8 months [5·9-7·8]; HR 0·81 [95% CI 0·69-0·95], p=0·0077), as were progression-free survival (median 2·6 months [95% CI 2·0-2·9] vs 1·9 months [1·9-2·1]; HR 0·81 [95% CI 0·69-0·96], p=0·0103) and disease control (201 [51%] of 398 patients vs 157 [40%] of 397; p=0·0020). The proportion of patients with an objective response did not differ significantly between groups (22 [6%] vs 11 [3%]; p=0·0551). Tumour shrinkage occurred in 103 (26%) of 398 patients versus 90 (23%) of 397 patients. Adverse event profiles were similar in each group: 224 (57%) of 392 patients in the afatinib group versus 227 (57%) of 395 in the erlotinib group had grade 3 or higher adverse events. We recorded higher incidences of treatment-related grade 3 diarrhoea with afatinib (39 [10%] vs nine [2%]), of grade 3 stomatitis with afatinib (16 [4%] vs none), and of grade 3 rash or acne with erlotinib (23 [6%] vs 41 [10%]). INTERPRETATION:The significant improvements in progression-free survival and overall survival with afatinib compared with erlotinib, along with a manageable safety profile and the convenience of oral administration suggest that afatinib could be an additional option for the treatment of patients with squamous cell carcinoma of the lung. FUNDING:Boehringer Ingelheim. 10.1016/S1470-2045(15)00006-6
Longitudinal association between fasting blood glucose concentrations and first stroke in hypertensive adults in China: effect of folic acid intervention. The American journal of clinical nutrition Diabetes is a known risk factor for stroke, but data on its prospective association with first stroke are limited. Folic acid supplementation has been shown to protect against first stroke, but its role in preventing first stroke in diabetes is unknown. This post hoc analysis of the China Stroke Primary Prevention Trial tested the hypotheses that the fasting blood glucose (FBG) concentration is positively associated with first stroke risk and that folic acid treatment can reduce stroke risk associated with elevated fasting glucose concentrations. This analysis included 20,327 hypertensive adults without a history of stroke or myocardial infarction, who were randomly assigned to a double-blind daily treatment with 10 mg enalapril and 0.8 mg folic acid ( = 10,160) or 10 mg enalapril alone ( = 10,167). Kaplan-Meier survival analysis and Cox proportionate hazard models were used to test the hypotheses with adjustment for pertinent covariables. During a median treatment duration of 4.5 y, 616 participants developed a first stroke (497 ischemic strokes). A high FBG concentration (≥7.0 mmol/L) or diabetes, compared with a low FBG concentration (<5.0 mmol/L), was associated with an increased risk of first stroke (6.0% compared with 2.6%, respectively; HR: 1.9; 95% CI: 1.3, 2.8; < 0.001). Folic acid treatment reduced the risk of stroke across a wide range of FBG concentrations ≥5.0 mmol/L, but risk reduction was greatest in subjects with FBG concentrations ≥7.0 mmol/L or with diabetes (HR: 0.66; 95% CI: 0.46, 0.97; < 0.05). There was a significant interactive effect of FBG and folic acid treatment on first stroke ( = 0.01). In Chinese hypertensive adults, an FBG concentration ≥7.0 mmol/L or diabetes is associated with an increased risk of first stroke; this increased risk is reduced by 34% with folic acid treatment. These findings warrant additional investigation. This trial was registered at clinicaltrials.gov as NCT00794885. 10.3945/ajcn.116.145656
Comparative effectiveness of clopidogrel in medically managed patients with unstable angina and non-ST-segment elevation myocardial infarction. Solomon Matthew D,Go Alan S,Shilane David,Boothroyd Derek B,Leong Thomas K,Kazi Dhruv S,Chang Tara I,Hlatky Mark A Journal of the American College of Cardiology OBJECTIVES:This study sought to examine the effectiveness of clopidogrel in real-world, medically managed patients with unstable angina (UA) or non-ST-segment elevation myocardial infarction (NSTEMI). BACKGROUND:Although clinical trials have demonstrated the efficacy of clopidogrel to reduce cardiovascular (CV) morbidity and mortality in medically managed patients with UA or NSTEMI, the effectiveness of clopidogrel in actual clinical practice is less certain. METHODS:A retrospective cohort study was conducted of Kaiser Permanente Northern California members without known coronary artery disease or prior clopidogrel use who presented with UA or NSTEMI between 2003 and 2008 and were medically managed (i.e., no percutaneous coronary intervention or coronary artery bypass grafting during the index hospitalization or within 7 days post-discharge). Over 2 years of follow-up, we measured the association between clopidogrel use and all-cause mortality, hospital stay for MI, and a composite endpoint of death or MI using propensity-matched multivariable Cox analyses. RESULTS:We identified 16,365 patients with incident UA (35%) or NSTEMI (65%); 36% of these patients were prescribed clopidogrel within 7 days of discharge. In 8,562 propensity score-matched patients, clopidogrel users had lower rates of all-cause mortality (8.3% vs. 13.0%; p < 0.01; adjusted hazard ratio [HR]: 0.63; 95% confidence interval [CI]: 0.54 to 0.72) and the composite of death or MI (13.5% vs. 17.4%; p < 0.01; HR: 0.74, CI: 0.66 to 0.84), but not MI alone (6.7% vs. 7.2%; p = 0.30; HR: 0.93, CI: 0.78 to 1.11), compared with nonusers of clopidogrel. The association between clopidogrel use and the composite of death or MI was significant only among patients presenting with NSTEMI (HR: 0.67; CI: 0.59 to 0.76; pint < 0.01), not among those presenting with UA (HR: 1.25; CI: 0.94 to 1.67). CONCLUSIONS:In a large, community-based cohort of patients who were medically managed after UA/NSTEMI, clopidogrel use was associated with a lower risk of death and MI, particularly among patients with NSTEMI. 10.1016/j.jacc.2014.02.586
Tofu consumption and blood lead levels in young Chinese adults. American journal of epidemiology Tofu is a commonly consumed food in China. Tofu may interfere with lead absorption and retention because of its high calcium content. In this observational study, the authors examined whether dietary tofu intake was associated with blood lead levels among young adults in Shenyang, China. The analyses included 605 men and 550 women who completed baseline questionnaires and had blood lead measurements taken in 1996-1998 as part of a prospective cohort study on reproductive health. Mean blood lead levels were 13.2 microg/dl in men and 10.1 microg/dl in women. Blood lead levels were negatively associated with tofu intake in both genders. A linear trend test showed a 3.7% (0.5-microg/dl) decrease in blood lead level with each higher category of tofu intake (p = 0.003). The highest tofu intake group (> or =750 g/week) had blood lead levels 11.3% lower (95% confidence interval: 4.1, 18.0) than those of the lowest tofu intake group (<250 g/week). In all regression models, data were adjusted for gender, age, height, body mass index, district, cigarette smoking, alcohol drinking, education, occupation, use of vitamin supplements, season, and dietary intake of meat, fish, vegetables, eggs, and milk. In conclusion, the authors found a significant inverse dose-response relation between tofu consumption and blood lead levels in this Chinese population. 10.1093/aje/153.12.1206
Antidepressant use in pregnancy and the risk of cardiac defects. Huybrechts Krista F,Palmsten Kristin,Avorn Jerry,Cohen Lee S,Holmes Lewis B,Franklin Jessica M,Mogun Helen,Levin Raisa,Kowal Mary,Setoguchi Soko,Hernández-Díaz Sonia The New England journal of medicine BACKGROUND:Whether the use of selective serotonin-reuptake inhibitors (SSRIs) and other antidepressants during pregnancy is associated with an increased risk of congenital cardiac defects is uncertain. In particular, there are concerns about a possible association between paroxetine use and right ventricular outflow tract obstruction and between sertraline use and ventricular septal defects. METHODS:We performed a cohort study nested in the nationwide Medicaid Analytic eXtract for the period 2000 through 2007. The study included 949,504 pregnant women who were enrolled in Medicaid during the period from 3 months before the last menstrual period through 1 month after delivery and their liveborn infants. We compared the risk of major cardiac defects among infants born to women who took antidepressants during the first trimester with the risk among infants born to women who did not use antidepressants, with an unadjusted analysis and analyses that restricted the cohort to women with depression and that used propensity-score adjustment to control for depression severity and other potential confounders. RESULTS:A total of 64,389 women (6.8%) used antidepressants during the first trimester. Overall, 6403 infants who were not exposed to antidepressants were born with a cardiac defect (72.3 infants with a cardiac defect per 10,000 infants), as compared with 580 infants with exposure (90.1 per 10,000 infants). Associations between antidepressant use and cardiac defects were attenuated with increasing levels of adjustment for confounding. The relative risks of any cardiac defect with the use of SSRIs were 1.25 (95% confidence interval [CI], 1.13 to 1.38) in the unadjusted analysis, 1.12 (95% CI, 1.00 to 1.26) in the analysis restricted to women with depression, and 1.06 (95% CI, 0.93 to 1.22) in the fully adjusted analysis restricted to women with depression. We found no significant association between the use of paroxetine and right ventricular outflow tract obstruction (relative risk, 1.07; 95% CI, 0.59 to 1.93) or between the use of sertraline and ventricular septal defects (relative risk, 1.04; 95% CI, 0.76 to 1.41). CONCLUSIONS:The results of this large, population-based cohort study suggested no substantial increase in the risk of cardiac malformations attributable to antidepressant use during the first trimester. (Funded by the Agency for Healthcare Research and Quality and the National Institutes of Health.). 10.1056/NEJMoa1312828
Efficacy of folic acid therapy in primary prevention of stroke among adults with hypertension in China: the CSPPT randomized clinical trial. Huo Yong,Li Jianping,Qin Xianhui,Huang Yining,Wang Xiaobin,Gottesman Rebecca F,Tang Genfu,Wang Binyan,Chen Dafang,He Mingli,Fu Jia,Cai Yefeng,Shi Xiuli,Zhang Yan,Cui Yimin,Sun Ningling,Li Xiaoying,Cheng Xiaoshu,Wang Jian'an,Yang Xinchun,Yang Tianlun,Xiao Chuanshi,Zhao Gang,Dong Qiang,Zhu Dingliang,Wang Xian,Ge Junbo,Zhao Lianyou,Hu Dayi,Liu Lisheng,Hou Fan Fan, JAMA IMPORTANCE:Uncertainty remains about the efficacy of folic acid therapy for the primary prevention of stroke because of limited and inconsistent data. OBJECTIVE:To test the primary hypothesis that therapy with enalapril and folic acid is more effective in reducing first stroke than enalapril alone among Chinese adults with hypertension. DESIGN, SETTING, AND PARTICIPANTS:The China Stroke Primary Prevention Trial, a randomized, double-blind clinical trial conducted from May 19, 2008, to August 24, 2013, in 32 communities in Jiangsu and Anhui provinces in China. A total of 20,702 adults with hypertension without history of stroke or myocardial infarction (MI) participated in the study. INTERVENTIONS:Eligible participants, stratified by MTHFR C677T genotypes (CC, CT, and TT), were randomly assigned to receive double-blind daily treatment with a single-pill combination containing enalapril, 10 mg, and folic acid, 0.8 mg (n = 10,348) or a tablet containing enalapril, 10 mg, alone (n = 10,354). MAIN OUTCOMES AND MEASURES:The primary outcome was first stroke. Secondary outcomes included first ischemic stroke; first hemorrhagic stroke; MI; a composite of cardiovascular events consisting of cardiovascular death, MI, and stroke; and all-cause death. RESULTS:During a median treatment duration of 4.5 years, compared with the enalapril alone group, the enalapril-folic acid group had a significant risk reduction in first stroke (2.7% of participants in the enalapril-folic acid group vs 3.4% in the enalapril alone group; hazard ratio [HR], 0.79; 95% CI, 0.68-0.93), first ischemic stroke (2.2% with enalapril-folic acid vs 2.8% with enalapril alone; HR, 0.76; 95% CI, 0.64-0.91), and composite cardiovascular events consisting of cardiovascular death, MI, and stroke (3.1% with enalapril-folic acid vs 3.9% with enalapril alone; HR, 0.80; 95% CI, 0.69-0.92). The risks of hemorrhagic stroke (HR, 0.93; 95% CI, 0.65-1.34), MI (HR, 1.04; 95% CI, 0.60-1.82), and all-cause deaths (HR, 0.94; 95% CI, 0.81-1.10) did not differ significantly between the 2 treatment groups. There were no significant differences between the 2 treatment groups in the frequencies of adverse events. CONCLUSIONS AND RELEVANCE:Among adults with hypertension in China without a history of stroke or MI, the combined use of enalapril and folic acid, compared with enalapril alone, significantly reduced the risk of first stroke. These findings are consistent with benefits from folate use among adults with hypertension and low baseline folate levels. TRIAL REGISTRATION:clinicaltrials.gov Identifier: NCT00794885. 10.1001/jama.2015.2274
First trimester fetal growth restriction and cardiovascular risk factors in school age children: population based cohort study. Jaddoe Vincent W V,de Jonge Layla L,Hofman Albert,Franco Oscar H,Steegers Eric A P,Gaillard Romy BMJ (Clinical research ed.) OBJECTIVE:To examine whether first trimester fetal growth restriction correlates with cardiovascular outcomes in childhood. DESIGN:Population based prospective cohort study. SETTING:City of Rotterdam, the Netherlands. PARTICIPANTS:1184 children with first trimester fetal crown to rump length measurements, whose mothers had a reliable first day of their last menstrual period and a regular menstrual cycle. MAIN OUTCOMES MEASURES:Body mass index, total and abdominal fat distribution, blood pressure, and blood concentrations of cholesterol, triglycerides, insulin, and C peptide at the median age of 6.0 (90% range 5.7-6.8) years. Clustering of cardiovascular risk factors was defined as having three or more of: high android fat mass; high systolic or diastolic blood pressure; low high density lipoprotein cholesterol or high triglycerides concentrations; and high insulin concentrations. RESULTS:One standard deviation score greater first trimester fetal crown to rump length was associated with a lower total fat mass (-0.30%, 95% confidence interval -0.57% to -0.03%), android fat mass (-0.07%, -0.12% to -0.02%), android/gynoid fat mass ratio (-0.53, -0.89 to -0.17), diastolic blood pressure (-0.43, -0.84 to -0.01, mm Hg), total cholesterol (-0.05, -0.10 to 0, mmol/L), low density lipoprotein cholesterol (-0.04, -0.09 to 0, mmol/L), and risk of clustering of cardiovascular risk factors (relative risk 0.81, 0.66 to 1.00) in childhood. Additional adjustment for gestational age and weight at birth changed these effect estimates only slightly. Childhood body mass index fully explained the associations of first trimester fetal crown to rump length with childhood total fat mass. First trimester fetal growth was not associated with other cardiovascular outcomes. Longitudinal growth analyses showed that compared with school age children without clustering of cardiovascular risk factors, those with clustering had a smaller first trimester fetal crown to rump length and lower second and third trimester estimated fetal weight but higher weight growth from the age of 6 months onwards. CONCLUSIONS:Impaired first trimester fetal growth is associated with an adverse cardiovascular risk profile in school age children. Early fetal life might be a critical period for cardiovascular health in later life. 10.1136/bmj.g14
Association of HDL cholesterol efflux capacity with incident coronary heart disease events: a prospective case-control study. Saleheen Danish,Scott Robert,Javad Sundas,Zhao Wei,Rodrigues Amrith,Picataggi Antonino,Lukmanova Daniya,Mucksavage Megan L,Luben Robert,Billheimer Jeffery,Kastelein John J P,Boekholdt S Matthijs,Khaw Kay-Tee,Wareham Nick,Rader Daniel J The lancet. Diabetes & endocrinology BACKGROUND:Although HDL cholesterol concentrations are strongly and inversely associated with risk of coronary heart disease, interventions that raise HDL cholesterol do not reduce risk of coronary heart disease. HDL cholesterol efflux capacity-a prototypical measure of HDL function-has been associated with coronary heart disease after adjusting for HDL cholesterol, but its effect on incident coronary heart disease risk is uncertain. METHODS:We measured cholesterol efflux capacity and assessed its relation with vascular risk factors and incident coronary heart disease events in a nested case-control sample from the prospective EPIC-Norfolk study of 25 639 individuals aged 40-79 years, assessed in 1993-97 and followed up to 2009. We quantified cholesterol efflux capacity in 1745 patients with incident coronary heart disease and 1749 control participants free of any cardiovascular disorders by use of a validated ex-vivo radiotracer assay that involved incubation of cholesterol-labelled J774 macrophages with apoB-depleted serum from study participants. FINDINGS:Cholesterol efflux capacity was positively correlated with HDL cholesterol concentration (r=0·40; p<0·0001) and apoA-I concentration (r=0·22; p<0·0001). It was also inversely correlated with type 2 diabetes (r=-0·18; p<0·0001) and positively correlated with alcohol consumption (r=0·12; p<0·0001). In analyses comparing the top and bottom tertiles, cholesterol efflux capacity was significantly and inversely associated with incident coronary heart disease events, independent of age, sex, diabetes, hypertension, smoking and alcohol use, waist:hip ratio, BMI, LDL cholesterol concentration, log-triglycerides, and HDL cholesterol or apoA-I concentrations (odds ratio 0·64, 95% CI 0·51-0·80). After a similar multivariable adjustment the risk of incident coronary heart disease was 0·80 (95% CI 0·70-0·90) for a per-SD change in cholesterol efflux capacity. INTERPRETATION:HDL cholesterol efflux capacity might provide an alternative mechanism for therapeutic modulation of the HDL pathway beyond HDL cholesterol concentration to help reduce risk of coronary heart disease. FUNDING:US National Institutes of Health, UK Medical Research Council, Cancer Research UK. 10.1016/S2213-8587(15)00126-6
Association of DDT with spontaneous abortion: a case-control study. Korrick S A,Chen C,Damokosh A I,Ni J,Liu X,Cho S I,Altshul L,Ryan L,Xu X Annals of epidemiology PURPOSE:Spontaneous abortion (SAB), the most common adverse pregnancy outcome, affects approximately 15% of clinically recognized pregnancies. Except for advanced maternal age and smoking, there are not well-established risk factors for SAB. Animal models associate increased fetal resorption or abortion with exposure to the pesticide dichlorodiphenyl trichloroethane (DDT), but epidemiologic investigations of DDT and SAB are inconsistent. We undertook a pilot investigation of the hypothesized association of DDT with SAB. METHODS:Participants in this case-control study were selected from a longitudinal study of reproductive effects of rotating shifts among female Chinese textile workers who were married, ages 22-34, nulliparous without history of SAB or infertility, and planning pregnancy. From 412 pregnancies, 42 of which ended in SAB, 15 SAB cases and 15 full-term controls were randomly selected and phlebotomized. Serum was analyzed for p,p'-DDT, o,p'-DDT, their metabolites (DDE and DDD), and other organochlorines including polychlorinated biphenyls. RESULTS:Cases and controls were nonsmokers and did not differ in age (mean 25 years), body mass index (BMI), passive smoke exposure, or workplace exposures. Cases had significantly (p < 0.05) higher serum levels of p,p'-DDE (22 vs.12 ng/g) and o,p'-DDE (0.09 vs. 0.05 ng/g) than controls. After adjustment for age and BMI, each ng/g serum increase in p,p'-DDE was associated with a 1.13 (CI, 1.02-1.26) increased odds of SAB. With adjustment of serum DDE levels for excretion via breastfeeding, DDE-associated increased odds of SAB remained significant with up to 7% declines in maternal serum DDE levels for each month of breastfeeding. CONCLUSIONS:A potential increased risk of SAB is associated with maternal serum DDE levels. 10.1016/s1047-2797(01)00239-3
The effects of intraoperative cryoprecipitate transfusion on acute renal failure following orthotropic liver transplantation. Liu Shuang,Wang Xiaoliang,Lu Yuanshan,Li Tao,Gong Zijun,Sheng Tao,Hu Bin,Peng Zhihai,Sun Xing Hepatology international PURPOSE:The definition of risk factors associated with acute renal failure (ARF) following orthotropic liver transplantation (OLT) is still controversial. Cryoprecipitate, which can supply fibrinogen and other coagulation factors, is widely used in OLT. However, the effects of intraoperative cryoprecipitate transfusion on ARF following OLT remain unclear. METHODS:In a series of 389 adult patients who received grafts from deceased donors and underwent their first OLT, the clinical correlation between intraoperative cryoprecipitate transfusion and ARF following OLT was retrospectively studied after adjusting for potential confounders. The distribution of ARF and the causes of death within the first year after OLT were also compared separately in patients with and without cryoprecipitate transfusion. RESULTS:The incidence of ARF in patients with cryoprecipitate transfusion was significantly higher than in patients without cryoprecipitate transfusion (15.9 vs. 7.8 %, p = 0.012). A nonlinear relationship between intraoperative cryoprecipitate transfusion and ARF following OLT was observed. The risk of ARF increased with the cryoprecipitate transfusion level up to the turning point (16 U) (adjusted OR 1.1, 95 % CI 1.1-1.2; p < 0.001). When the cryoprecipitate level exceeded 16 U, the level of cryoprecipitate transfusion was not associated with the risk of ARF (OR 0.95, 95 % CI 0.85-1.1; p = 0.319). Deaths within the first year after the operation occurred more frequently in cases with cryoprecipitate transfusion (22.9 vs. 14.2 %, p = 0.029). CONCLUSIONS:These findings suggested that intraoperative cryoprecipitate transfusion is associated with ARF following OLT. Cryoprecipitate transfusion during OLT should be performed carefully until more convincing evidence has been found. 10.1007/s12072-013-9457-9
Fluoroquinolone use and risk of aortic aneurysm and dissection: nationwide cohort study. Pasternak Björn,Inghammar Malin,Svanström Henrik BMJ (Clinical research ed.) OBJECTIVE:To investigate whether oral fluoroquinolone use is associated with an increased risk of aortic aneurysm or dissection. DESIGN:Nationwide historical cohort study using linked register data on patient characteristics, filled prescriptions, and cases of aortic aneurysm or dissection. SETTING:Sweden, July 2006 to December 2013. PARTICIPANTS:360 088 treatment episodes of fluoroquinolone use (78%ciprofloxacin) and propensity score matched comparator episodes of amoxicillin use (n=360 088). MAIN OUTCOME MEASURES:Cox regression was used to estimate hazard ratios for a first diagnosis of aortic aneurysm or dissection, defined as admission to hospital or emergency department for, or death due to, aortic aneurysm or dissection, within 60 days from start of treatment. RESULTS:Within the 60 day risk period, the rate of aortic aneurysm or dissection was 1.2 cases per 1000 person years among fluoroquinolone users and 0.7 cases per 1000 person years among amoxicillin users. Fluoroquinolone use was associated with an increased risk of aortic aneurysm or dissection (hazard ratio 1.66 (95% confidence interval 1.12 to 2.46)), with an estimated absolute difference of 82 (95% confidence interval 15 to 181) cases of aortic aneurysm or dissection by 60 days per 1 million treatment episodes. In a secondary analysis, the hazard ratio for the association with fluoroquinolone use was 1.90 (1.22 to 2.96) for aortic aneurysm and 0.93 (0.38 to 2.29) for aortic dissection. CONCLUSIONS:In a propensity score matched cohort, fluoroquinolone use was associated with an increased risk of aortic aneurysm or dissection. This association appeared to be largely driven by aortic aneurysm. 10.1136/bmj.k678
Comparison of different devices to measure the intraocular pressure in thyroid-associated orbitopathy. Kuebler Aylin Garip,Wiecha Caroline,Reznicek Lukas,Klingenstein Annemarie,Halfter Kathrin,Priglinger Siegfried,Hintschich Christoph Graefe's archive for clinical and experimental ophthalmology = Albrecht von Graefes Archiv fur klinische und experimentelle Ophthalmologie PURPOSE:To evaluate the correlation of the intraocular pressure measurements (IOP) with non-contact tonometer Corvis Scheimpflug technology (Corvis ST), Goldmann applanation tonometry (GAT), ocular response analyzer (ORA), and iCARE rebound tonometer in patients with thyroid-associated orbitopathy (TAO) and eye-healthy subjects (control group). METHODS:Twenty-nine consecutive patients with TAO (79% female) and 30 eye-healthy subjects (60% female) were included in this prospective, age- and sex-matched study. The IOP measurement with Corvis, ORA, GAT, iCARE, and central corneal thickness (CCT) with Corvis was obtained from all study participants. RESULTS:The mean age of the patients was 51 ± 10 years in patients with TAO and 56 ± 13 years in the control group. The mean IOP measurements with GAT, Corvis, ORA, and iCARE were 15.93 ± 4.42 mmHg, 18.10 ± 7.54 mmHg, 18.40 ± 7.93 mmHg, and 16.61 ± 7.96 mmHg in patients with TAO and 14.52 ± 3.02 mmHg, 14.48 ± 3.38 mmHg, 15.29 ± 4.64 mmHg, and 14.13 ± 3.85 mmHg in the control group (P = 0.157, P = 0.004, P = 0.017, and P = 0.176 respectively). The mean CCT was 547.5 ± 39.2 μm in patients with TAO and 560.8 ± 49.8 μm in the control group ( P= 0.261). CONCLUSIONS:The data collected shows an agreement between the iCARE and GAT IOP measurements in TAO patients and in eye-healthy patients. However, the mean value of IOP measurements with Corvis and ORA was significantly higher in patients with TAO in comparison with the control group (P = 0.044 and P = 0.029 respectively). 10.1007/s00417-019-04367-2
Decreased total iron binding capacity upon intensive care unit admission predicts red blood cell transfusion in critically ill patients. Imaeda Taro,Nakada Taka-Aki,Abe Ryuzo,Oda Shigeto PloS one INTRODUCTION:Red blood cell (RBC) transfusion is associated with poor clinical outcome in critically ill patients. We investigated the predictive value of biomarkers on intensive care units (ICU) admission for RBC transfusion within 28 days. METHODS:Critically ill patients (n = 175) who admitted to our ICU with organ dysfunction and an expected stay of ≥ 48 hours, without hemorrhage, were prospectively studied (derivation cohort, n = 121; validation cohort, n = 54). Serum levels of 12 biomarkers (hemoglobin, creatinine, albumin, interleukin-6 [IL-6], erythropoietin, Fe, total iron binding capacity [TIBC], transferrin, ferritin, transferrin saturation, folate, and vitamin B12) were measured upon ICU admission, days 7, 14, 21 and 28. RESULTS:Among the 12 biomarkers measured upon ICU admission, levels of hemoglobin, albumin, IL-6, TIBC, transferrin and ferritin were statistically different between transfusion and non-transfusion group. Of 6 biomarkers, TIBC upon ICU admission had the highest area under the curve value (0.835 [95% confidence interval] = 0.765-0.906) for predicting RBC transfusion (cut-off value = 234.5 μg/dL; sensitivity = 0.906, specificity = 0.632). This result was confirmed in validation cohort, whose sensitivity and specificity were 0.888 and 0.694, respectively. Measurement of these biomarkers every seven days revealed that albumin, TIBC and transferrin were statistically different between groups throughout hospitalization until 28 days. In validation cohort, patients in the transfusion group had significantly higher serum hepcidin levels than those in the non-transfusion group (P = 0.004). In addition, joint analysis across derivation and validation cohorts revealed that the serum IL-6 levels were higher in the transfusion group (P = 0.0014). CONCLUSION:Decreased TIBC upon ICU admission has high predictive value for RBC transfusion unrelated to hemorrhage within 28 days. 10.1371/journal.pone.0210067
Smartwatch Algorithm for Automated Detection of Atrial Fibrillation. Bumgarner Joseph M,Lambert Cameron T,Hussein Ayman A,Cantillon Daniel J,Baranowski Bryan,Wolski Kathy,Lindsay Bruce D,Wazni Oussama M,Tarakji Khaldoun G Journal of the American College of Cardiology BACKGROUND:The Kardia Band (KB) is a novel technology that enables patients to record a rhythm strip using an Apple Watch (Apple, Cupertino, California). The band is paired with an app providing automated detection of atrial fibrillation (AF). OBJECTIVES:The purpose of this study was to examine whether the KB could accurately differentiate sinus rhythm (SR) from AF compared with physician-interpreted 12-lead electrocardiograms (ECGs) and KB recordings. METHODS:Consecutive patients with AF presenting for cardioversion (CV) were enrolled. Patients underwent pre-CV ECG along with a KB recording. If CV was performed, a post-CV ECG was obtained along with a KB recording. The KB interpretations were compared to physician-reviewed ECGs. The KB recordings were reviewed by blinded electrophysiologists and compared to ECG interpretations. Sensitivity, specificity, and K coefficient were measured. RESULTS:A total of 100 patients were enrolled (age 68 ± 11 years). Eight patients did not undergo CV as they were found to be in SR. There were 169 simultaneous ECG and KB recordings. Fifty-seven were noninterpretable by the KB. Compared with ECG, the KB interpreted AF with 93% sensitivity, 84% specificity, and a K coefficient of 0.77. Physician interpretation of KB recordings demonstrated 99% sensitivity, 83% specificity, and a K coefficient of 0.83. Of the 57 noninterpretable KB recordings, interpreting electrophysiologists diagnosed AF with 100% sensitivity, 80% specificity, and a K coefficient of 0.74. Among 113 cases where KB and physician readings of the same recording were interpretable, agreement was excellent (K coefficient = 0.88). CONCLUSIONS:The KB algorithm for AF detection supported by physician review can accurately differentiate AF from SR. This technology can help screen patients prior to elective CV and avoid unnecessary procedures. 10.1016/j.jacc.2018.03.003
Angiopoietin-like protein 8 in early pregnancy improves the prediction of gestational diabetes. Huang Yun,Chen Xin,Chen Xiaohong,Feng Yu,Guo Heming,Li Sicheng,Dai Ting,Jiang Rong,Zhang Xiaoyan,Fang Chen,Hu Ji Diabetologia AIMS/HYPOTHESIS:Screening high-risk individuals for gestational diabetes mellitus (GDM) in early pregnancy conventionally relies on established maternal risk factors; however, the sensitivity and specificity of these factors are not satisfactory. The present study aimed to determine whether the concentration of angiopoietin-like protein 8 (ANGPTL8), either alone or combined with other risk factors in early pregnancy, could be used to predict subsequent GDM. METHODS:From August 2015 to January 2016, 474 women receiving prenatal care at around 12-16 weeks of gestation were recruited into the study. ANGPTL8 levels were measured at the first prenatal visit. All the participants received a 75 g OGTT during weeks 24-28 of gestation. RESULTS:ANGPTL8 levels in early pregnancy were considerably higher in women who developed GDM than those who maintained normal glucose tolerance (2822 ± 938 vs 2120 ± 1118 pg/ml, respectively; p < 0.0001). Multivariable logistic regression revealed that ANGPTL8 levels were significantly associated with risk of GDM independent of conventional risk factors. In addition, women in the highest quartile of ANGPTL8 concentration had an 8.75-fold higher risk of developing GDM compared with women in the lowest quartile (OR8.75, 95%CI 2.43, 31.58). More importantly, incorporating ANGPTL8 into the conventional prediction model significantly increased the AUC for prediction of GDM (0.772vs 0.725; p = 0.019). CONCLUSIONS:Our study suggests that ANGPTL8 levels in early pregnancy are significantly and independently associated with risk of GDM at 24-28 weeks of gestation. Combining ANGPTL8 levels with conventional risk factors could thus improve the prediction of GDM. 10.1007/s00125-017-4505-y
Association between Insulin-Like Growth Factor-1 and Uric Acid in Chinese Children and Adolescents with Idiopathic Short Stature: A Cross-Sectional Study. Wang Panpan,Ji Baolan,Shao Qian,Zhang Mei,Ban Bo BioMed research international OBJECTIVE:The aim of this study was to examine the relationship between insulin-like growth factor-1 (IGF-1) and serum uric acid (UA) in Chinese children and adolescents with idiopathic short stature (ISS). METHODS:A cross-sectional study of 91 Chinese children and adolescents with ISS was performed. Anthropometric measurements and biochemical parameters were tested. The standard deviation score of IGF-1 (IGF-1 SDS) was calculated. RESULTS:A univariate analysis displayed a significant positive correlation between IGF-1 SDS and UA ( = 0.004). In multivariate piecewise linear regression, the levels of IGF-1 SDS increased with the elevation of UA when UA was between 168 mol/L and 301 mol/L ( 0.010, 95% CI 0.004-0.017; = 0.002). The levels of IGF-1 SDS decreased with the elevation of UA when UA was either less than 168 mol/L (  -0.055, 95% CI -0.081--0.028; < 0.001) or more than 301 mol/L (  -0.005, 95% CI -0.013-0.002; = 0.174). CONCLUSIONS:This study demonstrated a nonlinear relationship between IGF-1 and UA levels in Chinese children and adolescents with ISS. This finding suggests that either high or low levels of UA may have an adverse effect on IGF-1, whereas appropriate UA levels have a beneficial effect. 10.1155/2018/4259098
The combined use of salivary biomarkers and clinical parameters to predict the outcome of scaling and root planing: A cohort study. Liu Yiying,Duan Dingyu,Ma Rui,Ding Yi,Xu Yi,Zhou Xuedong,Zhao Lei,Xu Xin Journal of clinical periodontology AIM:To explore the application of the combined use of baseline salivary biomarkers and clinical parameters in predicting the outcome of scaling and root planing (SRP). MATERIALS AND METHODS:Forty patients with advanced periodontitis were included. Baseline saliva samples were analysed for interleukin-1β (IL-1β), matrix metalloproteinase-8 and the loads of Porphyromonas gingivalis, Prevotella intermedia, Aggregatibacter actinomycetemcomitans and Tannerella forsythia. After SRP, pocket closure and further attachment loss at 6 months post-treatment were chosen as outcome variables. Models to predict the outcomes were established by generalized estimating equations. RESULTS:The combined use of baseline clinical attachment level (CAL), site location and IL-1β (area under the curve [AUC] = 0.764) better predicted pocket closure than probing depth (AUC = 0.672), CAL (AUC = 0.679), site location (AUC = 0.654) or IL-1β (AUC = 0.579) alone. The combination of site location, tooth loss, percentage of deep pockets, detection of A. actinomycetemcomitans and T. forsythia load (AUC = 0.842) better predicted further clinical attachment loss than site location (AUC = 0.715), tooth loss (AUC = 0.530), percentage of deep pockets (AUC = 0.659) or T. forsythia load (AUC = 0.647) alone. CONCLUSION:The combination of baseline salivary biomarkers and clinical parameters better predicted SRP outcomes than each alone. The current study indicates the possible usefulness of salivary biomarkers in addition to tooth-related parameters in predicting SRP outcomes. 10.1111/jcpe.13367
Development and Validation of a Clinical Risk Score to Predict the Occurrence of Critical Illness in Hospitalized Patients With COVID-19. Liang Wenhua,Liang Hengrui,Ou Limin,Chen Binfeng,Chen Ailan,Li Caichen,Li Yimin,Guan Weijie,Sang Ling,Lu Jiatao,Xu Yuanda,Chen Guoqiang,Guo Haiyan,Guo Jun,Chen Zisheng,Zhao Yi,Li Shiyue,Zhang Nuofu,Zhong Nanshan,He Jianxing, JAMA internal medicine Importance:Early identification of patients with novel coronavirus disease 2019 (COVID-19) who may develop critical illness is of great importance and may aid in delivering proper treatment and optimizing use of resources. Objective:To develop and validate a clinical score at hospital admission for predicting which patients with COVID-19 will develop critical illness based on a nationwide cohort in China. Design, Setting, and Participants:Collaborating with the National Health Commission of China, we established a retrospective cohort of patients with COVID-19 from 575 hospitals in 31 provincial administrative regions as of January 31, 2020. Epidemiological, clinical, laboratory, and imaging variables ascertained at hospital admission were screened using Least Absolute Shrinkage and Selection Operator (LASSO) and logistic regression to construct a predictive risk score (COVID-GRAM). The score provides an estimate of the risk that a hospitalized patient with COVID-19 will develop critical illness. Accuracy of the score was measured by the area under the receiver operating characteristic curve (AUC). Data from 4 additional cohorts in China hospitalized with COVID-19 were used to validate the score. Data were analyzed between February 20, 2020 and March 17, 2020. Main Outcomes and Measures:Among patients with COVID-19 admitted to the hospital, critical illness was defined as the composite measure of admission to the intensive care unit, invasive ventilation, or death. Results:The development cohort included 1590 patients. the mean (SD) age of patients in the cohort was 48.9 (15.7) years; 904 (57.3%) were men. The validation cohort included 710 patients with a mean (SD) age of 48.2 (15.2) years, and 382 (53.8%) were men and 172 (24.2%). From 72 potential predictors, 10 variables were independent predictive factors and were included in the risk score: chest radiographic abnormality (OR, 3.39; 95% CI, 2.14-5.38), age (OR, 1.03; 95% CI, 1.01-1.05), hemoptysis (OR, 4.53; 95% CI, 1.36-15.15), dyspnea (OR, 1.88; 95% CI, 1.18-3.01), unconsciousness (OR, 4.71; 95% CI, 1.39-15.98), number of comorbidities (OR, 1.60; 95% CI, 1.27-2.00), cancer history (OR, 4.07; 95% CI, 1.23-13.43), neutrophil-to-lymphocyte ratio (OR, 1.06; 95% CI, 1.02-1.10), lactate dehydrogenase (OR, 1.002; 95% CI, 1.001-1.004) and direct bilirubin (OR, 1.15; 95% CI, 1.06-1.24). The mean AUC in the development cohort was 0.88 (95% CI, 0.85-0.91) and the AUC in the validation cohort was 0.88 (95% CI, 0.84-0.93). The score has been translated into an online risk calculator that is freely available to the public (http://118.126.104.170/). Conclusions and Relevance:In this study, a risk score based on characteristics of COVID-19 patients at the time of admission to the hospital was developed that may help predict a patient's risk of developing critical illness. 10.1001/jamainternmed.2020.2033
Acupuncture for Chronic Severe Functional Constipation: A Randomized Trial. Liu Zhishun,Yan Shiyan,Wu Jiani,He Liyun,Li Ning,Dong Guirong,Fang Jianqiao,Fu Wenbin,Fu Lixin,Sun Jianhua,Wang Linpeng,Wang Shun,Yang Jun,Zhang Hongxing,Zhang Jianbin,Zhao Jiping,Zhou Wei,Zhou Zhongyu,Ai Yanke,Zhou Kehua,Liu Jia,Xu Huanfang,Cai Yuying,Liu Baoyan Annals of internal medicine BACKGROUND:Acupuncture has been used for chronic constipation, but evidence for its effectiveness remains scarce. OBJECTIVE:To determine the efficacy of electroacupuncture (EA) for chronic severe functional constipation (CSFC). DESIGN:Randomized, parallel, sham-controlled trial. (ClinicalTrials.gov: NCT01726504). SETTING:15 hospitals in China. PARTICIPANTS:Patients with CSFC and no serious underlying pathologic cause for constipation. INTERVENTION:28 sessions of EA at traditional acupoints or sham EA (SA) at nonacupoints over 8 weeks. MEASUREMENTS:The primary outcome was the change from baseline in mean weekly complete spontaneous bowel movements (CSBMs) during weeks 1 to 8. Participants were followed until week 20. RESULTS:1075 patients (536 and 539 in the EA and SA groups, respectively) were enrolled. The increase from baseline in mean weekly CSBMs during weeks 1 to 8 was 1.76 (95% CI, 1.61 to 1.89) in the EA group and 0.87 (CI, 0.73 to 0.97) in the SA group (between-group difference, 0.90 [CI, 0.74 to 1.10]; P < 0.001). The change from baseline in mean weekly CSBMs during weeks 9 to 20 was 1.96 (CI, 1.78 to 2.11) in the EA group and 0.89 (CI, 0.69 to 0.95) in the SA group (between-group difference, 1.09 [CI, 0.94 to 1.31]; P < 0.001). The proportion of patients having 3 or more mean weekly CSBMs in the EA group was 31.3% and 37.7% over the treatment and follow-up periods, respectively, compared with 12.1% and 14.1% in the SA group (P < 0.001). Acupuncture-related adverse events during treatment were infrequent in both groups, and all were mild or transient. LIMITATIONS:Longer-term follow-up was not assessed. Acupuncturists could not be blinded. CONCLUSION:Eight weeks of EA increases CSBMs and is safe for the treatment of CSFC. Additional study is warranted to evaluate a longer-term treatment and follow-up. PRIMARY FUNDING SOURCE:Ministry of Science and Technology of the People's Republic of China through the Twelfth Five-Year National Science and Technology Pillar Program. 10.7326/M15-3118
Mediterranean diet intervention in overweight and obese subjects lowers plasma cholesterol and causes changes in the gut microbiome and metabolome independently of energy intake. Meslier Victoria,Laiola Manolo,Roager Henrik Munch,De Filippis Francesca,Roume Hugo,Quinquis Benoit,Giacco Rosalba,Mennella Ilario,Ferracane Rosalia,Pons Nicolas,Pasolli Edoardo,Rivellese Angela,Dragsted Lars Ove,Vitaglione Paola,Ehrlich Stanislav Dusko,Ercolini Danilo Gut OBJECTIVES:This study aimed to explore the effects of an isocaloric Mediterranean diet (MD) intervention on metabolic health, gut microbiome and systemic metabolome in subjects with lifestyle risk factors for metabolic disease. DESIGN:Eighty-two healthy overweight and obese subjects with a habitually low intake of fruit and vegetables and a sedentary lifestyle participated in a parallel 8-week randomised controlled trial. Forty-three participants consumed an MD tailored to their habitual energy intakes (MedD), and 39 maintained their regular diets (ConD). Dietary adherence, metabolic parameters, gut microbiome and systemic metabolome were monitored over the study period. RESULTS:Increased MD adherence in the MedD group successfully reprogrammed subjects' intake of fibre and animal proteins. Compliance was confirmed by lowered levels of carnitine in plasma and urine. Significant reductions in plasma cholesterol (primary outcome) and faecal bile acids occurred in the MedD compared with the ConD group. Shotgun metagenomics showed gut microbiome changes that reflected individual MD adherence and increase in gene richness in participants who reduced systemic inflammation over the intervention. The MD intervention led to increased levels of the fibre-degrading and of genes for microbial carbohydrate degradation linked to butyrate metabolism. The dietary changes in the MedD group led to increased urinary urolithins, faecal bile acid degradation and insulin sensitivity that co-varied with specific microbial taxa. CONCLUSION:Switching subjects to an MD while maintaining their energy intake reduced their blood cholesterol and caused multiple changes in their microbiome and metabolome that are relevant in future strategies for the improvement of metabolic health. 10.1136/gutjnl-2019-320438
Long-term night shift work is associated with the risk of atrial fibrillation and coronary heart disease. European heart journal AIMS:The aim of this study was to test whether current and past night shift work was associated with incident atrial fibrillation (AF) and whether this association was modified by genetic vulnerability. Its associations with coronary heart disease (CHD), stroke, and heart failure (HF) were measured as a secondary aim. METHODS AND RESULTS:This cohort study included 283 657 participants in paid employment or self-employed without AF and 276 009 participants free of CHD, stroke, and HF at baseline in the UK Biobank. Current and lifetime night shift work information was obtained. Cox proportional hazard models were used. Weighted genetic risk score for AF was calculated. During a median follow-up of 10.4 years, 5777 incident AF cases were documented. From 'day workers', 'shift but never/rarely night shifts', and 'some night shifts' to 'usual/permanent night shifts', there was a significant increasing trend in the risk of incident AF (P for trend 0.013). Usual or permanent night shifts were associated with the highest risk [hazard ratio (HR) 1.16, 95% confidence interval (CI) 1.02-1.32]. Considering a person's lifetime work schedule and compared with shift workers never working nights, participants with a duration over 10 years and an average 3-8 nights/month frequency of night shift work exposure possessed higher AF risk (HR 1.18, 95% CI 0.99-1.40 and HR 1.22, 95% CI 1.02-1.45, respectively). These associations between current and lifetime night shifts and AF were not modified by genetic predisposition to AF. Usual/permanent current night shifts, ≥10 years and 3-8 nights/month of lifetime night shifts were significantly associated with a higher risk of incident CHD (HR 1.22, 95% CI 1.11-1.35, HR 1.37, 95% CI 1.20-1.58 and HR 1.35, 95% CI 1.18-1.55, respectively). These associations in stroke and HF were not significant. CONCLUSION:Both current and lifetime night shift exposures were associated with increased AF risk, regardless of genetic AF risk. Night shift exposure also increased the risk of CHD but not stroke or HF. Whether decreasing night shift work frequency and duration might represent another avenue to improve heart health during working life and beyond warrants further study. 10.1093/eurheartj/ehab505