The use of genetic programming to develop a predictor of swash excursion on sandy beaches
NASA Astrophysics Data System (ADS)
Passarella, Marinella; Goldstein, Evan B.; De Muro, Sandro; Coco, Giovanni
2018-02-01
We use genetic programming (GP), a type of machine learning (ML) approach, to predict the total and infragravity swash excursion using previously published data sets that have been used extensively in swash prediction studies. Three previously published works with a range of new conditions are added to this data set to extend the range of measured swash conditions. Using this newly compiled data set we demonstrate that a ML approach can reduce the prediction errors compared to well-established parameterizations and therefore it may improve coastal hazards assessment (e.g. coastal inundation). Predictors obtained using GP can also be physically sound and replicate the functionality and dependencies of previous published formulas. Overall, we show that ML techniques are capable of both improving predictability (compared to classical regression approaches) and providing physical insight into coastal processes.
A neural network based reputation bootstrapping approach for service selection
NASA Astrophysics Data System (ADS)
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Using Whole Language Materials in the Adult ESOL Classroom.
ERIC Educational Resources Information Center
Schiffer, Edward W.
A practicum explored the use of instructional materials based on the whole language approach to second language learning in adult English-as-a-Second-Language (ESL) instruction. The approach was implemented in a beginning ESL classroom at an adult education center that had previously used publisher textbooks, which were not thought to provide…
ERIC Educational Resources Information Center
Linnenbrink-Garcia, Lisa; Middleton, Michael J.; Ciani, Keith D.; Easter, Matthew A.; O'Keefe, Paul A.; Zusho, Akane
2012-01-01
In current research on achievement goal theory, most researchers differentiate between performance-approach and performance-avoidance goal orientations. Evidence from prior research and from several previously published data sets is used to highlight that the correlation is often rather large, with a number of studies reporting correlations above…
Frequency domain phase noise analysis of dual injection-locked optoelectronic oscillators.
Jahanbakht, Sajad
2016-10-01
Dual injection-locked optoelectronic oscillators (DIL-OEOs) have been introduced as a means to achieve very low-noise microwave oscillations while avoiding the large spurious peaks that occur in the phase noise of the conventional single-loop OEOs. In these systems, two OEOs are inter-injection locked to each other. The OEO with the longer optical fiber delay line is called the master OEO, and the other is called the slave OEO. Here, a frequency domain approach for simulating the phase noise spectrum of each of the OEOs in a DIL-OEO system and based on the conversion matrix approach is presented. The validity of the new approach is verified by comparing its results with previously published data in the literature. In the new approach, first, in each of the master or slave OEOs, the power spectral densities (PSDs) of two white and 1/f noise sources are optimized such that the resulting simulated phase noise of any of the master or slave OEOs in the free-running state matches the measured phase noise of that OEO. After that, the proposed approach is able to simulate the phase noise PSD of both OEOs at the injection-locked state. Because of the short run-time requirements, especially compared to previously proposed time domain approaches, the new approach is suitable for optimizing the power injection ratios (PIRs), and potentially other circuit parameters, in order to achieve good performance regarding the phase noise in each of the OEOs. Through various numerical simulations, the optimum PIRs for achieving good phase noise performance are presented and discussed; they are in agreement with the previously published results. This further verifies the applicability of the new approach. Moreover, some other interesting results regarding the spur levels are also presented.
A Strategic Culture Assessment of the Transatlantic Divide
2008-03-01
security divide through the strategic culture lens, taking a comparative case study approach . It analyzes the emergent EU strategic culture by looking...utilize the strategic culture approach in the ensuing case study comparisons. B. WHY THE USE OF STRATEGIC CULTURE? In a study published in 2004...analysis use a comparative cultural approach when a previous comparison of U.S. and EU behavior found these actors’ behavior most aligned with realism’s
pH-dependent surface charging and points of zero charge. IV. Update and new approach.
Kosmulski, Marek
2009-09-15
The recently published points of zero charge (PZC) and isoelectric points (IEPs) of various materials are compiled to update the previous compilation [M. Kosmulski, Surface Charging and Points of Zero Charge, CRC Press, Boca Raton, FL, 2009]. Unlike in previous compilations by the same author [Chemical Properties of Material Surfaces, Dekker, New York, 2001; J. Colloid Interface Sci. 253 (2002) 77; J. Colloid Interface Sci. 275 (2004) 214; J. Colloid Interface Sci. 298 (2006) 730], the materials are sorted not only by the chemical formula, but also by specific product, that is, by brand name (commercially available materials), and by recipe (home-synthesized materials). This new approach indicated that the relatively consistent PZC/IEP reported in the literature for materials having the same chemical formula are due to biased choice of specimens to be studied. Specimens which have PZC/IEP close to the "recommended" value are selected more often than other specimens (PZC/IEP not reported before or PZC/IEP reported, but different from the "recommended" value). Thus, the previously published PZC/IEP act as a self-fulfilling prophecy.
ERIC Educational Resources Information Center
Deschesnes, Marthe; Drouin, Nathalie; Tessier, Caroline; Couturier, Yves
2014-01-01
Purpose: The purpose of this paper is to understand how a Canadian intervention based on a professional development (PD) model did or did not influence schools' capacities to absorb a Healthy School (HS) approach into their operations. This study is the second part of a research project: previously published results regarding this research…
Combined PEST and Trial-Error approach to improve APEX calibration
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy Environmental eXtender (APEX), a physically-based hydrologic model that simulates management impacts on the environment for small watersheds, requires improved understanding of the input parameters for improved simulations. However, most previously published studies used the ...
New, national bottom-up estimate for tree-based biological ...
Nitrogen is a limiting nutrient in many ecosystems, but is also a chief pollutant from human activity. Quantifying human impacts on the nitrogen cycle and investigating natural ecosystem nitrogen cycling both require an understanding of the magnitude of nitrogen inputs from biological nitrogen fixation (BNF). A bottom-up approach to estimating BNF—scaling rates up from measurements to broader scales—is attractive because it is rooted in actual BNF measurements. However, bottom-up approaches have been hindered by scaling difficulties, and a recent top-down approach suggested that the previous bottom-up estimate was much too large. Here, we used a bottom-up approach for tree-based BNF, overcoming scaling difficulties with the systematic, immense (>70,000 N-fixing trees) Forest Inventory and Analysis (FIA) database. We employed two approaches to estimate species-specific BNF rates: published ecosystem-scale rates (kg N ha-1 yr-1) and published estimates of the percent of N derived from the atmosphere (%Ndfa) combined with FIA-derived growth rates. Species-specific rates can vary for a variety of reasons, so for each approach we examined how different assumptions influenced our results. Specifically, we allowed BNF rates to vary with stand age, N-fixer density, and canopy position (since N-fixation is known to require substantial light).Our estimates from this bottom-up technique are several orders of magnitude lower than previous estimates indicating
Identifying Careless Responses in Survey Data
ERIC Educational Resources Information Center
Meade, Adam W.; Craig, S. Bartholomew
2012-01-01
When data are collected via anonymous Internet surveys, particularly under conditions of obligatory participation (such as with student samples), data quality can be a concern. However, little guidance exists in the published literature regarding techniques for detecting careless responses. Previously several potential approaches have been…
Loh, Tze Ping; Sethi, Sunil Kumar; Metz, Michael Patrick
2015-08-01
To describe the reference intervals and biological variation data for thyrotropin (TSH) and free thyroxine (FT4) in a mixed Asian population using an indirect sampling approach and to compare them with published reports. TSH and FT4 of children measured once or twice over a 7-year period (2008-2014) at primary-care and tertiary-care settings were extracted from the laboratory information system. After excluding outliers, age-related reference intervals were derived using the Lambda-Mu-Sigma (LMS) approach, while age-partitioned biological variation data were obtained according to recommendations by Fraser and Harris. Both TSH and FT4 were very high at birth and declined with age. Similarly within-individual and between-individual biological variations were higher for both TSH and FT4 at birth and also declined with age. Our data were broadly similar to previous studies. Significant heterogeneity in study population and methods prohibited direct numerical comparison between this and previously published studies. This study fills two important gaps in our knowledge of paediatric thyroid function by reporting the centile trends (and reference values) in a mixed Asian population, as well as providing age-partitioned biological variation data. The variation in published reference intervals highlights the difficulty in harmonising paediatric thyroid reference intervals or recommending universal clinical cut-offs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.
Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen
2017-11-01
A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.
Wahman, David G; Speitel, Gerald E; Katz, Lynn E
2017-11-21
Chloramine chemistry is complex, with a variety of reactions occurring in series and parallel and many that are acid or base catalyzed, resulting in numerous rate constants. Bromide presence increases system complexity even further with possible bromamine and bromochloramine formation. Therefore, techniques for parameter estimation must address this complexity through thoughtful experimental design and robust data analysis approaches. The current research outlines a rational basis for constrained data fitting using Brønsted theory, application of the microscopic reversibility principle to reversible acid or base catalyzed reactions, and characterization of the relative significance of parallel reactions using fictive product tracking. This holistic approach was used on a comprehensive and well-documented data set for bromamine decomposition, allowing new interpretations of existing data by revealing that a previously published reaction scheme was not robust; it was not able to describe monobromamine or dibromamine decay outside of the conditions for which it was calibrated. The current research's simplified model (3 reactions, 17 constants) represented the experimental data better than the previously published model (4 reactions, 28 constants). A final model evaluation was conducted based on representative drinking water conditions to determine a minimal model (3 reactions, 8 constants) applicable for drinking water conditions.
NASA Astrophysics Data System (ADS)
Hidaka, Masako
Copyright policies and terms directly affect the approach taken by journal editors, authors and readers regarding dealing with of articles and/or copyrighted materials. However Japanese academic society publishers have some trouble in licensing processes for copyrighted materials as previous studies pointed out. In 2011 we conducted a survey on “terms and conditions of use” of electronic journal and the licensing practices associated with electronic scholarly materials. The survey showed commercial publishers have enough announcements on reuse of copyrighted materials for readers. On the other hand Japanese academic societies' cares for readers tend to not enough. They publish journals both in Japanese and in English. Subsequently, English and Japanese templates of “terms and conditions of use” for Japanese academic society publishers were proposed. The templates were developed based on an understanding of the International Association of Scientific, Technical and Medical Publishers' “STM Permissions Guidelines,” which were designed to establish a standard and reasonable approach to granting permission for republication to all signatory publishers.The survey showed that Japanese academic society publishers and commercial publishers are facing the same issues regarding acceptable use of electronic supplemental materials for journal articles. This issue remains to be solved.
Using Elephant's Toothpaste as an Engaging and Flexible Curriculum Alignment Project
ERIC Educational Resources Information Center
Eldridge, Daniel S.
2015-01-01
There is an increasing focus across all educational sectors to ensure that learning objectives are aligned with learning activities and assessments. An attractive approach previously published is that of curriculum alignment projects. This paper discusses the use of the fun and famous "Elephant's Toothpaste" experiment as a customizable…
Decision-Making Accuracy of CBM Progress-Monitoring Data
ERIC Educational Resources Information Center
Hintze, John M.; Wells, Craig S.; Marcotte, Amanda M.; Solomon, Benjamin G.
2018-01-01
This study examined the diagnostic accuracy associated with decision making as is typically conducted with curriculum-based measurement (CBM) approaches to progress monitoring. Using previously published estimates of the standard errors of estimate associated with CBM, 20,000 progress-monitoring data sets were simulated to model student reading…
Jonsdottir, Helga
2013-03-01
To synthesise findings from previously published studies on the effectiveness of self-management programmes for people with chronic obstructive pulmonary disease. Self-management is a widely valued concept to address contemporary issues of chronic health problems. Yet, findings of self-management programmes for people with chronic obstructive pulmonary disease are indecisive. Literature review of (1) previously published systematic reviews and (2) an integrative literature review. Synthesis of findings from previously published systematic reviews (n = 4) of the effectiveness of self-management programmes for people with chronic obstructive pulmonary disease and an integrated review that was performed on papers published between January 2007-June 2012 (n = 9). Findings demonstrate that there are few studies on the effectiveness of self-management programmes on people with chronic obstructive pulmonary disease despite more than a decade of research activities. Outcomes of the studies reveal some increase in health-related quality of life and reduction in use of healthcare resources. The methodological approaches vary, and the sample size is primarily small. Families are not acknowledged. Features of patient-centredness exist in self-management programmes, particularly in the more recent articles. The effectiveness of self-management programmes for people with chronic obstructive pulmonary disease remains indecisive. A reconceptualisation of self-management programmes is called for with attention to a family-centred, holistic and relational care focusing on living with and minimising the handicapping consequences of the health problems in their entirety. © 2013 Blackwell Publishing Ltd.
ERIC Educational Resources Information Center
Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.
2008-01-01
This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…
ERIC Educational Resources Information Center
Tillery, Denise
2001-01-01
Argues that the philosophical hermeneutics of Hans-Georg Gadamer provides a useful theoretical framework from which to discuss ethical issues in the technical communication classroom. Analyzes a previously published case study to demonstrate how hermeneutics can shed light on the ways that writers can be unconscious of ethical problems in their…
ERIC Educational Resources Information Center
Hopfer, S.; Davis, D.; Kam, J. A.; Shin, Y.; Elek, E.; Hecht, M. L.
2010-01-01
This article takes a systematic approach to reviewing substance use prevention programs introduced in elementary school (K-6th grade). Previous studies evaluating such programs among elementary school students showed mixed effects on subsequent substance use and related psychosocial factors. Thirty published evaluation studies of 24 elementary…
Occupational safety and health management in the construction industry: a review.
Jaafar, Mohd Hafiidz; Arifin, Kadir; Aiyub, Kadaruddin; Razman, Muhammad Rizal; Ishak, Muhammad Izzuddin Syakir; Samsurijan, Mohamad Shaharudin
2017-09-11
The construction industry plays a significant role in contributing to the economy and development globally. During the process of construction, various hazards coupled with the unique nature of the industry contribute to high fatality rates. This review refers to previous published studies and related Malaysian legislation documents. Four main elements consisting of human, worksite, management and external elements which cause occupational accidents and illnesses were identified. External and management elements are the underlying causes contributing to occupational safety and health (OSH), while human and worksite elements are more apparent causes of occupational accidents and illnesses. An effective OSH management approach is required to contain all hazards at construction sites. An approach to OSH management constructed by elements of policy, process, personnel and incentive developed in previous work is explored. Changes to the sub-elements according to previous studies and the related Malaysian legislation are also covered in this review.
Innovative Approaches to Fuel-Air Mixing and Combustion in Airbreathing Hypersonic Engines
NASA Astrophysics Data System (ADS)
MacLeod, C.
This paper describes some innovative methods for achieving enhanced fuel-air mixing and combustion in Scramjet-like spaceplane engines. A multimodal approach to the problem is discussed; this involves using several concurrent methods of forced mixing. The paper concentrates on Electromagnetic Activation (EMA) and Electrostatic Attraction as suitable techniques for this purpose - although several other potential methods are also discussed. Previously published empirical data is used to draw conclusions about the likely effectiveness of the system and possible engine topologies are outlined.
Assessment of Parkinson’s disease risk loci in Greece
Kara, Eleanna; Xiromerisiou, Georgia; Spanaki, Cleanthe; Bozi, Maria; Koutsis, Georgios; Panas, Marios; Dardiotis, Efthimios; Ralli, Styliani; Bras, Jose; Letson, Christopher; Edsall, Connor; Pliner, Hannah; Arepali, Sampath; Kalinderi, Kallirhoe; Fidani, Liana; Bostanjopoulou, Sevasti; Keller, Margaux F; Wood, Nicholas W; Hardy, John; Houlden, Henry; Stefanis, Leonidas; Plaitakis, Andreas; Hernandez, Dena; Hadjigeorgiou, Georgios M; Nalls, Mike A; Singleton, Andrew B
2013-01-01
Genome wide association studies (GWAS) have been shown to be a powerful approach to identify risk loci for neurodegenerative diseases. Recent GWAS in Parkinson’s disease (PD) have been successful in identifying numerous risk variants pointing to novel pathways potentially implicated in the pathogenesis of PD. Contributing to these GWAS efforts, we performed genotyping of previously identified risk alleles in PD patients and controls from Greece. We showed that previously published risk profiles for Northern European and American populations are also applicable to the Greek population. In addition, while we were largely underpowered to detect individual associations we replicated 5 of 32 previously published risk variants with nominal p-values <0.05. Genome-wide complex trait analysis (GCTA) revealed that known risk loci explain disease risk in 1.27% of Greek PD patients. Collectively, these results indicate that there is likely a substantial genetic component to PD in Greece similarly to other worldwide populations that remains to be discovered. PMID:24080174
A rapid and rational approach to generating isomorphous heavy-atom phasing derivatives.
Lu, Jinghua; Sun, Peter D
2014-09-01
In attempts to replace the conventional trial-and-error heavy-atom derivative search method with a rational approach, we previously defined heavy metal compound reactivity against peptide ligands. Here, we assembled a composite pH- and buffer-dependent peptide reactivity profile for each heavy metal compound to guide rational heavy-atom derivative search. When knowledge of the best-reacting heavy-atom compound is combined with mass spectrometry assisted derivatization, and with a quick-soak method to optimize phasing, it is likely that the traditional heavy-atom compounds could meet the demand of modern high-throughput X-ray crystallography. As an example, we applied this rational heavy-atom phasing approach to determine a previously unknown mouse serum amyloid A2 crystal structure. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Bayesian estimation of the discrete coefficient of determination.
Chen, Ting; Braga-Neto, Ulisses M
2016-12-01
The discrete coefficient of determination (CoD) measures the nonlinear interaction between discrete predictor and target variables and has had far-reaching applications in Genomic Signal Processing. Previous work has addressed the inference of the discrete CoD using classical parametric and nonparametric approaches. In this paper, we introduce a Bayesian framework for the inference of the discrete CoD. We derive analytically the optimal minimum mean-square error (MMSE) CoD estimator, as well as a CoD estimator based on the Optimal Bayesian Predictor (OBP). For the latter estimator, exact expressions for its bias, variance, and root-mean-square (RMS) are given. The accuracy of both Bayesian CoD estimators with non-informative and informative priors, under fixed or random parameters, is studied via analytical and numerical approaches. We also demonstrate the application of the proposed Bayesian approach in the inference of gene regulatory networks, using gene-expression data from a previously published study on metastatic melanoma.
NASA Astrophysics Data System (ADS)
Stephan, E.; Sivaraman, C.
2016-12-01
The Web brought together science communities creating collaborative opportunities that were previously unimaginable. This was due to the novel ways technology enabled users to share information that would otherwise not be available. This means that data and software that previously could not be discovered without direct contact with data or software creators can now be downloaded with the click of a mouse button, and the same products can now outlive the lifespan of their research projects. While in many ways these technological advancements provide benefit to collaborating scientists, a critical producer-consumer knowledge gap is created when collaborating scientists rely solely on web sites, web browsers, or similar technology to exchange services, software, and data. Without some best practices and common approaches from Web publishers, collaborating scientific consumers have no inherent way to trust the results or other products being shared, producers have no way to convey their scientific credibility, and publishers risk obscurity where data is hidden in the deep Web. By leveraging recommendations from the W3C Data Activity, scientific communities can adopt best practices for data publication enabling consumers to explore, reuse, reproduce, and contribute their knowledge about the data. This talk will discuss the application of W3C Data on the Web Best Practices in support of published earth science data and feature the Data Usage Vocabulary.
Full Inclusion and the REI: A Reply to Thousand and Villa.
ERIC Educational Resources Information Center
Jenkins, Joseph R.; Pious, Constance G.
1991-01-01
This reply to a commentary (EC 600 858) on a previously published paper (EC 230 267) dealing with the regular education initiative (REI) argues that a critical element in managing mainstream classrooms is use of team approach, that existing visions of the future are tenuous, and that integrated student placement is a preferred condition but not…
Characterizing historical and modern fire regimes in Michigan (USA): A landscape ecosystem approach
David T. Cleland; Thomas R. Crow; Sari C. Saunders; Donald I. Dickmann; Ann L. Maclean; James K. Jordan; Richard L. Watson; Alyssa M. Sloan; Kimberely D. Brosofske
2004-01-01
We studied the relationships of landscape ecosystems to historical and contemporary fire regimes across 4.3 million hectares in northern lower Michigan (USA). Changes in fire regimes were documented by comparing historical fire rotations in different landscape ecosystems to those occurring between 1985 and 2000. Previously published data and a synthesis of the...
Researcher Creations? The Positioning of Policy Texts in Higher Education Research
ERIC Educational Resources Information Center
Ashwin, Paul; Smith, Karen
2015-01-01
In this article we explore the way in which policy texts are positioned in a selection of higher education journal articles. Previous research has suggested that policy implementation studies have taken an uncritical approach to researching policies. Based on an analysis of articles published in higher education and policy journals in 2011, we…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-21
... previously by the FAA in a Flight Data Center (FDC) Notice to Airmen (NOTAM) as an emergency action of immediate flight safety relating directly to published aeronautical charts. The circumstances which created... Locks, CT, Bradley Intl, RNAV (RNP) Z RWY 15, Orig-A Orlando, FL, Kissimmee Gateway, ILS OR LOC RWY 15...
Lemoine, E; Merceron, D; Sallantin, J; Nguifo, E M
1999-01-01
This paper describes a new approach to problem solving by splitting up problem component parts between software and hardware. Our main idea arises from the combination of two previously published works. The first one proposed a conceptual environment of concept modelling in which the machine and the human expert interact. The second one reported an algorithm based on reconfigurable hardware system which outperforms any kind of previously published genetic data base scanning hardware or algorithms. Here we show how efficient the interaction between the machine and the expert is when the concept modelling is based on reconfigurable hardware system. Their cooperation is thus achieved with an real time interaction speed. The designed system has been partially applied to the recognition of primate splice junctions sites in genetic sequences.
Data-driven mapping of hypoxia-related tumor heterogeneity using DCE-MRI and OE-MRI.
Featherstone, Adam K; O'Connor, James P B; Little, Ross A; Watson, Yvonne; Cheung, Sue; Babur, Muhammad; Williams, Kaye J; Matthews, Julian C; Parker, Geoff J M
2018-04-01
Previous work has shown that combining dynamic contrast-enhanced (DCE)-MRI and oxygen-enhanced (OE)-MRI binary enhancement maps can identify tumor hypoxia. The current work proposes a novel, data-driven method for mapping tissue oxygenation and perfusion heterogeneity, based on clustering DCE/OE-MRI data. DCE-MRI and OE-MRI were performed on nine U87 (glioblastoma) and seven Calu6 (non-small cell lung cancer) murine xenograft tumors. Area under the curve and principal component analysis features were calculated and clustered separately using Gaussian mixture modelling. Evaluation metrics were calculated to determine the optimum feature set and cluster number. Outputs were quantitatively compared with a previous non data-driven approach. The optimum method located six robustly identifiable clusters in the data, yielding tumor region maps with spatially contiguous regions in a rim-core structure, suggesting a biological basis. Mean within-cluster enhancement curves showed physiologically distinct, intuitive kinetics of enhancement. Regions of DCE/OE-MRI enhancement mismatch were located, and voxel categorization agreed well with the previous non data-driven approach (Cohen's kappa = 0.61, proportional agreement = 0.75). The proposed method locates similar regions to the previous published method of binarization of DCE/OE-MRI enhancement, but renders a finer segmentation of intra-tumoral oxygenation and perfusion. This could aid in understanding the tumor microenvironment and its heterogeneity. Magn Reson Med 79:2236-2245, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Klein, Hans-Ulrich; Ruckert, Christian; Kohlmann, Alexander; Bullinger, Lars; Thiede, Christian; Haferlach, Torsten; Dugas, Martin
2009-12-15
Multiple gene expression signatures derived from microarray experiments have been published in the field of leukemia research. A comparison of these signatures with results from new experiments is useful for verification as well as for interpretation of the results obtained. Currently, the percentage of overlapping genes is frequently used to compare published gene signatures against a signature derived from a new experiment. However, it has been shown that the percentage of overlapping genes is of limited use for comparing two experiments due to the variability of gene signatures caused by different array platforms or assay-specific influencing parameters. Here, we present a robust approach for a systematic and quantitative comparison of published gene expression signatures with an exemplary query dataset. A database storing 138 leukemia-related published gene signatures was designed. Each gene signature was manually annotated with terms according to a leukemia-specific taxonomy. Two analysis steps are implemented to compare a new microarray dataset with the results from previous experiments stored and curated in the database. First, the global test method is applied to assess gene signatures and to constitute a ranking among them. In a subsequent analysis step, the focus is shifted from single gene signatures to chromosomal aberrations or molecular mutations as modeled in the taxonomy. Potentially interesting disease characteristics are detected based on the ranking of gene signatures associated with these aberrations stored in the database. Two example analyses are presented. An implementation of the approach is freely available as web-based application. The presented approach helps researchers to systematically integrate the knowledge derived from numerous microarray experiments into the analysis of a new dataset. By means of example leukemia datasets we demonstrate that this approach detects related experiments as well as related molecular mutations and may help to interpret new microarray data.
Effects of sediment supply on surface textures of gravel-bed rivers
John M. Buffington; David R. Montgomery
1999-01-01
Using previously published data from flume studies, we test a new approach for quantifying the effects of sediment supply (i.e., bed material supply) on surface grain size of equilibrium gravel channels. Textural response to sediment supply is evaluated relative to a theoretical prediction of competent median grain size (Dâ50). We find that surface median grain size (...
A Systematic Review of Research on Teaching English Language Skills for Saudi EFL Students
ERIC Educational Resources Information Center
Alsowat, Hamad H.
2017-01-01
This systematic review study sought to examine the teaching of English language skills in Saudi Arabia by systematically analyzing the previous studies on language skills which were published within the past ten years and identify the research areas to be bridged in the future. The study employed the systematic review approach. The search strategy…
Bouchard, M
2001-01-01
In recent years, a few articles describing the use of neural networks for nonlinear active control of sound and vibration were published. Using a control structure with two multilayer feedforward neural networks (one as a nonlinear controller and one as a nonlinear plant model), steepest descent algorithms based on two distinct gradient approaches were introduced for the training of the controller network. The two gradient approaches were sometimes called the filtered-x approach and the adjoint approach. Some recursive-least-squares algorithms were also introduced, using the adjoint approach. In this paper, an heuristic procedure is introduced for the development of recursive-least-squares algorithms based on the filtered-x and the adjoint gradient approaches. This leads to the development of new recursive-least-squares algorithms for the training of the controller neural network in the two networks structure. These new algorithms produce a better convergence performance than previously published algorithms. Differences in the performance of algorithms using the filtered-x and the adjoint gradient approaches are discussed in the paper. The computational load of the algorithms discussed in the paper is evaluated for multichannel systems of nonlinear active control. Simulation results are presented to compare the convergence performance of the algorithms, showing the convergence gain provided by the new algorithms.
A Review of the Match Technique as Applied to AASE-2/EASOE and SOLVE/THESEO 2000
NASA Technical Reports Server (NTRS)
Morris, Gary A.; Bojkov, Bojan R.; Lait, Leslie R.; Schoeberl, Mark R.; Rex, Markus
2004-01-01
We apply the GSFC trajectory model with a series of ozonesondes to derive ozone loss rates in the lower stratosphere for the AASE-2/EASOE mission (January - March 1992) and for the SOLVE/THESEO 2000 mission (January - March 2000) in an approach similar to Match. Ozone loss rates are computed by comparing the ozone concentrations provided by ozonesondes launched at the beginning and end of the trajectories connecting the launches. We investigate the sensitivity of the Match results on the various parameters used to reject potential matches in the original Match technique and conclude that only a filter based on potential vorticity changes along the calculated back trajectory seems necessary. Our study also demonstrates that calculated ozone loss rates can vary by up to a factor of two depending upon the precise trajectory paths calculated for each trajectory. As a result an additional systematic error might need to be added to the statistical uncertainties published with previous Match results. The sensitivity to the trajectory path is particularly pronounced in the month of January, the month during which the largest ozone loss rate discrepancies between photochemical models and Match are found. For most of the two study periods, our ozone loss rates agree with those previously published. Notable exceptions are found for January 1992 at 475 K and late February/early March 2000 at 450 K, both periods during which we find less loss than the previous studies. Integrated ozone loss rates in both years compare well with those found in numerous other studies and in a potential vorticity/potential temperature approach shown previously and in this paper. Finally, we suggest an alternate approach to Match using trajectory mapping that appears to more accurately reflect the true uncertainties associated with Match and reduces the dependence upon filters that may bias the results of Match through the rejection of greater than or equal to 80% of the matched sonde pairs and >99% of matched observations.
NASA Technical Reports Server (NTRS)
Tuey, Richard C.; Lane, Robert; Hart, Susan V.
1995-01-01
The NASA Scientific and Technical Information Office was assigned the responsibility to continue with the expansion of the NASAwide networked electronic duplicating effort by including the Goddard Space Flight Center (GSFC) as an additional node to the existing configuration of networked electronic duplicating systems within NASA. The subject of this report is the evaluation of a networked electronic duplicating system which meets the duplicating requirements and expands electronic publishing capabilities without increasing current operating costs. This report continues the evaluation reported in 'NASA Electronic Publishing System - Electronic Printing and Duplicating Evaluation Report' (NASA TM-106242) and 'NASA Electronic Publishing System - Stage 1 Evaluation Report' (NASA TM-106510). This report differs from the previous reports through the inclusion of an external networked desktop editing, archival, and publishing functionality which did not exist with the previous networked electronic duplicating system. Additionally, a two-phase approach to the evaluation was undertaken; the first was a paper study justifying a 90-day, on-site evaluation, and the second phase was to validate, during the 90-day evaluation, the cost benefits and productivity increases that could be achieved in an operational mode. A benchmark of the functionality of the networked electronic publishing system and external networked desktop editing, archival, and publishing system was performed under a simulated daily production environment. This report can be used to guide others in determining the most cost effective duplicating/publishing alternative through the use of cost/benefit analysis and return on investment techniques. A treatise on the use of these techniques can be found by referring to 'NASA Electronic Publishing System -Cost/Benefit Methodology' (NASA TM-106662).
2015-01-01
High-density lipoprotein (HDL) retards atherosclerosis by accepting cholesterol from the artery wall. However, the structure of the proposed acceptor, monomeric apolipoprotein A-I (apoA-I), the major protein of HDL, is poorly understood. Two published models for monomeric apoA-I used cross-linking distance constraints to derive best fit conformations. This approach has limitations. (i) Cross-linked peptides provide no information about secondary structure. (ii) A protein chain can be folded in multiple ways to create a best fit. (iii) Ad hoc folding of a secondary structure is unlikely to produce a stable orientation of hydrophobic and hydrophilic residues. To address these limitations, we used a different approach. We first noted that the dimeric apoA-I crystal structure, (Δ185–243)apoA-I, is topologically identical to a monomer in which helix 5 forms a helical hairpin, a monomer with a hydrophobic cleft running the length of the molecule. We then realized that a second crystal structure, (Δ1–43)apoA-I, contains a C-terminal structure that fits snuggly via aromatic and hydrophobic interactions into the hydrophobic cleft. Consequently, we combined these crystal structures into an initial model that was subjected to molecular dynamics simulations. We tested the initial and simulated models and the two previously published models in three ways: against two published data sets (domains predicted to be helical by H/D exchange and six spin-coupled residues) and against our own experimentally determined cross-linking distance constraints. We note that the best fit simulation model, superior by all tests to previously published models, has dynamic features of a molten globule with interesting implications for the functions of apoA-I. PMID:25423138
Huntley, Alyson L; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie; Evans, Maggie
2017-01-01
To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer. Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the findings have yet to be integrated. Integration of quantitative and qualitative synthesized evidence. Two previously published systematic reviews. Synthesized evidence on supportive care for men with prostate cancer was integrated from two previously published systematic reviews: a narrative quantitative review and a qualitative review with thematic synthesis. These two streams of synthesized evidence were synthesized using concurrent narrative summary. Data from both reviews were used to develop a set of propositions from which a summary of components of care that likely to contribute to supportive care acceptable to men with prostate cancer were identified. Nine propositions were developed which covered men's supportive care focusing on the role of health professionals. These propositions were used to compose nine components of care likely to lead to supportive care that is acceptable to men with prostate cancer. Some of these components are no/low cost such as developing a more empathic personalized approach, but more specific approaches need further investigation in randomized controlled trials, for example, online support. This methodological exemplar demonstrates the integration of quantitative and qualitative synthesized data to determine components of care likely to lead to provision of supportive care acceptable to men with prostate cancer. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
Spine lesion analysis in 3D CT data - Reporting on research progress
NASA Astrophysics Data System (ADS)
Jan, Jiri; Chmelik, Jiri; Jakubicek, Roman; Ourednicek, Petr; Amadori, Elena; Gavelli, Giampaolo
2018-04-01
The contribution describes progress in the long-term project concerning automatic diagnosis of spine bone lesions. There are two difficult problems: segmenting reliably possibly severely deformed vertebrae in the spine and then detect, segment and classify the lesions that are often hardly visible thus making even the medical expert decisions highly uncertain, with a large inter-expert variety. New approaches are described enabling to solve both problems with a success rate acceptable for clinical testing, at the same time speeding up the process substantially compared to the previous stage. The results are compared with previously published achievements.
Measuring fluorescence polarization with a dichrometer.
Sutherland, John C
2017-09-01
A method for obtaining fluorescence polarization data from an instrument designed to measure circular and linear dichroism is compared with a previously reported approach. The new method places a polarizer between the sample and a detector mounted perpendicular to the direction of the incident beam and results in determination of the fluorescence polarization ratio, whereas the previous method does not use a polarizer and yields the fluorescence anisotropy. A similar analysis with the detector located axially with the excitation beam demonstrates that there is no frequency modulated signal due to fluorescence polarization in the absence of a polarizer. Copyright © 2017. Published by Elsevier Inc.
ERIC Educational Resources Information Center
Holtz, Barry W.
2008-01-01
This article responds to three articles in the most recent issue of "The Journal of Jewish Education" (74:1) in which a variety of researchers examined Bible teaching that employed an approach to Bible pedagogy that had been characterized by the present author as "the Contextual orientation" in his previously published book, "Textual Knowledge:…
Spontaneous Intrahepatic Portal Venous Shunt: Presentation and Endovascular Treatment.
Sheth, Nakul; Sabbah, Nathanael; Contractor, Sohail
2016-07-01
Spontaneous intrahepatic portal venous shunts are rare with only few case reports published. Treatments using various endovascular techniques have been described, although no single technique has been shown to be preferred. We present a patient who was referred for treatment of a spontaneous portal venous shunt and describe our treatment approach and present a review on previously reported cases. © The Author(s) 2016.
GRACE time-variable gravity field recovery using an improved energy balance approach
NASA Astrophysics Data System (ADS)
Shang, Kun; Guo, Junyi; Shum, C. K.; Dai, Chunli; Luo, Jia
2015-12-01
A new approach based on energy conservation principle for satellite gravimetry mission has been developed and yields more accurate estimation of in situ geopotential difference observables using K-band ranging (KBR) measurements from the Gravity Recovery and Climate Experiment (GRACE) twin-satellite mission. This new approach preserves more gravity information sensed by KBR range-rate measurements and reduces orbit error as compared to previous energy balance methods. Results from analysis of 11 yr of GRACE data indicated that the resulting geopotential difference estimates agree well with predicted values from official Level 2 solutions: with much higher correlation at 0.9, as compared to 0.5-0.8 reported by previous published energy balance studies. We demonstrate that our approach produced a comparable time-variable gravity solution with the Level 2 solutions. The regional GRACE temporal gravity solutions over Greenland reveals that a substantially higher temporal resolution is achievable at 10-d sampling as compared to the official monthly solutions, but without the compromise of spatial resolution, nor the need to use regularization or post-processing.
Paracoccygeal corkscrew approach to ganglion impar injections for tailbone pain.
Foye, Patrick M; Patel, Shounuck I
2009-01-01
A new technique for performing nerve blocks of the ganglion impar (ganglion Walther) is presented. These injections have been reported to relieve coccydynia (tailbone pain), as well as other malignant and nonmalignant pelvic pain syndromes. A variety of techniques have been previously described for blocking this sympathetic nerve ganglion, which is located in the retrorectal space just anterior to the upper coccygeal segments. Prior techniques have included approaches through the anococcygeal ligament, through the sacrococcygeal joint, and through intracoccygeal joint spaces. This article presents a new, paracoccygeal approach whereby the needle is inserted alongside the coccyx and the needle is guided through three discrete steps with a rotating or corkscrew trajectory. Compared with some of the previously published techniques, this paracoccygeal corkscrew approach has multiple potential benefits, including ease of fluoroscopic guidance using the lateral view, ability to easily use a stylet for the spinal needle, and use of a shorter, thinner needle. While no single technique works best for all patients and each technique has potential advantages and disadvantages, this new technique adds to the available options.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
The cardiac muscle duplex as a method to study myocardial heterogeneity
Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.
2014-01-01
This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702
The Military Applications of Cloud Computing Technologies
2013-05-23
tactical networks will potentially cause some unique issues when implementing the JIE. Tactical networks are temporary in nature , and are utilized...connected ABCS clients will receive software updates and security patches as they are published over the network , rather than catching up after an extended...approach from the previous JNN network model, in that it introduces a limited, wireless capability to a unit’s LAN that will enable limited, on-the
Stepping Stones Triple P: the importance of putting the findings into context.
Tellegen, Cassandra L; Sofronoff, Kate
2015-02-04
The Stepping Stones Triple P (SSTP) parenting program is an evidence-based program for parents of children with a disability. A trial of SSTP was recently published in BMC Medicine, which reported results of a randomized controlled trial comparing SSTP to care-as-usual. Although the paper described what should be an important replication trial of SSTP, there are significant shortcomings to the scientific approach of the reporting that need to be addressed. The paper initially cites only a few published SSTP studies and describes evidence for the efficacy of the program as "very scarce". A meta-analysis of studies evaluating SSTP published prior to submission of this paper was not cited. The results are inconsistent with previous evidence for SSTP, yet the authors provide scant interpretation for this inconsistency. Similarly, the unusually high dropout rate of 49% was not adequately explained. The claims that previous research has only been conducted by the developers, has not included children with intellectual disability, and has not used care-as-usual comparison groups, are inaccurate. This commentary explores these issues further in order to place the findings from the recent trial into context.
A method of extracting speed-dependent vector correlations from 2 + 1 REMPI ion images.
Wei, Wei; Wallace, Colin J; Grubb, Michael P; North, Simon W
2017-07-07
We present analytical expressions for extracting Dixon's bipolar moments in the semi-classical limit from experimental anisotropy parameters of sliced or reconstructed non-sliced images. The current method focuses on images generated by 2 + 1 REMPI (Resonance Enhanced Multi-photon Ionization) and is a necessary extension of our previously published 1 + 1 REMPI equations. Two approaches for applying the new equations, direct inversion and forward convolution, are presented. As demonstration of the new method, bipolar moments were extracted from images of carbonyl sulfide (OCS) photodissociation at 230 nm and NO 2 photodissociation at 355 nm, and the results are consistent with previous publications.
McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F
2015-01-01
Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.
Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.
2015-01-01
Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079
van der Ham, Joris L
2016-05-19
Forensic entomologists can use carrion communities' ecological succession data to estimate the postmortem interval (PMI). Permutation tests of hierarchical cluster analyses of these data provide a conceptual method to estimate part of the PMI, the post-colonization interval (post-CI). This multivariate approach produces a baseline of statistically distinct clusters that reflect changes in the carrion community composition during the decomposition process. Carrion community samples of unknown post-CIs are compared with these baseline clusters to estimate the post-CI. In this short communication, I use data from previously published studies to demonstrate the conceptual feasibility of this multivariate approach. Analyses of these data produce series of significantly distinct clusters, which represent carrion communities during 1- to 20-day periods of the decomposition process. For 33 carrion community samples, collected over an 11-day period, this approach correctly estimated the post-CI within an average range of 3.1 days. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Heiferman, Daniel M; Billingsley, Joshua T; Kasliwal, Manish K; Johnson, Andrew K; Keigher, Kiffon M; Frudit, Michel E; Moftakhar, Roham; Lopes, Demetrius K
2016-07-01
Flow-diverting stents, including the Pipeline embolization device (PED) and Silk, have been beneficial in the treatment of aneurysms previously unable to be approached via endovascular techniques. Recurrent aneurysms for which stent-assisted embolization has failed are a therapeutic challenge, given the existing intraluminal construct with continued blood flow into the aneurysm. We report our experience using flow-diverting stents in the repair of 25 aneurysms for which stent-assisted embolization had failed. Nineteen (76%) of these aneurysms at the 12-month follow-up showed improved Raymond class occlusion, with 38% being completely occluded, and all aneurysms demonstrated decreased filling. One patient developed a moderate permanent neurologic deficit. Appropriate stent sizing, proximal and distal construct coverage, and preventing flow diverter deployment between the previously deployed stent struts are important considerations to ensure wall apposition and prevention of endoleak. Flow diverters are shown to be a reasonable option for treating previously stented recurrent cerebral aneurysms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Manthous, Constantine A; Jackson, William L
2007-03-01
The successful management of mass casualties arising from detonation of a nuclear device (NDD) would require significant preparation at all levels of the healthcare system. This article briefly outlines previously published models of destruction and casualties, details approaches to on-site triage and medical evacuation, and offers pathophysiology-based suggestions for treatment of the critically injured. Documentation from previous bomb blasts and nuclear accidents is reviewed to assist in forecasting needs of both systems and patients in the event of an NDD in a major metropolitan area. This review extracts data from previously published models of destruction and casualties projected from an NDD, the primary literature detailing observations of patients' pathophysiology following NDDs in Japan and relevant nuclear accidents, and available contemporary resources for first responders and healthcare providers. The blast and radiation exposures that accompany an NDD will significantly affect local and regional public resources. Morbidity and mortality likely to arise in the setting of dose-dependent organ dysfunction may be minimized by rigorous a priori planning/training for field triage decisions, coordination of medical and civil responses to effect rapid responses and medical evacuation routes, radiation-specific interventions, and modern intensive care. Although the responses of emergency and healthcare systems following NDD will vary depending on the exact mechanism, magnitude, and location of the event, dose exposures and individual pathophysiology evolution are reasonably predictable. Triage decisions, resource requirements, and bedside therapeutic plans can be evidence-based and can be developed rapidly with appropriate preparation and planning.
Data Warehouse Design from HL7 Clinical Document Architecture Schema.
Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L
2015-01-01
This paper proposes a semi-automatic approach to extract clinical information structured in a HL7 Clinical Document Architecture (CDA) and transform it in a data warehouse dimensional model schema. It is based on a conceptual framework published in a previous work that maps the dimensional model primitives with CDA elements. Its feasibility is demonstrated providing a case study based on the analysis of vital signs gathered during laboratory tests.
2011-09-30
capability to emulate the dive and movement behavior of marine mammals provides a significant advantage to modeling environmental impact than do historic...approaches used in Navy environmental assessments (EA) and impact statements (EIS). Many previous methods have been statistical or pseudo-statistical...Siderius. 2011. Comparison of methods used for computing the impact of sound on the marine environment, Marine Environmental Research, 71:342-350. [published
Reynolds, Sheila M; Bilmes, Jeff A; Noble, William Stafford
2010-07-08
DNA in eukaryotes is packaged into a chromatin complex, the most basic element of which is the nucleosome. The precise positioning of the nucleosome cores allows for selective access to the DNA, and the mechanisms that control this positioning are important pieces of the gene expression puzzle. We describe a large-scale nucleosome pattern that jointly characterizes the nucleosome core and the adjacent linkers and is predominantly characterized by long-range oscillations in the mono, di- and tri-nucleotide content of the DNA sequence, and we show that this pattern can be used to predict nucleosome positions in both Homo sapiens and Saccharomyces cerevisiae more accurately than previously published methods. Surprisingly, in both H. sapiens and S. cerevisiae, the most informative individual features are the mono-nucleotide patterns, although the inclusion of di- and tri-nucleotide features results in improved performance. Our approach combines a much longer pattern than has been previously used to predict nucleosome positioning from sequence-301 base pairs, centered at the position to be scored-with a novel discriminative classification approach that selectively weights the contributions from each of the input features. The resulting scores are relatively insensitive to local AT-content and can be used to accurately discriminate putative dyad positions from adjacent linker regions without requiring an additional dynamic programming step and without the attendant edge effects and assumptions about linker length modeling and overall nucleosome density. Our approach produces the best dyad-linker classification results published to date in H. sapiens, and outperforms two recently published models on a large set of S. cerevisiae nucleosome positions. Our results suggest that in both genomes, a comparable and relatively small fraction of nucleosomes are well-positioned and that these positions are predictable based on sequence alone. We believe that the bulk of the remaining nucleosomes follow a statistical positioning model.
Reynolds, Sheila M.; Bilmes, Jeff A.; Noble, William Stafford
2010-01-01
DNA in eukaryotes is packaged into a chromatin complex, the most basic element of which is the nucleosome. The precise positioning of the nucleosome cores allows for selective access to the DNA, and the mechanisms that control this positioning are important pieces of the gene expression puzzle. We describe a large-scale nucleosome pattern that jointly characterizes the nucleosome core and the adjacent linkers and is predominantly characterized by long-range oscillations in the mono, di- and tri-nucleotide content of the DNA sequence, and we show that this pattern can be used to predict nucleosome positions in both Homo sapiens and Saccharomyces cerevisiae more accurately than previously published methods. Surprisingly, in both H. sapiens and S. cerevisiae, the most informative individual features are the mono-nucleotide patterns, although the inclusion of di- and tri-nucleotide features results in improved performance. Our approach combines a much longer pattern than has been previously used to predict nucleosome positioning from sequence—301 base pairs, centered at the position to be scored—with a novel discriminative classification approach that selectively weights the contributions from each of the input features. The resulting scores are relatively insensitive to local AT-content and can be used to accurately discriminate putative dyad positions from adjacent linker regions without requiring an additional dynamic programming step and without the attendant edge effects and assumptions about linker length modeling and overall nucleosome density. Our approach produces the best dyad-linker classification results published to date in H. sapiens, and outperforms two recently published models on a large set of S. cerevisiae nucleosome positions. Our results suggest that in both genomes, a comparable and relatively small fraction of nucleosomes are well-positioned and that these positions are predictable based on sequence alone. We believe that the bulk of the remaining nucleosomes follow a statistical positioning model. PMID:20628623
Reid, David W; Doell, Faye K; Dalton, E Jane; Ahmad, Saunia
2008-12-01
The systemic-constructivist approach to studying and benefiting couples was derived from qualitative and quantitative research on distressed couples over the past 10 years. Systemic-constructivist couple therapy (SCCT) is the clinical intervention that accompanies the approach. SCCT guides the therapist to work with both the intrapersonal and the interpersonal aspects of marriage while also integrating the social-environmental context of the couple. The theory that underlies SCCT is explained, including concepts such as we-ness and interpersonal processing. The primary components of the therapy are described. Findings described previously in an inaugural monograph containing extensive research demonstrating the long-term utility of SCCT are reviewed. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Shpynov, S; Pozdnichenko, N; Gumenuk, A
2015-01-01
Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach. Copyright © 2015 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.
Skouteris, Helen; Bailey, Cate; Nagle, Cate; Hauck, Yvonne; Bruce, Lauren; Morris, Heather
2017-12-01
Worldwide, women seldom reach the recommended target of exclusive breastfeeding up to 6 months postpartum. The aim of the current study was to update a previously published review that presented a conceptual and methodological synthesis of interventions designed to promote exclusive breastfeeding up to 6 months in high-income countries. A systematic search of leading databases was conducted for scholarly, peer-reviewed, randomized controlled trials published from June 2013 to December 2016. Twelve new articles were identified as relevant; all were published in English and assessed exclusive breastfeeding with a follow-up period extending beyond 4 months postpartum. Articles were analyzed for overall quality of evidence in regard to duration of exclusive breastfeeding, using the Grading and Recommendations Assessment, Development, and Evaluation approach. A significant increase in the duration of exclusive breastfeeding was found in 4 of the 12 studies. All four successful interventions had long-duration postpartum programs, implemented by telephone, text message, or through a website. Some of the successful interventions also included prenatal education or in-hospital breastfeeding support. Results from this review update correspond closely with previous findings, in that all of the successful interventions had lengthy postnatal support or an education component. More studies assessed intervention fidelity than in the previous review; however, there was little discussion of maternal body-mass index. While a pattern of successful interventions is beginning to emerge, further research is needed to provide a robust evidence base to inform future interventions, particularly with overweight and obese women.
Fourier Magnitude-Based Privacy-Preserving Clustering on Time-Series Data
NASA Astrophysics Data System (ADS)
Kim, Hea-Suk; Moon, Yang-Sae
Privacy-preserving clustering (PPC in short) is important in publishing sensitive time-series data. Previous PPC solutions, however, have a problem of not preserving distance orders or incurring privacy breach. To solve this problem, we propose a new PPC approach that exploits Fourier magnitudes of time-series. Our magnitude-based method does not cause privacy breach even though its techniques or related parameters are publicly revealed. Using magnitudes only, however, incurs the distance order problem, and we thus present magnitude selection strategies to preserve as many Euclidean distance orders as possible. Through extensive experiments, we showcase the superiority of our magnitude-based approach.
Measuring target detection performance in paradigms with high event rates.
Bendixen, Alexandra; Andersen, Søren K
2013-05-01
Combining behavioral and neurophysiological measurements inevitably implies mutual constraints, such as when the neurophysiological measurement requires fast-paced stimulus presentation and hence the attribution of a behavioral response to a particular preceding stimulus becomes ambiguous. We develop and test a method for validly assessing behavioral detection performance in spite of this ambiguity. We examine four approaches taken in the literature to treat such situations. We analytically derive a new variant of computing the classical parameters of signal detection theory, hit and false alarm rates, adapted to fast-paced paradigms. Each of the previous approaches shows specific shortcomings (susceptibility towards response window choice, biased estimates of behavioral detection performance). Superior performance of our new approach is demonstrated for both simulated and empirical behavioral data. Further evidence is provided by reliable correspondence between behavioral performance and the N2b component as an electrophysiological indicator of target detection. The appropriateness of our approach is substantiated by both theoretical and empirical arguments. We demonstrate an easy-to-implement solution for measuring target detection performance independent of the rate of event presentation. Thus overcoming the measurement bias of previous approaches, our method will help to clarify the behavioral relevance of different measures of cortical activation. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Bayesian approach for counting experiment statistics applied to a neutrino point source analysis
NASA Astrophysics Data System (ADS)
Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.
2013-12-01
In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.
Hypothesis testing for differentially correlated features.
Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua
2016-10-01
In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Shared decision making in Italy: An updated revision of the current situation.
Bottacini, Alessandro; Scalia, Peter; Goss, Claudia
2017-06-01
The aim of this paper is to update the previous review on the state of patient and public participation in healthcare in Italy. Policymakers consider patient involvement an important aspect in health care decisions and encourage patients to actively participate in the clinical interaction. Nevertheless, the term shared decision making (SDM) is still not clearly defined. Patient associations promote patient participation in health care decisions. Several experts attended the latest consensus conference about patient engagement to reach a consensus on the definition of SDM. Research regarding SDM in Italy continues to increase with 17 articles published between 2012 and 2017. Researchers have assessed the variables associated with patient involvement and explored the use of the SDM approach in different medical settings. Despite the dedicated SDM initiative, researchers in Italy recognize room for improvement. Work is needed to reach a common language regarding SDM and its mechanisms to implement this approach at the clinical level. Copyright © 2017. Published by Elsevier GmbH.
Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy
2018-03-16
Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor
2017-04-01
We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Coco, Laura; Colina, Sonia; Atcherson, Samuel R.
2017-01-01
Purpose The purpose of this study was to examine the readability level of the Spanish versions of several audiology- and otolaryngology-related patient-reported outcome measures (PROMs) and include a readability analysis of 2 translation approaches when available—the published version and a “functionalist” version—using a team-based collaborative approach including community members. Method Readability levels were calculated using the Fry Graph adapted for Spanish, as well as the Fernandez-Huerta and the Spaulding formulae for several commonly used audiology- and otolaryngology-related PROMs. Results Readability calculations agreed with previous studies analyzing audiology-related PROMs in English and demonstrated many Spanish-language PROMs were beyond the 5th grade reading level suggested for health-related materials written for the average population. In addition, the functionalist versions of the PROMs yielded lower grade-level (improved) readability levels than the published versions. Conclusion Our results suggest many of the Spanish-language PROMs evaluated here are beyond the recommended readability levels and may be influenced by the approach to translation. Moreover, improved readability may be possible using a functionalist approach to translation. Future analysis of the suitability of outcome measures and the quality of their translations should move beyond readability and include an evaluation of the individual's comprehension of the written text. PMID:28892821
Fostering interpersonal trust on social media: physicians' perspectives and experiences.
Panahi, Sirous; Watson, Jason; Partridge, Helen
2016-02-01
The problem of developing and sustaining mutual trust is one of the main barriers to knowledge sharing on social media platforms such as blogs, wikis, micro-blogs and social networking websites. While many studies argue that mutual trust is necessary for online communication and knowledge sharing, few have actually explored and demonstrated how physicians can establish and sustain trusted relationships on social media. To identify approaches through which physicians establish interpersonal trust on social media. Twenty-four physicians, who were active users of social media, were interviewed using a semi-structured approach between 2013 and 2014. Snowball sampling was employed for participant recruitment. The data were analysed using a thematic analysis approach. Physicians trust their peers on social media in a slightly different way than in face-to-face communication. The study found that the majority of participants established trust on social media mainly through previous personal interaction, authenticity and relevancy of voice, professional standing, consistency of communication, peer recommendation, and non-anonymous and moderated sites. Healthcare professionals need to approach social media carefully when using it for knowledge sharing, networking and developing trusted relations with like-minded peers. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Apitz, Sabine E; Agius, Suzanne
2017-11-01
The effects of possible changes to the Canadian 2-tiered assessment framework for dredged material based on outcomes of the 2006 Contaminated Dredged Material Management Decisions Workshop (CDMMD) are evaluated. Expanding on the "data mining" approach described in a previous paper, which focused solely on chemical lines of evidence, the efficacy of Tier 1 approaches (increases to the number of chemical analytes, use of mean hazard quotients, and the use of a screening bioassay) in predicting toxicity are evaluated. Results suggest value in additional work to evaluate the following areas: 1) further expanding minimum chemical requirements, 2) using more advanced approaches for chemical interpretation, and 3) using a screening-level bioassay (e.g., Canadian solid-phase photoluminescent bacteria test) to determine whether it would complement Tier 1 chemistry as well as or better than the solvent-based Microtox™ test method evaluated in the present study. Integr Environ Assess Manag 2017;13:1072-1085. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Lapidus, Nathanael; Chevret, Sylvie; Resche-Rigon, Matthieu
2014-12-30
Agreement between two assays is usually based on the concordance correlation coefficient (CCC), estimated from the means, standard deviations, and correlation coefficient of these assays. However, such data will often suffer from left-censoring because of lower limits of detection of these assays. To handle such data, we propose to extend a multiple imputation approach by chained equations (MICE) developed in a close setting of one left-censored assay. The performance of this two-step approach is compared with that of a previously published maximum likelihood estimation through a simulation study. Results show close estimates of the CCC by both methods, although the coverage is improved by our MICE proposal. An application to cytomegalovirus quantification data is provided. Copyright © 2014 John Wiley & Sons, Ltd.
Burghelea, C; Ghervan, L; Bărbos, A; Lucan, V C; Elec, F; Moga, S; Bologa, F; Constantin, L; Iacob, G; Partiu, A
2008-01-01
Nowadays, the standard treatment for upper tract transitional cell carcinoma is open nephroureterectomy, by double lumbar and iliac approach, with peri-meat bladder cuff excision. Since the first laparoscopic nephroureterectomy was performed, several surgical teams were interested by this approach for the treatment of the upper tract transitional cell carcinoma. To plead for retro-peritoneoscopic nephroureterectomy and to assess the surgical indications. Were analyzed the results of the recent published series on nephroureterectomy for upper urinary tract transitional cell carcinoma. Were included studies on conventional, laparoscopic and retro-peritoneoscopic nephroureterectomy, with at least 10 cases, published after 2000. The advantages of retro-peritoneoscopic nephroureterectomy are: minimum blood loss, reduced analgesic intake, a shorter hospital stay and a faster return to previous activities, lower rate of intra- or postoperative complications compared with trans-peritoneal laparoscopy or conventional surgery. With a proper case selection the oncologic safety of the retro-peritoneoscopy is equivalent with open surgery. On short term, retro-peritoneoscopic approach shows similar oncological outcome with other techniques. Retroperitoneal laparoscopic nephroureterectomy is a viable alternative to conventional or trans-peritoneoscopic procedure, with clear cut benefits for the patient. Retro-peritoneoscopy is associated with a low morbidity.
Hrvatin, Sinisa; Hochbaum, Daniel R; Nagy, M Aurel; Cicconet, Marcelo; Robertson, Keiramarie; Cheadle, Lucas; Zilionis, Rapolas; Ratner, Alex; Borges-Monroy, Rebeca; Klein, Allon M; Sabatini, Bernardo L; Greenberg, Michael E
2018-05-11
In the version of this article initially published, the x-axis labels in Fig. 3c read Vglut, Gad1/2, Aldh1l1 and Pecam1; they should have read Vglut + , Gad1/2 + , Aldh1l1 + and Pecam1 + . In Fig. 4, the range values were missing from the color scales; they are, from left to right, 4-15, 0-15, 4-15 and 0-15 in Fig. 4a and 4-15, 4-15 and 4-8 in Fig. 4h. In the third paragraph of the main text, the phrase reading "Previous approaches have analyzed a limited number of inhibitory cell types, thus masking the full diversity of excitatory populations" should have read "Previous approaches have analyzed a limited number of inhibitory cell types and masked the full diversity of excitatory populations." In the second paragraph of Results section "Diversity of experience-regulated ERGs," the phrase reading "thus suggesting considerable divergence within the gene expression program responding to early stimuli" should have read "thus suggesting considerable divergence within the early stimulus-responsive gene expression program." In the fourth paragraph of Results section "Excitatory neuronal LRGs," the sentence reading "The anatomical organization of these cell types into sublayers, coupled with divergent transcriptional responses to a sensory stimulus, suggested previously unappreciated functional subdivisions located within the laminae of the mouse visual cortex and resembling the cytoarchitecture in higher mammals" should have read "The anatomical organization of these cell types into sublayers, coupled with divergent transcriptional responses to a sensory stimulus, suggests previously unappreciated functional subdivisions located within the laminae of the mouse visual cortex, resembling the cytoarchitecture in higher mammals." In the last sentence of the Results, "sensory-responsive genes" should have read "sensory-stimulus-responsive genes." The errors have been corrected in the HTML and PDF versions of the article.
Flegal, Katherine M; Ioannidis, John P A
2017-08-01
Meta-analyses of individual participant data (MIPDs) offer many advantages and are considered the highest level of evidence. However, MIPDs can be seriously compromised when they are not solidly founded upon a systematic review. These data-intensive collaborative projects may be led by experts who already have deep knowledge of the literature in the field and of the results of published studies and how these results vary based on different analytical approaches. If investigators tailor the searches, eligibility criteria, and analysis plan of the MIPD, they run the risk of reaching foregone conclusions. We exemplify this potential bias in a MIPD on the association of body mass index with mortality conducted by a collaboration of outstanding and extremely knowledgeable investigators. Contrary to a previous meta-analysis of group data that used a systematic review approach, the MIPD did not seem to use a formal search: it considered 239 studies, of which the senior author was previously aware of at least 238, and it violated its own listed eligibility criteria to include those studies and exclude other studies. It also preferred an analysis plan that was also known to give a specific direction of effects in already published results of most of the included evidence. MIPDs where results of constituent studies are already largely known need safeguards to their validity. These may include careful systematic searches, adherence to the Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data guidelines, and exploration of the robustness of results with different analyses. They should also avoid selective emphasis on foregone conclusions based on previously known results with specific analytical choices. Copyright © 2017 Elsevier Inc. All rights reserved.
Belleudi, Valeria; Trotta, Francesco; Vecchi, Simona; Amato, Laura; Addis, Antonio; Davoli, Marina
2018-05-16
Several drugs share the same therapeutic indication, including those undergoing patent expiration. Concerns on the interchangeability are frequent in clinical practice, challenging the evaluation of switchability through observational research. To conduct a scoping review of observational studies on drug switchability to identify methodological strategies adopted to deal with bias and confounding. We searched PubMed, EMBASE, and Web of Science (updated 1/31/2017) to identify studies evaluating switchability in terms of effectiveness/safety outcomes or compliance. Three reviewers independently screened studies extracting all characteristics. Strategies to address confounding, particularly, previous drug use and switching reasons were considered. All findings were summarized in descriptive analyses. Thirty-two studies, published in the last 10 years, met the inclusion criteria. Epilepsy, cardiovascular and rheumatology were the most frequently represented clinical areas. 75% of the studies reported data on effectiveness/safety outcomes. The most frequent study design was cohort (65.6%) followed by case-control (21.9%) and self-controlled (12.5%). Case-control and case-crossover studies showed homogeneous methodological strategies to deal with bias and confounding. Among cohort studies, the confounding associated with previous drug use was addressed introducing variables in multivariate model (47.3%) or selecting only adherent patients (14.3%). Around 30% of cohort studies did not report reasons for switching. In the remaining 70%, clinical parameters or previous occurrence of outcomes were measured to identify switching connected with lack of effectiveness or adverse events. This study represents a starting point for researchers and administrators who are approaching the investigation and assessment of issues related to interchangeability of drugs. Copyright © 2018. Published by Elsevier Inc.
Patient acceptance of awake craniotomy.
Wrede, Karsten H; Stieglitz, Lennart H; Fiferna, Antje; Karst, Matthias; Gerganov, Venelin M; Samii, Madjid; von Gösseln, Hans-Henning; Lüdemann, Wolf O
2011-12-01
The aim of this study was to objectively assess the patients' acceptance for awake craniotomy in a group of neurosurgical patients, who underwent this procedure for removal of lesions in or close to eloquent brain areas. Patients acceptance for awake craniotomy under local anesthesia and conscious sedation was assessed by a formal questionnaire (PPP33), initially developed for general surgery patients. The results are compared to a group of patients who had brain surgery under general anesthesia and to previously published data. The awake craniotomy (AC) group consisted of 37 male and 9 female patients (48 craniotomies) with age ranging from 18 to 71 years. The general anesthesia (GA) group consisted of 26 male and 15 female patients (43 craniotomies) with age ranging from 26 to 83 years. All patients in the study were included in the questionnaire analysis. In comparison to GA the overall PPP33 score for AC was higher (p=0.07), suggesting better overall acceptance for AC. The subscale scores for AC were also significantly better compared to GA for the two subscales "postoperative pain" (p=0.02) and "physical disorders" (p=0.01) and equal for the other 6 subscales. The results of the overall mean score and the scores for the subscales of the PPP33 questionnaire verify good patients' acceptance for AC. Previous studies have shown good patients' acceptance for awake craniotomy, but only a few times using formal approaches. By utilizing a formal questionnaire we could verify good patient acceptance for awake craniotomy for the treatment of brain tumors in or close to eloquent areas. This is a novel approach that substantiates previously published experiences. Copyright © 2011 Elsevier B.V. All rights reserved.
Mefford, Heather C; Cooper, Gregory M; Zerr, Troy; Smith, Joshua D; Baker, Carl; Shafer, Neil; Thorland, Erik C; Skinner, Cindy; Schwartz, Charles E; Nickerson, Deborah A; Eichler, Evan E
2009-09-01
Copy-number variants (CNVs) are substantial contributors to human disease. A central challenge in CNV-disease association studies is to characterize the pathogenicity of rare and possibly incompletely penetrant events, which requires the accurate detection of rare CNVs in large numbers of individuals. Cost and throughput issues limit our ability to perform these studies. We have adapted the Illumina BeadXpress SNP genotyping assay and developed an algorithm, SNP-Conditional OUTlier detection (SCOUT), to rapidly and accurately detect both rare and common CNVs in large cohorts. This approach is customizable, cost effective, highly parallelized, and largely automated. We applied this method to screen 69 loci in 1105 children with unexplained intellectual disability, identifying pathogenic variants in 3.1% of these individuals and potentially pathogenic variants in an additional 2.3%. We identified seven individuals (0.7%) with a deletion of 16p11.2, which has been previously associated with autism. Our results widen the phenotypic spectrum of these deletions to include intellectual disability without autism. We also detected 1.65-3.4 Mbp duplications at 16p13.11 in 1.1% of affected individuals and 350 kbp deletions at 15q11.2, near the Prader-Willi/Angelman syndrome critical region, in 0.8% of affected individuals. Compared to published CNVs in controls they are significantly (P = 4.7 x 10(-5) and 0.003, respectively) enriched in these children, supporting previously published hypotheses that they are neurocognitive disease risk factors. More generally, this approach offers a previously unavailable balance between customization, cost, and throughput for analysis of CNVs and should prove valuable for targeted CNV detection in both research and diagnostic settings.
System steganalysis with automatic fingerprint extraction
Sloan, Tom; Hernandez-Castro, Julio; Isasi, Pedro
2018-01-01
This paper tries to tackle the modern challenge of practical steganalysis over large data by presenting a novel approach whose aim is to perform with perfect accuracy and in a completely automatic manner. The objective is to detect changes introduced by the steganographic process in those data objects, including signatures related to the tools being used. Our approach achieves this by first extracting reliable regularities by analyzing pairs of modified and unmodified data objects; then, combines these findings by creating general patterns present on data used for training. Finally, we construct a Naive Bayes model that is used to perform classification, and operates on attributes extracted using the aforementioned patterns. This technique has been be applied for different steganographic tools that operate in media files of several types. We are able to replicate or improve on a number or previously published results, but more importantly, we in addition present new steganalytic findings over a number of popular tools that had no previous known attacks. PMID:29694366
Parameter identification for nonlinear aerodynamic systems
NASA Technical Reports Server (NTRS)
Pearson, Allan E.
1993-01-01
This final technical report covers a three and one-half year period preceding February 28, 1993 during which support was provided under NASA Grant NAG-1-1065. Following a general description of the system identification problem and a brief survey of methods to attack it, the basic ideas behind the approach taken in this research effort are presented. The results obtained are described with reference to the published work, including the five semiannual progress reports previously submitted and two interim technical reports.
Methods for Reachability-based Hybrid Controller Design
2012-05-10
approaches for airport runways ( Teo and Tomlin, 2003). The results of the reachability calculations were validated in extensive simulations as well as...UAV flight experiments (Jang and Tomlin, 2005; Teo , 2005). While the focus of these previous applications lies largely in safety verification, the work...B([15, 0],a0)× [−π,π])\\ V,∀qi ∈ Q, where a0 = 30m is the protected radius (chosen based upon published data of the wingspan of a Boeing KC -135
Parasympathetic Stimulation Elicits Cerebral Vasodilatation in Rat
Talman, William T.; Corr, Julie; Dragon, Deidre Nitschke; Wang, DeQiang
2010-01-01
Forebrain arteries receive nitroxidergic input from parasympathetic ganglionic fibers that arise from the pterygopalatine ganglia. Previous studies have shown that ganglionic stimulation in some species led to cerebral vasodilatation while interruption of those fibers interfered with vasodilatation seen during acute hypertension. Because the ganglionic fibers are quite delicate and are easily damaged when the ganglia are approached with published techniques we sought to develop a method that allowed clear exposure of the ganglia and permitted demonstration of cerebral vasodilatation with electrical stimulation of the ganglia in the rat. We had found that an orbital approach during which the eye was retracted for visualization of the ganglion precluded eliciting vasodilatation with ganglionic stimulation. In the current study approaching the ganglion through an incision over the zygomatic arch provided clear exposure of the ganglion and stimulation of the ganglion with that approach led to vasodilatation. PMID:17275420
Approaches to modelling uranium (VI) adsorption on natural mineral assemblages
Waite, T.D.; Davis, J.A.; Fenton, B.R.; Payne, T.E.
2000-01-01
Component additivity (CA) and generalised composite (GC) approaches to deriving a suitable surface complexation model for description of U(VI) adsorption to natural mineral assemblages are pursued in this paper with good success. A single, ferrihydrite-like component is found to reasonably describe uranyl uptake to a number of kaolinitic iron-rich natural substrates at pH > 4 in the CA approach with previously published information on nature of surface complexes, acid-base properties of surface sites and electrostatic effects used in the model. The GC approach, in which little pre-knowledge about generic surface sites is assumed, gives even better fits and would appear to be a method of particular strength for application in areas such as performance assessment provided the model is developed in a careful, stepwise manner with simplicity and goodness of fit as the major criteria for acceptance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas, E-mail: nibrown@cantab.net; Olayos, Elizabeth; Elmer, Sandra
Management of intractable haematuria and obstructive urosepsis from upper tract urothelial carcinoma can be problematic in patients not suitable for surgery, chemotherapy or radiotherapy. Interventional radiology techniques provide alternative approaches in this setting, such as complete kidney embolization to cease urine output, percutaneous nephrostomy, antegrade injection of sclerotherapy agents and sterilisation of the upper collecting system. Related approaches have been successfully employed to sclerose renal cysts, lymphoceles, chyluria and intractable lower tract haemorrhage. No reports of percutaneous, antegrade sclerotherapy in the upper urinary tract have previously been published. We present a case of recurrent haematuria and obstructive urosepsis caused bymore » invasive upper tract urothelial carcinoma in a non-operative patient, which was treated with renal embolisation and percutaneous upper tract urothelial sclerotherapy.« less
Jenkins, Walter; Urch, Scott E.; Shelbourne, K. Donald
2010-01-01
Rehabilitation following lateral side knee ligament repair or reconstruction has traditionally utilized a conservative approach. An article outlining a new concept in rehabilitation following ACL reconstruction called the Knee Symmetry Model was recently published13. The Knee Symmetry Model can also be applied to rehabilitation of other knee pathologies including a knee dislocation with a lateral side injury. This Clinical Commentary describes the rehabilitation procedures used with patients who underwent surgery to repair lateral side ligaments, based upon the Knee Symmetry Model. These procedures were used previously to rehabilitate a group of patients with lateral side ligament repair as reported by Shelbourne et al10. Outcome data and subjective knee scores for these patients were recorded via the International Knee Documentation Committee (IKDC) guidelines and modified Noyes survey scores and are summarized in this paper, as previously published. Rehabilitation following lateral side knee ligament repair using guidelines based upon the Knee Symmetry Model appears to provide patients with excellent long-term stability, normal ROM and strength, and a high level of function. PMID:21589671
A Computational Model of the Rainbow Trout Hypothalamus-Pituitary-Ovary-Liver Axis
Gillies, Kendall; Krone, Stephen M.; Nagler, James J.; Schultz, Irvin R.
2016-01-01
Reproduction in fishes and other vertebrates represents the timely coordination of many endocrine factors that culminate in the production of mature, viable gametes. In recent years there has been rapid growth in understanding fish reproductive biology, which has been motivated in part by recognition of the potential effects that climate change, habitat destruction and contaminant exposure can have on natural and cultured fish populations. New approaches to understanding the impacts of these stressors are being developed that require a systems biology approach with more biologically accurate and detailed mathematical models. We have developed a multi-scale mathematical model of the female rainbow trout hypothalamus-pituitary-ovary-liver axis to use as a tool to help understand the functioning of the system and for extrapolation of laboratory findings of stressor impacts on specific components of the axis. The model describes the essential endocrine components of the female rainbow trout reproductive axis. The model also describes the stage specific growth of maturing oocytes within the ovary and permits the presence of sub-populations of oocytes at different stages of development. Model formulation and parametrization was largely based on previously published in vivo and in vitro data in rainbow trout and new data on the synthesis of gonadotropins in the pituitary. Model predictions were validated against several previously published data sets for annual changes in gonadotropins and estradiol in rainbow trout. Estimates of select model parameters can be obtained from in vitro assays using either quantitative (direct estimation of rate constants) or qualitative (relative change from control values) approaches. This is an important aspect of mathematical models as in vitro, cell-based assays are expected to provide the bulk of experimental data for future risk assessments and will require quantitative physiological models to extrapolate across biological scales. PMID:27096735
A Computational Model of the Rainbow Trout Hypothalamus-Pituitary-Ovary-Liver Axis.
Gillies, Kendall; Krone, Stephen M; Nagler, James J; Schultz, Irvin R
2016-04-01
Reproduction in fishes and other vertebrates represents the timely coordination of many endocrine factors that culminate in the production of mature, viable gametes. In recent years there has been rapid growth in understanding fish reproductive biology, which has been motivated in part by recognition of the potential effects that climate change, habitat destruction and contaminant exposure can have on natural and cultured fish populations. New approaches to understanding the impacts of these stressors are being developed that require a systems biology approach with more biologically accurate and detailed mathematical models. We have developed a multi-scale mathematical model of the female rainbow trout hypothalamus-pituitary-ovary-liver axis to use as a tool to help understand the functioning of the system and for extrapolation of laboratory findings of stressor impacts on specific components of the axis. The model describes the essential endocrine components of the female rainbow trout reproductive axis. The model also describes the stage specific growth of maturing oocytes within the ovary and permits the presence of sub-populations of oocytes at different stages of development. Model formulation and parametrization was largely based on previously published in vivo and in vitro data in rainbow trout and new data on the synthesis of gonadotropins in the pituitary. Model predictions were validated against several previously published data sets for annual changes in gonadotropins and estradiol in rainbow trout. Estimates of select model parameters can be obtained from in vitro assays using either quantitative (direct estimation of rate constants) or qualitative (relative change from control values) approaches. This is an important aspect of mathematical models as in vitro, cell-based assays are expected to provide the bulk of experimental data for future risk assessments and will require quantitative physiological models to extrapolate across biological scales.
NASA Astrophysics Data System (ADS)
Demezhko, Dmitry; Gornostaeva, Anastasia; Majorowicz, Jacek; Šafanda, Jan
2018-01-01
Using a previously published temperature log of the 2363-m-deep borehole Hunt well (Alberta, Canada) and the results of its previous interpretation, the new reconstructions of ground surface temperature and surface heat flux histories for the last 30 ka have been obtained. Two ways to adjust the timescale of geothermal reconstructions are discussed, namely the traditional method based on the a priori data on thermal diffusivity value, and the alternative one including the orbital tuning of the surface heat flux and the Earth's insolation changes. It is shown that the second approach provides better agreement between geothermal reconstructions and proxy evidences of deglaciation chronology in the studied region.
The MIXED framework: A novel approach to evaluating mixed-methods rigor.
Eckhardt, Ann L; DeVon, Holli A
2017-10-01
Evaluation of rigor in mixed-methods (MM) research is a persistent challenge due to the combination of inconsistent philosophical paradigms, the use of multiple research methods which require different skill sets, and the need to combine research at different points in the research process. Researchers have proposed a variety of ways to thoroughly evaluate MM research, but each method fails to provide a framework that is useful for the consumer of research. In contrast, the MIXED framework is meant to bridge the gap between an academic exercise and practical assessment of a published work. The MIXED framework (methods, inference, expertise, evaluation, and design) borrows from previously published frameworks to create a useful tool for the evaluation of a published study. The MIXED framework uses an experimental eight-item scale that allows for comprehensive integrated assessment of MM rigor in published manuscripts. Mixed methods are becoming increasingly prevalent in nursing and healthcare research requiring researchers and consumers to address issues unique to MM such as evaluation of rigor. © 2017 John Wiley & Sons Ltd.
Asfahani, J; Ahmad, Z; Ghani, B Abdul
2018-07-01
An approach based on self organizing map (SOM) artificial neural networks is proposed herewith oriented towards interpreting nuclear and electrical well logging data. The well logging measurements of Kodana well in Southern Syria have been interpreted by applying the proposed approach. Lithological cross-section model of the basaltic environment has been derived and four different kinds of basalt have been consequently distinguished. The four basalts are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products- clay. The results obtained by SOM artificial neural networks are in a good agreement with the previous published results obtained by other different techniques. The SOM approach is practiced successfully in the case study of the Kodana well logging data, and can be therefore recommended as a suitable and effective approach for handling huge well logging data with higher number of variables required for lithological discrimination purposes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
A morphometric system to distinguish sheep and goat postcranial bones.
Salvagno, Lenny; Albarella, Umberto
2017-01-01
Distinguishing between the bones of sheep and goat is a notorious challenge in zooarchaeology. Several methodological contributions have been published at different times and by various people to facilitate this task, largely relying on a macro-morphological approach. This is now routinely adopted by zooarchaeologists but, although it certainly has its value, has also been shown to have limitations. Morphological discriminant criteria can vary in different populations and correct identification is highly dependent upon a researcher's experience, availability of appropriate reference collections, and many other factors that are difficult to quantify. There is therefore a need to establish a more objective system, susceptible to scrutiny. In order to fulfil such a requirement, this paper offers a comprehensive morphometric method for the identification of sheep and goat postcranial bones, using a sample of more than 150 modern skeletons as a basis, and building on previous pioneering work. The proposed method is based on measurements-some newly created, others previously published-and its use is recommended in combination with the more traditional morphological approach. Measurement ratios, used to translate morphological traits into biometrical attributes, are demonstrated to have substantial diagnostic potential, with the vast majority of specimens correctly assigned to species. The efficacy of the new method is also tested with Discriminant Analysis, which provides a successful verification of the biometrical indices, a statistical means to select the most promising measurements, and an additional line of analysis to be used in conjunction with the others.
Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O
2016-06-01
Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.
Faggion, Clovis M; Huda, Fahd; Wasiak, Jason
2014-06-01
To evaluate the methodological approaches used to assess the quality of studies included in systematic reviews (SRs) in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of Systematic Reviews) were searched independently to identify SRs examining interventions published through 2 September 2013. The reference lists of included SRs and records of 10 specialty dental journals were searched manually. Methodological approaches were assessed using seven criteria based on the Cochrane Handbook for Systematic Reviews of Interventions. Temporal trends in methodological quality were also explored. Of the 159 SRs with meta-analyses included in the analysis, 44 (28%) reported the use of domain-based tools, 15 (9%) reported the use of checklists and 7 (4%) reported the use of scales. Forty-two (26%) SRs reported use of more than one tool. Criteria were met heterogeneously; authors of 15 (9%) publications incorporated the quality of evidence of primary studies into SRs, whereas 69% of SRs reported methodological approaches in the Materials/Methods section. Reporting of four criteria was significantly better in recent (2010-2013) than in previous publications. The analysis identified several methodological limitations of approaches used to assess evidence in studies included in SRs in periodontology and implant dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A Novel Approach to Improving Utilization of Laboratory Testing.
Zhou, Yaolin; Procop, Gary W; Riley, Jacquelyn D
2018-02-01
- The incorporation of best practice guidelines into one's institution is a challenging goal of utilization management, and the successful adoption of such guidelines depends on institutional context. Laboratorians who have access to key clinical data are well positioned to understand existing local practices and promote more appropriate laboratory testing. - To apply a novel approach to utilization management by reviewing international clinical guidelines and current institutional practices to create a reliable mechanism to improve detection and reduce unnecessary tests in our patient population. - We targeted a frequently ordered genetic test for HFE-related hereditary hemochromatosis, a disorder of low penetrance. After reviewing international practice guidelines, we evaluated 918 HFE tests and found that all patients with new diagnoses had transferrin saturation levels that were significantly higher than those of patients with nonrisk genotypes (72% versus 42%; P < .001). - Our "one-button" order that restricts HFE genetic tests to patients with transferrin saturation greater than 45% is consistent with published practice guidelines and detected 100% of new patients with HFE-related hereditary hemochromatosis. - Our proposed algorithm differs from previously published approaches in that it incorporates both clinical practice guidelines and local physician practices, yet requires no additional hands-on effort from pathologists or clinicians. This novel approach to utilization management embraces the role of pathologists as leaders in promoting high-quality patient care in local health care systems.
Should Cesarean Scar Defect Be Treated Laparoscopically? A Case Report and Review of the Literature.
Api, Murat; Boza, Aysen; Gorgen, Husnu; Api, Olus
2015-01-01
Several obstetric complications due to inappropriately healed cesarean scar such as placenta accreta, scar dehiscence, and ectopic scar pregnancy are increasingly reported along with rising cesarean rates. Furthermore, many gynecologic conditions, including abnormal uterine bleeding, pelvic pain and infertility, are imputed to deficient cesarean scar healing. Hysteroscopy is the most commonly reported approach for the revision of cesarean scar defects (CSDs). Nevertheless, existing evidence is inadequate to conclude that either hysteroscopy or laparoscopy is effective or superior to each other. Although several management options have been suggested recently, the laparoscopic approach has not been thoroughly scrutinized. We present a case and reviewed the data related to the laparoscopic repair of CSDs and compared the hysteroscopic and laparoscopic management options based on the data from previously published articles. As a result of our analyses, the laparoscopic approach increases uterine wall thickness when compared with the hysteroscopic approach, and both surgical techniques seem to be effective for the resolution of gynecologic symptoms. Hysteroscopic treatment most likely corrects the scar defect but does not strengthen the uterine wall; thus, the potential risk of dehiscence or rupture in subsequent pregnancies does not seem to be improved. Because large uterine defects are known risk factors for scar dehiscence, the repair of the defect to reinforce the myometrial endurance seems to be an appropriate method of treatment. Copyright © 2015 AAGL. Published by Elsevier Inc. All rights reserved.
Improving accuracy for identifying related PubMed queries by an integrated approach.
Lu, Zhiyong; Wilbur, W John
2009-10-01
PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users' search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments.
Improving accuracy for identifying related PubMed queries by an integrated approach
Lu, Zhiyong; Wilbur, W. John
2009-01-01
PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users’ search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1,539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1,396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments. PMID:19162232
Is workplace health promotion research in the Nordic countries really on the right track?
Torp, Steffen; Vinje, Hege Forbech
2014-11-01
The aims of this scoping review of research on workplace health promotion interventions in the Nordic countries were to investigate: how the studies defined health; whether the studies intended to change the workplace itself (the settings approach); and whether the research focus regarding their definitions of health and use of settings approaches has changed in the past five-year period versus previous times. Using scientific literature databases, we searched for intervention studies labelled as "health promotion" in an occupational setting in the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) published from 1986 to 2014. We identified 63 publications and qualitatively analysed their content regarding health outcomes and their use of settings approaches. The reviewed studies focused primarily on preventing disease rather than promoting positive measures of health. In addition, most studies did not try to change the workplace but rather used the workplace as a convenient setting for reaching people to change their behaviour related to lifestyles and disease prevention. Participatory and non-participatory settings approaches to promote well-being and other positive health measures have been used to a minor degree. The recent studies' definitions of health and use of settings approaches did not differ much from the studies published earlier. workplace health promotion in the nordic countries should more often include positive health measures and settings approaches in intervention research it is important to anchor workplace health promotion among important stakeholders such as unions and employers by arguing that sustainable production is dependent on workers' health. © 2014 the Nordic Societies of Public Health.
Immunotherapy Approaches for Malignant Glioma From 2007 to 2009
Sampson, John H.
2012-01-01
Malignant glioma is a deadly disease for which there have been few therapeutic advances over the past century. Although previous treatments were largely unsuccessful, glioma may be an ideal target for immune-based therapy. Recently, translational research led to several clinical trials based on tumor immunotherapy to treat patients with malignant glioma. Here we review 17 recent glioma immunotherapy clinical trials, published over the past 3 years. Various approaches were used, including passive transfer of naked and radiolabeled antibodies, tumor antigen-specific peptide immunization, and the use of patient tumor cells with or without dendritic cells as vaccines. We compare and discuss the current state of the art of clinical immunotherapy treatment, as well as its limited successes, pitfalls, and future potential. PMID:20424975
Finite difference time domain calculation of transients in antennas with nonlinear loads
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.; Kunz, Karl S.; Chamberlin, Kent
1991-01-01
In this paper transient fields for antennas with more general geometries are calculated directly using Finite Difference Time Domain methods. In each FDTD cell which contains a nonlinear load, a nonlinear equation is solved at each time step. As a test case the transient current in a long dipole antenna with a nonlinear load excited by a pulsed plane wave is computed using this approach. The results agree well with both calculated and measured results previously published. The approach given here extends the applicability of the FDTD method to problems involving scattering from targets including nonlinear loads and materials, and to coupling between antennas containing nonlinear loads. It may also be extended to propagation through nonlinear materials.
Ethics and law in research with human biological samples: a new approach.
Petrini, Carlo
2014-01-01
During the last century a large number of documents (regulations, ethical codes, treatises, declarations, conventions) were published on the subject of ethics and clinical trials, many of them focusing on the protection of research participants. More recently various proposals have been put forward to relax some of the constraints imposed on research by these documents and regulations. It is important to distinguish between risks deriving from direct interventions on human subjects and other types of risk. In Italy the Data Protection Authority has acted in the question of research using previously collected health data and biological samples to simplify the procedures regarding informed consent. The new approach may be of help to other researchers working outside Italy.
Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi
2013-09-18
The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.
Improved determination of particulate absorption from combined filter pad and PSICAM measurements.
Lefering, Ina; Röttgers, Rüdiger; Weeks, Rebecca; Connor, Derek; Utschig, Christian; Heymann, Kerstin; McKee, David
2016-10-31
Filter pad light absorption measurements are subject to two major sources of experimental uncertainty: the so-called pathlength amplification factor, β, and scattering offsets, o, for which previous null-correction approaches are limited by recent observations of non-zero absorption in the near infrared (NIR). A new filter pad absorption correction method is presented here which uses linear regression against point-source integrating cavity absorption meter (PSICAM) absorption data to simultaneously resolve both β and the scattering offset. The PSICAM has previously been shown to provide accurate absorption data, even in highly scattering waters. Comparisons of PSICAM and filter pad particulate absorption data reveal linear relationships that vary on a sample by sample basis. This regression approach provides significantly improved agreement with PSICAM data (3.2% RMS%E) than previously published filter pad absorption corrections. Results show that direct transmittance (T-method) filter pad absorption measurements perform effectively at the same level as more complex geometrical configurations based on integrating cavity measurements (IS-method and QFT-ICAM) because the linear regression correction compensates for the sensitivity to scattering errors in the T-method. This approach produces accurate filter pad particulate absorption data for wavelengths in the blue/UV and in the NIR where sensitivity issues with PSICAM measurements limit performance. The combination of the filter pad absorption and PSICAM is therefore recommended for generating full spectral, best quality particulate absorption data as it enables correction of multiple errors sources across both measurements.
Oesterlund, Anna H; Lander, Flemming; Lauritsen, Jens
2016-10-01
The occupational injury incident rate remains relatively high in the European Union. The case-crossover study gives a unique opportunity to study transient risk factors that normally would be very difficult to approach. Studies like this have been carried out in both America and Asia, but so far no relevant research has been conducted in Europe. Case-crossover studies of occupational injuries were collected from PubMed and Embase and read through. Previous experiences concerning method, exposure and outcome, time-related measurements and construction of the questionnaire were taken into account in the preparation of a pilot study. Consequently, experiences from the pilot study were used to design the study protocol. Approximately 2000 patients with an occupational injury will be recruited from the emergency departments in Herning and Odense, Denmark. A standardised questionnaire will be used to collect basic demographic data and information on eight transient risk factors. Based on previous studies and knowledge on occupational injuries the transient risk factors we chose to examine were: time pressure, performing a task with a different method/using unaccustomed technique, change in working surroundings, using a phone, disagreement, feeling ill, being distracted and using malfunctioning machinery/tools or work material. Exposure time 'just before the injury' will be compared with two control periods, 'previous day at the same time of the injury' (pair match) and the previous work week (usual frequency). This study protocol describes a unique opportunity to calculate the effect of transient risk factors on occupational injuries in a European setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K
2017-03-17
Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Edger, Patrick P; VanBuren, Robert; Colle, Marivi; Poorten, Thomas J; Wai, Ching Man; Niederhuth, Chad E; Alger, Elizabeth I; Ou, Shujun; Acharya, Charlotte B; Wang, Jie; Callow, Pete; McKain, Michael R; Shi, Jinghua; Collier, Chad; Xiong, Zhiyong; Mower, Jeffrey P; Slovin, Janet P; Hytönen, Timo; Jiang, Ning; Childs, Kevin L; Knapp, Steven J
2018-02-01
Although draft genomes are available for most agronomically important plant species, the majority are incomplete, highly fragmented, and often riddled with assembly and scaffolding errors. These assembly issues hinder advances in tool development for functional genomics and systems biology. Here we utilized a robust, cost-effective approach to produce high-quality reference genomes. We report a near-complete genome of diploid woodland strawberry (Fragaria vesca) using single-molecule real-time sequencing from Pacific Biosciences (PacBio). This assembly has a contig N50 length of ∼7.9 million base pairs (Mb), representing a ∼300-fold improvement of the previous version. The vast majority (>99.8%) of the assembly was anchored to 7 pseudomolecules using 2 sets of optical maps from Bionano Genomics. We obtained ∼24.96 Mb of sequence not present in the previous version of the F. vesca genome and produced an improved annotation that includes 1496 new genes. Comparative syntenic analyses uncovered numerous, large-scale scaffolding errors present in each chromosome in the previously published version of the F. vesca genome. Our results highlight the need to improve existing short-read based reference genomes. Furthermore, we demonstrate how genome quality impacts commonly used analyses for addressing both fundamental and applied biological questions. © The Authors 2017. Published by Oxford University Press.
Resolution of a life-threatening complication after lung radiofrequency ablation.
Andreetti, Claudio; Maurizi, Giulio; Cassiano, Francesco; Rendina, Erino Angelo
2014-10-01
Lung radiofrequency ablation (RFA) is an option for the treatment of unresectable lung cancer. Clinical investigators have previously warned against severe complications associated with this procedure. We report a case of life-threatening complication after lung RFA for non-operable non-small-cell lung cancer consisting of pulmonary abscess evolving into a bronchopleural fistula, severe pneumothorax and septic pleuritis, which was successfully treated with a multimodal conservative approach. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Comprehensive European dietary exposure model (CEDEM) for food additives.
Tennant, David R
2016-05-01
European methods for assessing dietary exposures to nutrients, additives and other substances in food are limited by the availability of detailed food consumption data for all member states. A proposed comprehensive European dietary exposure model (CEDEM) applies summary data published by the European Food Safety Authority (EFSA) in a deterministic model based on an algorithm from the EFSA intake method for food additives. The proposed approach can predict estimates of food additive exposure provided in previous EFSA scientific opinions that were based on the full European food consumption database.
Su, Xu; Wu, Guili; Li, Lili; Liu, Jianquan
2015-07-01
Accurate identification of species is essential for the majority of biological studies. However, defining species objectively and consistently remains a challenge, especially for plants distributed in remote regions where there is often a lack of sufficient previous specimens. In this study, multiple approaches and lines of evidence were used to determine species boundaries for plants occurring in the Qinghai-Tibet Plateau, using the genus Orinus (Poaceae) as a model system for an integrative approach to delimiting species. A total of 786 individuals from 102 populations of six previously recognized species were collected for niche, morphological and genetic analyses. Three plastid DNA regions (matK, rbcL and trnH-psbA) and one nuclear DNA region [internal transcribed space (ITS)] were sequenced. Whereas six species had been previously recognized, statistical analyses based on character variation, molecular data and niche differentiation identified only two well-delimited clusters, together with a third possibly originating from relatively recent hybridization between, or historical introgression from, the other two. Based on a principle of integrative species delimitation to reconcile different sources of data, the results provide compelling evidence that the six previously recognized species of the genus Orinus that were examined should be reduced to two, with new circumscriptions, and a third, identified in this study, should be described as a new species. This empirical study highlights the value of applying genetic differentiation, morphometric statistics and ecological niche modelling in an integrative approach to re-circumscribing species boundaries. The results produce relatively objective, operational and unbiased taxonomic classifications of plants occurring in remote regions. © The Author 2015. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Innovative approaches for improving maternal and newborn health--A landscape analysis.
Lunze, Karsten; Higgins-Steele, Ariel; Simen-Kapeu, Aline; Vesel, Linda; Kim, Julia; Dickson, Kim
2015-12-17
Essential interventions can improve maternal and newborn health (MNH) outcomes in low- and middle-income countries, but their implementation has been challenging. Innovative MNH approaches have the potential to accelerate progress and to lead to better health outcomes for women and newborns, but their added value to health systems remains incompletely understood. This study's aim was to analyze the landscape of innovative MNH approaches and related published evidence. Systematic literature review and descriptive analysis based on the MNH continuum of care framework and the World Health Organization health system building blocks, analyzing the range and nature of currently published MNH approaches that are considered innovative. We used 11 databases (MedLine, Web of Science, CINAHL, Cochrane, Popline, BLDS, ELDIS, 3ie, CAB direct, WHO Global Health Library and WHOLIS) as data source and extracted data according to our study protocol. Most innovative approaches in MNH are iterations of existing interventions, modified for contexts in which they had not been applied previously. Many aim at the direct organization and delivery of maternal and newborn health services or are primarily health workforce interventions. Innovative approaches also include health technologies, interventions based on community ownership and participation, and novel models of financing and policy making. Rigorous randomized trials to assess innovative MNH approaches are rare; most evaluations are smaller pilot studies. Few studies assessed intervention effects on health outcomes or focused on equity in health care delivery. Future implementation and evaluation efforts need to assess innovations' effects on health outcomes and provide evidence on potential for scale-up, considering cost, feasibility, appropriateness, and acceptability. Measuring equity is an important aspect to identify and target population groups at risk of service inequity. Innovative MNH interventions will need innovative implementation, evaluation and scale-up strategies for their sustainable integration into health systems.
Reference Accuracy among Research Articles Published in "Research on Social Work Practice"
ERIC Educational Resources Information Center
Wilks, Scott E.; Geiger, Jennifer R.; Bates, Samantha M.; Wright, Amy L.
2017-01-01
Objective: The objective was to examine reference errors in research articles published in Research on Social Work Practice. High rates of reference errors in other top social work journals have been noted in previous studies. Methods: Via a sampling frame of 22,177 total references among 464 research articles published in the previous decade, a…
Koga, S; Sairyo, K; Shibuya, I; Kanamori, Y; Kosugi, T; Matsumoto, H; Kitagawa, Y; Sumita, T; Dezawa, A
2012-02-01
In this report, we introduce two cases of recurrent herniated nucleus pulposus (HNP) at L5-S1 that were successfully removed using the small incised microendoscopic discectomy (sMED) technique, proposed by Dezawa and Sairyo in 2011. sMED was performed via the interlaminar approach with a percutaneous endoscope. The patients had previously underdone microendoscopic discectomy for HNP. For the recurrent HNP, the sMED interlaminar approach was selected because the HNP occurred at the level of L5-S1; the percutaneous endoscopic transforaminal approach was not possible for anatomical reasons. To perform sMED via the interlaminar approach, we employed new, specially made devices to enable us to use this technique. In conclusion, sMED is the most minimally invasive approach available for HNP, and its limitations have been gradually eliminated with the introduction specially made devices. In the near future, percutaneous endoscopic surgery could be the gold standard for minimally invasive disc surgery. © 2012 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and Blackwell Publishing Asia Pty Ltd.
[Published books on pain and its treatment in Spain. Analysis with the ISBN database].
Guardiola, E; Baños, J E
1995-04-01
Although analyses have been done on the publishing of scientific articles on pain in Spanish, book publications in the field have not been studied. This article fills that gap. A bibliography of books with pain approached from a medical standpoint was compiled from ISBN CD-ROM database (updated for 1993). Books going into more than one edition were considered single titles. Multi-volume collections were considered single books. We analyzed type of book, subject, ISBN classification, year, language (of publication and original), publisher and place of publication. Two hundred books were studied. Over 60% had been published within the previous 10 years. The year that showed the most books published was 1990 (19) followed by 1989 (16) and 1988 (16). Output has been rising steadily. One hundred ninety-eight books were published in Spanish and 2 in Catalan. The original language was Spanish in 114 cases, English in 51 cases, French in 21 and German in 7. By ISBN classification, most (146) covered pathology, disease and medical/therapeutic clinical practice. By topic, 51 books were general, 41 treated lumbalgia, sciatica or back pain and 35 covered headaches in general or migraine. Most of the books were issued by trade publishers. The cities most often involved were Barcelona and Madrid. An increased number of books about pain are being published in Spain, coinciding with a rise in the publication of scientific articles on the subject.
A modular approach to adaptive structures.
Pagitz, Markus; Pagitz, Manuel; Hühne, Christian
2014-10-07
A remarkable property of nastic, shape changing plants is their complete fusion between actuators and structure. This is achieved by combining a large number of cells whose geometry, internal pressures and material properties are optimized for a given set of target shapes and stiffness requirements. An advantage of such a fusion is that cell walls are prestressed by cell pressures which increases, decreases the overall structural stiffness, weight. Inspired by the nastic movement of plants, Pagitz et al (2012 Bioinspir. Biomim. 7) published a novel concept for pressure actuated cellular structures. This article extends previous work by introducing a modular approach to adaptive structures. An algorithm that breaks down any continuous target shapes into a small number of standardized modules is presented. Furthermore it is shown how cytoskeletons within each cell enhance the properties of adaptive modules. An adaptive passenger seat and an aircrafts leading, trailing edge is used to demonstrate the potential of a modular approach.
Tomlinson, Alan; Hair, Mario; McFadyen, Angus
2013-10-01
Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.
Gillard, Jonathan
2015-12-01
This article re-examines parametric methods for the calculation of time specific reference intervals where there is measurement error present in the time covariate. Previous published work has commonly been based on the standard ordinary least squares approach, weighted where appropriate. In fact, this is an incorrect method when there are measurement errors present, and in this article, we show that the use of this approach may, in certain cases, lead to referral patterns that may vary with different values of the covariate. Thus, it would not be the case that all patients are treated equally; some subjects would be more likely to be referred than others, hence violating the principle of equal treatment required by the International Federation for Clinical Chemistry. We show, by using measurement error models, that reference intervals are produced that satisfy the requirement for equal treatment for all subjects. © The Author(s) 2011.
Metrication study for large space telescope
NASA Technical Reports Server (NTRS)
Creswick, F. A.; Weller, A. E.
1973-01-01
Various approaches which could be taken in developing a metric-system design for the Large Space Telescope, considering potential penalties on development cost and time, commonality with other satellite programs, and contribution to national goals for conversion to the metric system of units were investigated. Information on the problems, potential approaches, and impacts of metrication was collected from published reports on previous aerospace-industry metrication-impact studies and through numerous telephone interviews. The recommended approach to LST metrication formulated in this study cells for new components and subsystems to be designed in metric-module dimensions, but U.S. customary practice is allowed where U.S. metric standards and metric components are not available or would be unsuitable. Electrical/electronic-system design, which is presently largely metric, is considered exempt from futher metrication. An important guideline is that metric design and fabrication should in no way compromise the effectiveness of the LST equipment.
Neville, David C A; Coquard, Virginie; Priestman, David A; te Vruchte, Danielle J M; Sillence, Daniel J; Dwek, Raymond A; Platt, Frances M; Butters, Terry D
2004-08-15
Interest in cellular glycosphingolipid (GSL) function has necessitated the development of a rapid and sensitive method to both analyze and characterize the full complement of structures present in various cells and tissues. An optimized method to characterize oligosaccharides released from glycosphingolipids following ceramide glycanase digestion has been developed. The procedure uses the fluorescent compound anthranilic acid (2-aminobenzoic acid; 2-AA) to label oligosaccharides prior to analysis using normal-phase high-performance liquid chromatography. The labeling procedure is rapid, selective, and easy to perform and is based on the published method of Anumula and Dhume [Glycobiology 8 (1998) 685], originally used to analyze N-linked oligosaccharides. It is less time consuming than a previously published 2-aminobenzamide labeling method [Anal. Biochem. 298 (2001) 207] for analyzing GSL-derived oligosaccharides, as the fluorescent labeling is performed on the enzyme reaction mixture. The purification of 2-AA-labeled products has been improved to ensure recovery of oligosaccharides containing one to four monosaccharide units, which was not previously possible using the Anumula and Dhume post-derivatization purification procedure. This new approach may also be used to analyze both N- and O-linked oligosaccharides.
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
Dommert, M; Reginatto, M; Zboril, M; Fiedler, F; Helmbrecht, S; Enghardt, W; Lutz, B
2017-11-28
Bonner sphere measurements are typically analyzed using unfolding codes. It is well known that it is difficult to get reliable estimates of uncertainties for standard unfolding procedures. An alternative approach is to analyze the data using Bayesian parameter estimation. This method provides reliable estimates of the uncertainties of neutron spectra leading to rigorous estimates of uncertainties of the dose. We extend previous Bayesian approaches and apply the method to stray neutrons in proton therapy environments by introducing a new parameterized model which describes the main features of the expected neutron spectra. The parameterization is based on information that is available from measurements and detailed Monte Carlo simulations. The validity of this approach has been validated with results of an experiment using Bonner spheres carried out at the experimental hall of the OncoRay proton therapy facility in Dresden. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly-Gorham, Molly Rose K.; DeVetter, Brent M.; Brauer, Carolyn S.
We have re-investigated the optical constants n and k for the homologous series of inorganic salts barium fluoride (BaF2) and calcium fluoride (CaF2) using a single-angle near-normal incidence reflectance device in combination with a calibrated Fourier transform infrared (FTIR) spectrometer. Our results are in good qualitative agreement with most previous works. However, certain features of the previously published data near the reststrahlen band exhibit distinct differences in spectral characteristics. Notably, our measurements of BaF2 do not include a spectral feature in the ~250 cm-1 reststrahlen band that was previously published. Additionally, CaF2 exhibits a distinct wavelength shift relative to themore » model derived from previously published data. We confirmed our results with recently published works that use significantly more modern instrumentation and data reduction techniques« less
Segmentation of cortical bone using fast level sets
NASA Astrophysics Data System (ADS)
Chowdhury, Manish; Jörgens, Daniel; Wang, Chunliang; Smedby, Årjan; Moreno, Rodrigo
2017-02-01
Cortical bone plays a big role in the mechanical competence of bone. The analysis of cortical bone requires accurate segmentation methods. Level set methods are usually in the state-of-the-art for segmenting medical images. However, traditional implementations of this method are computationally expensive. This drawback was recently tackled through the so-called coherent propagation extension of the classical algorithm which has decreased computation times dramatically. In this study, we assess the potential of this technique for segmenting cortical bone in interactive time in 3D images acquired through High Resolution peripheral Quantitative Computed Tomography (HR-pQCT). The obtained segmentations are used to estimate cortical thickness and cortical porosity of the investigated images. Cortical thickness and Cortical porosity is computed using sphere fitting and mathematical morphological operations respectively. Qualitative comparison between the segmentations of our proposed algorithm and a previously published approach on six images volumes reveals superior smoothness properties of the level set approach. While the proposed method yields similar results to previous approaches in regions where the boundary between trabecular and cortical bone is well defined, it yields more stable segmentations in challenging regions. This results in more stable estimation of parameters of cortical bone. The proposed technique takes few seconds to compute, which makes it suitable for clinical settings.
Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W
2015-10-01
Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.
Mahieu, Nathaniel G; Patti, Gary J
2017-10-03
When using liquid chromatography/mass spectrometry (LC/MS) to perform untargeted metabolomics, it is now routine to detect tens of thousands of features from biological samples. Poor understanding of the data, however, has complicated interpretation and masked the number of unique metabolites actually being measured in an experiment. Here we place an upper bound on the number of unique metabolites detected in Escherichia coli samples analyzed with one untargeted metabolomics method. We first group multiple features arising from the same analyte, which we call "degenerate features", using a context-driven annotation approach. Surprisingly, this analysis revealed thousands of previously unreported degeneracies that reduced the number of unique analytes to ∼2961. We then applied an orthogonal approach to remove nonbiological features from the data using the 13 C-based credentialing technology. This further reduced the number of unique analytes to less than 1000. Our 90% reduction in data is 5-fold greater than previously published studies. On the basis of the results, we propose an alternative approach to untargeted metabolomics that relies on thoroughly annotated reference data sets. To this end, we introduce the creDBle database ( http://creDBle.wustl.edu ), which contains accurate mass, retention time, and MS/MS fragmentation data as well as annotations of all credentialed features.
Ellner, Stephen P; Geber, Monica A; Hairston, Nelson G
2011-06-01
Rapid contemporary evolution due to natural selection is common in the wild, but it remains uncertain whether its effects are an essential component of community and ecosystem structure and function. Previously we showed how to partition change in a population, community or ecosystem property into contributions from environmental and trait change, when trait change is entirely caused by evolution (Hairston et al. 2005). However, when substantial non-heritable trait change occurs (e.g. due to phenotypic plasticity or change in population structure) that approach can mis-estimate both contributions. Here, we demonstrate how to disentangle ecological impacts of evolution vs. non-heritable trait change by combining our previous approach with the Price Equation. This yields a three-way partitioning into effects of evolution, non-heritable phenotypic change and environment. We extend the approach to cases where ecological consequences of trait change are mediated through interspecific interactions. We analyse empirical examples involving fish, birds and zooplankton, finding that the proportional contribution of rapid evolution varies widely (even among different ecological properties affected by the same trait), and that rapid evolution can be important when it acts to oppose and mitigate phenotypic effects of environmental change. Paradoxically, rapid evolution may be most important when it is least evident. © 2011 Blackwell Publishing Ltd/CNRS.
Habit strength is predicted by activity dynamics in goal-directed brain systems during training.
Zwosta, Katharina; Ruge, Hannes; Goschke, Thomas; Wolfensteller, Uta
2018-01-15
Previous neuroscientific research revealed insights into the brain networks supporting goal-directed and habitual behavior, respectively. However, it remains unclear how these contribute to inter-individual differences in habit strength which is relevant for understanding not only normal behavior but also more severe dysregulations between these types of action control, such as in addiction. In the present fMRI study, we trained subjects on approach and avoidance behavior for an extended period of time before testing the habit strength of the acquired stimulus-response associations. We found that stronger habits were associated with a stronger decrease in inferior parietal lobule activity for approach and avoidance behavior and weaker vmPFC activity at the end of training for avoidance behavior, areas associated with the anticipation of outcome identity and value. VmPFC in particular showed markedly different activity dynamics during the training of approach and avoidance behavior. Furthermore, while ongoing training was accompanied by increasing functional connectivity between posterior putamen and premotor cortex, consistent with previous assumptions about the neural basis of increasing habitualization, this was not predictive of later habit strength. Together, our findings suggest that inter-individual differences in habitual behavior are driven by differences in the persistent involvement of brain areas supporting goal-directed behavior during training. Copyright © 2017. Published by Elsevier Inc.
Occupational and Environmental Contributions to Chronic Cough in Adults: Chest Expert Panel Report.
Tarlo, Susan M; Altman, Kenneth W; Oppenheimer, John; Lim, Kaiser; Vertigan, Anne; Prezant, David; Irwin, Richard S
2016-10-01
In response to occupational and environmental exposures, cough can be an isolated symptom reflecting exposure to an irritant with little physiological consequence, or it can be a manifestation of more significant disease. This document reviews occupational and environmental contributions to chronic cough in adults, focusing on aspects not previously covered in the 2006 ACCP Cough Guideline or our more recent systematic review, and suggests an approach to investigation of these factors when suspected. MEDLINE and TOXLINE literature searches were supplemented by articles identified by the cough panel occupational and environmental subgroup members, to identify occupational and environmental aspects of chronic cough not previously covered in the 2006 ACCP Cough Guideline. Based on the literature reviews and the Delphi methodology, the cough panel occupational and environmental subgroup developed guideline suggestions that were approved after review and voting by the full cough panel. The literature review identified relevant articles regarding: mechanisms; allergic environmental causes; chronic cough and the recreational and involuntary inhalation of tobacco and marijuana smoke; nonallergic environmental triggers; laryngeal syndromes; and occupational diseases and exposures. Consensus-based statements were developed for the approach to diagnosis due to a lack of strong evidence from published literature. Despite increased understanding of cough related to occupational and environmental triggers, there remains a gap between the recommended assessment of occupational and environmental causes of cough and the reported systematic assessment of these factors. There is a need for further documentation of occupational and environmental causes of cough in the future. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Book review: The Wilderness Debate Rages On: Continuing the Great New Wilderness Debate
Peter Landres
2009-01-01
The Wilderness Debate Rages On is a collection of mostly previously published papers about the meaning, value, and role of wilderness and continues the discussion that was propelled by the editors' previous book The Great New Wilderness Debate (also a collection of papers) published in 1998. The editors state that this sequel to their previous book is mandated...
New Methodology for Estimating Fuel Economy by Vehicle Class
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling
2011-01-01
Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less
Bayoumi, Ahmed B; Laviv, Yosef; Yokus, Burhan; Efe, Ibrahim E; Toktas, Zafer Orkun; Kilic, Turker; Demir, Mustafa K; Konya, Deniz; Kasper, Ekkehard M
2017-11-01
1) To provide neurosurgeons and radiologists with a new quantitative and anatomical method to describe spinal meningiomas (SM) consistently. 2) To provide a guide to the surgical approach needed and amount of bony resection required based on the proposed classification. 3) To report the distribution of our 58 cases of SM over different Stages and Subtypes in correlation to the surgical treatment needed for each case. 4) To briefly review the literature on the rare non-conventional surgical corridors to resect SM. We reviewed the literature to report on previously published cohorts and classifications used to describe the location of the tumor inside the spinal canal. We reviewed the cases that were published prior showing non-conventional surgical approaches to resect spinal meningiomas. We proposed our classification system composed of Staging based on maximal cross-sectional surface area of tumor inside canal, Typing based on number of quadrants occupied by tumor and Subtyping based on location of the tumor bulk to spinal cord. Extradural and extra-spinal growth were also covered by our classification. We then applied it retrospectively on our 58 cases. 12 articles were published illustrating overlapping terms to describe spinal meningiomas. Another 7 articles were published reporting on 23 cases of anteriorly located spinal meningiomas treated with approaches other than laminectomies/laminoplasties. 4 Types, 9 Subtypes and 4 Stages were described in our Classification System. In our series of 58 patients, no midline anterior type was represented. Therefore, all our cases were treated by laminectomies or laminoplasties (with/without facetectomies) except a case with a paraspinal component where a costotransversectomy was needed. Spinal meningiomas can be radiologically described in a precise fashion. Selection of surgical corridor depends mainly on location of tumor bulk inside canal. Copyright © 2017 Elsevier B.V. All rights reserved.
Y-Chromosome Haplogroups in the Bosnian-Herzegovinian Population Based on 23 Y-STR Loci.
Doğan, Serkan; Ašić, Adna; Doğan, Gulsen; Besic, Larisa; Marjanovic, Damir
2016-07-01
In a study of the Bosnian-Herzegovinian (B&H) population, Y-chromosome marker frequencies for 100 individuals, generated using the PowerPlex Y23 kit, were used to perform Y-chromosome haplogroup assignment via Whit Athey's Haplogroup Predictor. This algorithm determines Y-chromosome haplogroups from Y-chromosome short tandem repeat (Y-STR) data using a Bayesian probability-based approach. The most frequent haplogroup appeared to be I2a, with a prevalence of 49%, followed by R1a and E1b1b, each accounting for 17% of all haplogroups within the population. Remaining haplogroups were J2a (5%), I1 (4%), R1b (4%), J2b (2%), G2a (1%), and N (1%). These results confirm previously published preliminary B&H population data published over 10 years ago, especially the prediction about the B&H population being a part of the Western Balkan area, which served as the Last Glacial Maximum refuge for the Paleolithic human European population. Furthermore, the results corroborate the hypothesis that this area was a significant stopping point on the "Middle East-Europe highway" during the Neolithic farmer migrations. Finally, since these results are almost completely in accordance with previously published data on B&H and neighboring populations generated by Y-chromosome single nucleotide polymorphism analysis, it can be concluded that in silico analysis of Y-STRs is a reliable method for approximation of the Y-chromosome haplogroup diversity of an examined population.
Effects of linking a soil-water-balance model with a groundwater-flow model
Stanton, Jennifer S.; Ryter, Derek W.; Peterson, Steven M.
2013-01-01
A previously published regional groundwater-flow model in north-central Nebraska was sequentially linked with the recently developed soil-water-balance (SWB) model to analyze effects to groundwater-flow model parameters and calibration results. The linked models provided a more detailed spatial and temporal distribution of simulated recharge based on hydrologic processes, improvement of simulated groundwater-level changes and base flows at specific sites in agricultural areas, and a physically based assessment of the relative magnitude of recharge for grassland, nonirrigated cropland, and irrigated cropland areas. Root-mean-squared (RMS) differences between the simulated and estimated or measured target values for the previously published model and linked models were relatively similar and did not improve for all types of calibration targets. However, without any adjustment to the SWB-generated recharge, the RMS difference between simulated and estimated base-flow target values for the groundwater-flow model was slightly smaller than for the previously published model, possibly indicating that the volume of recharge simulated by the SWB code was closer to actual hydrogeologic conditions than the previously published model provided. Groundwater-level and base-flow hydrographs showed that temporal patterns of simulated groundwater levels and base flows were more accurate for the linked models than for the previously published model at several sites, particularly in agricultural areas.
Graphic facilitation as a novel approach to practice development.
Leonard, Angela; Bonaconsa, Candice; Ssenyonga, Lydia; Coetzee, Minette
2017-10-10
The active participation of staff from the outset of any health service or practice improvement process ensures they are more likely to become engaged in the implementation phases that follow initial service analyses. Graphic facilitation is a way of getting participants to develop an understanding of complex systems and articulate solutions from within them. This article describes how a graphic facilitation process enabled the members of a multidisciplinary team at a specialist paediatric neurosurgery hospital in Uganda to understand how their system worked. The large graphic representation the team created helped each team member to visualise their day-to-day practice, understand each person's contribution, celebrate their triumphs and highlight opportunities for service improvement. The process highlighted three features of their practice: an authentic interdisciplinary team approach to care, admission of the primary carer with the child, and previously unrecognised delays in patient flow through the outpatients department. The team's active participation and ownership of the process resulted in sustainable improvements to clinical practice. ©2012 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
An overview of concept mapping in Dutch mental health care.
Nabitz, Udo; van Randeraad-van der Zee, Carlijn; Kok, Ineke; van Bon-Martens, Marja; Serverens, Peter
2017-02-01
About 25 years ago, concept mapping was introduced in the Netherlands and applied in different fields. A collection of concept mapping projects conducted in the Netherlands was identified, in part in the archive of the Netherlands Institute of Mental Health and Addiction (Trimbos Institute). Some of the 90 identified projects are internationally published. The 90 concept mapping projects reflect the changes in mental health care and can be grouped into 5-year periods and into five typologies. The studies range from conceptualizing the problems of the homeless to the specification of quality indicators for treatment programs for patients with cystic fibrosis. The number of concept mapping projects has varied over time. Growth has been considerable in the last 5 years compared to the previous 5 years. Three case studies are described in detail with 12 characteristics and graphical representations. Concept mapping aligns well with the typical Dutch approach of the "Poldermodel." A broad introduction of concept mapping in European countries in cooperation with other countries, such as the United States and Canada, would strengthen the empirical basis for applying this approach in health care policy, quality, and clinical work. Copyright © 2016. Published by Elsevier Ltd.
Gene discovery by chemical mutagenesis and whole-genome sequencing in Dictyostelium.
Li, Cheng-Lin Frank; Santhanam, Balaji; Webb, Amanda Nicole; Zupan, Blaž; Shaulsky, Gad
2016-09-01
Whole-genome sequencing is a useful approach for identification of chemical-induced lesions, but previous applications involved tedious genetic mapping to pinpoint the causative mutations. We propose that saturation mutagenesis under low mutagenic loads, followed by whole-genome sequencing, should allow direct implication of genes by identifying multiple independent alleles of each relevant gene. We tested the hypothesis by performing three genetic screens with chemical mutagenesis in the social soil amoeba Dictyostelium discoideum Through genome sequencing, we successfully identified mutant genes with multiple alleles in near-saturation screens, including resistance to intense illumination and strong suppressors of defects in an allorecognition pathway. We tested the causality of the mutations by comparison to published data and by direct complementation tests, finding both dominant and recessive causative mutations. Therefore, our strategy provides a cost- and time-efficient approach to gene discovery by integrating chemical mutagenesis and whole-genome sequencing. The method should be applicable to many microbial systems, and it is expected to revolutionize the field of functional genomics in Dictyostelium by greatly expanding the mutation spectrum relative to other common mutagenesis methods. © 2016 Li et al.; Published by Cold Spring Harbor Laboratory Press.
Trebble, Timothy M; Paul, Maureen; Hockey, Peter M; Heyworth, Nicola; Humphrey, Rachael; Powell, Timothy; Clarke, Nicholas
2015-03-01
Improving the quality and activity of clinicians' practice improves patient care. Performance-related human resource management (HRM) is an established approach to improving individual practice but with limited use among clinicians. A framework for performance-related HRM was developed from successful practice in non-healthcare organisations centred on distributive leadership and locally provided, validated and interpreted performance measurement. This study evaluated the response of medical and non-clinical managers to its implementation into a large secondary healthcare organisation. A semistructured qualitative questionnaire was developed from themes identified during framework implementation and included attitudes to previous approaches to measuring doctors' performance, and the structure and response to implementation of the performance-related HRM framework. Responses were analysed through a process of data summarising and categorising. A total of 29, from an invited cohort of 31, medical and non-clinical managers from departmental to executive level were interviewed. Three themes were identified: (1) previous systems of managing clinical performance were considered to be ineffective due to insufficient empowerment of medical managers and poor quality of available performance data; (2) the implemented framework was considered to address these needs and was positively received by medical and non-clinical managers; (3) introduction of performance-related HRM required the involvement of the whole organisation to executive level and inclusion within organisational strategy, structure and training. This study suggests that a performance-related HRM framework may facilitate the management of clinical performance in secondary healthcare, but is dependent on the design and methods of application used. Such approaches contrast with those currently proposed for clinicians in secondary healthcare in the UK and suggest that alternative strategies should be considered. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A techno-economic approach to plasma gasification
NASA Astrophysics Data System (ADS)
Ramos, Ana; Rouboa, Abel
2018-05-01
Within the most used Waste-to-Energy technologies plasma gasification is recent and therefore not yet widely commercialized. Thus, it is necessary to conduct a viability study to support the thorough understanding and implementation of this thermal treatment. This paper aims to assess some technical, environmental and economic aspects of plasma gasification paving the way for a more sustained waste management system, as well as taking advantage of the commodity assets granted by the technique. Therefore, results from previously published studies were updated and highlighted as a preliminary starting point in order to potentially evolve to a complete and systematic work.
Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Keller, J.; Wallen, R.
2015-02-01
Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.
The New Sepsis Definitions: Implications for the Basic and Translational Research Communities.
Coopersmith, Craig M; Deutschman, Clifford S
2017-03-01
New definitions of sepsis and septic shock were published in early 2016, updating old definitions that have not been revisited since 2001. These new definitions should profoundly affect sepsis research. In addition, these papers present clinical criteria for identifying infected patients who are highly likely to have or to develop sepsis or septic shock. In contrast to previous approaches, these new clinical criteria are evidence based. In this review, two of the authors of the new definitions detail the content of the papers and explore the implications for shock and sepsis researchers.
How blockchain-timestamped protocols could improve the trustworthiness of medical science
Irving, Greg; Holden, John
2017-01-01
Trust in scientific research is diminished by evidence that data are being manipulated. Outcome switching, data dredging and selective publication are some of the problems that undermine the integrity of published research. Methods for using blockchain to provide proof of pre-specified endpoints in clinical trial protocols were first reported by Carlisle. We wished to empirically test such an approach using a clinical trial protocol where outcome switching has previously been reported. Here we confirm the use of blockchain as a low cost, independently verifiable method to audit and confirm the reliability of scientific studies. PMID:27239273
The coupled three-dimensional wave packet approach to reactive scattering
NASA Astrophysics Data System (ADS)
Marković, Nikola; Billing, Gert D.
1994-01-01
A recently developed scheme for time-dependent reactive scattering calculations using three-dimensional wave packets is applied to the D+H2 system. The present method is an extension of a previously published semiclassical formulation of the scattering problem and is based on the use of hyperspherical coordinates. The convergence requirements are investigated by detailed calculations for total angular momentum J equal to zero and the general applicability of the method is demonstrated by solving the J=1 problem. The inclusion of the geometric phase is also discussed and its effect on the reaction probability is demonstrated.
How blockchain-timestamped protocols could improve the trustworthiness of medical science.
Irving, Greg; Holden, John
2016-01-01
Trust in scientific research is diminished by evidence that data are being manipulated. Outcome switching, data dredging and selective publication are some of the problems that undermine the integrity of published research. Methods for using blockchain to provide proof of pre-specified endpoints in clinical trial protocols were first reported by Carlisle. We wished to empirically test such an approach using a clinical trial protocol where outcome switching has previously been reported. Here we confirm the use of blockchain as a low cost, independently verifiable method to audit and confirm the reliability of scientific studies.
Jouneau, S; Dres, M; Guerder, A; Bele, N; Bellocq, A; Bernady, A; Berne, G; Bourdin, A; Brinchault, G; Burgel, P R; Carlier, N; Chabot, F; Chavaillon, J M; Cittee, J; Claessens, Y E; Delclaux, B; Deslée, G; Ferré, A; Gacouin, A; Girault, C; Ghasarossian, C; Gouilly, P; Gut-Gobert, C; Gonzalez-Bermejo, J; Jebrak, G; Le Guillou, F; Léveiller, G; Lorenzo, A; Mal, H; Molinari, N; Morel, H; Morel, V; Noel, F; Pégliasco, H; Perotin, J M; Piquet, J; Pontier, S; Rabbat, A; Revest, M; Reychler, G; Stelianides, S; Surpas, P; Tattevin, P; Roche, N
2017-04-01
Chronic obstructive pulmonary disease (COPD) is the chronic respiratory disease with the most important burden on public health in terms of morbidity, mortality and health costs. For patients, COPD is a major source of disability because of dyspnea, restriction in daily activities, exacerbation, risk of chronic respiratory failure and extra-respiratory systemic organ disorders. The previous French Language Respiratory Society (SPLF) guidelines on COPD exacerbations were published in 2003. Using the GRADE methodology, the present document reviews the current knowledge on COPD exacerbation through 4 specific outlines: (1) epidemiology, (2) clinical evaluation, (3) therapeutic management and (4) prevention. Specific aspects of outpatients and inpatients care are discussed, especially regarding assessment of exacerbation severity and pharmacological approach. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Mediating consolation with suicidal patients.
Gilje, Fredricka; Talseth, Anne-Grethe
2007-07-01
Psychiatric nurses frequently encounter suicidal patients. Caring for such patients often raises ethical questions and dilemmas. The research question for this study was: 'What understandings are revealed in texts about consolation and psychiatric nurses' responses to suicidal patients?' A Gadamerian approach guided re-interpretation of published texts. Through synthesizing four interpretive phases, a comprehensive interpretation emerged. This revealed being 'at home' with self, or an ethical way of being, as a hermeneutic understanding of a way to become ready to mediate consolation with suicidal patients. Trustworthiness was addressed by means of the qualities of auditability, credibility and confirmability. This re-interpretation adds to nursing knowledge, enhances understanding of previous research findings, provides pre-understanding for further research and reveals the value of hermeneutic inquiry in nursing. It also deepens understanding of a published model of consolation. These understandings may help to guide nurses who are struggling with suicidal patients.
Reading Level and Comprehension of Research Consent Forms: An Integrative Review.
Foe, Gabriella; Larson, Elaine L
2016-02-01
Consent forms continue to be at a higher reading level than the recommended sixth to eighth grade, making it difficult for participants to comprehend information before enrolling in research. To assess and address the extent of the problem regarding the level of literacy of consent forms and update previously published reports, we conducted an integrative literature review of English language research published between January 1, 2000, and December 31, 2013; 35 descriptive and eight intervention studies met inclusion criteria. Results confirmed that developing forms at eighth-grade level was attainable though not practiced. It was found that risks of participation was the section most poorly understood. There was also a lack of consensus regarding the most effective method to increase comprehension. Further research using standardized tools is needed to determine the best approach for improving consent forms and processes. © The Author(s) 2016.
PISA: Federated Search in P2P Networks with Uncooperative Peers
NASA Astrophysics Data System (ADS)
Ren, Zujie; Shou, Lidan; Chen, Gang; Chen, Chun; Bei, Yijun
Recently, federated search in P2P networks has received much attention. Most of the previous work assumed a cooperative environment where each peer can actively participate in information publishing and distributed document indexing. However, little work has addressed the problem of incorporating uncooperative peers, which do not publish their own corpus statistics, into a network. This paper presents a P2P-based federated search framework called PISA which incorporates uncooperative peers as well as the normal ones. In order to address the indexing needs for uncooperative peers, we propose a novel heuristic query-based sampling approach which can obtain high-quality resource descriptions from uncooperative peers at relatively low communication cost. We also propose an effective method called RISE to merge the results returned by uncooperative peers. Our experimental results indicate that PISA can provide quality search results, while utilizing the uncooperative peers at a low cost.
Qualitative research publication rates in top-ranked nursing journals: 2002-2011.
Gagliardi, Anna R; Umoquit, Muriah; Webster, Fiona; Dobrow, Mark
2014-01-01
Journal publication is the traditional means of disseminating research. Few top-ranked general medical and health services and policy research journals publish qualitative research. This study examined qualitative research publication rates in top-ranked nursing journals with varying characteristics (general vs. specialty focus, number of issues per year) and compared publication rates with those previously reported for journals in related fields. A bibliometric approach was used to identify and quantify qualitative articles published in 10 top-ranked nursing journals from 2002 to 2011. The percentage of qualitative empirical studies varied within and across nursing journals with no apparent association with journal characteristics. Although variable, qualitative research appears more common in high-ranking nursing journals than in general medical and health services and policy research journals. Examining factors that contribute to inconsistent rates may identify strategies to optimize qualitative research reporting and publication.
Estimating the cost of a smoking employee.
Berman, Micah; Crane, Rob; Seiber, Eric; Munur, Mehmet
2014-09-01
We attempted to estimate the excess annual costs that a US private employer may attribute to employing an individual who smokes tobacco as compared to a non-smoking employee. Reviewing and synthesising previous literature estimating certain discrete costs associated with smoking employees, we developed a cost estimation approach that approximates the total of such costs for U.S. employers. We examined absenteeism, presenteesim, smoking breaks, healthcare costs and pension benefits for smokers. Our best estimate of the annual excess cost to employ a smoker is $5816. This estimate should be taken as a general indicator of the extent of excess costs, not as a predictive point value. Employees who smoke impose significant excess costs on private employers. The results of this study may help inform employer decisions about tobacco-related policies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Solitaire salvage: a stent retriever-assisted catheter reduction technical report.
Parry, Phillip Vaughan; Morales, Alejandro; Jankowitz, Brian Thomas
2016-07-01
The endovascular management of giant aneurysms often proves difficult with standard techniques. Obtaining distal access to allow catheter reduction is often key to approaching these aneurysms, but several anatomic challenges make this task unsafe and not feasible. Obtaining distal anchor points and performing catheter reduction maneuvers using adjunctive devices is not a novel concept, however using the Solitaire in order to do so may have some distinct advantages compared with previously described methods. Here we describe our novel Solitaire salvage technique, which allowed successful reduction of a looped catheter within an aneurysm in three cases. While this technique is expensive and therefore best performed after standard maneuvers have failed, in our experience it was effective, safe, and more efficient than other methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
15 CFR 10.10 - Review of published standards.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Review of published standards. 10.10... DEVELOPMENT OF VOLUNTARY PRODUCT STANDARDS § 10.10 Review of published standards. (a) Each standard published... considered until a replacement standard is published. (b) Each standard published under these or previous...
Characteristics of Academic Detailing: Results of a Literature Review
Van Hoof, Thomas J.; Harrison, Lisa G.; Miller, Nicole E.; Pappas, Maryanne S.; Fischer, Michael A.
2015-01-01
Background Academic detailing is an evidence-based strategy to improve patient care. Efforts to understand the intervention and to use it strategically require an understanding of its important characteristics. A recent systematic review and a subsequent reporting framework call for more accurate and complete reporting of continuing medical education interventions. Objectives Building on a previously published systematic review of 69 studies, we sought to determine how an expanded set of 106 academic detailing studies, including many recently published articles, fared with respect to reporting of important data about this intervention. Methods We conducted a search of MEDLINE, the Cumulative Index to Nursing and Allied Health Literature (clinical) database, and Scopus, from which we identified 38 additional randomized controlled trials published from August 2007 through March 2013. Including the original 69 studies, we abstracted 106 available English-language studies and quantitatively analyzed information about 4 important characteristics of academic detailing: content of visits, clinicians being visited, communication process underlying visits, and outreach workers making visits. Results We found considerable variation (36.5%-100%) in the extent of reporting intervention characteristics, especially about the communication process underlying visits and the outreach workers making visits. The best overall documentation of intervention characteristics of any single study was 68%. Results also demonstrate wide variation in the approach to academic detailing. Conclusions This study demonstrates the need for a standardized approach to collecting and reporting data about academic detailing interventions. Our findings also highlight opportunities for using academic detailing more effectively in research and quality-improvement efforts. PMID:26702333
Characteristics of Academic Detailing: Results of a Literature Review.
Van Hoof, Thomas J; Harrison, Lisa G; Miller, Nicole E; Pappas, Maryanne S; Fischer, Michael A
2015-11-01
Academic detailing is an evidence-based strategy to improve patient care. Efforts to understand the intervention and to use it strategically require an understanding of its important characteristics. A recent systematic review and a subsequent reporting framework call for more accurate and complete reporting of continuing medical education interventions. Building on a previously published systematic review of 69 studies, we sought to determine how an expanded set of 106 academic detailing studies, including many recently published articles, fared with respect to reporting of important data about this intervention. We conducted a search of MEDLINE, the Cumulative Index to Nursing and Allied Health Literature (clinical) database, and Scopus, from which we identified 38 additional randomized controlled trials published from August 2007 through March 2013. Including the original 69 studies, we abstracted 106 available English-language studies and quantitatively analyzed information about 4 important characteristics of academic detailing: content of visits, clinicians being visited, communication process underlying visits, and outreach workers making visits. We found considerable variation (36.5%-100%) in the extent of reporting intervention characteristics, especially about the communication process underlying visits and the outreach workers making visits. The best overall documentation of intervention characteristics of any single study was 68%. Results also demonstrate wide variation in the approach to academic detailing. This study demonstrates the need for a standardized approach to collecting and reporting data about academic detailing interventions. Our findings also highlight opportunities for using academic detailing more effectively in research and quality-improvement efforts.
Active and passive vibration suppression for space structures
NASA Technical Reports Server (NTRS)
Hyland, David C.
1991-01-01
The relative benefits of passive and active vibration suppression for large space structures (LSS) are discussed. The intent is to sketch the true ranges of applicability of these approaches using previously published technical results. It was found that the distinction between active and passive vibration suppression approaches is not as sharp as might be thought at first. The relative simplicity, reliability, and cost effectiveness touted for passive measures are vitiated by 'hidden costs' bound up with detailed engineering implementation issues and inherent performance limitations. At the same time, reliability and robustness issues are often cited against active control. It is argued that a continuum of vibration suppression measures offering mutually supporting capabilities is needed. The challenge is to properly orchestrate a spectrum of methods to reap the synergistic benefits of combined advanced materials, passive damping, and active control.
Method of measuring blood oxygenation based on spectroscopy of diffusely scattered light
NASA Astrophysics Data System (ADS)
Kleshnin, M. S.; Orlova, A. G.; Kirillin, M. Yu.; Golubyatnikov, G. Yu.; Turchin, I. V.
2017-05-01
A new approach to the measurement of blood oxygenation is developed and implemented, based on an original two-step algorithm reconstructing the relative concentration of biological chromophores (haemoglobin, water, lipids) from the measured spectra of diffusely scattered light at different distances from the radiation source. The numerical experiments and approbation of the proposed approach using a biological phantom have shown the high accuracy of the reconstruction of optical properties of the object in question, as well as the possibility of correct calculation of the haemoglobin oxygenation in the presence of additive noises without calibration of the measuring device. The results of the experimental studies in animals agree with the previously published results obtained by other research groups and demonstrate the possibility of applying the developed method to the monitoring of blood oxygenation in tumour tissues.
Continuous high-solids corn liquefaction and fermentation with stripping of ethanol.
Taylor, Frank; Marquez, Marco A; Johnston, David B; Goldberg, Neil M; Hicks, Kevin B
2010-06-01
Removal of ethanol from the fermentor during fermentation can increase productivity and reduce the costs for dewatering the product and coproduct. One approach is to recycle the fermentor contents through a stripping column, where a non-condensable gas removes ethanol to a condenser. Previous research showed that this approach is feasible. Savings of $0.03 per gallon were predicted at 34% corn dry solids. Greater savings were predicted at higher concentration. Now the feasibility has been demonstrated at over 40% corn dry solids, using a continuous corn liquefaction system. A pilot plant, that continuously fed corn meal at more than one bushel (25 kg) per day, was operated for 60 consecutive days, continuously converting 95% of starch and producing 88% of the maximum theoretical yield of ethanol. A computer simulation was used to analyze the results. The fermentation and stripping systems were not significantly affected when the CO(2) stripping gas was partially replaced by nitrogen or air, potentially lowering costs associated with the gas recycle loop. It was concluded that previous estimates of potential cost savings are still valid. (c) 2010. Published by Elsevier Ltd. All rights reserved.
Chamberlain, Daniel B; Chamberlain, James M
2017-01-01
We demonstrate the application of a Bayesian approach to a recent negative clinical trial result. A Bayesian analysis of such a trial can provide a more useful interpretation of results and can incorporate previous evidence. This was a secondary analysis of the efficacy and safety results of the Pediatric Seizure Study, a randomized clinical trial of lorazepam versus diazepam for pediatric status epilepticus. We included the published results from the only prospective pediatric study of status in a Bayesian hierarchic model, and we performed sensitivity analyses on the amount of pooling between studies. We evaluated 3 summary analyses for the results: superiority, noninferiority (margin <-10%), and practical equivalence (within ±10%). Consistent with the original study's classic analysis of study results, we did not demonstrate superiority of lorazepam over diazepam. There is a 95% probability that the true efficacy of lorazepam is in the range of 66% to 80%. For both the efficacy and safety outcomes, there was greater than 95% probability that lorazepam is noninferior to diazepam, and there was greater than 90% probability that the 2 medications are practically equivalent. The results were largely driven by the current study because of the sample sizes of our study (n=273) and the previous pediatric study (n=61). Because Bayesian analysis estimates the probability of one or more hypotheses, such an approach can provide more useful information about the meaning of the results of a negative trial outcome. In the case of pediatric status epilepticus, it is highly likely that lorazepam is noninferior and practically equivalent to diazepam. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Walsh, Kyle M; Anderson, Erik; Hansen, Helen M; Decker, Paul A; Kosel, Matt L; Kollmeyer, Thomas; Rice, Terri; Zheng, Shichun; Xiao, Yuanyuan; Chang, Jeffrey S; McCoy, Lucie S; Bracci, Paige M; Wiemels, Joe L; Pico, Alexander R; Smirnov, Ivan; Lachance, Daniel H; Sicotte, Hugues; Eckel-Passow, Jeanette E; Wiencke, John K; Jenkins, Robert B; Wrensch, Margaret R
2013-02-01
Genomewide association studies (GWAS) and candidate-gene studies have implicated single-nucleotide polymorphisms (SNPs) in at least 45 different genes as putative glioma risk factors. Attempts to validate these associations have yielded variable results and few genetic risk factors have been consistently replicated. We conducted a case-control study of Caucasian glioma cases and controls from the University of California San Francisco (810 cases, 512 controls) and the Mayo Clinic (852 cases, 789 controls) in an attempt to replicate previously reported genetic risk factors for glioma. Sixty SNPs selected from the literature (eight from GWAS and 52 from candidate-gene studies) were successfully genotyped on an Illumina custom genotyping panel. Eight SNPs in/near seven different genes (TERT, EGFR, CCDC26, CDKN2A, PHLDB1, RTEL1, TP53) were significantly associated with glioma risk in the combined dataset (P < 0.05), with all associations in the same direction as in previous reports. Several SNP associations showed considerable differences across histologic subtype. All eight successfully replicated associations were first identified by GWAS, although none of the putative risk SNPs from candidate-gene studies was associated in the full case-control sample (all P values > 0.05). Although several confirmed associations are located near genes long known to be involved in gliomagenesis (e.g., EGFR, CDKN2A, TP53), these associations were first discovered by the GWAS approach and are in noncoding regions. These results highlight that the deficiencies of the candidate-gene approach lay in selecting both appropriate genes and relevant SNPs within these genes. © 2012 WILEY PERIODICALS, INC.
Comparative genomics of Toll-like receptor signalling in five species
Jann, Oliver C; King, Annemarie; Corrales, Nestor Lopez; Anderson, Susan I; Jensen, Kirsty; Ait-ali, Tahar; Tang, Haizhou; Wu, Chunhua; Cockett, Noelle E; Archibald, Alan L; Glass, Elizabeth J
2009-01-01
Background Over the last decade, several studies have identified quantitative trait loci (QTL) affecting variation of immune related traits in mammals. Recent studies in humans and mice suggest that part of this variation may be caused by polymorphisms in genes involved in Toll-like receptor (TLR) signalling. In this project, we used a comparative approach to investigate the importance of TLR-related genes in comparison with other immunologically relevant genes for resistance traits in five species by associating their genomic location with previously published immune-related QTL regions. Results We report the genomic localisation of TLR1-10 and ten associated signalling molecules in sheep and pig using in-silico and/or radiation hybrid (RH) mapping techniques and compare their positions with their annotated homologues in the human, cattle and mouse whole genome sequences. We also report medium-density RH maps for porcine chromosomes 8 and 13. A comparative analysis of the positions of previously published relevant QTLs allowed the identification of homologous regions that are associated with similar health traits in several species and which contain TLR related and other immunologically relevant genes. Additional evidence was gathered by examining relevant gene expression and association studies. Conclusion This comparative genomic approach identified eight genes as potentially causative genes for variations of health related traits. These include susceptibility to clinical mastitis in dairy cattle, general disease resistance in sheep, cattle, humans and mice, and tolerance to protozoan infection in cattle and mice. Four TLR-related genes (TLR1, 6, MyD88, IRF3) appear to be the most likely candidate genes underlying QTL regions which control the resistance to the same or similar pathogens in several species. Further studies are required to investigate the potential role of polymorphisms within these genes. PMID:19432955
A systematic review of financial incentives for dietary behavior change.
Purnell, Jason Q; Gernes, Rebecca; Stein, Rick; Sherraden, Margaret S; Knoblock-Hahn, Amy
2014-07-01
In light of the obesity epidemic, there is growing interest in the use of financial incentives for dietary behavior change. Previous reviews of the literature have focused on randomized controlled trials and found mixed results. The purpose of this systematic review is to update and expand on previous reviews by considering a broader range of study designs, including randomized controlled trials, quasi-experimental, observational, and simulation studies testing the use of financial incentives to change dietary behavior and to inform both dietetic practice and research. The review was guided by theoretical consideration of the type of incentive used based on the principles of operant conditioning. There was further examination of whether studies were carried out with an institutional focus. Studies published between 2006 and 2012 were selected for review, and data were extracted regarding study population, intervention design, outcome measures, study duration and follow-up, and key findings. Twelve studies meeting selection criteria were reviewed, with 11 finding a positive association between incentives and dietary behavior change in the short term. All studies pointed to more specific information on the type, timing, and magnitude of incentives needed to motivate individuals to change behavior, the types of incentives and disincentives most likely to affect the behavior of various socioeconomic groups, and promising approaches for potential policy and practice innovations. Limitations of the studies are noted, including the lack of theoretical guidance in the selection of incentive structures and the absence of basic experimental data. Future research should consider these factors, even as policy makers and practitioners continue to experiment with this potentially useful approach to addressing obesity. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Measuring and managing health system performance: An update from New Zealand.
Chalmers, Linda Maree; Ashton, Toni; Tenbensel, Tim
2017-08-01
In July 2016, New Zealand introduced a new approach to measuring and monitoring health system performance. This 'Systems Level Measure Framework' (SLMF) has evolved from the Integrated Performance and Incentive Framework (IPIF) previously reported in this journal. The SLMF is designed to stimulate a 'whole of system' approach that requires inter-organisational collaboration. Local 'Alliances' between government and non-government health sector organisations are responsible for planning and achieving improved health system outcomes such as reducing ambulatory sensitive hospitalisation for young children, and reducing acute hospital bed days. It marks a shift from the previous regime of output and process targets, and from a pay-for-performance approach to primary care. Some elements of the earlier IPIF proposal, such as general practice quality measures, and tiered levels of performance, were not included in the SLM framework. The focus on health system outcomes demonstrates policy commitment to effective integration of health services. However, there remain considerable challenges to successful implementation. An outcomes framework makes it challenging to attribute changes in outcomes to organisational and collaborative strategies. At the local level, the strength and functioning of collaborative relationships between organisations vary considerably. The extent and pace of change may also be constrained by existing funding arrangements in the health system. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
A New Paradigm for Evaluating Avoidance/Escape Motivation.
Tsutsui-Kimura, Iku; Bouchekioua, Youcef; Mimura, Masaru; Tanaka, Kenji F
2017-07-01
Organisms have evolved to approach pleasurable opportunities and to avoid or escape from aversive experiences. These 2 distinct motivations are referred to as approach and avoidance/escape motivations and are both considered vital for survival. Despite several recent advances in understanding the neurobiology of motivation, most studies addressed approach but not avoidance/escape motivation. Here we develop a new experimental paradigm to quantify avoidance/escape motivation and examine the pharmacological validity. We set up an avoidance variable ratio 5 task in which mice were required to press a lever for variable times to avoid an upcoming aversive stimulus (foot shock) or to escape the ongoing aversive event if they failed to avoid it. We i.p. injected ketamine (0, 1, or 5 mg/kg) or buspirone (0, 5, or 10 mg/kg) 20 or 30 minutes before the behavioral task to see if ketamine enhanced avoidance/escape behavior and buspirone diminished it as previously reported. We found that the performance on the avoidance variable ratio 5 task was sensitive to the intensity of the aversive stimulus. Treatment with ketamine increased while that with buspirone decreased the probability of avoidance from an aversive stimulus in the variable ratio 5 task, being consistent with previous reports. Our new paradigm will prove useful for quantifying avoidance/escape motivation and will contribute to a more comprehensive understanding of motivation. © The Author 2017. Published by Oxford University Press on behalf of CINP.
Protothecosis in hematopoietic stem cell transplantation: case report and review of previous cases.
Macesic, N; Fleming, S; Kidd, S; Madigan, V; Chean, R; Ritchie, D; Slavin, M
2014-06-01
Prototheca species are achlorophyllus algae. Prototheca wickerhamii and Prototheca zopfii cause human disease. In immunocompetent individuals, they cause soft tissue infections and olecranon bursitis, but in transplant recipients, these organisms can cause disseminated disease. We report a fatal case of disseminated P. zopfii infection in an hematopoietic stem cell transplant (HSCT) recipient with bloodstream infection and involvement of multiple soft tissue sites. We review all previous cases of protothecosis in HSCT reported in the literature. Protothecosis is uncommon after HSCT, but has a disseminated presentation that is frequently fatal. It is commonly misidentified as a yeast. Tumor necrosis factor-alpha inhibitors and contamination of central venous catheters may contribute to development of protothecosis. Optimal treatment approaches are yet to be defined. New agents such as miltefosine may be possible future therapies. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Translation of Genotype to Phenotype by a Hierarchy of Cell Subsystems.
Yu, Michael Ku; Kramer, Michael; Dutkowski, Janusz; Srivas, Rohith; Licon, Katherine; Kreisberg, Jason; Ng, Cherie T; Krogan, Nevan; Sharan, Roded; Ideker, Trey
2016-02-24
Accurately translating genotype to phenotype requires accounting for the functional impact of genetic variation at many biological scales. Here we present a strategy for genotype-phenotype reasoning based on existing knowledge of cellular subsystems. These subsystems and their hierarchical organization are defined by the Gene Ontology or a complementary ontology inferred directly from previously published datasets. Guided by the ontology's hierarchical structure, we organize genotype data into an "ontotype," that is, a hierarchy of perturbations representing the effects of genetic variation at multiple cellular scales. The ontotype is then interpreted using logical rules generated by machine learning to predict phenotype. This approach substantially outperforms previous, non-hierarchical methods for translating yeast genotype to cell growth phenotype, and it accurately predicts the growth outcomes of two new screens of 2,503 double gene knockouts impacting DNA repair or nuclear lumen. Ontotypes also generalize to larger knockout combinations, setting the stage for interpreting the complex genetics of disease.
Litfin, Thomas; Zhou, Yaoqi; Yang, Yuedong
2017-04-15
The high cost of drug discovery motivates the development of accurate virtual screening tools. Binding-homology, which takes advantage of known protein-ligand binding pairs, has emerged as a powerful discrimination technique. In order to exploit all available binding data, modelled structures of ligand-binding sequences may be used to create an expanded structural binding template library. SPOT-Ligand 2 has demonstrated significantly improved screening performance over its previous version by expanding the template library 15 times over the previous one. It also performed better than or similar to other binding-homology approaches on the DUD and DUD-E benchmarks. The server is available online at http://sparks-lab.org . yaoqi.zhou@griffith.edu.au or yuedong.yang@griffith.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Som, Nicholas A.; Goodman, Damon H.; Perry, Russell W.; Hardy, Thomas B.
2016-01-01
Previous methods for constructing univariate habitat suitability criteria (HSC) curves have ranged from professional judgement to kernel-smoothed density functions or combinations thereof. We present a new method of generating HSC curves that applies probability density functions as the mathematical representation of the curves. Compared with previous approaches, benefits of our method include (1) estimation of probability density function parameters directly from raw data, (2) quantitative methods for selecting among several candidate probability density functions, and (3) concise methods for expressing estimation uncertainty in the HSC curves. We demonstrate our method with a thorough example using data collected on the depth of water used by juvenile Chinook salmon (Oncorhynchus tschawytscha) in the Klamath River of northern California and southern Oregon. All R code needed to implement our example is provided in the appendix. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Sipahi, Sevgi; Sasaki, Kirsten; Miller, Charles E
2017-08-01
The purpose of this review is to understand the minimally invasive approach to the excision and repair of an isthmocele. Previous small trials and case reports have shown that the minimally invasive approach by hysteroscopy and/or laparoscopy can cure symptoms of a uterine isthmocele, including abnormal bleeding, pelvic pain and secondary infertility. A recent larger prospective study has been published that evaluates outcomes of minimally invasive isthmocele repair. Smaller studies and individual case reports echo the positive results of this larger trial. The cesarean section scar defect, also known as an isthmocele, has become an important diagnosis for women who present with abnormal uterine bleeding, pelvic pain and secondary infertility. It is important for providers to be aware of the effective surgical treatment options for the symptomatic isthmocele. A minimally invasive approach, whether it be laparoscopic or hysteroscopic, has proven to be a safe and effective option in reducing symptoms and improving fertility. VIDEO ABSTRACT: http://links.lww.com/COOG/A37.
A KPI-based process monitoring and fault detection framework for large-scale processes.
Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang
2017-05-01
Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Various drug delivery approaches to the central nervous system.
Pasha, Santosh; Gupta, Kshitij
2010-01-01
The presence of the blood-brain barrier (BBB), an insurmountable obstacle, in particular, and other barriers in brain and periphery contribute to hindrance of the successful diagnosis and treatment of a myriad of central nervous system pathologies. This review discusses several strategies adopted to define a rational drug delivery approach to the CNS along with a short description of the strategies implemented by the authors' group to enhance the analgesic activity, a CNS property, of chimeric peptide of Met-enkephalin and FMRFa (YGGFMKKKFMRFa-YFa). Various approaches for drug delivery to the CNS with their beneficial and non-beneficial aspects, supported by an extensive literature survey published recently, up to August 2009. The reader will have the privilege of gaining an understanding of previous as well as recent approaches to breaching the CNS barriers. Among the various strategies discussed, the potential for efficacious CNS drug targeting in future lies either with the non-invasively administered multifunctional nanosystems or these nanosystems without characterstics such as long systemic circulating capability and avoiding reticuloendothelial system scavenging system of the body, endogenous transporters and efflux inhibitors administered by convection-enhanced delivery.
Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutepov, A. L.; Oudovenko, V. S.; Kotliar, G.
We present a code implementing the linearized self-consistent quasiparticle GW method (QSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N 3more » scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method.« less
Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals
Kutepov, A. L.; Oudovenko, V. S.; Kotliar, G.
2017-06-23
We present a code implementing the linearized self-consistent quasiparticle GW method (QSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N 3more » scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method.« less
Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches
Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire
2016-01-01
Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262
An innovative and integrated approach based on DNA walking to identify unauthorised GMOs.
Fraiture, Marie-Alice; Herman, Philippe; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H
2014-03-15
In the coming years, the frequency of unauthorised genetically modified organisms (GMOs) being present in the European food and feed chain will increase significantly. Therefore, we have developed a strategy to identify unauthorised GMOs containing a pCAMBIA family vector, frequently present in transgenic plants. This integrated approach is performed in two successive steps on Bt rice grains. First, the potential presence of unauthorised GMOs is assessed by the qPCR SYBR®Green technology targeting the terminator 35S pCAMBIA element. Second, its presence is confirmed via the characterisation of the junction between the transgenic cassette and the rice genome. To this end, a DNA walking strategy is applied using a first reverse primer followed by two semi-nested PCR rounds using primers that are each time nested to the previous reverse primer. This approach allows to rapidly identify the transgene flanking region and can easily be implemented by the enforcement laboratories. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
How many novel eukaryotic 'kingdoms'? Pitfalls and limitations of environmental DNA surveys
Berney, Cédric; Fahrni, José; Pawlowski, Jan
2004-01-01
Background Over the past few years, the use of molecular techniques to detect cultivation-independent, eukaryotic diversity has proven to be a powerful approach. Based on small-subunit ribosomal RNA (SSU rRNA) gene analyses, these studies have revealed the existence of an unexpected variety of new phylotypes. Some of them represent novel diversity in known eukaryotic groups, mainly stramenopiles and alveolates. Others do not seem to be related to any molecularly described lineage, and have been proposed to represent novel eukaryotic kingdoms. In order to review the evolutionary importance of this novel high-level eukaryotic diversity critically, and to test the potential technical and analytical pitfalls and limitations of eukaryotic environmental DNA surveys (EES), we analysed 484 environmental SSU rRNA gene sequences, including 81 new sequences from sediments of the small river, the Seymaz (Geneva, Switzerland). Results Based on a detailed screening of an exhaustive alignment of eukaryotic SSU rRNA gene sequences and the phylogenetic re-analysis of previously published environmental sequences using Bayesian methods, our results suggest that the number of novel higher-level taxa revealed by previously published EES was overestimated. Three main sources of errors are responsible for this situation: (1) the presence of undetected chimeric sequences; (2) the misplacement of several fast-evolving sequences; and (3) the incomplete sampling of described, but yet unsequenced eukaryotes. Additionally, EES give a biased view of the diversity present in a given biotope because of the difficult amplification of SSU rRNA genes in some taxonomic groups. Conclusions Environmental DNA surveys undoubtedly contribute to reveal many novel eukaryotic lineages, but there is no clear evidence for a spectacular increase of the diversity at the kingdom level. After re-analysis of previously published data, we found only five candidate lineages of possible novel high-level eukaryotic taxa, two of which comprise several phylotypes that were found independently in different studies. To ascertain their taxonomic status, however, the organisms themselves have now to be identified. PMID:15176975
Jentes, Emily S; Lash, R Ryan; Johansson, Michael A; Sharp, Tyler M; Henry, Ronnie; Brady, Oliver J; Sotir, Mark J; Hay, Simon I; Margolis, Harold S; Brunette, Gary W
2016-06-01
International travel can expose travellers to pathogens not commonly found in their countries of residence, like dengue virus. Travellers and the clinicians who advise and treat them have unique needs for understanding the geographic extent of risk for dengue. Specifically, they should assess the need for prevention measures before travel and ensure appropriate treatment of illness post-travel. Previous dengue-risk maps published in the Centers for Disease Control and Prevention's Yellow Book lacked specificity, as there was a binary (risk, no risk) classification. We developed a process to compile evidence, evaluate it and apply more informative risk classifications. We collected more than 839 observations from official reports, ProMED reports and published scientific research for the period 2005-2014. We classified each location as frequent/continuous risk if there was evidence of more than 10 dengue cases in at least three of the previous 10 years. For locations that did not fit this criterion, we classified locations as sporadic/uncertain risk if the location had evidence of at least one locally acquired dengue case during the last 10 years. We used expert opinion in limited instances to augment available data in areas where data were sparse. Initial categorizations classified 134 areas as frequent/continuous and 140 areas as sporadic/uncertain. CDC subject matter experts reviewed all initial frequent/continuous and sporadic/uncertain categorizations and the previously uncategorized areas. From this review, most categorizations stayed the same; however, 11 categorizations changed from the initial determinations. These new risk classifications enable detailed consideration of dengue risk, with clearer meaning and a direct link to the evidence that supports the specific classification. Since many infectious diseases have dynamic risk, strong geographical heterogeneities and varying data quality and availability, using this approach for other diseases can improve the accuracy, clarity and transparency of risk communication. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
You, Je Sung; Kim, Hoon; Park, Jung Soo; Baek, Kyung Min; Jang, Mun Sun; Lee, Hye Sun; Chung, Sung Phil; Kim, SeungWhan
2015-03-01
The major components affecting high quality cardiopulmonary resuscitation (CPR) have been defined as the ability of the rescuer, hand position, position of the rescuer and victim, depth and rate of chest compressions, and fatigue. Until now, there have been no studies on dominant versus non-dominant hand position and the rescuer's side of approach. This study was designed to evaluate the effectiveness of hand position and approach side on the quality of CPR between right-handed (RH) and left-handed (LH) novice rescuers. 44 health science university students with no previous experience of basic life support (BLS) volunteered for the study. We divided volunteers into two groups by handedness. Adult BLS was performed on a manikin for 2 min in each session. The sequences were randomly performed on the manikin's left side of approach (Lap) with the rescuer's left hand in contact with the sternum (Lst), Lap/Rst, Rap/Lst and Rap/Rst. We compared the quality of chest compressions between the RH and LH groups according to predetermined positions. A significant decrease in mean compression depth between the two groups was only observed when rescuers performed in the Rap/Lst scenario, regardless of hand dominance. The frequency of correct hand placement also significantly decreased in the Lap/Rst position for the LH group. The performance of novice rescuers during chest compressions is influenced by the position of the dominant hand and the rescuer's side of approach. In CPR training and real world situations, a novice rescuer, regardless of handedness, should consider hand positions for contacting the sternum identical to the side of approach after approaching from the nearest and most accessible side, for optimal CPR performance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Child Malnutrition in Pakistan: Evidence from Literature
Asim, Muhammad; Nawaz, Yasir
2018-01-01
Pakistan has one of the highest prevalences of child malnutrition as compared to other developing countries. This narrative review was accomplished to examine the published empirical literature on children’s nutritional status in Pakistan. The objectives of this review were to know about the methodological approaches used in previous studies, to assess the overall situation of childhood malnutrition, and to identify the areas that have not yet been studied. This study was carried out to collect and synthesize the relevant data from previously published papers through different scholarly database search engines. The most relevant and current published papers between 2000–2016 were included in this study. The research papers that contain the data related to child malnutrition in Pakistan were assessed. A total of 28 articles was reviewed and almost similar methodologies were used in all of them. Most of the researchers conducted the cross sectional quantitative and descriptive studies, through structured interviews for identifying the causes of child malnutrition. Only one study used the mix method technique for acquiring data from the respondents. For the assessment of malnutrition among children, out of 28 papers, 20 used the World Health Organization (WHO) weight for age, age for height, and height for weight Z-score method. Early marriages, large family size, high fertility rates with a lack of birth spacing, low income, the lack of breast feeding, and exclusive breastfeeding were found to be the themes that repeatedly emerged in the reviewed literature. There is a dire need of qualitative and mixed method researches to understand and have an insight into the underlying factors of child malnutrition in Pakistan. PMID:29734703
An Ensemble Approach for Drug Side Effect Prediction
Jahid, Md Jamiul; Ruan, Jianhua
2014-01-01
In silico prediction of drug side-effects in early stage of drug development is becoming more popular now days, which not only reduces the time for drug design but also reduces the drug development costs. In this article we propose an ensemble approach to predict drug side-effects of drug molecules based on their chemical structure. Our idea originates from the observation that similar drugs have similar side-effects. Based on this observation we design an ensemble approach that combine the results from different classification models where each model is generated by a different set of similar drugs. We applied our approach to 1385 side-effects in the SIDER database for 888 drugs. Results show that our approach outperformed previously published approaches and standard classifiers. Furthermore, we applied our method to a number of uncharacterized drug molecules in DrugBank database and predict their side-effect profiles for future usage. Results from various sources confirm that our method is able to predict the side-effects for uncharacterized drugs and more importantly able to predict rare side-effects which are often ignored by other approaches. The method described in this article can be useful to predict side-effects in drug design in an early stage to reduce experimental cost and time. PMID:25327524
Dickson, Hugh; Kavanagh, David J; MacLeod, Colin
2016-04-01
Previous research has shown that action tendencies to approach alcohol may be modified using computerized Approach-Avoidance Task (AAT), and that this impacted on subsequent consumption. A recent paper in this journal (Becker, Jostman, Wiers, & Holland, 2015) failed to show significant training effects for food in three studies: Nor did it find effects on subsequent consumption. However, avoidance training to high calorie foods was tested against a control rather than Approach training. The present study used a more comparable paradigm to the alcohol studies. It randomly assigned 90 participants to 'approach' or 'avoid' chocolate images on the AAT, and then asked them to taste and rate chocolates. A significant interaction of condition and time showed that training to avoid chocolate resulted in faster avoidance responses to chocolate images, compared with training to approach it. Consistent with Becker et al.'s Study 3, no effect was found on amounts of chocolate consumed, although a newly published study in this journal (Schumacher, Kemps, & Tiggemann, 2016) did do so. The collective evidence does not as yet provide solid basis for the application of AAT training to reduction of problematic food consumption, although clinical trials have yet to be conducted. Copyright © 2016 Elsevier Ltd. All rights reserved.
Xylinas, Evanguelos; Colin, Pierre; Audenet, François; Phe, Véronique; Cormier, Luc; Cussenot, Olivier; Houlgatte, Alain; Karsenty, Gilles; Bruyère, Franck; Polguer, Thomas; Ruffion, Alain; Valeri, Antoine; Rozet, François; Long, Jean-Alexandre; Zerbib, Marc; Rouprêt, Morgan
2013-02-01
To identify predictive factors and assess the impact on oncological outcomes of intravesical recurrence after radical nephroureterectomy (RNU) in upper tract urothelial carcinoma (UTUC). Using a national multicentric retrospective dataset, we identified all patients with UTUC who underwent a RNU between 1995 and 2010 (n = 482). Intravesical recurrence was tested as a prognostic factor for survival through univariable and multivariable Cox regression analysis. Overall, intravesical recurrence occurred in 169 patients (35 %) with a median age of 69.2 years (IQR: 60-76) and after a median follow-up of 39.5 months (IQR: 25-60). Actuarial intravesical recurrence-free survival estimates at 2 and 5 years after RNU were 72 and 45 %, respectively. On univariable analyses, previous history of bladder tumor, tumor multifocality, laparoscopic approach, pathological T-stage, presence of concomitant CIS and lymphovascular invasion were all associated with intravesical recurrence. On multivariable analysis, previous history of bladder cancer, tumor multifocality and laparoscopic approach remained independent predictors of intravesical recurrence. Existence of intravesical recurrence was not correlated with worst oncological outcomes in terms of disease recurrence (p = 0.075) and cancer-specific mortality (p = 0.06). In the current study, intravesical recurrence occurred in 35 % of patients with UTUC after RNU. Previous history of bladder cancer, tumor multifocality, concomitant CIS and laparoscopic approach were independent predictors of intravesical recurrence. These findings are in line with recent published data and should be considered carefully to provide a definitive surveillance protocol regarding management of urothelial carcinomas regardless of the location of urothelial carcinomas in the whole urinary tract.
Mass and Volume Optimization of Space Flight Medical Kits
NASA Technical Reports Server (NTRS)
Keenan, A. B.; Foy, Millennia Hope; Myers, Jerry
2014-01-01
Resource allocation is a critical aspect of space mission planning. All resources, including medical resources, are subject to a number of mission constraints such a maximum mass and volume. However, unlike many resources, there is often limited understanding in how to optimize medical resources for a mission. The Integrated Medical Model (IMM) is a probabilistic model that estimates medical event occurrences and mission outcomes for different mission profiles. IMM simulates outcomes and describes the impact of medical events in terms of lost crew time, medical resource usage, and the potential for medically required evacuation. Previously published work describes an approach that uses the IMM to generate optimized medical kits that maximize benefit to the crew subject to mass and volume constraints. We improve upon the results obtained previously and extend our approach to minimize mass and volume while meeting some benefit threshold. METHODS We frame the medical kit optimization problem as a modified knapsack problem and implement an algorithm utilizing dynamic programming. Using this algorithm, optimized medical kits were generated for 3 mission scenarios with the goal of minimizing the medical kit mass and volume for a specified likelihood of evacuation or Crew Health Index (CHI) threshold. The algorithm was expanded to generate medical kits that maximize likelihood of evacuation or CHI subject to mass and volume constraints. RESULTS AND CONCLUSIONS In maximizing benefit to crew health subject to certain constraints, our algorithm generates medical kits that more closely resemble the unlimited-resource scenario than previous approaches which leverage medical risk information generated by the IMM. Our work here demonstrates that this algorithm provides an efficient and effective means to objectively allocate medical resources for spaceflight missions and provides an effective means of addressing tradeoffs in medical resource allocations and crew mission success parameters.
Celli, A; Sanchez, S; Behne, M; Hazlett, T; Gratton, E; Mauro, T
2010-03-03
Ionic gradients are found across a variety of tissues and organs. In this report, we apply the phasor representation of fluorescence lifetime imaging data to the quantitative study of ionic concentrations in tissues, overcoming technical problems of tissue thickness, concentration artifacts of ion-sensitive dyes, and calibration across inhomogeneous tissue. We used epidermis as a model system, as Ca(2+) gradients in this organ have been shown previously to control essential biologic processes of differentiation and formation of the epidermal permeability barrier. The approach described here allowed much better localization of Ca(2+) stores than those used in previous studies, and revealed that the bulk of free Ca(2+) measured in the epidermis comes from intracellular Ca(2+) stores such as the Golgi and the endoplasmic reticulum, with extracellular Ca(2+) making a relatively small contribution to the epidermal Ca(2+) gradient. Due to the high spatial resolution of two-photon microscopy, we were able to measure a marked heterogeneity in average calcium concentrations from cell to cell in the basal keratinocytes. This finding, not reported in previous studies, calls into question the long-held hypothesis that keratinocytes increase intracellular Ca(2+), cease proliferation, and differentiate passively in response to changes in extracellular Ca(2+). The experimental results obtained using this approach illustrate the power of the experimental and analytical techniques outlined in this report. Our approach can be used in mechanistic studies to address the formation, maintenance, and function of the epidermal Ca(2+) gradient, and it should be broadly applicable to the study of other tissues with ionic gradients. 2010 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Trans-cervical resection of a separate substernal goitre.
To, Henry; Karmakar, Antara; Farrell, Stephen; Manolas, Steve
2017-01-01
A separate substernal goitre which is not continuous with the main cervical thyroid proves a unique challenge for resection. A trans-cervical approach is preferred but may be hazardous due to the possibility of ectopic thyroid tissue with alternate blood supply. A 72year old female who had a previous left hemithyroidectomy presents with a symptomatic central substernal thyroid mass. Following radiological work-up, the separate goitre was carefully removed via a trans-cervical approach and avoidance of sternotomy. She had a rapid recovery without complication. The anatomy and embryology of substernal masses need to be carefully considered particularly if the mass is ectopic thyroid tissue. Careful pre-operative assessment may determine its nature and anatomical features. Intra-operative dissection requires consideration of blood supply and surrounding structures, but often may be and is best completed via a cervical approach to minimise morbidity. Review of the literature affirms the preference for a trans-cervical approach and offers criteria for successful resection via this method. Confirming the nature and anatomy of a separate substernal goitre enables successful removal of the mass via a trans-cervical approach with minimal morbidity. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi
2016-01-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405
Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi
2015-11-01
Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.
Salgarelli, A C; Anesi, A; Bellini, P; Pollastri, G; Tanza, D; Barberini, S; Chiarini, L
2013-04-01
Fractures of the mandibular condyle are common and account for 25-35% of all mandibular fractures reported in the literature. Even with the development of a consensus on the preference for open reduction and internal fixation of these fractures, the clinician is still faced with a dilemma concerning the optimal approach to the ramus-condyle unit. Limited access and injury to the facial nerve are the most common problems. The most commonly used extraoral approaches are the submandibular, retromandibular and preauricular methods. In this study, we propose a modified cosmetic preauricular incision with a short end in the neck, to improve the transmasseteric anteroparotid (TMAP) approach previously described by Wilson et al. in 2005. We retrospectively analysed 13 patients treated in our department for mandibular condylar fractures. Post-operative complications, occlusal status, interincisal opening and joint tenderness were evaluated at 3 months after surgery. The wider skin incision described here provides a convenient approach for open reduction and rigid internal fixation, and good results were obtained. The follow-up ranged from 6 to 40 months. Copyright © 2013 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Haak, Danielle M.; Chaine, Noelle M.; Stephen, Bruce J.; Wong, Alec; Allen, Craig R.
2013-01-01
The Chinese mystery snail (Bellamya chinensis) is an aquatic invasive species found throughout the USA. Little is known about this species’ life history or ecology, and only one population estimate has been published, for Wild Plum Lake in southeast Nebraska. A recent die-off event occurred at this same reservoir and we present a mortality estimate for this B. chinensis population using a quadrat approach. Assuming uniform distribution throughout the newly-exposed lake bed (20,900 m2), we estimate 42,845 individuals died during this event, amounting to approximately 17% of the previously-estimated population size of 253,570. Assuming uniform distribution throughout all previously-reported available habitat (48,525 m2), we estimate 99,476 individuals died, comprising 39% of the previously-reported adult population. The die-off occurred during an extreme drought event, which was coincident with abnormally hot weather. However, the exact reason of the die-off is still unclear. More monitoring of the population dynamics of B. chinensis is necessary to further our understanding of this species’ ecology.
Ruano-Ravina, Alberto; Álvarez-Dardet, Carlos; Domínguez-Berjón, M Felicitas; Fernández, Esteve; García, Ana M; Borrell, Carme
2016-01-01
The purpose of the study was to analyze the determinants of citations such as publication year, article type, article topic, article selected for a press release, number of articles previously published by the corresponding author, and publication language in a Spanish journal of public health. Observational study including all articles published in Gaceta Sanitaria during 2007-2011. We retrieved the number of citations from the ISI Web of Knowledge database in June 2013 and also information on other variables such as number of articles published by the corresponding author in the previous 5 years (searched through PubMed), selection for a press release, publication language, article type and topic, and others. We included 542 articles. Of these, 62.5% were cited in the period considered. We observed an increased odds ratio of citations for articles selected for a press release and also with the number of articles published previously by the corresponding author. Articles published in English do not seem to increase their citations. Certain externalities such as number of articles published by the corresponding author and being selected for a press release seem to influence the number of citations in national journals. Copyright © 2016 Elsevier Inc. All rights reserved.
Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H
2014-07-01
There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Bishop, Felicity L
2015-02-01
To outline some of the challenges of mixed methods research and illustrate how they can be addressed in health psychology research. This study critically reflects on the author's previously published mixed methods research and discusses the philosophical and technical challenges of mixed methods, grounding the discussion in a brief review of methodological literature. Mixed methods research is characterized as having philosophical and technical challenges; the former can be addressed by drawing on pragmatism, the latter by considering formal mixed methods research designs proposed in a number of design typologies. There are important differences among the design typologies which provide diverse examples of designs that health psychologists can adapt for their own mixed methods research. There are also similarities; in particular, many typologies explicitly orient to the technical challenges of deciding on the respective timing of qualitative and quantitative methods and the relative emphasis placed on each method. Characteristics, strengths, and limitations of different sequential and concurrent designs are identified by reviewing five mixed methods projects each conducted for a different purpose. Adapting formal mixed methods designs can help health psychologists address the technical challenges of mixed methods research and identify the approach that best fits the research questions and purpose. This does not obfuscate the need to address philosophical challenges of mixing qualitative and quantitative methods. Statement of contribution What is already known on this subject? Mixed methods research poses philosophical and technical challenges. Pragmatism in a popular approach to the philosophical challenges while diverse typologies of mixed methods designs can help address the technical challenges. Examples of mixed methods research can be hard to locate when component studies from mixed methods projects are published separately. What does this study add? Critical reflections on the author's previously published mixed methods research illustrate how a range of different mixed methods designs can be adapted and applied to address health psychology research questions. The philosophical and technical challenges of mixed methods research should be considered together and in relation to the broader purpose of the research. © 2014 The British Psychological Society.
Wilkinson, Claire; Livingston, Michael; Room, Robin
2016-09-30
Legislative limits on trading hours for licensed premises have a long history in Australia as a key policy approach to managing alcohol-related problems. In recent years, following substantial extensions to permitted hours of sale, there has been renewed attention to policies aimed at reducing late-night trading hours. Restrictions on on-premise alcohol sales have been implemented in Australia after 3.30 am in Newcastle, and after 3 am in Kings Cross and the Sydney central business district in New South Wales. In July 2016, similar restrictions were introduced state-wide after 2 am, or 3 am in 'safe night precincts', in Queensland. Similar policy changes have occurred internationally (e.g. in the UK and the Nordic countries) and there is a growing body of research examining the impacts of trading hour policies on alcohol-related harm. Although there has been a series of reviews of the research in this area, the most recent is now 5 years old and limited to studies published before March 2008. Objective and importance of study: To examine recent (2005-2015) research about the impact of changing the hours of sale of alcohol on alcohol-related harms. The ongoing public discussion about trading hours policy in Australia can benefit from an up-to-date and comprehensive review of the research. Systematic review of the literature that considered the impact of policies that extended or restricted trading hours. MEDLINE, Core Collection, PsychINFO and EMBASE databases were searched from January 2005 to December 2015. Articles were summarised descriptively, focusing on studies conducted in Australia and published since the previous reviews. The search identified 21 studies, including seven from Australia. There were 14 studies published since previous reviews. A series of robust, well-designed Australian studies demonstrate that reducing the hours during which on-premise alcohol outlets can sell alcohol late at night can substantially reduce rates of violence. The Australian studies are supported by a growing body of international research. The evidence of effectiveness is strong enough to consider restrictions on late trading hours for bars and hotels as a key approach to reducing late-night violence in Australia.
Schmeisser, Falko; Jing, Xianghong; Joshi, Manju; Vasudevan, Anupama; Soto, Jackeline; Li, Xing; Choudhary, Anil; Baichoo, Noel; Resnick, Josephine; Ye, Zhiping; McCormick, William; Weir, Jerry P
2016-03-01
The potency of inactivated influenza vaccines is determined using a single-radial immunodiffusion (SRID) assay and requires standardized reagents consisting of a Reference Antigen and an influenza strain-specific antiserum. Timely availability of reagents is a critical step in influenza vaccine production, and the need for backup approaches for reagent preparation is an important component of pandemic preparedness. When novel H7N9 viruses emerged in China in 2013, candidate inactivated H7N9 influenza vaccines were developed for evaluation in clinical trials, and reagents were needed to measure vaccine potency. We previously described an alternative approach for generating strain-specific potency antisera, utilizing modified vaccinia virus Ankara vectors to produce influenza hemagglutinin (HA)-containing virus-like particles (VLPs) for immunization. Vector-produced HA antigen is not dependent upon the success of the traditional bromelain-digestion and HA purification. Antiserum for H7N9 vaccines, produced after immunization of sheep with preparations of bromelain-HA (br-HA), was not optimal for the SRID assay, and the supply of antiserum was limited. However, antiserum obtained from sheep boosted with VLPs containing H7 HA greatly improved the ring quality in the SRID assay. Importantly, this antiserum worked well with both egg- and cell-derived antigen and was distributed to vaccine manufacturers. Utilizing a previously developed approach for preparing vaccine potency antiserum, we have addressed a major bottleneck encountered in preparation of H7N9 vaccine reagents. The combination of br-HA and mammalian VLPs for sequential immunization represents the first use of an alternative approach for producing an influenza vaccine potency antiserum. © 2015 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
Potential Vaccines and Post-Exposure Treatments for Filovirus Infections
Friedrich, Brian M.; Trefry, John C.; Biggins, Julia E.; Hensley, Lisa E.; Honko, Anna N.; Smith, Darci R.; Olinger, Gene G.
2012-01-01
Viruses of the family Filoviridae represent significant health risks as emerging infectious diseases as well as potentially engineered biothreats. While many research efforts have been published offering possibilities toward the mitigation of filoviral infection, there remain no sanctioned therapeutic or vaccine strategies. Current progress in the development of filovirus therapeutics and vaccines is outlined herein with respect to their current level of testing, evaluation, and proximity toward human implementation, specifically with regard to human clinical trials, nonhuman primate studies, small animal studies, and in vitro development. Contemporary methods of supportive care and previous treatment approaches for human patients are also discussed. PMID:23170176
Interval-based reconstruction for uncertainty quantification in PET
NASA Astrophysics Data System (ADS)
Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis
2018-02-01
A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.
Teen magazines as educational texts on dating violence: the $2.99 approach.
Kettrey, Heather Hensman; Emery, Beth C
2010-11-01
This study analyzed the portrayal of dating violence in teen magazines published in the United States. Such an investigation is important because previous research indicates that dating violence is a serious problem facing adolescents, teen magazines overemphasize the importance of romantic relationships, and teens who read this genre frequently or for education/advice are especially susceptible to its messages. Results indicated that although teen magazines do frame dating violence as a cultural problem, they are much more likely to utilize an individual frame that emphasizes the victim. Results were discussed as they apply to the responsibilities of professionals working with adolescents.
The Impact of Guided Notes on Post-Secondary Student Achievement: A Meta-Analysis
ERIC Educational Resources Information Center
Larwin, Karen H.; Larwin, David A.
2013-01-01
The common practice of using of guided notes in the post-secondary classroom is not fully appreciated or understood. In an effort to add to the existing research about this phenomenon, the current investigation expands on previously published research and one previously published meta-analysis that examined the impact of guided notes on…
Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E
2017-08-15
Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
Diffusion of palliative care in nursing homes: lessons from the culture change movement.
Tyler, Denise A; Shield, Renée R; Miller, Susan C
2015-05-01
Studies have found that nursing homes (NHs) that rely heavily on Medicaid funding are less likely to implement innovative approaches to care, such as palliative care (PC) or resident-centered approaches commonly referred to as "culture change" (CC). However, a nationally representative survey we previously conducted found that some high Medicaid facilities have implemented these innovative approaches. The purpose of this study was to identify the factors that enable some high Medicaid NHs to implement innovative approaches to care. We conducted telephone interviews with 16 NH administrators in four categories of facilities: 1) low PC and low CC, 2) low PC and high CC, 3) high PC and low CC, and 4) high PC and high CC. Interviews explored strategies used to overcome barriers to implementation and the resources needed for implementation. We had expected to find differences between low and high NHs but instead found differences in NHs' experiences with CC and PC. Since the time of our national survey in 2009-2010, most previously low CC NHs had implemented at least some CC practices; however, we did not find similar changes around PC. Administrators reported numerous ways in which they had received information and assistance from outside entities for implementing CC. This was not the case for PC where administrators reported relying exclusively and heavily on hospices for both their residents' PC needs and information related to PC. PC advocates could learn much from the CC model in which advocates have used multipronged efforts to institute reform. Published by Elsevier Inc.
Applications of next-generation sequencing to blood and marrow transplantation.
Chapman, Michael; Warren, Edus H; Wu, Catherine J
2012-01-01
Since the advent of next-generation sequencing (NGS) in 2005, there has been an explosion of published studies employing the technology to tackle previously intractable questions in many disparate biological fields. This has been coupled with technology development that has occurred at a remarkable pace. This review discusses the potential impact of this new technology on the field of blood and marrow stem cell transplantation. Hematologic malignancies have been among the forefront of those cancers whose genomes have been the subject of NGS. Hence, these studies have opened novel areas of biology that can be exploited for prognostic, diagnostic, and therapeutic means. Because of the unprecedented depth, resolution and accuracy achievable by NGS, this technology is well-suited for providing detailed information on the diversity of receptors that govern antigen recognition; this approach has the potential to contribute important insights into understanding the biologic effects of transplantation. Finally, the ability to perform comprehensive tumor sequencing provides a systematic approach to the discovery of genetic alterations that can encode peptides with restricted tumor expression, and hence serve as potential target antigens of graft-versus-leukemia responses. Altogether, this increasingly affordable technology will undoubtedly impact the future practice and care of patients with hematologic malignancies. Copyright © 2012 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Meerpohl, Joerg J; Schell, Lisa K; Bassler, Dirk; Gallus, Silvano; Kleijnen, Jos; Kulig, Michael; La Vecchia, Carlo; Marušić, Ana; Ravaud, Philippe; Reis, Andreas; Schmucker, Christine; Strech, Daniel; Urrútia, Gerard; Wager, Elizabeth; Antes, Gerd
2015-05-05
Dissemination bias in clinical research severely impedes informed decision-making not only for healthcare professionals and patients, but also for funders, research ethics committees, regulatory bodies and other stakeholder groups that make health-related decisions. Decisions based on incomplete and biased evidence cannot only harm people, but may also have huge financial implications by wasting resources on ineffective or harmful diagnostic and therapeutic measures, and unnecessary research. Owing to involvement of multiple stakeholders, it remains easy for any single group to assign responsibility for resolving the problem to others. To develop evidence-informed general and targeted recommendations addressing the various stakeholders involved in knowledge generation and dissemination to help overcome the problem of dissemination bias on the basis of previously collated evidence. Based on findings from systematic reviews, document analyses and surveys, we developed general and targeted draft recommendations. During a 2-day workshop in summer 2013, these draft recommendations were discussed with external experts and key stakeholders, and refined following a rigorous and transparent methodological approach. Four general, overarching recommendations applicable to all or most stakeholder groups were formulated, addressing (1) awareness raising, (2) implementation of targeted recommendations, (3) trial registration and results posting, and (4) systematic approaches to evidence synthesis. These general recommendations are complemented and specified by 47 targeted recommendations tailored towards funding agencies, pharmaceutical and device companies, research institutions, researchers (systematic reviewers and trialists), research ethics committees, trial registries, journal editors and publishers, regulatory agencies, benefit (health technology) assessment institutions and legislators. Despite various recent examples of dissemination bias and several initiatives to reduce it, the problem of dissemination bias has not been resolved. Tailored recommendations based on a comprehensive approach will hopefully help increase transparency in biomedical research by overcoming the failure to disseminate negative findings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Accurate nonlinear mapping between MNI volumetric and FreeSurfer surface coordinate systems.
Wu, Jianxiao; Ngo, Gia H; Greve, Douglas; Li, Jingwei; He, Tong; Fischl, Bruce; Eickhoff, Simon B; Yeo, B T Thomas
2018-05-16
The results of most neuroimaging studies are reported in volumetric (e.g., MNI152) or surface (e.g., fsaverage) coordinate systems. Accurate mappings between volumetric and surface coordinate systems can facilitate many applications, such as projecting fMRI group analyses from MNI152/Colin27 to fsaverage for visualization or projecting resting-state fMRI parcellations from fsaverage to MNI152/Colin27 for volumetric analysis of new data. However, there has been surprisingly little research on this topic. Here, we evaluated three approaches for mapping data between MNI152/Colin27 and fsaverage coordinate systems by simulating the above applications: projection of group-average data from MNI152/Colin27 to fsaverage and projection of fsaverage parcellations to MNI152/Colin27. Two of the approaches are currently widely used. A third approach (registration fusion) was previously proposed, but not widely adopted. Two implementations of the registration fusion (RF) approach were considered, with one implementation utilizing the Advanced Normalization Tools (ANTs). We found that RF-ANTs performed the best for mapping between fsaverage and MNI152/Colin27, even for new subjects registered to MNI152/Colin27 using a different software tool (FSL FNIRT). This suggests that RF-ANTs would be useful even for researchers not using ANTs. Finally, it is worth emphasizing that the most optimal approach for mapping data to a coordinate system (e.g., fsaverage) is to register individual subjects directly to the coordinate system, rather than via another coordinate system. Only in scenarios where the optimal approach is not possible (e.g., mapping previously published results from MNI152 to fsaverage), should the approaches evaluated in this manuscript be considered. In these scenarios, we recommend RF-ANTs (https://github.com/ThomasYeoLab/CBIG/tree/master/stable_projects/registration/Wu2017_RegistrationFusion). © 2018 Wiley Periodicals, Inc.
A novel approach for assessments of erythrocyte sedimentation rate.
Pribush, A; Hatskelzon, L; Meyerstein, N
2011-06-01
Previous studies have shown that the dispersed phase of sedimenting blood undergoes dramatic structural changes: Discrete red blood cell (RBC) aggregates formed shortly after a settling tube is filled with blood are combined into a continuous network followed by its collapse via the formation of plasma channels, and finally, the collapsed network is dispersed into individual fragments. Based on this scheme of structural transformation, a novel approach for assessments of erythrocyte sedimentation is suggested. Information about erythrocyte sedimentation is extracted from time records of the blood conductivity measured after a dispersion of RBC network into individual fragments. It was found that the sedimentation velocity of RBC network fragments correlates positively with the intensity of attractive intercellular interactions, whereas no effect of hematocrit (Hct) was observed. Thus, unlike Westergren erythrocyte sedimentation rate, sedimentation data obtained by the proposed method do not require correction for Hct. © 2010 Blackwell Publishing Ltd.
Liu, Hai-Yang; Wang, Yan-Kun; Zhi, Cong-Cong; Xiao, Jin-Hua; Huang, Da-Wei
2014-06-01
Wolbachia are widespread in insects and can manipulate host reproduction. Nasonia vitripennis is a widely studied organism with a very high prevalence of Wolbachia infection. To study the effect of Wolbachia infection in Nasonia spp., it is important to obtain noninfected individuals by artificial methods. Current methods that employ sugar water-containing antibiotics can successfully eliminate Wolbachia from the parasitic wasps; however, treatment of at least three generations is required. Here, we describe a novel, feasible, and effective approach to eliminate Wolbachia from N. vitripennis by feeding fly pupae continuously offering antibiotics to Nasonia populations, which shortened the time to eliminate the pathogens to two generations. Additionally, the Wolbachia Uni and CauB strains have obviously different rifampicin-resistance abilities, which is a previously unknown phenomenon. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
[Private health insurance in Brazil: approaches to public/private patterns in healthcare].
Sestelo, José Antonio de Freitas; Souza, Luis Eugenio Portela Fernandes de; Bahia, Lígia
2013-05-01
This article draws on a previous review of 270 articles on private health plans published from 2000 to 2010 and selects 17 that specifically address the issue of the relationship between the public and private healthcare sectors. Content analysis considered the studies' concepts and terms, related theoretical elements, and predominant lines of argument. A reading of the argumentative strategies detected the existence of a critical view of the modus operandi in the public/private relationship based on Social Medicine and the theoretical tenets of the Brazilian Health Reform Movement. The study also identified contributions based on neoliberal business approaches that focus strictly on economic issues to discuss private health insurance. Understanding the public/private link in healthcare obviously requires the development of a solid empirical base, analyzed with adequate theoretical assumptions due to the inherent degree of complexity in the public/private healthcare interface.
A composite computational model of liver glucose homeostasis. I. Building the composite model.
Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A
2012-04-07
A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.
Body ownership: When feeling and knowing diverge.
Romano, Daniele; Sedda, Anna; Brugger, Peter; Bottini, Gabriella
2015-07-01
Individuals with the peculiar disturbance of 'overcompleteness' experience an intense desire to amputate one of their healthy limbs, describing a sense of disownership for it (Body Integrity Identity Disorder - BIID). This condition is similar to somatoparaphrenia, the acquired delusion that one's own limb belongs to someone else. In ten individuals with BIID, we measured skin conductance response to noxious stimuli, delivered to the accepted and non-accepted limb, touching the body part or simulating the contact (stimuli approach the body without contacting it), hypothesizing that these individuals have responses like somatoparaphrenic patients, who previously showed reduced pain anticipation, when the threat was directed to the disowned limb. We found reduced anticipatory response to stimuli approaching, but not contacting, the unwanted limb. Conversely, stimuli contacting the non-accepted body-part, induced stronger SCR than those contacting the healthy parts, suggesting that feeling of ownership is critically related to a proper processing of incoming threats. Copyright © 2015. Published by Elsevier Inc.
Maulidiani; Rudiyanto; Abas, Faridah; Ismail, Intan Safinar; Lajis, Nordin H
2018-06-01
Optimization process is an important aspect in the natural product extractions. Herein, an alternative approach is proposed for the optimization in extraction, namely, the Generalized Likelihood Uncertainty Estimation (GLUE). The approach combines the Latin hypercube sampling, the feasible range of independent variables, the Monte Carlo simulation, and the threshold criteria of response variables. The GLUE method is tested in three different techniques including the ultrasound, the microwave, and the supercritical CO 2 assisted extractions utilizing the data from previously published reports. The study found that this method can: provide more information on the combined effects of the independent variables on the response variables in the dotty plots; deal with unlimited number of independent and response variables; consider combined multiple threshold criteria, which is subjective depending on the target of the investigation for response variables; and provide a range of values with their distribution for the optimization. Copyright © 2018 Elsevier Ltd. All rights reserved.
Korinsak, Siripar; Tangphatsornruang, Sithichoke; Pootakham, Wirulda; Wanchana, Samart; Plabpla, Anucha; Jantasuriyarat, Chatchawan; Patarapuwadol, Sujin; Vanavichit, Apichart; Toojinda, Theerayut
2018-05-15
Magnaporthe oryzae is a fungal pathogen causing blast disease in many plant species. In this study, seventy three isolates of M. oryzae collected from rice (Oryza sativa) in 1996-2014 were genotyped using a genotyping-by-sequencing approach to detect genetic variation. An association study was performed to identify single nucleotide polymorphisms (SNPs) associated with virulence genes using 831 selected SNP and infection phenotypes on local and improved rice varieties. Population structure analysis revealed eight subpopulations. The division into eight groups was not related to the degree of virulence. Association mapping showed five SNPs associated with fungal virulence on chromosome 1, 2, 3, 4 and 7. The SNP on chromosome 1 was associated with virulence against RD6-Pi7 and IRBL7-M which might be linked to the previously reported AvrPi7. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
The trajectory of scientific discovery: concept co-occurrence and converging semantic distance.
Cohen, Trevor; Schvaneveldt, Roger W
2010-01-01
The paradigm of literature-based knowledge discovery originated by Swanson involves finding meaningful associations between terms or concepts that have not occurred together in any previously published document. While several automated approaches have been applied to this problem, these generally evaluate the literature at a point in time, and do not evaluate the role of change over time in distributional statistics as an indicator of meaningful implicit associations. To address this issue, we develop and evaluate Symmetric Random Indexing (SRI), a novel variant of the Random Indexing (RI) approach that is able to measure implicit association over time. SRI is found to compare favorably to existing RI variants in the prediction of future direct co-occurrence. Summary statistics over several experiments suggest a trend of converging semantic distance prior to the co-occurrence of key terms for two seminal historical literature-based discoveries.
NASA Astrophysics Data System (ADS)
Lenderink, Geert; Attema, Jisk
2015-08-01
Scenarios of future changes in small scale precipitation extremes for the Netherlands are presented. These scenarios are based on a new approach whereby changes in precipitation extremes are set proportional to the change in water vapor amount near the surface as measured by the 2m dew point temperature. This simple scaling framework allows the integration of information derived from: (i) observations, (ii) a new unprecedentedly large 16 member ensemble of simulations with the regional climate model RACMO2 driven by EC-Earth, and (iii) short term integrations with a non-hydrostatic model Harmonie. Scaling constants are based on subjective weighting (expert judgement) of the three different information sources taking also into account previously published work. In all scenarios local precipitation extremes increase with warming, yet with broad uncertainty ranges expressing incomplete knowledge of how convective clouds and the atmospheric mesoscale circulation will react to climate change.
Integrative strategies to identify candidate genes in rodent models of human alcoholism.
Treadwell, Julie A
2006-01-01
The search for genes underlying alcohol-related behaviours in rodent models of human alcoholism has been ongoing for many years with only limited success. Recently, new strategies that integrate several of the traditional approaches have provided new insights into the molecular mechanisms underlying ethanol's actions in the brain. We have used alcohol-preferring C57BL/6J (B6) and alcohol-avoiding DBA/2J (D2) genetic strains of mice in an integrative strategy combining high-throughput gene expression screening, genetic segregation analysis, and mapping to previously published quantitative trait loci to uncover candidate genes for the ethanol-preference phenotype. In our study, 2 genes, retinaldehyde binding protein 1 (Rlbp1) and syntaxin 12 (Stx12), were found to be strong candidates for ethanol preference. Such experimental approaches have the power and the potential to greatly speed up the laborious process of identifying candidate genes for the animal models of human alcoholism.
Chan, Edmond S; Cummings, Carl; Atkinson, Adelle; Chad, Zave; Francoeur, Marie-Josée; Kirste, Linda; Mack, Douglas; Primeau, Marie-Noël; Vander Leek, Timothy K; Watson, Wade Ta
2014-01-01
Allergic conditions in children are a prevalent health concern in Canada. The burden of disease and the societal costs of proper diagnosis and management are considerable, making the primary prevention of allergic conditions a desirable health care objective. This position statement reviews current evidence on dietary exposures and allergy prevention in infants at high risk of developing allergic conditions. It revisits previous dietary recommendations for pregnancy, breastfeeding and formula-feeding, and provides an approach for introducing solid foods to high-risk infants. While there is no evidence that delaying the introduction of any specific food beyond six months of age helps to prevent allergy, the protective effect of early introduction of potentially allergenic foods (at four to six months) remains under investigation. Recent research appears to suggest that regularly ingesting a new, potentially allergenic food may be as important as when that food is first introduced. This article has already been published (Paediatr Child Health. 2013 Dec;18(10):545-54), and is being re-published with permission from the original publisher, the Canadian Paediatric Society.
Cai, Xinjie; Yang, Fang; Yan, Xiangzhen; Yang, Wanxun; Yu, Na; Oortgiesen, Daniel A W; Wang, Yining; Jansen, John A; Walboomers, X Frank
2015-04-01
The implantation of bone marrow-derived mesenchymal stem cells (MSCs) has previously been shown successful to achieve periodontal regeneration. However, the preferred pre-implantation differentiation strategy (e.g. maintenance of stemness, osteogenic or chondrogenic induction) to obtain optimal periodontal regeneration is still unknown. This in vivo study explored which differentiation approach is most suitable for periodontal regeneration. Mesenchymal stem cells were obtained from Fischer rats and seeded onto poly(lactic-co-glycolic acid)/poly(ɛ-caprolactone) electrospun scaffolds, and then pre-cultured under different in vitro conditions: (i) retention of multilineage differentiation potential; (ii) osteogenic differentiation approach; and (iii) chondrogenic differentiation approach. Subsequently, the cell-scaffold constructs were implanted into experimental periodontal defects of Fischer rats, with empty scaffolds as controls. After 6 weeks of implantation, histomorphometrical analyses were applied to evaluate the regenerated periodontal tissues. The chondrogenic differentiation approach showed regeneration of alveolar bone and ligament tissues. The retention of multilineage differentiation potential supported only ligament regeneration, while the osteogenic differentiation approach boosted alveolar bone regeneration. Chondrogenic differentiation of MSCs before implantation is a useful strategy for regeneration of alveolar bone and periodontal ligament, in the currently used rat model. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Optimized formulas for the gravitational field of a tesseroid
NASA Astrophysics Data System (ADS)
Grombein, Thomas; Seitz, Kurt; Heck, Bernhard
2013-07-01
Various tasks in geodesy, geophysics, and related geosciences require precise information on the impact of mass distributions on gravity field-related quantities, such as the gravitational potential and its partial derivatives. Using forward modeling based on Newton's integral, mass distributions are generally decomposed into regular elementary bodies. In classical approaches, prisms or point mass approximations are mostly utilized. Considering the effect of the sphericity of the Earth, alternative mass modeling methods based on tesseroid bodies (spherical prisms) should be taken into account, particularly in regional and global applications. Expressions for the gravitational field of a point mass are relatively simple when formulated in Cartesian coordinates. In the case of integrating over a tesseroid volume bounded by geocentric spherical coordinates, it will be shown that it is also beneficial to represent the integral kernel in terms of Cartesian coordinates. This considerably simplifies the determination of the tesseroid's potential derivatives in comparison with previously published methodologies that make use of integral kernels expressed in spherical coordinates. Based on this idea, optimized formulas for the gravitational potential of a homogeneous tesseroid and its derivatives up to second-order are elaborated in this paper. These new formulas do not suffer from the polar singularity of the spherical coordinate system and can, therefore, be evaluated for any position on the globe. Since integrals over tesseroid volumes cannot be solved analytically, the numerical evaluation is achieved by means of expanding the integral kernel in a Taylor series with fourth-order error in the spatial coordinates of the integration point. As the structure of the Cartesian integral kernel is substantially simplified, Taylor coefficients can be represented in a compact and computationally attractive form. Thus, the use of the optimized tesseroid formulas particularly benefits from a significant decrease in computation time by about 45 % compared to previously used algorithms. In order to show the computational efficiency and to validate the mathematical derivations, the new tesseroid formulas are applied to two realistic numerical experiments and are compared to previously published tesseroid methods and the conventional prism approach.
Systematic review of non-surgical therapies for osteoarthritis of the hand: an update.
Lue, S; Koppikar, S; Shaikh, K; Mahendira, D; Towheed, T E
2017-09-01
To update our earlier systematic reviews which evaluated all published randomized controlled trials (RCTs) evaluating pharmacological and non-pharmacological therapies in patients with hand osteoarthritis (OA). Surgical therapies were not evaluated. RCTs published between March 2008 and December 2015 were added to the previous systematic reviews. A total of 95 RCTs evaluating various pharmacological and non-pharmacological therapies in hand OA were analyzed in this update. Generally, the methodological quality of these RCTs has improved since the last update, with more studies describing their methods for randomization, blinding, and allocation concealment. However, RCTs continue to be weakened by a lack of consistent case definition and a lack of standardized outcome assessments specific to hand OA. The number and location of evaluated hand joints continues to be underreported, and only 25% of RCTs adequately described the method used to ensure allocation concealment. These remain major weaknesses of published RCTs. A meta-analysis could not be performed because of marked study heterogeneity, insufficient statistical data available in the published RCTs, and a small number of identical comparators. Hand OA is a complex area in which to study the efficacy of therapies. There has been an improvement in the overall design and conduct of RCTs, however, additional large RCTs with a more robust methodological approach specific to hand OA are needed in order to make clinically relevant conclusions about the efficacy of the diverse treatment options available. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Oxidation Mechanisms of Toluene and Benzene
NASA Technical Reports Server (NTRS)
Bittker, David A.
1995-01-01
An expanded and improved version of a previously published benzene oxidation mechanism is presented and shown to model published experimental data fairly successfully. This benzene submodel is coupled to a modified version of a toluene oxidation submodel from the recent literature. This complete mechanism is shown to successfully model published experimental toluene oxidation data for a highly mixed flow reactor and for higher temperature ignition delay times in a shock tube. A comprehensive sensitivity analysis showing the most important reactions is presented for both the benzene and toluene reacting systems. The NASA Lewis toluene mechanism's modeling capability is found to be equivalent to that of the previously published mechanism which contains a somewhat different benzene submodel.
Meyerson, Paul; Tryon, Warren W
2003-11-01
This study evaluated the psychometric equivalency of Web-based research. The Sexual Boredom Scale was presented via the World-Wide Web along with five additional scales used to validate it. A subset of 533 participants that matched a previously published sample (Watt & Ewing, 1996) on age, gender, and race was identified. An 8 x 8 correlation matrix from the matched Internet sample was compared via structural equation modeling with a similar 8 x 8 correlation matrix from the previously published study. The Internet and previously published samples were psychometrically equivalent. Coefficient alpha values calculated on the matched Internet sample yielded reliability coefficients almost identical to those for the previously published sample. Factors such as computer administration and uncontrollable administration settings did not appear to affect the results. Demographic data indicated an overrepresentation of males by about 6% and Caucasians by about 13% relative to the U.S. Census (2000). A total of 2,230 participants were obtained in about 8 months without remuneration. These results suggest that data collection on the Web is (1) reliable, (2) valid, (3) reasonably representative, (4) cost effective, and (5) efficient.
Validation of cryo-EM structure of IP₃R1 channel.
Murray, Stephen C; Flanagan, John; Popova, Olga B; Chiu, Wah; Ludtke, Steven J; Serysheva, Irina I
2013-06-04
About a decade ago, three electron cryomicroscopy (cryo-EM) single-particle reconstructions of IP3R1 were reported at low resolution. It was disturbing that these structures bore little similarity to one another, even at the level of quaternary structure. Recently, we published an improved structure of IP3R1 at ∼1 nm resolution. However, this structure did not bear any resemblance to any of the three previously published structures, leading to the question of why the structure should be considered more reliable than the original three. Here, we apply several methods, including class-average/map comparisons, tilt-pair validation, and use of multiple refinement software packages, to give strong evidence for the reliability of our recent structure. The map resolution and feature resolvability are assessed with the gold standard criterion. This approach is generally applicable to assessing the validity of cryo-EM maps of other molecular machines. Copyright © 2013 Elsevier Ltd. All rights reserved.
A novel approach to quantify cybersecurity for electric power systems
NASA Astrophysics Data System (ADS)
Kaster, Paul R., Jr.
Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.
Takata, Atsushi; Miyake, Noriko; Tsurusaki, Yoshinori; Fukai, Ryoko; Miyatake, Satoko; Koshimizu, Eriko; Kushima, Itaru; Okada, Takashi; Morikawa, Mako; Uno, Yota; Ishizuka, Kanako; Nakamura, Kazuhiko; Tsujii, Masatsugu; Yoshikawa, Takeo; Toyota, Tomoko; Okamoto, Nobuhiko; Hiraki, Yoko; Hashimoto, Ryota; Yasuda, Yuka; Saitoh, Shinji; Ohashi, Kei; Sakai, Yasunari; Ohga, Shouichi; Hara, Toshiro; Kato, Mitsuhiro; Nakamura, Kazuyuki; Ito, Aiko; Seiwa, Chizuru; Shirahata, Emi; Osaka, Hitoshi; Matsumoto, Ayumi; Takeshita, Saoko; Tohyama, Jun; Saikusa, Tomoko; Matsuishi, Toyojiro; Nakamura, Takumi; Tsuboi, Takashi; Kato, Tadafumi; Suzuki, Toshifumi; Saitsu, Hirotomo; Nakashima, Mitsuko; Mizuguchi, Takeshi; Tanaka, Fumiaki; Mori, Norio; Ozaki, Norio; Matsumoto, Naomichi
2018-01-16
Recent studies have established important roles of de novo mutations (DNMs) in autism spectrum disorders (ASDs). Here, we analyze DNMs in 262 ASD probands of Japanese origin and confirm the "de novo paradigm" of ASDs across ethnicities. Based on this consistency, we combine the lists of damaging DNMs in our and published ASD cohorts (total number of trios, 4,244) and perform integrative bioinformatics analyses. Besides replicating the findings of previous studies, our analyses highlight ATP-binding genes and fetal cerebellar/striatal circuits. Analysis of individual genes identified 61 genes enriched for damaging DNMs, including ten genes for which our dataset now contributes to statistical significance. Screening of compounds altering the expression of genes hit by damaging DNMs reveals a global downregulating effect of valproic acid, a known risk factor for ASDs, whereas cardiac glycosides upregulate these genes. Collectively, our integrative approach provides deeper biological and potential medical insights into ASDs. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
On-column trypsinization allows for re-use of matrix in modified multiplexed inhibitor beads assay.
Petrovic, Voin; Olaisen, Camilla; Sharma, Animesh; Nepal, Anala; Bugge, Steffen; Sundby, Eirik; Hoff, Bård Helge; Slupphaug, Geir; Otterlei, Marit
2017-04-15
The Multiplexed Inhibitor Bead (MIB) assay is a previously published quantitative proteomic MS-based approach to study cellular kinomes. A rather extensive procedure, need for multiple custom-made kinase inhibitors and an inability to re-use the MIB-columns, has limited its applicability. Here we present a modified MIB assay in which elution of bound proteins is facilitated by on-column trypsinization. We tested the modified MIB assay by analyzing extract from three human cancer cell lines treated with the cytotoxic drugs cisplatin or docetaxel. Using only three immobilized kinase inhibitors, we were able to detect about 6000 proteins, including ∼40% of the kinome, as well as other signaling, metabolic and structural proteins. The method is reproducible and the MIB-columns are re-usable without loss of performance. This makes the MIB assay a simple, affordable, and rapid assay for monitoring changes in cellular signaling. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Clark, Maria T; Clark, Richard J; Toohey, Shane; Bradbury-Jones, Caroline
2017-03-01
Acupuncture shows promise as a treatment for plantar heel pain (PHP) or plantar fasciitis (PF), but data heterogeneity has undermined demonstration of efficacy. Recognising that acupuncture is a diverse field of practice, the aim of this study was to gain a broader, global perspective on the different approaches and rationales used in the application of acupuncture in PHP. We built upon an earlier systematic review (which was limited by the necessity of a methodological focus on efficacy) using the critical interpretive synthesis (CIS) method to draw upon a wider international sample of 25 clinical sources, including case reports and case series. Multiple tracks of analysis led to an emergent synthesis. Findings are presented at three levels: primary (summarised data); secondary (patterns observed); and tertiary (emergent synthesis). Multiple treatments and rationales were documented but no single approach dominated. Notable contradictions emerged such as the application of moxibustion by some authors and ice by others. Synthesis of findings revealed a 'patchwork' of factors influencing the approaches taken. The complexity of the field of acupuncture was illustrated through the 'lens' of PHP. The 'patchwork' metaphor provides a unifying framework for a previously divergent community of practice and research. Several directions for future research were identified, such as: importance of prior duration; existence of diagnostic subgroups; and how practitioners make clinical decisions and report their findings. CIS was found to provide visibility for multiple viewpoints in developing theory and modelling the processes of 'real world' practice by acupuncturists addressing the problem of PHP. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Blepharoptosis surgery in patients with myasthenia gravis.
Litwin, Andre S; Patel, Bhupendra; McNab, Alan A; McCann, John D; Leatherbarrow, Brian; Malhotra, Raman
2015-07-01
To review our approach of cautious surgical correction of blepharoptosis in patients with myasthenia gravis (MG) to minimise risk of exposure complications. Retrospective case note review of 30 patients with symptomatic eyelid concerns despite appropriate medical treatment, who underwent eyelid surgery. The mean age at diagnosis was 47 years. 13/30 patients had systemic MG, 14/30 ocular MG and 3/30 congenital MG. The main outcome measures were improvement in eyelid height and/or position, duration of a successful postoperative result, need for further surgical intervention, and intraoperative or postoperative complications. 38 blepharoptosis procedures were performed on 23 patients. Mean age at time of surgery was 62 years, with an average follow-up of 29 months. 10 patients (16 eyelids) underwent anterior approach levator advancement, 4 patients (5 eyelids) posterior approach surgery and 8 patients (15 eyelids) brow suspension. One patient (2 eyelids) had tarsal switch surgery. An average improvement in eyelid height of 1.9 mm was achieved. Postoperative symptoms or signs of exposure keratopathy occurred in 17% of patients. This necessitated lid lowering in one eyelid of one patient. During follow-up, 37% of eyelids required further surgical intervention to improve the upper eyelid height, after an average of 19 months (range 0.5-49 months). Over a third of patients in our series required repeat surgery, which would be expected when the initial aim was to under-correct this group. In contrast to previous commentaries, the amount of eyelid excursion was not the main factor used to guide the surgical approach. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Dietrich, Stefan; Floegel, Anna; Troll, Martina; Kühn, Tilman; Rathmann, Wolfgang; Peters, Anette; Sookthai, Disorn; von Bergen, Martin; Kaaks, Rudolf; Adamski, Jerzy; Prehn, Cornelia; Boeing, Heiner; Schulze, Matthias B; Illig, Thomas; Pischon, Tobias; Knüppel, Sven; Wang-Sattler, Rui; Drogan, Dagmar
2016-10-01
The application of metabolomics in prospective cohort studies is statistically challenging. Given the importance of appropriate statistical methods for selection of disease-associated metabolites in highly correlated complex data, we combined random survival forest (RSF) with an automated backward elimination procedure that addresses such issues. Our RSF approach was illustrated with data from the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam study, with concentrations of 127 serum metabolites as exposure variables and time to development of type 2 diabetes mellitus (T2D) as outcome variable. Out of this data set, Cox regression with a stepwise selection method was recently published. Replication of methodical comparison (RSF and Cox regression) was conducted in two independent cohorts. Finally, the R-code for implementing the metabolite selection procedure into the RSF-syntax is provided. The application of the RSF approach in EPIC-Potsdam resulted in the identification of 16 incident T2D-associated metabolites which slightly improved prediction of T2D when used in addition to traditional T2D risk factors and also when used together with classical biomarkers. The identified metabolites partly agreed with previous findings using Cox regression, though RSF selected a higher number of highly correlated metabolites. The RSF method appeared to be a promising approach for identification of disease-associated variables in complex data with time to event as outcome. The demonstrated RSF approach provides comparable findings as the generally used Cox regression, but also addresses the problem of multicollinearity and is suitable for high-dimensional data. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
NASA Astrophysics Data System (ADS)
Sauppe, Sebastian; Hahn, Andreas; Brehm, Marcus; Paysan, Pascal; Seghers, Dieter; Kachelrieß, Marc
2016-03-01
We propose an adapted method of our previously published five-dimensional (5D) motion compensation (MoCo) algorithm1, developed for micro-CT imaging of small animals, to provide for the first time motion artifact-free 5D cone-beam CT (CBCT) images from a conventional flat detector-based CBCT scan of clinical patients. Image quality of retrospectively respiratory- and cardiac-gated volumes from flat detector CBCT scans is deteriorated by severe sparse projection artifacts. These artifacts further complicate motion estimation, as it is required for MoCo image reconstruction. For high quality 5D CBCT images at the same x-ray dose and the same number of projections as todays 3D CBCT we developed a double MoCo approach based on motion vector fields (MVFs) for respiratory and cardiac motion. In a first step our already published four-dimensional (4D) artifact-specific cyclic motion-compensation (acMoCo) approach is applied to compensate for the respiratory patient motion. With this information a cyclic phase-gated deformable heart registration algorithm is applied to the respiratory motion-compensated 4D CBCT data, thus resulting in cardiac MVFs. We apply these MVFs on double-gated images and thereby respiratory and cardiac motion-compensated 5D CBCT images are obtained. Our 5D MoCo approach processing patient data acquired with the TrueBeam 4D CBCT system (Varian Medical Systems). Our double MoCo approach turned out to be very efficient and removed nearly all streak artifacts due to making use of 100% of the projection data for each reconstructed frame. The 5D MoCo patient data show fine details and no motion blurring, even in regions close to the heart where motion is fastest.
Third-order elastic constants of diamond determined from experimental data
Winey, J. M.; Hmiel, A.; Gupta, Y. M.
2016-06-01
The pressure derivatives of the second-order elastic constants (SOECs) of diamond were determined by analyzing previous sound velocity measurements under hydrostatic stress [McSkimin and Andreatch, J. Appl. Phys. 43, 294 (1972)]. Furthermore, our analysis corrects an error in the previously reported results.We present a complete and corrected set of third-order elastic constants (TOECs) using the corrected pressure derivatives, together with published data for the nonlinear elastic response of shock compressed diamond [Lang and Gupta, Phys. Rev. Lett. 106, 125502 (2011)] and it differs significantly from TOECs published previously.
Trans-catheter aortic valve implantation after previous aortic homograft surgery.
Drews, Thorsten; Pasic, Miralem; Buz, Semih; Unbehaun, Axel
2011-12-01
In patients with previous heart surgery, the operative risk is elevated during conventional aortic valve re-operations. Trans-catheter aortic valve implantation is a new method for the treatment of high-risk patients. Nevertheless, this new procedure carries potential risks in patients with previous homograft implantation in aortic position. Between April 2008 and February 2011, 345 consecutive patients (mean EuroSCORE (European System for Cardiac Operative Risk Evaluation): 38 ± 20%; mean Society of Thoracic Surgeons (STS) Mortality Score: 19 ± 16%; mean age: 80 ± 8 years; 111 men and 234 women) underwent trans-apical aortic valve implantation. In three patients, previous aortic homograft implantation had been performed. Homograft degeneration causing combined valve stenosis and incompetence made re-operation necessary. In all three patients, the aortic valve could be implanted using the trans-apical approach, and the procedure was successful. In two patients, there was slight paravalvular leakage of the aortic prosthesis and the other patient had slight central leakage. Neither ostium obstruction nor mitral valve damage was observed. Trans-catheter valve implantation can be performed successfully after previous homograft implantation. Particular care should be taken to achieve optimal valve positioning, not to obstruct the ostium of the coronary vessels due to the changed anatomic situation and not to cause annulus rupture. Copyright © 2011 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.
Brandon, Catherine; Jamadar, David; Girish, Gandikota; Dong, Qian; Morag, Yoav; Mullan, Patricia
2015-04-01
Publishing is critical for academic medicine career advancement. Rejection of manuscripts can be demoralizing. Obstacles faced by clinical faculty may include lack of time, confidence, and optimal writing practices. This study describes the development and evaluation of a peer-writing group, informed by theory and research on faculty development and writing. Five clinical-track radiology faculty members formed a "Writers' Circle" to promote scholarly productivity and reflection on writing practices. Members decided to work with previously rejected manuscripts. After members' initial meeting, interactions were informal, face to face during clinical work, and online. After the first 6 months, an anonymous survey asked members about the status of articles and evaluations of the writing group. Ten previously rejected articles, at least one from each member, were submitted to the Circle. In 6 months, four manuscripts were accepted for publication, five were in active revision, and one was withdrawn. All participants (100%) characterized the program as worth their time, increasing their motivation to write, their opportunities to support scholarly productivity of colleagues, and their confidence in generating scholarship. Peer-support writing groups can facilitate the pooling of expertise and the exchange of recommended writing practices. Our peer-support group increased scholarly productivity and provided a collegial approach to academic writing. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Spontaneous bilateral fracture of patella.
Moretti, Biagio; Speciale, Domenico; Garofalo, Raffaele; Moretti, Lorenzo; Patella, Silvio; Patella, Vittorio
2008-03-01
Bilateral patellae fractures represent a rare entity, accounting for approximately 2.9% of all lesions interesting in this anatomical district. In most cases found in the published work, they are described as stress fractures or as complications of chronic diseases such as osteoporosis, renal failure and secondary hyperparathyroidism. Although many pathogenetic mechanisms have been supposed, none have been proved for certain. Insufficiency fractures of the patellae are rare events and no data has been published on their incidence. We present a case of bilateral fracture of the patellae due to an indirect trauma occurring in an 85-year-old patient affected by Parkinson's disease, osteoporosis and diffuse degenerative osteoarthritis. X-ray of the knees (anteroposterior and lateral) and magnetic resonance imaging evaluation confirmed the fractures. The patient was treated conservatively. She had a good result, returning to her previous autonomous ambulation. This case is unusual because there was no direct trauma to the knees because of bilaterality, but confirmed previous observations about insufficiency fractures of patellae in the presence of comorbidity. Insufficiency fractures of patellae can be an insidious condition in elderly people. Prepatellar pain, a common symptom in the relapse phase of degenerative arthritis of the knee, should not be underestimated, particularly in patients with diseases influencing metabolism of bone and with an elevated risk of fall. A periodical clinical and instrumental follow up should be done in these patient. Moreover, we underline the necessity of a multidisciplinary approach.
Creating peer groups for assessing and comparing nursing home performance.
Byrne, Margaret M; Daw, Christina; Pietz, Ken; Reis, Brian; Petersen, Laura A
2013-11-01
Publicly reported performance data for hospitals and nursing homes are becoming ubiquitous. For such comparisons to be fair, facilities must be compared with their peers. To adapt a previously published methodology for developing hospital peer groupings so that it is applicable to nursing homes and to explore the characteristics of "nearest-neighbor" peer groupings. Analysis of Department of Veterans Affairs administrative databases and nursing home facility characteristics. The nearest-neighbor methodology for developing peer groupings involves calculating the Euclidean distance between facilities based on facility characteristics. We describe our steps in selection of facility characteristics, describe the characteristics of nearest-neighbor peer groups, and compare them with peer groups derived through classical cluster analysis. The facility characteristics most pertinent to nursing home groupings were found to be different from those that were most relevant for hospitals. Unlike classical cluster groups, nearest neighbor groups are not mutually exclusive, and the nearest-neighbor methodology resulted in nursing home peer groupings that were substantially less diffuse than nursing home peer groups created using traditional cluster analysis. It is essential that healthcare policy makers and administrators have a means of fairly grouping facilities for the purposes of quality, cost, or efficiency comparisons. In this research, we show that a previously published methodology can be successfully applied to a nursing home setting. The same approach could be applied in other clinical settings such as primary care.
Trends in treatment and outcomes of pediatric craniopharyngioma, 1975-2011.
Cohen, Michal; Bartels, Ute; Branson, Helen; Kulkarni, Abhaya V; Hamilton, Jill
2013-06-01
Craniopharyngioma tumors and their treatment can lead to significant long-term morbidity due to their proximity to vital structures. The optimal treatment has been debated for many years. We aimed to review the long-term outcomes of children treated for craniopharyngioma in our institution over the past decade and describe trends in treatment and outcomes over the past 3 decades. Charts of children with craniopharyngioma treated and followed at The Hospital for Sick Children between 2001 and 2011 were reviewed. Data regarding findings at diagnosis, treatment, and long-term outcomes were analyzed. Comparison was made with previously published data from our institution. Data from 33 patients are included; mean age at treatment, 10.7 ± 4.8 years. In 18 children (55%), the initial surgical approach was tumor cyst decompression with or without adjuvant therapy, compared with only 0-2% in the preceding decades (P < .01). Diabetes insipidus occurred in 55% of children and panhypopituitarism in 58% compared with 88% (P < .01) and 86% (P < .01), respectively, in the previous 10 years. Overall, there was a 36% reduction in the number of children who developed severe obesity compared with the preceding decade. Body mass index at follow-up was associated with body mass index at diagnosis (P = .004) and tumor resection as an initial treatment approach (P = .028). A shift in surgical treatment approach away from gross total resection has led to improved endocrine outcomes. This may have beneficial implications for quality of life in survivors.
NASA Astrophysics Data System (ADS)
Al-Shudeifat, Mohammad A.; Butcher, Eric A.
2011-01-01
The actual breathing mechanism of the transverse breathing crack in the cracked rotor system that appears due to the shaft weight is addressed here. As a result, the correct time-varying area moments of inertia for the cracked element cross-section during shaft rotation are also determined. Hence, two new breathing functions are identified to represent the actual breathing effect on the cracked element stiffness matrix. The new breathing functions are used in formulating the time-varying finite element stiffness matrix of the cracked element. The finite element equations of motion are then formulated for the cracked rotor system and solved via harmonic balance method for response, whirl orbits and the shift in the critical and subcritical speeds. The analytical results of this approach are compared with some previously published results obtained using approximate formulas for the breathing mechanism. The comparison shows that the previously used breathing function is a weak model for the breathing mechanism in the cracked rotor even for small crack depths. The new breathing functions give more accurate results for the dynamic behavior of the cracked rotor system for a wide range of the crack depths. The current approach is found to be efficient for crack detection since the critical and subcritical shaft speeds, the unique vibration signature in the neighborhood of the subcritical speeds and the sensitivity to the unbalance force direction all together can be utilized to detect the breathing crack before further damage occurs.
Scheduling, revenue management, and fairness in an academic-hospital radiology division.
Baum, Richard; Bertsimas, Dimitris; Kallus, Nathan
2014-10-01
Physician staff of academic hospitals today practice in several geographic locations including their main hospital. This is referred to as the extended campus. With extended campuses expanding, the growing complexity of a single division's schedule means that a naive approach to scheduling compromises revenue. Moreover, it may provide an unfair allocation of individual revenue, desirable or burdensome assignments, and the extent to which the preferences of each individual are met. This has adverse consequences on incentivization and employee satisfaction and is simply against business policy. We identify the daily scheduling of physicians in this context as an operational problem that incorporates scheduling, revenue management, and fairness. Noting previous success of operations research and optimization in each of these disciplines, we propose a simple unified optimization formulation of this scheduling problem using mixed-integer optimization. Through a study of implementing the approach at the Division of Angiography and Interventional Radiology at the Brigham and Women's Hospital, which is directed by one of the authors, we exemplify the flexibility of the model to adapt to specific applications, the tractability of solving the model in practical settings, and the significant impact of the approach, most notably in increasing revenue by 8.2% over previous operating revenue while adhering strictly to a codified fairness and objectivity. We found that the investment in implementing such a system is far outweighed by the large potential revenue increase and the other benefits outlined. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Sarker, Abeed; Gonzalez, Graciela
2015-02-01
Automatic detection of adverse drug reaction (ADR) mentions from text has recently received significant interest in pharmacovigilance research. Current research focuses on various sources of text-based information, including social media-where enormous amounts of user posted data is available, which have the potential for use in pharmacovigilance if collected and filtered accurately. The aims of this study are: (i) to explore natural language processing (NLP) approaches for generating useful features from text, and utilizing them in optimized machine learning algorithms for automatic classification of ADR assertive text segments; (ii) to present two data sets that we prepared for the task of ADR detection from user posted internet data; and (iii) to investigate if combining training data from distinct corpora can improve automatic classification accuracies. One of our three data sets contains annotated sentences from clinical reports, and the two other data sets, built in-house, consist of annotated posts from social media. Our text classification approach relies on generating a large set of features, representing semantic properties (e.g., sentiment, polarity, and topic), from short text nuggets. Importantly, using our expanded feature sets, we combine training data from different corpora in attempts to boost classification accuracies. Our feature-rich classification approach performs significantly better than previously published approaches with ADR class F-scores of 0.812 (previously reported best: 0.770), 0.538 and 0.678 for the three data sets. Combining training data from multiple compatible corpora further improves the ADR F-scores for the in-house data sets to 0.597 (improvement of 5.9 units) and 0.704 (improvement of 2.6 units) respectively. Our research results indicate that using advanced NLP techniques for generating information rich features from text can significantly improve classification accuracies over existing benchmarks. Our experiments illustrate the benefits of incorporating various semantic features such as topics, concepts, sentiments, and polarities. Finally, we show that integration of information from compatible corpora can significantly improve classification performance. This form of multi-corpus training may be particularly useful in cases where data sets are heavily imbalanced (e.g., social media data), and may reduce the time and costs associated with the annotation of data in the future. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle; Mayhew, Alain; Skidmore, Becky; Stevens, Adrienne; Boutron, Isabelle; Sarkis-Onofre, Rafael; Bjerre, Lise M; Hróbjartsson, Asbjørn; Altman, Douglas G; Moher, David
2017-06-19
The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time. The Cochrane Library, MEDLINE®, and EMBASE® were searched from January 1990 to October 16, 2014, for reports assessing MQ and/or RQ of SRs. Title, abstract, and full-text screening of all reports were conducted independently by two reviewers. Reports assessing the MQ and/or RQ of a cohort of ten or more SRs of interventions were included. All results are reported as frequencies and percentages of reports. Of 20,765 unique records retrieved, 1189 of them were reviewed for full-text review, of which 76 reports were included. Eight previously published approaches to assessing MQ or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own criteria. PRISMA, OQAQ, and AMSTAR were the most commonly used published tools to assess MQ or RQ. In conjunction with other approaches, published tools were used in 29% (22/76) of reports, with 36% (8/22) assessing adherence to both PRISMA and AMSTAR criteria and 26% (6/22) using QUOROM and OQAQ. The methods used to assess quality of SRs are diverse, and none has become universally accepted. The most commonly used quality assessment tools are AMSTAR, OQAQ, and PRISMA. As new tools and guidelines are developed to improve both the MQ and RQ of SRs, authors of methodological studies are encouraged to put thoughtful consideration into the use of appropriate tools to assess quality and reporting.
Nunes, Rita G; Hajnal, Joseph V
2018-06-01
Point spread function (PSF) mapping enables estimating the displacement fields required for distortion correction of echo planar images. Recently, a highly accelerated approach was introduced for estimating displacements from the phase slope of under-sampled PSF mapping data. Sampling schemes with varying spacing were proposed requiring stepwise phase unwrapping. To avoid unwrapping errors, an alternative approach applying the concept of finite rate of innovation to PSF mapping (FRIP) is introduced, using a pattern search strategy to locate the PSF peak, and the two methods are compared. Fully sampled PSF data was acquired in six subjects at 3.0 T, and distortion maps were estimated after retrospective under-sampling. The two methods were compared for both previously published and newly optimized sampling patterns. Prospectively under-sampled data were also acquired. Shift maps were estimated and deviations relative to the fully sampled reference map were calculated. The best performance was achieved when using FRIP with a previously proposed sampling scheme. The two methods were comparable for the remaining schemes. The displacement field errors tended to be lower as the number of samples or their spacing increased. A robust method for estimating the position of the PSF peak has been introduced.
Guthrie, Susan; Pollitt, Alexandra; Hanney, Stephen; Grant, Jonathan
2014-01-01
In 2012, RAND Europe and the Health Economics Research Group (Brunel University) were commissioned by the Wellcome Trust, Cancer Research UK, the National Institute for Health Research and the Academy of Medical Science to conduct a study of the returns to the public/charitable investment in cancer-related research. This study built on previous work published in the 2008 "What's it worth?" report that estimated the economic returns to medical research in terms of spillover benefits and health gain. The 2008 study was extensively quoted and cited as a clear justification for the economic importance of medical research and appears to have played a role in achieving the protection of the medical science budget in the recent public expenditure cuts. This cancer study used a similar approach to that used in the previous study, but with some methodological developments. One of the methodological developments was the inclusion of case studies to examine the validity and variability of the estimates on elapsed time between funding and health gains, and the amount of health gains that can be attributed to UK research. This study provides the full text of the five case studies conducted as well as some discussion of observations emerging across the case study set.
Correlation, evaluation, and extension of linearized theories for tire motion and wheel shimmy
NASA Technical Reports Server (NTRS)
Smiley, Robert F
1957-01-01
An evaluation is made of the existing theories of a linearized tire motion and wheel shimmy. It is demonstrated that most of the previously published theories represent varying degrees of approximation to a summary theory developed in this report which is a minor modification of the basic theory of Von Schlippe and Dietrich. In most cases where strong differences exist between the previously published theories and summary theory, the previously published theories are shown to possess certain deficiencies. A series of systematic approximations to the summary theory is developed for the treatment of problems too simple to merit the use of the complete summary theory, and procedures are discussed for applying the summary theory and its systematic approximations to the shimmy of more complex landing-gear structures than have previously been considered. Comparisons of the existing experimental data with the predictions of the summary theory and the systematic approximations provide a fair substantiation of the more detailed approximate theories.
NASA Astrophysics Data System (ADS)
Davis, Joshua R.; Giorgis, Scott
2014-11-01
We describe a three-part approach for modeling shape preferred orientation (SPO) data of spheroidal clasts. The first part consists of criteria to determine whether a given SPO and clast shape are compatible. The second part is an algorithm for randomly generating spheroid populations that match a prescribed SPO and clast shape. In the third part, numerical optimization software is used to infer deformation from spheroid populations, by finding the deformation that returns a set of post-deformation spheroids to a minimally anisotropic initial configuration. Two numerical experiments explore the strengths and weaknesses of this approach, while giving information about the sensitivity of the model to noise in data. In monoclinic transpression of oblate rigid spheroids, the model is found to constrain the shortening component but not the simple shear component. This modeling approach is applied to previously published SPO data from the western Idaho shear zone, a monoclinic transpressional zone that deformed a feldspar megacrystic gneiss. Results suggest at most 5 km of shortening, as well as pre-deformation SPO fabric. The shortening estimate is corroborated by a second model that assumes no pre-deformation fabric.
Real-time motion compensation for EM bronchoscope tracking with smooth output - ex-vivo validation
NASA Astrophysics Data System (ADS)
Reichl, Tobias; Gergel, Ingmar; Menzel, Manuela; Hautmann, Hubert; Wegner, Ingmar; Meinzer, Hans-Peter; Navab, Nassir
2012-02-01
Navigated bronchoscopy provides benefits for endoscopists and patients, but accurate tracking information is needed. We present a novel real-time approach for bronchoscope tracking combining electromagnetic (EM) tracking, airway segmentation, and a continuous model of output. We augment a previously published approach by including segmentation information in the tracking optimization instead of image similarity. Thus, the new approach is feasible in real-time. Since the true bronchoscope trajectory is continuous, the output is modeled using splines and the control points are optimized with respect to displacement from EM tracking measurements and spatial relation to segmented airways. Accuracy of the proposed method and its components is evaluated on a ventilated porcine ex-vivo lung with respect to ground truth data acquired from a human expert. We demonstrate the robustness of the output of the proposed method against added artificial noise in the input data. Smoothness in terms of inter-frame distance is shown to remain below 2 mm, even when up to 5 mm of Gaussian noise are added to the input. The approach is shown to be easily extensible to include other measures like image similarity.
2014-01-01
Background Integrating and analyzing heterogeneous genome-scale data is a huge algorithmic challenge for modern systems biology. Bipartite graphs can be useful for representing relationships across pairs of disparate data types, with the interpretation of these relationships accomplished through an enumeration of maximal bicliques. Most previously-known techniques are generally ill-suited to this foundational task, because they are relatively inefficient and without effective scaling. In this paper, a powerful new algorithm is described that produces all maximal bicliques in a bipartite graph. Unlike most previous approaches, the new method neither places undue restrictions on its input nor inflates the problem size. Efficiency is achieved through an innovative exploitation of bipartite graph structure, and through computational reductions that rapidly eliminate non-maximal candidates from the search space. An iterative selection of vertices for consideration based on non-decreasing common neighborhood sizes boosts efficiency and leads to more balanced recursion trees. Results The new technique is implemented and compared to previously published approaches from graph theory and data mining. Formal time and space bounds are derived. Experiments are performed on both random graphs and graphs constructed from functional genomics data. It is shown that the new method substantially outperforms the best previous alternatives. Conclusions The new method is streamlined, efficient, and particularly well-suited to the study of huge and diverse biological data. A robust implementation has been incorporated into GeneWeaver, an online tool for integrating and analyzing functional genomics experiments, available at http://geneweaver.org. The enormous increase in scalability it provides empowers users to study complex and previously unassailable gene-set associations between genes and their biological functions in a hierarchical fashion and on a genome-wide scale. This practical computational resource is adaptable to almost any applications environment in which bipartite graphs can be used to model relationships between pairs of heterogeneous entities. PMID:24731198
Irani, Mohamad; Robles, Alex; Gunnala, Vinay; Spandorfer, Steven D
To determine whether different treatment approaches of ectopic pregnancy (EP), particularly unilateral salpingectomy and methotrexate, affect its recurrence rate in patients undergoing in vitro fertilization (IVF). A retrospective cohort study (Canadian Task Force classification II-2). An academic medical center. Patients with a history of a previous EP who achieved pregnancy after IVF cycles between January 2004 and August 2015 were included. The recurrence rate of EP was compared between patients who underwent different treatment approaches for a previous EP. IVF. A total of 594 patients were included. Seventeen patients had a recurrence of EP (2.9%). Patients with a history of ≥2 EPs were associated with a significantly higher recurrence rate of EP than those with 1 previous EP (8.5% vs. 1.8%; p = .01; odds ratio [OR] = 2.2; 95% confidence interval [CI], 1.2-4.4). Patients who underwent unilateral salpingectomy (n = 245) had a comparable recurrence rate of EP after IVF with those who received methotrexate (n = 283) (3.6% vs. 2.8%; p = .5; OR = 1.3; 95% CI, 0.4-3.4). This OR remained unchanged after adjusting for patient's age, number of previous EPs, number of transferred embryos, and peak estradiol level during stimulation (adjusted OR = 1.4; 95% CI, 0.5-3.8). None of the patients who underwent bilateral salpingectomy (n = 45) or salpingostomy (n = 21) had a recurrence of EP after IVF. The recurrence rate of EP significantly correlates with the number of previous EPs. Treatment of EP with methotrexate has a comparable recurrence rate of EP after IVF with unilateral salpingectomy. Therefore, the risk of recurrence should not be a reason to favor salpingectomy over methotrexate in this population. Copyright © 2017 AAGL. Published by Elsevier Inc. All rights reserved.
The floral morphospace – a modern comparative approach to study angiosperm evolution
Chartier, Marion; Jabbour, Florian; Gerber, Sylvain; Mitteroecker, Philipp; Sauquet, Hervé; von Balthazar, Maria; Staedler, Yannick; Crane, Peter R.; Schönenberger, Jürg
2017-01-01
Summary Morphospaces are mathematical representations used for studying the evolution of morphological diversity and for the evaluation of evolved shapes among theoretically possible ones. Although widely used in zoology, they – with few exceptions – have been disregarded in plant science and in particular in the study of broad-scale patterns of floral structure and evolution. Here we provide basic information on the morphospace approach; we review earlier morphospace applications in plant science; and as a practical example, we construct and analyze a floral morphospace. Morphospaces are usually visualized with the help of ordination methods such as principal component analysis (PCA) or nonmetric multidimensional scaling (NMDS). The results of these analyses are then coupled with disparity indices that describe the spread of taxa in the space. We discuss these methods and apply modern statistical tools to the first and only angiosperm-wide floral morphospace published by Stebbins in 1951. Despite the incompleteness of Stebbins’ original dataset, our analyses highlight major, angiosperm-wide trends in the diversity of flower morphology and thereby demonstrate the power of this previously neglected approach in plant science. PMID:25539005
NASA Astrophysics Data System (ADS)
Tulet, Pierre; Crassier, Vincent; Cousin, Frederic; Suhre, Karsten; Rosset, Robert
2005-09-01
Classical aerosol schemes use either a sectional (bin) or lognormal approach. Both approaches have particular capabilities and interests: the sectional approach is able to describe every kind of distribution, whereas the lognormal one makes assumption of the distribution form with a fewer number of explicit variables. For this last reason we developed a three-moment lognormal aerosol scheme named ORILAM to be coupled in three-dimensional mesoscale or CTM models. This paper presents the concept and hypothesis of a range of aerosol processes such as nucleation, coagulation, condensation, sedimentation, and dry deposition. One particular interest of ORILAM is to keep explicit the aerosol composition and distribution (mass of each constituent, mean radius, and standard deviation of the distribution are explicit) using the prediction of three-moment (m0, m3, and m6). The new model was evaluated by comparing simulations to measurements from the Escompte campaign and to a previously published aerosol model. The numerical cost of the lognormal mode is lower than two bins of the sectional one.
Glossary of reference terms for alternative test methods and their validation.
Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas
2014-01-01
This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.
Systematic approach to characterisation of NORM in Thailand.
Chanyotha, S; Kranrod, C; Pengvanich, P
2015-11-01
The aim of this article is to provide information on the systematic approach that has been developed for the measurement of natural radiation exposure and the characterisation of naturally occurring radioactive materials (NORM) in terms of occurrence and distribution in various industrial processes, including the produced waste from the mineral industries in Thailand. The approach can be adapted for various types of study areas. The importance of collaboration among research institutions is discussed. Some developments include 25 documents; the redesign of the field equipment, such as the gamma survey meter, for convenient access to conduct measurement in various study areas; the method to collect and analyse radon gas from a natural gas pipeline and the manganese dioxide fibre to adsorb radium on-site for laboratory analysis. The NORM project in Thailand has been carried out for more than 10 y to support the development of NORM regulation in Thailand. In the previous studies as well as current, international standards for action levels have been adopted for safety purpose. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Allergenicity assessment strategy for novel food proteins and protein sources.
Verhoeckx, Kitty; Broekman, Henrike; Knulst, André; Houben, Geert
2016-08-01
To solve the future food insecurity problem, alternative and sustainable protein sources (e.g. insects, rapeseed, fava bean and algae) are now being explored for the production of food and feed. To approve these novel protein sources for future food a comprehensive risk assessment is needed according to the European food legislation. Allergenicity risk assessment might pose some major difficulties, since detailed guidance on how to assess the allergenic potential of novel foods is not available. At present, the approach relies mostly on the guidance of allergenicity assessment for genetically modified (GM) plant foods. The most recent one was proposed by EFSA (2010 and 2011); "weight-of-evidence approach". However this guidance is difficult to interpret, not completely applicable or validated for novel foods and therefore needs some adjustments. In this paper we propose a conceptual strategy which is based on the "weight-of-evidence approach" for food derived from GM plants and other strategies that were previously published in the literature. This strategy will give more guidance on how to assess the allergenicity of novel food proteins and protein sources. Copyright © 2016 Elsevier Inc. All rights reserved.
A sequential bioequivalence design with a potential ethical advantage.
Fuglsang, Anders
2014-07-01
This paper introduces a two-stage approach for evaluation of bioequivalence, where, in contrast to the designs of Diane Potvin and co-workers, two stages are mandatory regardless of the data obtained at stage 1. The approach is derived from Potvin's method C. It is shown that under circumstances with relatively high variability and relatively low initial sample size, this method has an advantage over Potvin's approaches in terms of sample sizes while controlling type I error rates at or below 5% with a minute occasional trade-off in power. Ethically and economically, the method may thus be an attractive alternative to the Potvin designs. It is also shown that when using the method introduced here, average total sample sizes are rather independent of initial sample size. Finally, it is shown that when a futility rule in terms of sample size for stage 2 is incorporated into this method, i.e., when a second stage can be abolished due to sample size considerations, there is often an advantage in terms of power or sample size as compared to the previously published methods.
Thomas, David L; Lythgoe, Mark F; Gadian, David G; Ordidge, Roger J
2006-04-01
A novel method for measuring the longitudinal relaxation time of arterial blood (T1a) is presented. Knowledge of T1a is essential for accurately quantifying cerebral perfusion using arterial spin labeling (ASL) techniques. The method is based on the flow-sensitive alternating inversion recovery (FAIR) pulsed ASL (PASL) approach. We modified the standard FAIR acquisition scheme by incorporating a global saturation pulse at the beginning of the recovery period. With this approach the FAIR tissue signal difference has a simple monoexponential dependence on the recovery time, with T1a as the time constant. Therefore, FAIR measurements performed over a range of recovery times can be fitted to a monoexponential recovery curve and T1a can be calculated directly. This eliminates many of the difficulties associated with the measurement of T1a. Experiments performed in vivo in the mouse at 2.35T produced a mean value of 1.51 s for T1a, consistent with previously published values. (c) 2006 Wiley-Liss, Inc.
Granacher, Urs; Muehlbauer, Thomas; Gollhofer, Albert; Kressig, Reto W; Zahner, Lukas
2011-01-01
The risk of sustaining a fall and fall-related injuries is particularly high in children and seniors, which is why there is a need to develop fall-preventive intervention programs. An intergenerational approach in balance and strength promotion appears to have great potential because it is specifically tailored to the physical, social and behavioural needs of children and seniors. Burtscher and Kopp [Gerontology, DOI: 10.1159/000322930] raised the question whether our previously published mini-review is evidence-based or evidence-inspired. These authors postulate that we did not follow a 4-stage conceptual model for the development of injury and/or fall-preventive intervention programs. In response to this criticism, we present information from the mini-review that comply with the 4-stage model incorporating evidence-based and evidence-inspired components. We additionally provide information on how to implement an intergenerational balance and resistance training approach in a school setting based on a study that is being currently conducted. Copyright © 2010 S. Karger AG, Basel.
Top 10 metrics for life science software good practices.
Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.
Scholten, Paul M; Massimi, Stephen; Dahmen, Nick; Diamond, Joanne; Wyss, James
2015-01-01
Athletic pubalgia is a syndrome of persistent groin pain due to chronic repetitive trauma or stress involving the pelvic joints and many musculotendinous structures that cross the anterior pelvis. As a result, the differential diagnosis can be complex, but insertional tendinopathies are the most common. This case report describes a novel approach to the treatment of distal rectus abdominis tendinopathies with ultrasound-guided needle tenotomy and platelet-rich plasma (PRP) injection. After injection, the patient returned to pain-free play at his previous level of intensity. This suggests that PRP may be a useful treatment for this diagnosis. Copyright © 2015. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Connelly, Joseph; Blake, Peter; Jones, Joycelyn
2008-01-01
The authors report operational upgrades and streamlined data analysis of a commissioned electronic speckle interferometer (ESPI) in a permanent in-house facility at NASA's Goddard Space Flight Center. Our ESPI was commercially purchased for use by the James Webb Space Telescope (JWST) development team. We have quantified and reduced systematic error sources, improved the software operability with a user-friendly graphic interface, developed an instrument simulator, streamlined data analysis for long-duration testing, and implemented a turn-key approach to speckle interferometry. We also summarize results from a test of the JWST support structure (previously published), and present new results from several pieces of test hardware at various environmental conditions.
Kormes, Diego J; Cortón, Eduardo
2009-01-01
Whereas biosensors have been usually proposed as analytical tools, used to investigate the surrounding media pursuing an analytical answer, we have used a biosensor-like device to characterize the microbial cells immobilized on it. We have studied the kinetics of transport and degradation of glucose at different concentrations and temperatures. When glucose concentrations of 15 and 1.5 mM were assayed, calculated activation energies were 25.2 and 18.4 kcal mol(-1), respectively, in good agreement with previously published data. The opportunity and convenience of using Arrhenius plots to estimate the activation energy in metabolic-related processes is also discussed.
Time-resolved gamma spectroscopy of single events
NASA Astrophysics Data System (ADS)
Wolszczak, W.; Dorenbos, P.
2018-04-01
In this article we present a method of characterizing scintillating materials by digitization of each individual scintillation pulse followed by digital signal processing. With this technique it is possible to measure the pulse shape and the energy of an absorbed gamma photon on an event-by-event basis. In contrast to time-correlated single photon counting technique, the digital approach provides a faster measurement, an active noise suppression, and enables characterization of scintillation pulses simultaneously in two domains: time and energy. We applied this method to study the pulse shape change of a CsI(Tl) scintillator with energy of gamma excitation. We confirmed previously published results and revealed new details of the phenomenon.
Rómoli, Santiago; Serrano, Mario Emanuel; Ortiz, Oscar Alberto; Vega, Jorge Rubén; Eduardo Scaglia, Gustavo Juan
2015-07-01
Based on a linear algebra approach, this paper aims at developing a novel control law able to track reference profiles that were previously-determined in the literature. A main advantage of the proposed strategy is that the control actions are obtained by solving a system of linear equations. The optimal controller parameters are selected through Monte Carlo Randomized Algorithm in order to minimize a proposed cost index. The controller performance is evaluated through several tests, and compared with other controller reported in the literature. Finally, a Monte Carlo Randomized Algorithm is conducted to assess the performance of the proposed controller. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Top 10 metrics for life science software good practices
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here. PMID:27635232
Online-Based Approaches to Identify Real Journals and Publishers from Hijacked Ones.
Asadi, Amin; Rahbar, Nader; Asadi, Meisam; Asadi, Fahime; Khalili Paji, Kokab
2017-02-01
The aim of the present paper was to introduce some online-based approaches to evaluate scientific journals and publishers and to differentiate them from the hijacked ones, regardless of their disciplines. With the advent of open-access journals, many hijacked journals and publishers have deceitfully assumed the mantle of authenticity in order to take advantage of researchers and students. Although these hijacked journals and publishers can be identified through checking their advertisement techniques and their websites, these ways do not always result in their identification. There exist certain online-based approaches, such as using Master Journal List provided by Thomson Reuters, and Scopus database, and using the DOI of a paper, to certify the realness of a journal or publisher. It is indispensable that inexperienced students and researchers know these methods so as to identify hijacked journals and publishers with a higher level of probability.
Contributions to muscle force and EMG by combined neural excitation and electrical stimulation
Crago, Patrick E; Makowski, Nathaniel S; Cole, Natalie M
2014-01-01
Objective Stimulation of muscle for research or clinical interventions is often superimposed on ongoing physiological activity, without a quantitative understanding of the impact of the stimulation on the net muscle activity and the physiological response. Experimental studies show that total force during stimulation is less than the sum of the isolated voluntary and stimulated forces, but the occlusion mechanism is not understood. Approach We develop a model of efferent motor activity elicited by superimposing stimulation during a physiologically activated contraction. The model combines action potential interactions due to collision block, source resetting, and refractory periods with previously published models of physiological motor unit recruitment, rate modulation, force production, and EMG generation in human first dorsal interosseous muscle to investigate the mechanisms and effectiveness of stimulation on the net muscle force and EMG. Main Results Stimulation during a physiological contraction demonstrates partial occlusion of force and the neural component of the EMG, due to action potential interactions in motor units activated by both sources. Depending on neural and stimulation firing rates as well as on force-frequency properties, individual motor unit forces can be greater, smaller, or unchanged by the stimulation. In contrast, voluntary motor unit EMG potentials in simultaneously stimulated motor units show progressive occlusion with increasing stimulus rate. The simulations predict that occlusion would be decreased by a reverse stimulation recruitment order. Significance The results are consistent with and provide a mechanistic interpretation of previously published experimental evidence of force occlusion. The models also predict two effects that have not been reported previously - voluntary EMG occlusion and the advantages of a proximal stimulation site. This study provides a basis for the rational design of both future experiments and clinical neuroprosthetic interventions involving either motor or sensory stimulation. PMID:25242203
Hydrodynamic and Chemical Factors in Clogging by Montmorillonite in Porous Media
Mays, David C.; Hunt, James R.
2008-01-01
Clogging by colloid deposits is important in water treatment filters, groundwater aquifers, and petroleum reservoirs. The complexity of colloid deposition and deposit morphology preclude models based on first principles, so this study extends an empirical approach to quantify clogging using a simple, one-parameter model. Experiments were conducted with destabilized suspensions of sodium- and calcium-montmorillonite to quantify the hydrodynamic and chemical factors important in clogging. Greater clogging is observed at slower fluid velocity, consistent with previous investigations. However, calcium-montmorillonite causes one order of magnitude less clogging per mass of deposited particles compared to sodium-montmorillonite or a previously published summary of clogging in model granular media. Steady state conditions, in which the permeability and the quantity of deposited material are both constant, were not observed, even though the experimental conditions were optimized for that purpose. These results indicate that hydrodynamic aspects of clogging by these natural materials are consistent with those of simplified model systems, and they demonstrate significant chemical effects on clogging for fully destabilized montmorillonite clay. PMID:17874771
Hydrodynamic and chemical factors in clogging by montmorillonite in porous media.
Mays, David C; Hunt, James R
2007-08-15
Clogging by colloid deposits is important in water treatment filters, groundwater aquifers, and petroleum reservoirs. The complexity of colloid deposition and deposit morphology preclude models based on first principles, so this study extends an empirical approach to quantify clogging using a simple, one-parameter model. Experiments were conducted with destabilized suspensions of sodium- and calcium-montmorillonite to quantify the hydrodynamic and chemical factors important in clogging. Greater clogging is observed at slower fluid velocity, consistent with previous investigations. However, calcium-montmorillonite causes 1 order of magnitude less clogging per mass of deposited particles compared to sodium-montmorillonite or a previously published summary of clogging in model granular media. Steady-state conditions, in which the permeability and the quantity of deposited material are both constant, were not observed, even though the experimental conditions were optimized for that purpose. These results indicate that hydrodynamic aspects of clogging by these natural materials are consistent with those of simplified model systems, and they demonstrate significant chemical effects on clogging for fully destabilized montmorillonite clay.
Predicting adolescent's cyberbullying behavior: A longitudinal risk analysis.
Barlett, Christopher P
2015-06-01
The current study used the risk factor approach to test the unique and combined influence of several possible risk factors for cyberbullying attitudes and behavior using a four-wave longitudinal design with an adolescent US sample. Participants (N = 96; average age = 15.50 years) completed measures of cyberbullying attitudes, perceptions of anonymity, cyberbullying behavior, and demographics four times throughout the academic school year. Several logistic regression equations were used to test the contribution of these possible risk factors. Results showed that (a) cyberbullying attitudes and previous cyberbullying behavior were important unique risk factors for later cyberbullying behavior, (b) anonymity and previous cyberbullying behavior were valid risk factors for later cyberbullying attitudes, and (c) the likelihood of engaging in later cyberbullying behavior increased with the addition of risk factors. Overall, results show the unique and combined influence of such risk factors for predicting later cyberbullying behavior. Results are discussed in terms of theory. Copyright © 2015 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Resting-state functional connectivity assessed with two diffuse optical tomographic systems.
Niu, Haijing; Khadka, Sabin; Tian, Fenghua; Lin, Zi-Jing; Lu, Chunming; Zhu, Chaozhe; Liu, Hanli
2011-04-01
Functional near-infrared spectroscopy (fNIRS) is recently utilized as a new approach to assess resting-state functional connectivity (RSFC) in the human brain. For any new technique or new methodology, it is necessary to be able to replicate similar experiments using different instruments in order to establish its liability and reproducibility. We apply two different diffuse optical tomographic (DOT) systems (i.e., DYNOT and CW5), with various probe arrangements to evaluate RSFC in the sensorimotor cortex by utilizing a previously published experimental protocol and seed-based correlation analysis. Our results exhibit similar spatial patterns and strengths in RSFC between the bilateral motor cortexes. The consistent observations are obtained from both DYNOT and CW5 systems, and are also in good agreement with the previous fNIRS study. Overall, we demonstrate that the fNIRS-based RSFC is reproducible by various DOT imaging systems among different research groups, enhancing the confidence of neuroscience researchers and clinicians to utilize fNIRS for future applications.
The newest international trend about regulation of indoor radon.
Bochicchio, Francesco
2011-07-01
On the basis of recent epidemiological findings, many international and national organisations have revised their recommendations and regulations on radon exposure in dwellings and workplaces, or are in the process to do it. In particular, new recommendations and regulations were recently published (or are going to be) by World Health Organization, Nordic Countries, International Commission on Radiological Protection, International, Atomic Energy Agency (and the other international organisations sponsoring the International Basic Safety Standards), European Commission. Although with some differences, these new documents recommend lower radon concentrations in indoor air, especially in dwellings, compared with previous ones. Moreover, preventive measures in all new buildings are more and more considered as one of the most cost-effective way to reduce the radon-related lung cancers, compared with previous approach restricting preventive measures in radon-prone areas only. A comprehensive national action plan, involving several national and local authorities, is generally considered a necessary tool to deal with the many complex actions needed to reduce the risk from radon exposure in an effective way.
NASA Technical Reports Server (NTRS)
Newton, G. P.
1973-01-01
Previous solutions of the problem of the distribution of vibrationally excited molecular nitrogen in the thermosphere have either assumed a Boltzmann distribution and considered diffusion as one of the loss processes or solved for the energy level populations and neglected diffusion. Both of the previous approaches are combined by solving the time dependent continuity equations, including the diffusion process, for the first six energy levels of molecular nitrogen for conditions in the thermosphere corresponding to a stable auroral red arc. The primary source of molecular nitrogen excitation was subexcitation, and inelastic collisions between thermal electrons and molecular nitrogen. The reaction rates for this process were calculated from published cross section calculations. The loss processes for vibrational energy were electron and atomic oxygen quenching and vibrational energy exchange. The coupled sets of nonlinear, partial differential equations were solved numerically by employing finite difference equations.
Wijeakumar, Sobanawartiny; Ambrose, Joseph P.; Spencer, John P.; Curtu, Rodica
2017-01-01
A fundamental challenge in cognitive neuroscience is to develop theoretical frameworks that effectively span the gap between brain and behavior, between neuroscience and psychology. Here, we attempt to bridge this divide by formalizing an integrative cognitive neuroscience approach using dynamic field theory (DFT). We begin by providing an overview of how DFT seeks to understand the neural population dynamics that underlie cognitive processes through previous applications and comparisons to other modeling approaches. We then use previously published behavioral and neural data from a response selection Go/Nogo task as a case study for model simulations. Results from this study served as the ‘standard’ for comparisons with a model-based fMRI approach using dynamic neural fields (DNF). The tutorial explains the rationale and hypotheses involved in the process of creating the DNF architecture and fitting model parameters. Two DNF models, with similar structure and parameter sets, are then compared. Both models effectively simulated reaction times from the task as we varied the number of stimulus-response mappings and the proportion of Go trials. Next, we directly simulated hemodynamic predictions from the neural activation patterns from each model. These predictions were tested using general linear models (GLMs). Results showed that the DNF model that was created by tuning parameters to capture simultaneously trends in neural activation and behavioral data quantitatively outperformed a Standard GLM analysis of the same dataset. Further, by using the GLM results to assign functional roles to particular clusters in the brain, we illustrate how DNF models shed new light on the neural populations’ dynamics within particular brain regions. Thus, the present study illustrates how an interactive cognitive neuroscience model can be used in practice to bridge the gap between brain and behavior. PMID:29118459
NASA Astrophysics Data System (ADS)
Inman, Matthew Clay
A novel, open-cathode direct methanol fuel cell (DMFC ) has been designed and built by researchers at the University of North Florida and University of Florida. Foremost among the advances of this system over previous DMFC architectures is a passive water recovery system which allows product water to replenish that consumed at the anode. This is enabled by a specially-designed water pathway combined with a liquid barrier layer (LBL ). The LBL membrane is positioned between the cathode catalyst layer and the cathode gas diffusion layer, and must exhibit high permeability and low diffusive resistance to both oxygen and water vapor, bulk hydrophobicity to hold back the product liquid water, and must remain electrically conductive. Maintaining water balance at optimum operating temperatures is problematic with the current LBL design, forcing the system to run at lower temperatures decreasing the overall system efficiency. This research presents a novel approach to nanoporous membrane design whereby flux of a given species is determined based upon the molecular properties of said species and those of the diffusing medium, the pore geometry, and the membrane thickness. A molecular dynamics (MD ) model is developed for tracking Knudsen regime flows of a Lennard-Jones (LJ ) fluid through an atomistic pore structure, hundreds of thousands of wall collision simulations are performed on the University of Florida HiPerGator supercomputer, and the generated trajectory information is used to develop number density and axial velocity profiles for use in a rigorous approach to total flux calculation absent in previously attempted MD models. Results are compared to other published approaches and diffusion data available in the literature. The impact of this study on various applications of membrane design is discussed and additional simulations and model improvements are outlined for future consideration.
Korn, Christoph W; Vunder, Johanna; Miró, Júlia; Fuentemilla, Lluís; Hurlemann, Rene; Bach, Dominik R
2017-10-01
Rodent approach-avoidance conflict tests are common preclinical models of human anxiety disorder. Their translational validity mainly rests on the observation that anxiolytic drugs reduce rodent anxiety-like behavior. Here, we capitalized on a recently developed approach-avoidance conflict computer game to investigate the impact of benzodiazepines and of amygdala lesions on putative human anxiety-like behavior. In successive epochs of this game, participants collect monetary tokens on a spatial grid while under threat of virtual predation. In a preregistered, randomized, double-blind, placebo-controlled trial, we tested the effect of a single dose (1 mg) of lorazepam (n = 59). We then compared 2 patients with bilateral amygdala lesions due to Urbach-Wiethe syndrome with age- and gender-matched control participants (n = 17). Based on a previous report, the primary outcome measure was the effect of intra-epoch time (i.e., an adaptation to increasing potential loss) on presence in the safe quadrant of the spatial grid. We hypothesized reduced loss adaptation in this measure under lorazepam and in patients with amygdala lesions. Lorazepam and amygdala lesions reduced loss adaptation in the primary outcome measure. We found similar results in several secondary outcome measures. The relative reduction of anxiety-like behavior in patients with amygdala lesions was qualitatively and quantitatively indistinguishable from an impact of anterior hippocampus lesions found in a previous report. Our results establish the translational validity of human approach-avoidance conflict tests in terms of anxiolytic drug action. We identified the amygdala, in addition to the hippocampus, as a critical structure in human anxiety-like behavior. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Loades, Maria E; Sheils, Elizabeth A; Crawley, Esther
2016-10-11
At least 30% of young people with chronic fatigue syndrome/myalgic encephalomyelitis (CFS/ME) also have symptoms of depression. This systematic review aimed to establish which treatment approaches for depression are effective and whether comorbid depression mediates outcome. A systematic review was undertaken. The search terms were entered into MEDLINE, EMBASE, PsycInfo and the Cochrane library. Inclusion and exclusion criteria were applied to identify relevant papers. Inclusion criteria were children age <18, with CFS/ME, defined using CDC, NICE or Oxford criteria, and having completed a valid assessment for depression. 9 studies were identified which met the inclusion criteria, but none specifically tested treatments for paediatric CFS/ME with depression and none stratified outcome for those who were depressed compared with those who were not depressed. There is no consistent treatment approach for children with CFS/ME and comorbid depression, although cognitive-behavioural therapy for CFS/ME and a multicomponent inpatient programme for CFS/ME have shown some promise in reducing depressive symptoms. An antiviral medication in a small scale, retrospective, uncontrolled study suggested possible benefit. It is not possible to determine what treatment approaches are effective for depression in paediatric CFS/ME, nor to determine the impact of depression on the outcome of CFS/ME treatment. Young people with significant depression tend to have been excluded from previous treatment studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Preliminary Multivariable Cost Model for Space Telescopes
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2010-01-01
Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored
Improved parameters of seven Kepler giant companions characterized with SOPHIE and HARPS-N
NASA Astrophysics Data System (ADS)
Bonomo, A. S.; Sozzetti, A.; Santerne, A.; Deleuil, M.; Almenara, J.-M.; Bruno, G.; Díaz, R. F.; Hébrard, G.; Moutou, C.
2015-03-01
Radial-velocity observations of Kepler candidates obtained with the SOPHIE and HARPS-N spectrographs have permitted unveiling the nature of the five giant planets Kepler-41b, Kepler-43b, Kepler-44b, Kepler-74b, and Kepler-75b, the massive companion Kepler-39b, and the brown dwarf KOI-205b. These companions were previously characterized with long-cadence (LC) Kepler data. Here we aim at refining the parameters of these transiting systems by i) modelling the published radial velocities and Kepler short-cadence (SC) data that provide a much better sampling of the transits; ii) performing new spectral analyses of the SOPHIE and ESPaDOnS spectra, after improving our procedure for selecting and co-adding the SOPHIE spectra of faint stars (Kp ≳ 14); and iii) improving stellar rotation periods hence stellar age estimates through gyrochronology, when possible, by using all the available LC data up to quarter Q17. Posterior distributions of the system parameters were derived with a differential evolution Markov chain Monte Carlo approach. Our main results are as follows: a) Kepler-41b is significantly larger and less dense than previously found because a lower orbital inclination is favoured by SC data. This also affects the determination of the geometric albedo that is lower than previously derived: Ag< 0.135; b) Kepler-44b is moderately smaller and denser than reported in the discovery paper, as a consequence of the slightly shorter transit duration found with SC data; c) good agreement was achieved with published Kepler-43, Kepler-75, and KOI-205 system parameters, although the host stars Kepler-75 and KOI-205 were found to be slightly richer in metals and hotter, respectively; d) the previously reported non-zero eccentricities of Kepler-39b and Kepler-74b might be spurious. If their orbits were circular, the two companions would be smaller and denser than in the eccentric case. The radius of Kepler-39b is still larger than predicted by theoretical isochrones. Its parent star is hotter and richer in metals than previously determined. Tables 2-8 are available in electronic form at http://www.aanda.org
Writing and Publishing Handbook.
ERIC Educational Resources Information Center
Hansen, William F., Ed.
Intended to provide guidance in academic publishing to faculty members, especially younger faculty members, this handbook is a compilation of four previously published essays by different authors. Following a preface and an introduction, the four essays and their authors are as follows: (1) "One Writer's Secrets" (Donald M. Murray); (2)…
Utility-preserving anonymization for health data publishing.
Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn
2017-07-11
Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.
Davis, Katherine; Gorst, Sarah L; Harman, Nicola; Smith, Valerie; Gargon, Elizabeth; Altman, Douglas G; Blazeby, Jane M; Clarke, Mike; Tunis, Sean; Williamson, Paula R
2018-01-01
Core outcome sets (COS) comprise a minimum set of outcomes that should be measured and reported in all trials for a specific health condition. The COMET (Core Outcome Measures in Effectiveness Trials) Initiative maintains an up to date, publicly accessible online database of published and ongoing COS. An annual systematic review update is an important part of this process. This review employed the same, multifaceted approach that was used in the original review and the previous two updates. This approach has identified studies that sought to determine which outcomes/domains to measure in clinical trials of a specific condition. This update includes an analysis of the inclusion of participants from low and middle income countries (LMICs) as identified by the OECD, in these COS. Eighteen publications, relating to 15 new studies describing the development of 15 COS, were eligible for inclusion in the review. Results show an increase in the use of mixed methods, including Delphi surveys. Clinical experts remain the most common stakeholder group involved. Overall, only 16% of the 259 COS studies published up to the end of 2016 have included participants from LMICs. This review highlights opportunities for greater public participation in COS development and the involvement of stakeholders from a wider range of geographical settings, in particular LMICs.
Development and evaluation of consensus-based sediment quality guidelines for freshwater ecosystems
MacDonald, D.D.; Ingersoll, C.G.; Berger, T.A.
2000-01-01
Numerical sediment quality guidelines (SQGs) for freshwater ecosystems have previously been developed using a variety of approaches. Each approach has certain advantages and limitations which influence their application in the sediment quality assessment process. In an effort to focus on the agreement among these various published SQGs, consensus-based SQGs were developed for 28 chemicals of concern in freshwater sediments (i.e., metals, polycyclic aromatic hydrocarbons, polychlorinated biphenyls, and pesticides). For each contaminant of concern, two SQGs were developed from the published SQGs, including a threshold effect concentration (TEC) and a probable effect concentration (PEC). The resultant SQGs for each chemical were evaluated for reliability using matching sediment chemistry and toxicity data from field studies conducted throughout the United States. The results of this evaluation indicated that most of the TECs (i.e., 21 of 28) provide an accurate basis for predicting the absence of sediment toxicity. Similarly, most of the PECs (i.e., 16 of 28) provide an accurate basis for predicting sediment toxicity. Mean PEC quotients were calculated to evaluate the combined effects of multiple contaminants in sediment. Results of the evaluation indicate that the incidence of toxicity is highly correlated to the mean PEC quotient (R2= 0.98 for 347 samples). It was concluded that the consensus-based SQGs provide a reliable basis for assessing sediment quality conditions in freshwater ecosystems.
NASA Astrophysics Data System (ADS)
Blaña, M.; Fellhauer, M.; Smith, R.; Candlish, G. N.; Cohen, R.; Farias, J. P.
2015-01-01
Hercules is a dwarf spheroidal satellite of the Milky Way, found at a distance of ≈138 kpc, and showing evidence of tidal disruption. It is very elongated and exhibits a velocity gradient of 16 ± 3 km s-1 kpc-1. Using these data a possible orbit of Hercules has previously been deduced in the literature. In this study, we make use of a novel approach to find a best-fitting model that follows the published orbit. Instead of using trial and error, we use a systematic approach in order to find a model that fits multiple observables simultaneously. As such, we investigate a much wider parameter range of initial conditions and ensure we have found the best match possible. Using a dark matter free progenitor that undergoes tidal disruption, our best-fitting model can simultaneously match the observed luminosity, central surface brightness, effective radius, velocity dispersion, and velocity gradient of Hercules. However, we find it is impossible to reproduce the observed elongation and the position angle of Hercules at the same time in our models. This failure persists even when we vary the duration of the simulation significantly, and consider a more cuspy density distribution for the progenitor. We discuss how this suggests that the published orbit of Hercules is very likely to be incorrect.
Phenotype-driven molecular autopsy for sudden cardiac death.
Cann, F; Corbett, M; O'Sullivan, D; Tennant, S; Hailey, H; Grieve, J H K; Broadhurst, P; Rankin, R; Dean, J C S
2017-01-01
A phenotype-driven approach to molecular autopsy based in a multidisciplinary team comprising clinical and laboratory genetics, forensic medicine and cardiology is described. Over a 13 year period, molecular autopsy was undertaken in 96 sudden cardiac death cases. A total of 46 cases aged 1-40 years had normal hearts and suspected arrhythmic death. Seven (15%) had likely pathogenic variants in ion channelopathy genes [KCNQ1 (1), KCNH2 (4), SCN5A (1), RyR2(1)]. Fifty cases aged between 2 and 67 had a cardiomyopathy. Twenty-five had arrhythmogenic right ventricular cardiomyopathy (ARVC), 10 dilated cardiomyopathy (DCM) and 15 hypertrophic cardiomyopathy (HCM). Likely pathogenic variants were found in three ARVC cases (12%) in PKP2, DSC2 or DSP, two DCM cases (20%) in MYH7, and four HCM cases (27%) in MYBPC3 (3) or MYH7 (1). Uptake of cascade screening in relatives was higher when a molecular diagnosis was made at autopsy. In three families, variants previously published as pathogenic were detected, but clinical investigation revealed no abnormalities in carrier relatives. With a conservative approach to defining pathogenicity of sequence variants incorporating family phenotype information and population genomic data, a molecular diagnosis was made in 15% of sudden arrhythmic deaths and 18% of cardiomyopathy deaths. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Röhm, Martina; Carle, Stefan; Maigler, Frank; Flamm, Johannes; Kramer, Viktoria; Mavoungou, Chrystelle; Schmid, Otmar; Schindowski, Katharina
2017-10-30
Aerosolized administration of biopharmaceuticals to the airways is a promising route for nasal and pulmonary drug delivery, but - in contrast to small molecules - little is known about the effects of aerosolization on safety and efficacy of biopharmaceuticals. Proteins are sensitive against aerosolization-associated shear stress. Tailored formulations can shield proteins and enhance permeation, but formulation development requires extensive screening approaches. Thus, the aim of this study was to develop a cell-based in vitro technology platform that includes screening of protein quality after aerosolization and transepithelial permeation. For efficient screening, a previously published aerosolization-surrogate assay was used in a design of experiments approach to screen suitable formulations for an IgG and its antigen-binding fragment (Fab) as exemplary biopharmaceuticals. Efficient, dose-controlled aerosol-cell delivery was performed with the ALICE-CLOUD system containing RPMI 2650 epithelial cells at the air-liquid interface. We could demonstrate that our technology platform allows for rapid and efficient screening of formulations consisting of different excipients (here: arginine, cyclodextrin, polysorbate, sorbitol, and trehalose) to minimize aerosolization-induced protein aggregation and maximize permeation through an in vitro epithelial cell barrier. Formulations reduced aggregation of native Fab and IgG relative to vehicle up to 50% and enhanced transepithelial permeation rate up to 2.8-fold. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Developing a framework for assessment of the environmental determinants of walking and cycling.
Pikora, Terri; Giles-Corti, Billie; Bull, Fiona; Jamrozik, Konrad; Donovan, Rob
2003-04-01
The focus for interventions and research on physical activity has moved away from vigorous activity to moderate-intensity activities, such as walking. In addition, a social ecological approach to physical activity research and practice is recommended. This approach considers the influence of the environment and policies on physical activity. Although there is limited empirical published evidence related to the features of the physical environment that influence physical activity, urban planning and transport agencies have developed policies and strategies that have the potential to influence whether people walk or cycle in their neighbourhood. This paper presents the development of a framework of the potential environmental influences on walking and cycling based on published evidence and policy literature, interviews with experts and a Delphi study. The framework includes four features: functional, safety, aesthetic and destination; as well as the hypothesised factors that contribute to each of these features of the environment. In addition, the Delphi experts determined the perceived relative importance of these factors. Based on these factors, a data collection tool will be developed and the frameworks will be tested through the collection of environmental information on neighbourhoods, where data on the walking and cycling patterns have been collected previously. Identifying the environmental factors that influence walking and cycling will allow the inclusion of a public health perspective as well as those of urban planning and transport in the design of built environments.
Dorafshar, Amir H; Januszyk, Michael; Song, David H
2010-08-01
Techniques for autologous breast reconstruction have evolved to minimize donor-site morbidity and reduce flap-specific complications. When available, the superficial inferior epigastric artery (SIEA) flap represents the optimal method to achieve the former. However, many microsurgeons have been reluctant to adopt this procedure due to technical challenges inherent to the surgery, as well as concerns with the intrinsic capacity of the superficial vessel system to adequately support this flap. This article sets forth a simple approach to the SIEA flap harvest and demonstrates that favorable results may be achieved even for small caliber vessels. A total of 46 patients underwent 53 SIEA breast reconstructions over a 6-year period using a modified approach for pedicle dissection and arterial inclusion criteria solely on the basis of presence of a palpable pulse. Average pedicle length harvested for all SIEA flaps was 6.07 cm; and mean arterial (0.96 mm) and venous (2.27 mm) diameters represent the lowest published values. Three flaps (5.7%) demonstrated fat necrosis or partial flap necrosis, with one (1.9%) complete flap loss. These results compare favorably with those of previous SIEA series employing diameter-based selection criteria, suggesting that the presence of a palpable arterial pulse may be sufficient to permit successful utilization of this flap. (c) Thieme Medical Publishers.
Recommended approaches in the application of ...
ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine benchmark dose (BMD) and estimate a point of departure (POD). Several studies have shown that transcriptional PODs correlate with PODs derived from analysis of pathological changes, but there is no consensus on how the genes that are used to derive a transcriptional POD should be selected. Because of very large number of unrelated genes in gene expression data, the process of selecting subsets of informative genes is a major challenge. We used published microarray data from studies on rats exposed orally to multiple doses of six chemicals for 5, 14, 28, and 90 days. We evaluated eight different approaches to select genes for POD derivation and compared them to three previously proposed approaches. The relationship between transcriptional BMDs derived using these 11 approaches were compared with PODs derived from apical data that might be used in a human health risk assessment. We found that transcriptional benchmark dose values for all 11 approaches were remarkably aligned with different apical PODs, while a subset of between 3 and 8 of the approaches met standard statistical criteria across the 5-, 14-, 28-, and 90-day time points and thus qualify as effective estimates of apical PODs. Our r
Mol, Ben W; Bossuyt, Patrick M; Sunkara, Sesh K; Garcia Velasco, Juan A; Venetis, Christos; Sakkas, Denny; Lundin, Kersti; Simón, Carlos; Taylor, Hugh S; Wan, Robert; Longobardi, Salvatore; Cottell, Evelyn; D'Hooghe, Thomas
2018-06-01
Although most medical treatments are designed for the average patient with a one-size-fits-all-approach, they may not benefit all. Better understanding of the function of genes, proteins, and metabolite, and of personal and environmental factors has led to a call for personalized medicine. Personalized reproductive medicine is still in its infancy, without clear guidance on treatment aspects that could be personalized and on trial design to evaluate personalized treatment effect and benefit-harm balance. While the rationale for a personalized approach often relies on retrospective analyses of large observational studies or real-world data, solid evidence of superiority of a personalized approach will come from randomized trials comparing outcomes and safety between a personalized and one-size-fits-all strategy. A more efficient, targeted randomized trial design may recruit only patients or couples for which the personalized approach would differ from the previous, standard approach. Multiple monocenter studies using the same study protocol (allowing future meta-analysis) might reduce the major center effect associated with multicenter studies. In certain cases, single-arm observational studies can generate the necessary evidence for a personalized approach. This review describes each of the main segments of patient care in assisted reproductive technologies treatment, addressing which aspects could be personalized, emphasizing current evidence and relevant study design. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
The NIEHS Predictive-Toxicology Evaluation Project.
Bristol, D W; Wachsman, J T; Greenwell, A
1996-01-01
The Predictive-Toxicology Evaluation (PTE) project conducts collaborative experiments that subject the performance of predictive-toxicology (PT) methods to rigorous, objective evaluation in a uniquely informative manner. Sponsored by the National Institute of Environmental Health Sciences, it takes advantage of the ongoing testing conducted by the U.S. National Toxicology Program (NTP) to estimate the true error of models that have been applied to make prospective predictions on previously untested, noncongeneric-chemical substances. The PTE project first identifies a group of standardized NTP chemical bioassays either scheduled to be conducted or are ongoing, but not yet complete. The project then announces and advertises the evaluation experiment, disseminates information about the chemical bioassays, and encourages researchers from a wide variety of disciplines to publish their predictions in peer-reviewed journals, using whatever approaches and methods they feel are best. A collection of such papers is published in this Environmental Health Perspectives Supplement, providing readers the opportunity to compare and contrast PT approaches and models, within the context of their prospective application to an actual-use situation. This introduction to this collection of papers on predictive toxicology summarizes the predictions made and the final results obtained for the 44 chemical carcinogenesis bioassays of the first PTE experiment (PTE-1) and presents information that identifies the 30 chemical carcinogenesis bioassays of PTE-2, along with a table of prediction sets that have been published to date. It also provides background about the origin and goals of the PTE project, outlines the special challenge associated with estimating the true error of models that aspire to predict open-system behavior, and summarizes what has been learned to date. PMID:8933048
Hill, Sarah; Amos, Amanda; Clifford, David; Platt, Stephen
2014-11-01
We updated and expanded a previous systematic literature review examining the impact of tobacco control interventions on socioeconomic inequalities in smoking. We searched the academic literature for reviews and primary research articles published between January 2006 and November 2010 that examined the socioeconomic impact of six tobacco control interventions in adults: that is, price increases, smoke-free policies, advertising bans, mass media campaigns, warning labels, smoking cessation support and community-based programmes combining several interventions. We included English-language articles from countries at an advanced stage of the tobacco epidemic that examined the differential impact of tobacco control interventions by socioeconomic status or the effectiveness of interventions among disadvantaged socioeconomic groups. All articles were appraised by two authors and details recorded using a standardised approach. Data from 77 primary studies and seven reviews were synthesised via narrative review. We found strong evidence that increases in tobacco price have a pro-equity effect on socioeconomic disparities in smoking. Evidence on the equity impact of other interventions is inconclusive, with the exception of non-targeted smoking cessation programmes which have a negative equity impact due to higher quit rates among more advantaged smokers. Increased tobacco price via tax is the intervention with the greatest potential to reduce socioeconomic inequalities in smoking. Other measures studied appear unlikely to reduce inequalities in smoking without specific efforts to reach disadvantaged smokers. There is a need for more research evaluating the equity impact of tobacco control measures, and development of more effective approaches for reducing tobacco use in disadvantaged groups and communities. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
O'Leary, F
2003-07-01
To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.
Evaluating revised biomass equations: are some forest types more equivalent than others?
Coeli M. Hoover; James E. Smith
2016-01-01
Background: In 2014, Chojnacky et al. published a revised set of biomass equations for trees of temperate US forests, expanding on an existing equation set (published in 2003 by Jenkins et al.), both of which were developed from published equations using a meta-analytical approach. Given the similarities in the approach to developing the equations, an examination of...
Frank, Till D; Kiyatkin, Anatoly; Cheong, Alex; Kholodenko, Boris N
2017-06-01
Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant. © The authors 2016. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.
Huang, R; Agranovski, I; Pyankov, O; Grinshpun, S
2008-04-01
Continuous emission of unipolar ions has been shown to improve the performance of respirators and stationary filters challenged with non-biological particles. In this study, we investigated the ion-induced enhancement effect while challenging a low-efficiency heating, ventilation and air-conditioning (HVAC) filter with viable bacterial cells, bacterial and fungal spores, and viruses. The aerosol concentration was measured in real time. Samples were also collected with a bioaerosol sampler for viable microbial analysis. The removal efficiency of the filter was determined, respectively, with and without an ion emitter. The ionization was found to significantly enhance the filter efficiency in removing viable biological particles from the airflow. For example, when challenged with viable bacteria, the filter efficiency increased as much as four- to fivefold. For viable fungal spores, the ion-induced enhancement improved the efficiency by a factor of approximately 2. When testing with virus-carrying liquid droplets, the original removal efficiency provided by the filter was rather low: 9.09 +/- 4.84%. While the ion emission increased collection about fourfold, the efficiency did not reach 75-100% observed with bacteria and fungi. These findings, together with our previously published results for non-biological particles, demonstrate the feasibility of a new approach for reducing aerosol particles in HVAC systems used for indoor air quality control. Recirculated air in HVAC systems used for indoor air quality control in buildings often contains considerable number of viable bioaerosol particles because of limited efficiency of the filters installed in these systems. In the present study, we investigated - using aerosolized bacterial cells, bacterial and fungal spores, and virus-carrying particles - a novel idea of enhancing the performance of a low-efficiency HVAC filter utilizing continuous emission of unipolar ions in the filter vicinity. The findings described in this paper, together with our previously published results for non-biological particles, demonstrate the feasibility of the newly developed approach.
Seaton, Cherisse L; Holm, Nikolai; Bottorff, Joan L; Jones-Bricker, Margaret; Errey, Sally; Caperchione, Cristina M; Lamont, Sonia; Johnson, Steven T; Healy, Theresa
2018-05-01
To explore published empirical literature in order to identify factors that facilitate or inhibit collaborative approaches for health promotion using a scoping review methodology. A comprehensive search of MEDLINE, CINAHL, ScienceDirect, PsycINFO, and Academic Search Complete for articles published between January 2001 and October 2015 was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. To be included studies had to: be an original research article, published in English, involve at least 2 organizations in a health promotion partnership, and identify factors contributing to or constraining the success of an established (or prior) partnership. Studies were excluded if they focused on primary care collaboration or organizations jointly lobbying for a cause. Data extraction was completed by 2 members of the author team using a summary chart to extract information relevant to the factors that facilitated or constrained collaboration success. NVivo 10 was used to code article content into the thematic categories identified in the data extraction. Twenty-five studies across 8 countries were identified. Several key factors contributed to collaborative effectiveness, including a shared vision, leadership, member characteristics, organizational commitment, available resources, clear roles/responsibilities, trust/clear communication, and engagement of the target population. In general, the findings were consistent with previous reviews; however, additional novel themes did emerge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henegariu, O; Artan, S; Greally, J M
2003-08-19
Experimental data published in recent years showed that up to 10% of all cases with mild to severe idiopathic mental retardation may result from small rearrangements of the subtelomeric regions of human chromosomes. To detect such cryptic translocations, we developed a ''telomeric'' multiplex FISH assay, using a set of previously published and commercially available subtelomeric probes. This set of probes includes 41 cosmid/PAC/P1 clones located from less than 100kb to about 1 Mb from the end of the chromosomes. Similarly, a published mouse probe set, comprised of BACs hybridizing to the closest known marker toward the centromere and telomere ofmore » each mouse chromosome, was used to develop a mouse-specific ''telomeric'' M-FISH. Three different combinatorial labeling strategies were used to simultaneously detect all human sub-telomeric regions on one slide. The simplest approach uses only three fluors, and can be performed in laboratories lacking sophisticated imaging equipment or personnel highly trained in cytogenetics. A standard fluorescence microscope equipped with only three filters is sufficient. Fluor-dUTPs and labeled probes can be custom-made, thus dramatically reducing costs. Images can be prepared using generic imaging software (Adobe Photoshop), and analysis performed by simple visual inspection.« less
Eleven loci with new reproducible genetic associations with allergic disease risk.
Ferreira, Manuel A R; Vonk, Judith M; Baurecht, Hansjörg; Marenholz, Ingo; Tian, Chao; Hoffman, Joshua D; Helmer, Quinta; Tillander, Annika; Ullemar, Vilhelmina; Lu, Yi; Rüschendorf, Franz; Hinds, David A; Hübner, Norbert; Weidinger, Stephan; Magnusson, Patrik K E; Jorgenson, Eric; Lee, Young-Ae; Boomsma, Dorret I; Karlsson, Robert; Almqvist, Catarina; Koppelman, Gerard H; Paternoster, Lavinia
2018-04-19
A recent genome-wide association study (GWAS) identified 99 loci that contain genetic risk variants shared between asthma, hay fever, and eczema. Many more risk loci shared between these common allergic diseases remain to be discovered, which could point to new therapeutic opportunities. We sought to identify novel risk loci shared between asthma, hay fever, and eczema by applying a gene-based test of association to results from a published GWAS that included data from 360,838 subjects. We used approximate conditional analysis to adjust the results from the published GWAS for the effects of the top risk variants identified in that study. We then analyzed the adjusted GWAS results with the EUGENE gene-based approach, which combines evidence for association with disease risk across regulatory variants identified in different tissues. Novel gene-based associations were followed up in an independent sample of 233,898 subjects from the UK Biobank study. Of the 19,432 genes tested, 30 had a significant gene-based association at a Bonferroni-corrected P value of 2.5 × 10 -6 . Of these, 20 were also significantly associated (P < .05/30 = .0016) with disease risk in the replication sample, including 19 that were located in 11 loci not reported to contain allergy risk variants in previous GWASs. Among these were 9 genes with a known function that is directly relevant to allergic disease: FOSL2, VPRBP, IPCEF1, PRR5L, NCF4, APOBR, IL27, ATXN2L, and LAT. For 4 genes (eg, ATXN2L), a genetically determined decrease in gene expression was associated with decreased allergy risk, and therefore drugs that inhibit gene expression or function are predicted to ameliorate disease symptoms. The opposite directional effect was observed for 14 genes, including IL27, a cytokine known to suppress T H 2 responses. Using a gene-based approach, we identified 11 risk loci for allergic disease that were not reported in previous GWASs. Functional studies that investigate the contribution of the 19 associated genes to the pathophysiology of allergic disease and assess their therapeutic potential are warranted. Copyright © 2018 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
D:L-AMINO Acids and the Turnover of Microbial Biomass
NASA Astrophysics Data System (ADS)
Lomstein, B. A.; Braun, S.; Mhatre, S. S.; Jørgensen, B. B.
2015-12-01
Decades of ocean drilling have demonstrated wide spread microbial life in deep sub-seafloor sediment, and surprisingly high microbial cell numbers. Despite the ubiquity of life in the deep biosphere, the large community sizes and the low energy fluxes in the vast buried ecosystem are still poorly understood. It is not know whether organisms of the deep biosphere are specifically adapted to extremely low energy fluxes or whether most of the observed cells are in a maintenance state. Recently we developed and applied a new culture independent approach - the D:L-amino acid model - to quantify the turnover times of living microbial biomass, microbial necromass and mean metabolic rates. This approach is based on the built-in molecular clock in amino acids that very slowly undergo chemical racemization until they reach an even mixture of L- and D- forms, unless microorganisms spend energy to keep them in the L-form that dominates in living organisms. The approach combines sensitive analyses of amino acids, the unique bacterial endospore marker (dipicolinic acid) with racemization dynamics of stereo-isomeric amino acids. Based on a heating experiment, we recently reported kinetic parameters for racemization of aspartic acid, glutamic acid, serine and alanine in bulk sediment from Aarhus Bay, Denmark. The obtained racemization rate constants were faster than the racemization rate constants of free amino acids, which we have previously applied in Holocene sediment from Aarhus Bay and in up to 10 mio yr old sediment from ODP Leg 201. Another important input parameter for the D:L-amino acid model is the cellular carbon content. It has recently been suggested that the cellular carbon content most likely is lower than previously thought. In recognition of these new findings, previously published data based on the D:L-amino acid model were recalculated and will be presented together with new data from an Arctic Holocene setting with constant sub-zero temperatures.
Closing the loop: from paper to protein annotation using supervised Gene Ontology classification.
Gobeill, Julien; Pasche, Emilie; Vishnyakova, Dina; Ruch, Patrick
2014-01-01
Gene function curation of the literature with Gene Ontology (GO) concepts is one particularly time-consuming task in genomics, and the help from bioinformatics is highly requested to keep up with the flow of publications. In 2004, the first BioCreative challenge already designed a task of automatic GO concepts assignment from a full text. At this time, results were judged far from reaching the performances required by real curation workflows. In particular, supervised approaches produced the most disappointing results because of lack of training data. Ten years later, the available curation data have massively grown. In 2013, the BioCreative IV GO task revisited the automatic GO assignment task. For this issue, we investigated the power of our supervised classifier, GOCat. GOCat computes similarities between an input text and already curated instances contained in a knowledge base to infer GO concepts. The subtask A consisted in selecting GO evidence sentences for a relevant gene in a full text. For this, we designed a state-of-the-art supervised statistical approach, using a naïve Bayes classifier and the official training set, and obtained fair results. The subtask B consisted in predicting GO concepts from the previous output. For this, we applied GOCat and reached leading results, up to 65% for hierarchical recall in the top 20 outputted concepts. Contrary to previous competitions, machine learning has this time outperformed standard dictionary-based approaches. Thanks to BioCreative IV, we were able to design a complete workflow for curation: given a gene name and a full text, this system is able to select evidence sentences for curation and to deliver highly relevant GO concepts. Contrary to previous competitions, machine learning this time outperformed dictionary-based systems. Observed performances are sufficient for being used in a real semiautomatic curation workflow. GOCat is available at http://eagl.unige.ch/GOCat/. http://eagl.unige.ch/GOCat4FT/. © The Author(s) 2014. Published by Oxford University Press.
Trends in treatment and outcomes of pediatric craniopharyngioma, 1975–2011
Cohen, Michal; Bartels, Ute; Branson, Helen; Kulkarni, Abhaya V.; Hamilton, Jill
2013-01-01
Background Craniopharyngioma tumors and their treatment can lead to significant long-term morbidity due to their proximity to vital structures. The optimal treatment has been debated for many years. We aimed to review the long-term outcomes of children treated for craniopharyngioma in our institution over the past decade and describe trends in treatment and outcomes over the past 3 decades. Methods Charts of children with craniopharyngioma treated and followed at The Hospital for Sick Children between 2001 and 2011 were reviewed. Data regarding findings at diagnosis, treatment, and long-term outcomes were analyzed. Comparison was made with previously published data from our institution. Results Data from 33 patients are included; mean age at treatment, 10.7 ± 4.8 years. In 18 children (55%), the initial surgical approach was tumor cyst decompression with or without adjuvant therapy, compared with only 0–2% in the preceding decades (P < .01). Diabetes insipidus occurred in 55% of children and panhypopituitarism in 58% compared with 88% (P < .01) and 86% (P < .01), respectively, in the previous 10 years. Overall, there was a 36% reduction in the number of children who developed severe obesity compared with the preceding decade. Body mass index at follow-up was associated with body mass index at diagnosis (P = .004) and tumor resection as an initial treatment approach (P = .028). Conclusions A shift in surgical treatment approach away from gross total resection has led to improved endocrine outcomes. This may have beneficial implications for quality of life in survivors. PMID:23486689
Lorne, Emmanuel; Diouf, Momar; de Wilde, Robert B P; Fischer, Marc-Olivier
2018-02-01
The Bland-Altman (BA) and percentage error (PE) methods have been previously described to assess the agreement between 2 methods of medical or laboratory measurements. This type of approach raises several problems: the BA methodology constitutes a subjective approach to interchangeability, whereas the PE approach does not take into account the distribution of values over a range. We describe a new methodology that defines an interchangeability rate between 2 methods of measurement and cutoff values that determine the range of interchangeable values. We used a simulated data and a previously published data set to demonstrate the concept of the method. The interchangeability rate of 5 different cardiac output (CO) pulse contour techniques (Wesseling method, LiDCO, PiCCO, Hemac method, and Modelflow) was calculated, in comparison with the reference pulmonary artery thermodilution CO using our new method. In our example, Modelflow with a good interchangeability rate of 93% and a cutoff value of 4.8 L min, was found to be interchangeable with the thermodilution method for >95% of measurements. Modelflow had a higher interchangeability rate compared to Hemac (93% vs 86%; P = .022) or other monitors (Wesseling cZ = 76%, LiDCO = 73%, and PiCCO = 62%; P < .0001). Simulated data and reanalysis of a data set comparing 5 CO monitors against thermodilution CO showed that, depending on the repeatability of the reference method, the interchangeability rate combined with a cutoff value could be used to define the range of values over which interchangeability remains acceptable.
A new approach using coagulation rate constant for evaluation of turbidity removal
NASA Astrophysics Data System (ADS)
Al-Sameraiy, Mukheled
2017-06-01
Coagulation-flocculation-sedimentation processes for treating three levels of bentonite synthetic turbid water using date seeds (DS) and alum (A) coagulants were investigated in the previous research work. In the current research, the same experimental results were used to adopt a new approach on a basis of using coagulation rate constant as an investigating parameter to identify optimum doses of these coagulants. Moreover, the performance of these coagulants to meet (WHO) turbidity standard was assessed by introducing a new evaluating criterion in terms of critical coagulation rate constant (kc). Coagulation rate constants (k2) were mathematically calculated in second order form of coagulation process for each coagulant. The maximum (k2) values corresponded to doses, which were obviously to be considered as optimum doses. The proposed criterion to assess the performance of coagulation process of these coagulants was based on the mathematical representation of (WHO) turbidity guidelines in second order form of coagulation process stated that (k2) for each coagulant should be ≥ (kc) for each level of synthetic turbid water. For all tested turbid water, DS coagulant could not satisfy it. While, A coagulant could satisfy it. The results obtained in the present research are exactly in agreement with the previous published results in terms of finding optimum doses for each coagulant and assessing their performances. On the whole, it is recommended considering coagulation rate constant to be a new approach as an indicator for investigating optimum doses and critical coagulation rate constant to be a new evaluating criterion to assess coagulants' performance.
Rivas-Santiago, Bruno; Cervantes-Villagrana, Alberto; Sada, Eduardo; Hernández-Pando, Rogelio
2012-05-01
Defensins are low molecular weight antimicrobial and immunomodulatory peptides. Their participation against Mycobacterium tuberculosis (MTb) infection has been scarcely studied. We describe the kinetics of murine β-defensin 2 (mBD-2) expression by quantitative real-time PCR and cellular location by immunohistochemistry in murine models of progressive pulmonary tuberculosis and latent infection. During progressive disease, mBD2 gene expression raised its peak at 14 days postinfection, whereas in latent infection it was at 90 days. In both models, mBD-2 immunostaining was essentially located in cells with dendritic morphology located near mediastinal lymph nodes, which correlated with the previous reported highest expression of cell-mediated protected immunity in both models. These results suggest that mBD-2 may play a role in the control of bacilli growth by contributing to establish a Th1 response, being a link between innate and adaptative immunity. These data may be used for the development of new vaccine approaches. Copyright © 2012 IMSS. Published by Elsevier Inc. All rights reserved.
Interactive object modelling based on piecewise planar surface patches.
Prankl, Johann; Zillich, Michael; Vincze, Markus
2013-06-01
Detecting elements such as planes in 3D is essential to describe objects for applications such as robotics and augmented reality. While plane estimation is well studied, table-top scenes exhibit a large number of planes and methods often lock onto a dominant plane or do not estimate 3D object structure but only homographies of individual planes. In this paper we introduce MDL to the problem of incrementally detecting multiple planar patches in a scene using tracked interest points in image sequences. Planar patches are reconstructed and stored in a keyframe-based graph structure. In case different motions occur, separate object hypotheses are modelled from currently visible patches and patches seen in previous frames. We evaluate our approach on a standard data set published by the Visual Geometry Group at the University of Oxford [24] and on our own data set containing table-top scenes. Results indicate that our approach significantly improves over the state-of-the-art algorithms.
Towards the estimation of effect measures in studies using respondent-driven sampling.
Rotondi, Michael A
2014-06-01
Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.
Feasibility study for clinical application of caspase-3 inhibitors in Pemphigus vulgaris.
Hariton, William V J; Galichet, Arnaud; Vanden Berghe, Tom; Overmiller, Andrew M; Mahoney, My G; Declercq, Wim; Müller, Eliane J
2017-12-01
The potentially severe side effects of systemic corticosteroids and immunosuppressants used in Pemphigus vulgaris (PV) call for novel therapeutic approaches. In this context, pharmacological inhibition of major pathogenic signalling effectors represents a promising alternative. However, we have also shown that overinhibition of effectors required for epidermal homeostasis can exacerbate PV pathophysiology implicating transepidermal keratinocyte fragility. A feedforward target validation therefore preferentially includes studies on knockout mouse models. We previously reported on successful amelioration of PV blisters following inhibition of non-apoptotic, low-level caspase-3. Here, we use conditional, keratinocyte-specific caspase-3-deficient mice (casp3 EKO ) to demonstrate (i) absence of keratinocyte fragility upon injection of the potent Dsg3-specific antibody AK23 and (ii) amelioration of blistering on the background of known signalling effectors. Our results provide the experimental proof of concept justifying translation of the caspase-3 inhibitor approach into PV clinical trials. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Veselkov, Alexei N.; Evstigneev, Maxim P.; Veselkov, Dennis A.; Davies, David B.
2001-08-01
A general nuclear magnetic resonance analysis of a statistical-thermodynamical model of hetero-association of aromatic molecules in solution has been developed to take "edge effects" into consideration, i.e., the dependence of proton chemical shifts on the position of the molecule situated inside or at the edge of the aggregate. This generalized approach is compared with a previously published model, where an average contribution to proton shielding is considered irrespective of the position of the molecule in the stack. Association parameters have been determined from experimental concentration and temperature dependences of 500 MHz proton chemical shifts of the hetero-association of the acridine dye, proflavine, and the phenanthridinium dye, ethidium bromide, in aqueous solution. Differences in the parameters in the range 10%-30% calculated using the basic and generalized approaches have been found to depend substantially on the magnitude of the equilibrium hetero-association constant Khet—the larger the value of Khet, the higher the discrepancy between the two methods.
Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A
2012-02-01
Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.
Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence
2013-01-01
Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861
Fulcher, Ben D; Jones, Nick S
2017-11-22
Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Albariño, César G; Guerrero, Lisa Wiggleton; Chakrabarti, Ayan K; Kainulainen, Markus H; Whitmer, Shannon L M; Welch, Stephen R; Nichol, Stuart T
2016-09-01
During the large outbreak of Ebola virus disease that occurred in Western Africa from late 2013 to early 2016, several hundred Ebola virus (EBOV) genomes have been sequenced and the virus genetic drift analyzed. In a previous report, we described an efficient reverse genetics system designed to generate recombinant EBOV based on a Makona variant isolate obtained in 2014. Using this system, we characterized the replication and fitness of 2 isolates of the Makona variant. These virus isolates are nearly identical at the genetic level, but have single amino acid differences in the VP30 and L proteins. The potential effects of these differences were tested using minigenomes and recombinant viruses. The results obtained with this approach are consistent with the role of VP30 and L as components of the EBOV RNA replication machinery. Moreover, the 2 isolates exhibited clear fitness differences in competitive growth assays. Published by Elsevier Inc.
Interactive object modelling based on piecewise planar surface patches☆
Prankl, Johann; Zillich, Michael; Vincze, Markus
2013-01-01
Detecting elements such as planes in 3D is essential to describe objects for applications such as robotics and augmented reality. While plane estimation is well studied, table-top scenes exhibit a large number of planes and methods often lock onto a dominant plane or do not estimate 3D object structure but only homographies of individual planes. In this paper we introduce MDL to the problem of incrementally detecting multiple planar patches in a scene using tracked interest points in image sequences. Planar patches are reconstructed and stored in a keyframe-based graph structure. In case different motions occur, separate object hypotheses are modelled from currently visible patches and patches seen in previous frames. We evaluate our approach on a standard data set published by the Visual Geometry Group at the University of Oxford [24] and on our own data set containing table-top scenes. Results indicate that our approach significantly improves over the state-of-the-art algorithms. PMID:24511219
Floden, Evan W; Tommaso, Paolo D; Chatzou, Maria; Magis, Cedrik; Notredame, Cedric; Chang, Jia-Ming
2016-07-08
The PSI/TM-Coffee web server performs multiple sequence alignment (MSA) of proteins by combining homology extension with a consistency based alignment approach. Homology extension is performed with Position Specific Iterative (PSI) BLAST searches against a choice of redundant and non-redundant databases. The main novelty of this server is to allow databases of reduced complexity to rapidly perform homology extension. This server also gives the possibility to use transmembrane proteins (TMPs) reference databases to allow even faster homology extension on this important category of proteins. Aside from an MSA, the server also outputs topological prediction of TMPs using the HMMTOP algorithm. Previous benchmarking of the method has shown this approach outperforms the most accurate alignment methods such as MSAProbs, Kalign, PROMALS, MAFFT, ProbCons and PRALINE™. The web server is available at http://tcoffee.crg.cat/tmcoffee. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Foch, Eric; Milner, Clare E
2014-01-03
Iliotibial band syndrome (ITBS) is a common knee overuse injury among female runners. Atypical discrete trunk and lower extremity biomechanics during running may be associated with the etiology of ITBS. Examining discrete data points limits the interpretation of a waveform to a single value. Characterizing entire kinematic and kinetic waveforms may provide additional insight into biomechanical factors associated with ITBS. Therefore, the purpose of this cross-sectional investigation was to determine whether female runners with previous ITBS exhibited differences in kinematics and kinetics compared to controls using a principal components analysis (PCA) approach. Forty participants comprised two groups: previous ITBS and controls. Principal component scores were retained for the first three principal components and were analyzed using independent t-tests. The retained principal components accounted for 93-99% of the total variance within each waveform. Runners with previous ITBS exhibited low principal component one scores for frontal plane hip angle. Principal component one accounted for the overall magnitude in hip adduction which indicated that runners with previous ITBS assumed less hip adduction throughout stance. No differences in the remaining retained principal component scores for the waveforms were detected among groups. A smaller hip adduction angle throughout the stance phase of running may be a compensatory strategy to limit iliotibial band strain. This running strategy may have persisted after ITBS symptoms subsided. © 2013 Published by Elsevier Ltd.
Goekoop, Rutger; Goekoop, Jaap G
2014-01-01
The vast number of psychopathological syndromes that can be observed in clinical practice can be described in terms of a limited number of elementary syndromes that are differentially expressed. Previous attempts to identify elementary syndromes have shown limitations that have slowed progress in the taxonomy of psychiatric disorders. To examine the ability of network community detection (NCD) to identify elementary syndromes of psychopathology and move beyond the limitations of current classification methods in psychiatry. 192 patients with unselected mental disorders were tested on the Comprehensive Psychopathological Rating Scale (CPRS). Principal component analysis (PCA) was performed on the bootstrapped correlation matrix of symptom scores to extract the principal component structure (PCS). An undirected and weighted network graph was constructed from the same matrix. Network community structure (NCS) was optimized using a previously published technique. In the optimal network structure, network clusters showed a 89% match with principal components of psychopathology. Some 6 network clusters were found, including "Depression", "Mania", "Anxiety", "Psychosis", "Retardation", and "Behavioral Disorganization". Network metrics were used to quantify the continuities between the elementary syndromes. We present the first comprehensive network graph of psychopathology that is free from the biases of previous classifications: a 'Psychopathology Web'. Clusters within this network represent elementary syndromes that are connected via a limited number of bridge symptoms. Many problems of previous classifications can be overcome by using a network approach to psychopathology.
Belgiu, Mariana; Dr Guţ, Lucian
2014-10-01
Although multiresolution segmentation (MRS) is a powerful technique for dealing with very high resolution imagery, some of the image objects that it generates do not match the geometries of the target objects, which reduces the classification accuracy. MRS can, however, be guided to produce results that approach the desired object geometry using either supervised or unsupervised approaches. Although some studies have suggested that a supervised approach is preferable, there has been no comparative evaluation of these two approaches. Therefore, in this study, we have compared supervised and unsupervised approaches to MRS. One supervised and two unsupervised segmentation methods were tested on three areas using QuickBird and WorldView-2 satellite imagery. The results were assessed using both segmentation evaluation methods and an accuracy assessment of the resulting building classifications. Thus, differences in the geometries of the image objects and in the potential to achieve satisfactory thematic accuracies were evaluated. The two approaches yielded remarkably similar classification results, with overall accuracies ranging from 82% to 86%. The performance of one of the unsupervised methods was unexpectedly similar to that of the supervised method; they identified almost identical scale parameters as being optimal for segmenting buildings, resulting in very similar geometries for the resulting image objects. The second unsupervised method produced very different image objects from the supervised method, but their classification accuracies were still very similar. The latter result was unexpected because, contrary to previously published findings, it suggests a high degree of independence between the segmentation results and classification accuracy. The results of this study have two important implications. The first is that object-based image analysis can be automated without sacrificing classification accuracy, and the second is that the previously accepted idea that classification is dependent on segmentation is challenged by our unexpected results, casting doubt on the value of pursuing 'optimal segmentation'. Our results rather suggest that as long as under-segmentation remains at acceptable levels, imperfections in segmentation can be ruled out, so that a high level of classification accuracy can still be achieved.
Retracted articles in surgery journals. What are surgeons doing wrong?
Cassão, Bruna Dell'Acqua; Herbella, Fernando A M; Schlottmann, Francisco; Patti, Marco G
2018-06-01
Retraction of previously published scientific articles is an important mechanism to preserve the integrity of scientific work. This study analyzed retractions of previously published articles from surgery journals. We searched for retracted articles in the 100 surgery journals with the highest SJR2 indicator grades. We found 130 retracted articles in 49 journals (49%). Five or more retracted articles were published in 8 journals (8%). The mean time between publication and retraction was 26 months (range 1 to 158 months). The United States, China, Germany, Japan, and the United Kingdom accounted for more than 3 out of 4 of the retracted articles. The greatest number of retractions came from manuscripts about orthopedics and traumatology, general surgery, anesthesiology, cardiothoracic surgery, and plastic surgery. Nonsurgeons were responsible for 16% of retractions in these surgery journals. The main reasons for retraction were duplicate publication (42%), plagiarism (16%), absence of proven integrity of the study (14%), incorrect data (13%), data published without authorization (12%), violation of research ethics (11%), documented fraud (11%), request of an author(s) (5%), and unknown (3%). In 25% of the retracted articles, other publications by the same authors also had been retracted. Retraction of published articles does not occur frequently in surgery journals. Some form of scientific misconduct was present in the majority of retractions, especially duplication of publication and plagiarism. Retractions of previously published articles were most frequent from countries with the greatest number of publications; some authors showed recidivism. Copyright © 2018 Elsevier Inc. All rights reserved.
Chern, Alexander; Hunter, Jacob B; Bennett, Marc L
2017-01-01
To determine if cranioplasty techniques following translabyrinthine approaches to the cerebellopontine angle are cost-effective. Retrospective case series. One hundred eighty patients with available financial data who underwent translabyrinthine approaches at a single academic referral center between 2005 and 2015. Cranioplasty with a dural substitute, layered fat graft, and a resorbable mesh plate secured with screws Main Outcome Measures: billing data was obtained for each patient's hospital course for translabyrinthine approaches and postoperative cerebrospinal fluid (CSF) leaks. One hundred nineteen patients underwent translabyrinthine approaches with an abdominal fat graft closure, with a median cost of $25759.89 (range, $15885.65-$136433.07). Sixty-one patients underwent translabyrinthine approaches with a dural substitute, abdominal fat graft, and a resorbable mesh for closure, with a median cost of $29314.97 (range, $17674.28-$111404.55). The median cost of a CSF leak was $50401.25 (range, $0-$384761.71). The additional cost of a CSF leak when shared by all patients who underwent translabyrinthine approaches is $6048.15. The addition of a dural substitute and a resorbable mesh plate after translabyrinthine approaches reduced the CSF leak from 12 to 1.9%, an 84.2% reduction, and a median savings per patient of $2932.23. Applying our cohort's billing data to previously published cranioplasty techniques, costs, and leak rate improvements after translabyrinthine approaches, all techniques were found to be cost-effective. Resorbable mesh cranioplasty is cost-effective at reducing CSF leaks after translabyrinthine approaches. Per our billing data and achieving the same CSF leak rate, cranioplasty costs exceeding $5090.53 are not cost-effective.
37 CFR 381.10 - Cost of living adjustment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...
37 CFR 253.10 - Cost of living adjustment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...
37 CFR 381.10 - Cost of living adjustment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...
37 CFR 381.10 - Cost of living adjustment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...
37 CFR 381.10 - Cost of living adjustment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2012, to the most recent Index published prior to December 1, 2013... change in the cost of living during the period from the most recent index published prior to the previous...
37 CFR 253.10 - Cost of living adjustment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...
37 CFR 253.10 - Cost of living adjustment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...
37 CFR 253.10 - Cost of living adjustment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2002, to the most recent Index published prior to December 1, 2003... cost of living during the period from the most recent index published prior to the previous notice, to...
37 CFR 381.10 - Cost of living adjustment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... determined by the Consumer Price Index (all consumers, all items) during the period from the most recent Index published prior to December 1, 2006, to the most recent Index published prior to December 1, 2007... the cost of living during the period from the most recent index published prior to the previous notice...
[Atherogenic dyslipidemia and residual risk. State of the art in 2014].
Millán Núñez-Cortés, Jesús; Pedro-Botet Montoya, Juan; Pintó Sala, Xavier
2014-01-01
Pandemics of metabolic síndrome, obesity, and type 2 diabetes is a major challenge for the next years and supported the grat burden of cardiovascular diseases. The R3i (Residual Risk Reduction initiative) has previously highlighted atherogenic dyslipidaemia as an important and modifiable contributor to the lipid related residual cardiovascular risk. Atherogenic dyslipidaemia is defined as an imbalance between proatherogenic triglycerides-rich apoB-containing lipoproteins and antiatherogenic AI containing lipoproteins. To improve clinical management of atherogenic dyslipidaemia a despite of lifestyle intervention includes pharmacological approach, and fibrates is the main option for combination with a statin to further reduce non-HDL cholesterol. Copyright © 2014 Sociedad Española de Arteriosclerosis. Published by Elsevier España. All rights reserved.
Wave propagation in embedded inhomogeneous nanoscale plates incorporating thermal effects
NASA Astrophysics Data System (ADS)
Ebrahimi, Farzad; Barati, Mohammad Reza; Dabbagh, Ali
2018-04-01
In this article, an analytical approach is developed to study the effects of thermal loading on the wave propagation characteristics of an embedded functionally graded (FG) nanoplate based on refined four-variable plate theory. The heat conduction equation is solved to derive the nonlinear temperature distribution across the thickness. Temperature-dependent material properties of nanoplate are graded using Mori-Tanaka model. The nonlocal elasticity theory of Eringen is introduced to consider small-scale effects. The governing equations are derived by the means of Hamilton's principle. Obtained frequencies are validated with those of previously published works. Effects of different parameters such as temperature distribution, foundation parameters, nonlocal parameter, and gradient index on the wave propagation response of size-dependent FG nanoplates have been investigated.
Multiple sclerosis lesion segmentation using an automatic multimodal graph cuts.
García-Lorenzo, Daniel; Lecoeur, Jeremy; Arnold, Douglas L; Collins, D Louis; Barillot, Christian
2009-01-01
Graph Cuts have been shown as a powerful interactive segmentation technique in several medical domains. We propose to automate the Graph Cuts in order to automatically segment Multiple Sclerosis (MS) lesions in MRI. We replace the manual interaction with a robust EM-based approach in order to discriminate between MS lesions and the Normal Appearing Brain Tissues (NABT). Evaluation is performed in synthetic and real images showing good agreement between the automatic segmentation and the target segmentation. We compare our algorithm with the state of the art techniques and with several manual segmentations. An advantage of our algorithm over previously published ones is the possibility to semi-automatically improve the segmentation due to the Graph Cuts interactive feature.
Purser, Gemma; Rochelle, Christopher A; Wallis, Humphrey C; Rosenqvist, Jörgen; Kilpatrick, Andrew D; Yardley, Bruce W D
2014-08-01
A novel titanium reaction cell has been constructed for the study of water-rock-CO2 reactions. The reaction cell has been used within a direct-sampling rocking autoclave and offers certain advantages over traditional "flexible gold/titanium cell" approaches. The main advantage is robustness, as flexible cells are prone to rupture on depressurisation during gas-rich experiments. The reaction cell was tested in experiments during an inter-laboratory comparison study, in which mineral kinetic data were determined. The cell performed well during experiments up to 130 °C and 300 bars pressure. The data obtained were similar to those of other laboratories participating in the study, and also to previously published data.
Whitmore, Henschke, and Hilaris: The reorientation of prostate brachytherapy (1970-1987).
Aronowitz, Jesse N
2012-01-01
Urologists had performed prostate brachytherapy for decades before New York's Memorial Hospital retropubic program. This paper explores the contribution of Willet Whitmore, Ulrich Henschke, Basil Hilaris, and Memorial's physicists to the evolution of the procedure. Literature review and interviews with program participants. More than 1000 retropubic implants were performed at Memorial between 1970 and 1987. Unlike previous efforts, Memorial's program benefited from the participation of three disciplines in its conception and execution. Memorial's retropubic program was a collaboration of urologists, radiation therapists, and physicists. Their approach focused greater attention on dosimetry and radiation safety, and served as a template for subsequent prostate brachytherapy programs. Copyright © 2012 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Kormes, Diego J.; Cortón, Eduardo
2009-01-01
Whereas biosensors have been usually proposed as analytical tools, used to investigate the surrounding media pursuing an analytical answer, we have used a biosensor-like device to characterize the microbial cells immobilized on it. We have studied the kinetics of transport and degradation of glucose at different concentrations and temperatures. When glucose concentrations of 15 and 1.5 mM were assayed, calculated activation energies were 25.2 and 18.4 kcal mol−1, respectively, in good agreement with previously published data. The opportunity and convenience of using Arrhenius plots to estimate the activation energy in metabolic-related processes is also discussed. PMID:22573975
Endoscopic management of peripancreatic fluid collections.
Goyal, Jatinder; Ramesh, Jayapal
2015-07-01
Peripancreatic fluid collections are a well-known complication of pancreatitis and can vary from fluid-filled collections to entirely necrotic collections. Although most of the fluid-filled pseudocysts tend to resolve spontaneously with conservative management, intervention is necessary in symptomatic patients. Open surgery has been the traditional treatment modality of choice though endoscopic, laparoscopic and transcutaneous techniques offer alternative drainage approaches. During the last decade, improvement in endoscopic ultrasound technology has enabled real-time access and drainage of fluid collections that were previously not amenable to blind transmural drainage. This has initiated a trend towards use of this modality for treatment of pseudocysts. In this review, we have summarised the existing evidence for endoscopic drainage of peripancreatic fluid collections from published studies.
Light curve variations of the eclipsing binary V367 Cygni
NASA Astrophysics Data System (ADS)
Akan, M. C.
1987-07-01
The long-period eclipsing binary star V367 Cygni has been observed photoelectrically in two colours, B and V, in 1984, 1985, and 1986. These new light curves of the system have been discussed and compared for the light-variability with the earlier ones presented by Heiser (1962). Using some of the previously published photoelectric light curves and the present ones, several primary minima times have been derived to calculate the light elements. Any attempt to obtain a photometric solution of the binary is complicated by the peculiar nature of the light curve caused by the presence of the circumstellar matter in the system. Despite this difficulty, however, some approaches are being carried out to solve the light curves which are briefly discussed.
Scattering from randomly oriented circular discs with application to vegetation
NASA Technical Reports Server (NTRS)
Karam, M. A.; Fung, A. K.
1984-01-01
A vegetation layer is modeled by a collection of randomly oriented circular discs over a half space. The backscattering coefficient from such a half space is computed using the radiative transfer theory. It is shown that significantly different results are obtained from this theory as compared with some earlier investigations using the same modeling approach but with restricted disc orientations. In particular, the backscattered cross polarized returns cannot have a fast increasing angular trend which is inconsistent with measurements. By setting the appropriate angle of orientation to zero the theory reduces to previously published results. Comparisons are shown with measurements taken from milo, corn and wheat and good agreements are obtained for both polarized and cross polarized returns.
Scattering from randomly oriented circular discs with application to vegetation
NASA Technical Reports Server (NTRS)
Karam, M. A.; Fung, A. K.
1983-01-01
A vegetation layer is modeled by a collection of randomly oriented circular discs over a half space. The backscattering coefficient from such a half space is computed using the radiative transfer theory. It is shown that significantly different results are obtained from this theory as compared with some earlier investigations using the same modeling approach but with restricted disc orientations. In particular, the backscattered cross-polarized returns cannot have a fast increasing angular trend which is inconsistent with measurements. By setting the appropriate angle of orientation to zero the theory reduces to previously published results. Comparisons are shown with measurements taken from milo, corn and wheat and good agreements are obtained for both polarized and cross-polarized returns.
NASA Astrophysics Data System (ADS)
Kleshnin, Mikhail; Orlova, Anna; Kirillin, Mikhail; Golubiatnikov, German; Turchin, Ilya
2017-07-01
A new approach to optical measuring blood oxygen saturation was developed and implemented. This technique is based on an original three-stage algorithm for reconstructing the relative concentration of biological chromophores (hemoglobin, water, lipids) from the measured spectra of diffusely scattered light at different distances from the probing radiation source. The numerical experiments and approbation of the proposed technique on a biological phantom have shown the high reconstruction accuracy and the possibility of correct calculation of hemoglobin oxygenation in the presence of additive noise and calibration errors. The obtained results of animal studies have agreed with the previously published results of other research groups and demonstrated the possibility to apply the developed technique to monitor oxygen saturation in tumor tissue.
Design solutions for the solar cell interconnect fatigue fracture problem
NASA Technical Reports Server (NTRS)
Mon, G. R.; Ross, R. G., Jr.
1982-01-01
Mechanical fatigue of solar cell interconnects is a major failure mechanism in photovoltaic arrays. A comprehensive approach to the reliability design of interconnects, together with extensive design data for the fatigue properties of copper interconnects, has been published. This paper extends the previous work, developing failure prediction (fatigue) data for additional interconnect material choices, including aluminum and a variety of copper-Invar and copper-steel claddings. An improved global fatigue function is used to model the probability-of-failure statistics of each material as a function of level and number of cycles of applied strain. Life-cycle economic analyses are used to evaluate the relative merits of each material choce. The copper-Invar clad composites demonstrate superior performance over pure copper. Aluminum results are disappointing.
Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox
2017-08-01
Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.
An update on 'dose calibrator' settings for nuclides used in nuclear medicine.
Bergeron, Denis E; Cessna, Jeffrey T
2018-06-01
Most clinical measurements of radioactivity, whether for therapeutic or imaging nuclides, rely on commercial re-entrant ionization chambers ('dose calibrators'). The National Institute of Standards and Technology (NIST) maintains a battery of representative calibrators and works to link calibration settings ('dial settings') to primary radioactivity standards. Here, we provide a summary of NIST-determined dial settings for 22 radionuclides. We collected previously published dial settings and determined some new ones using either the calibration curve method or the dialing-in approach. The dial settings with their uncertainties are collected in a comprehensive table. In general, current manufacturer-provided calibration settings give activities that agree with National Institute of Standards and Technology standards to within a few percent.
Levy, Barry S; Nassetta, William J
2011-01-01
In April 2010, an explosion on an oil rig in the Gulf of Mexico killed 11 workers, injured 17 workers, and spilled an estimated 185 million gallons of crude oil into the Gulf. Adverse effects on the health of cleanup workers, fishermen, and others as well as on the ecosystem are being studied. This paper reviews published studies of the adverse health effects due to previous oil spills. Acute effects have included: respiratory, eye, and skin symptoms; headache; nausea; dizziness; and tiredness or fatigue. Chronic effects have included: psychological disorders, respiratory disorders, genotoxic effects, and endocrine abnormalities. We also present a systematic approach to evaluating individuals exposed to oil spills.
Hurlock-Chorostecki, Christina; Forchuk, Cheryl; Orchard, Carole; van Soeren, Mary; Reeves, Scott
2014-05-01
Nurse practitioners (NP) are employed within hospital interprofessional (IP) teams in several countries worldwide. There have been some efforts to describe the nature of the NP role within IP teams largely focussing on how the role may augment care processes. Here, using a constructivist grounded theory approach, the perceptions of NPs about their role were compared and integrated into a previously published team perspective as the second phase of a larger study. Seventeen hospital-based (HB) NPs across Ontario, Canada, participated in group and individual interviews. The NP perspective substantiated and expanded the previously reported team perspective, resulting in an IP perspective. The three practice foci illustrating role value meaning of this perspective became: evolve NP role and advance the specialty, focus on team working, and hold patient care together. The IP perspective, juxtaposed with an existing contingency approach, revealed that NPs were promoting IP work, predominantly at the collaboration and teamwork levels, and aiding IP team transitions to appropriate forms of IP work. The practice, "focus on team working"' was strongly related to promoting IP work. The findings were consistent with HB NPs enacting a role in building IP team cohesiveness rather than merely acting as a labour saver. This is the first study to align NP and team understanding of HB NP role value using an IP framework.
Limb-Enhancer Genie: An accessible resource of accurate enhancer predictions in the developing limb
Monti, Remo; Barozzi, Iros; Osterwalder, Marco; ...
2017-08-21
Epigenomic mapping of enhancer-associated chromatin modifications facilitates the genome-wide discovery of tissue-specific enhancers in vivo. However, reliance on single chromatin marks leads to high rates of false-positive predictions. More sophisticated, integrative methods have been described, but commonly suffer from limited accessibility to the resulting predictions and reduced biological interpretability. Here we present the Limb-Enhancer Genie (LEG), a collection of highly accurate, genome-wide predictions of enhancers in the developing limb, available through a user-friendly online interface. We predict limb enhancers using a combination of > 50 published limb-specific datasets and clusters of evolutionarily conserved transcription factor binding sites, taking advantage ofmore » the patterns observed at previously in vivo validated elements. By combining different statistical models, our approach outperforms current state-of-the-art methods and provides interpretable measures of feature importance. Our results indicate that including a previously unappreciated score that quantifies tissue-specific nuclease accessibility significantly improves prediction performance. We demonstrate the utility of our approach through in vivo validation of newly predicted elements. Moreover, we describe general features that can guide the type of datasets to include when predicting tissue-specific enhancers genome-wide, while providing an accessible resource to the general biological community and facilitating the functional interpretation of genetic studies of limb malformations.« less
Srinivasan, Sujatha; Munch, Matthew M; Sizova, Maria V; Fiedler, Tina L; Kohler, Christina M; Hoffman, Noah G; Liu, Congzhou; Agnew, Kathy J; Marrazzo, Jeanne M; Epstein, Slava S; Fredricks, David N
2016-08-15
Women with bacterial vaginosis (BV) have complex communities of anaerobic bacteria. There are no cultivated isolates of several bacteria identified using molecular methods and associated with BV. It is unclear whether this is due to the inability to adequately propagate these bacteria or to correctly identify them in culture. Vaginal fluid from 15 women was plated on 6 different media using classical cultivation approaches. Individual isolates were identified by 16S ribosomal RNA (rRNA) gene sequencing and compared with validly described species. Bacterial community profiles in vaginal samples were determined using broad-range 16S rRNA gene polymerase chain reaction and pyrosequencing. We isolated and identified 101 distinct bacterial strains spanning 6 phyla including (1) novel strains with <98% 16S rRNA sequence identity to validly described species, (2) closely related species within a genus, (3) bacteria previously isolated from body sites other than the vagina, and (4) known bacteria formerly isolated from the vagina. Pyrosequencing showed that novel strains Peptoniphilaceae DNF01163 and Prevotellaceae DNF00733 were prevalent in women with BV. We isolated a diverse set of novel and clinically significant anaerobes from the human vagina using conventional approaches with systematic molecular identification. Several previously "uncultivated" bacteria are amenable to conventional cultivation. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
Review of Methods and Approaches for Deriving Numeric ...
EPA will propose numeric criteria for nitrogen/phosphorus pollution to protect estuaries, coastal areas and South Florida inland flowing waters that have been designated Class I, II and III , as well as downstream protective values (DPVs) to protect estuarine and marine waters. In accordance with the formal determination and pursuant to a subsequent consent decree, these numeric criteria are being developed to translate and implement Florida’s existing narrative nutrient criterion, to protect the designated use that Florida has previously set for these waters, at Rule 62-302.530(47)(b), F.A.C. which provides that “In no case shall nutrient concentrations of a body of water be altered so as to cause an imbalance in natural populations of aquatic flora or fauna.” Under the Clean Water Act and EPA’s implementing regulations, these numeric criteria must be based on sound scientific rationale and reflect the best available scientific knowledge. EPA has previously published a series of peer reviewed technical guidance documents to develop numeric criteria to address nitrogen/phosphorus pollution in different water body types. EPA recognizes that available and reliable data sources for use in numeric criteria development vary across estuarine and coastal waters in Florida and flowing waters in South Florida. In addition, scientifically defensible approaches for numeric criteria development have different requirements that must be taken into consider
Average crystal structure(s) of the embedded meta stable η‧-phase in the Al-Mg-Zn system
NASA Astrophysics Data System (ADS)
Bøvik Larsen, Helge; Thorkildsen, Gunnar; Natland, Sølvi; Pattison, Philip
2014-05-01
Meta stable embedded nano-sized ?-particles within a single grain extracted from an alloy having the nominal composition ? have been examined with X-ray diffraction. By applying the orientational and metric relationships that exist between the hexagonal unit cell of the ?-particles and the cubic unit cell of the Al-matrix, it has been proven possible to directly collect diffracted intensity data from the ?-particle ensemble. This has been done using synchrotron radiation and a ?-diffractometer having a scintillator point detector setup. The approach has resulted in improved data quality compared to previous experiments. The interpretation of the data set, based on a combination of Patterson syntheses, direct methods and geometrical restraints, yielded two possible average structural representations: one Al-rich with the approximate stoichiometric composition ? and one Al-depleted with approximate stoichiometric composition ?. Both structures are realized in the same space group, ?, and are most probably superimposed in the crystalline system examined. The geometries are discussed within the atomic environment approach where icosahedral or near-icosahedral configurations are encountered. Comparison with previous published models and the equilibrium structure reveals a main difference related to the distribution of the Zn-sites in the unit cell. A possible transformation path is also suggested. Various aspects and challenges regarding data collection, data reduction and data quality are specifically addressed.
California State Waters Map Series Data Catalog
Golden, Nadine E.
2013-01-01
In 2007, the California Ocean Protection Council initiated the California Seafloor Mapping Program (CSMP), designed to create a comprehensive seafloor map of high-resolution bathymetry, marine benthic habitats, and geology within the 3-nautical-mile limit of California's State Waters. The CSMP approach is to create highly detailed seafloor maps and associated data layers through the collection, integration, interpretation, and visualization of swath sonar data, acoustic backscatter, seafloor video, seafloor photography, high-resolution seismic-reflection profiles, and bottom-sediment sampling data. CSMP has divided coastal California into 110 map blocks (fig. 1), each to be published individually as USGS Scientific Investigations Maps (SIMs) at a scale of 1:24,000. The map products display seafloor morphology and character, identify potential marine benthic habitats, and illustrate both the seafloor geology and shallow (to about 100 m) subsurface geology. This CSMP data catalog contains much of the data used to prepare the SIMs in the California State Waters Map Series. Other data that were used to prepare the maps were compiled from previously published sources (for example, onshore geology) and, thus, are not included herein.
Oud, Emerentiana Veronica; de Vrieze, Nynke Hesselina Neeltje; de Meij, Arjan; de Vries, Henry John C
2014-06-01
Current lymphogranuloma venereum (LGV) guidelines mainly focus on anorectal infections. Inguinal LGV infections have been rare in the current epidemic among men who have sex with men (MSM), but might require a different approach not yet recommended in current guidelines for the treatment and diagnosis of LGV. We describe 4 inguinal LGV cases. Three MSM developed inguinal LGV infection several weeks after a previous consultation, of which two had received azithromycin after being notified for LGV. Three failed the recommended 21 days doxycycline treatment. These inguinal LGV cases highlight 3 pitfalls in the current standard management of LGV: (1) Urethral chlamydia infections in MSM can be caused by LGV biovars that in contrast to non-LGV biovars require prolonged antibiotic therapy. (2) The recommended one gram azithromycin contact treatment seems insufficient to prevent established infections. (3) Inguinal LGV may require prolonged courses of doxycycline, exceeding the currently advised 21 days regimen. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Asano, Nadja Maria Jorge; Coriolano, Maria das Graças Wanderley de Sales; Asano, Breno Jorge; Lins, Otávio Gomes
2013-01-01
To analyze the frequency of psychiatric comorbidities in patients with systemic lupus erythematosus (SLE) using the systematic review method. A systematic literature search was performed between April and July 2011 in the following databases: BIREME, PubMed and CAPES thesis database. This search prioritized studies published over the last ten years (2001-2011), involving the presence of psychiatric comorbidities in patients with SLE. Out of 314 articles published in scientific journals (PubMed) and 29 (BIREME), previously identified ones, 13 articles on psychiatric disorders and SLE were selected so they could be submitted to the systematic review methodological approach. The articles indicated high frequency of psychiatric comorbidities, especially mood and anxiety disorders. There is no consensus between the disease activity and psychiatric disorders. Patients with active SLE showed a higher risk of developing mood disorders than patients with inactive SLE. Patients with SLE had a higher suicide risk than the general population. More thorough studies to evaluate the psychological and genetic role, specific and non-specific autoimmune inflammatory mechanisms in mood and anxiety disorders are needed.
Management Strategies for Posttransplant Diabetes Mellitus after Heart Transplantation: A Review
Cehic, Matthew G.; Nundall, Nishant; Greenfield, Jerry R.
2018-01-01
Posttransplant diabetes mellitus (PTDM) is a well-recognized complication of heart transplantation and is associated with increased morbidity and mortality. Previous studies have yielded wide ranging estimates in the incidence of PTDM due in part to variable definitions applied. In addition, there is a limited published data on the management of PTDM after heart transplantation and a paucity of studies examining the effects of newer classes of hypoglycaemic drug therapies. In this review, we discuss the role of established glucose-lowering therapies and the rationale and emerging clinical evidence that supports the role of incretin-based therapies (glucagon like peptide- (GLP-) 1 agonists and dipeptidyl peptidase- (DPP-) 4 inhibitors) and sodium-glucose cotransporter 2 (SGLT2) inhibitors in the management of PTDM after heart transplantation. Recently published Consensus Guidelines for the diagnosis of PTDM will hopefully lead to more consistent approaches to the diagnosis of PTDM and provide a platform for the larger-scale multicentre trials that will be needed to determine the role of these newer therapies in the management of PTDM. PMID:29623219
van Schayck, C P
2002-02-23
Two standards on COPD have recently been published: the revised national standard from the Dutch College of General Practitioners and the first international standard published by the World Health Organization and the US National Heart, Lung and Blood Institute. The reduced emphasis on the role of spirometry in the monitoring and evaluation of treatment is an important change in these new standards compared to previous ones. Cessation of smoking is considered to be central to the prevention and treatment of COPD. Doctors should strongly support this approach and, more than before, are urged to view COPD as a disease caused by addiction. Bronchodilators are the cornerstone of symptomatic treatment of COPD, particularly the long-acting ones due to their ease of administration and effective treatment of morning dyspnoea. Inhalation corticosteroids should only be administered as a trial treatment and only under certain conditions. Continuation of treatment with these agents is only justified if there is a demonstrated improvement in lung function, exacerbations or symptoms, although the precise area of indication is not yet clear.
Sanchez-Lucas, Rosa; Mehta, Angela; Valledor, Luis; Cabello-Hurtado, Francisco; Romero-Rodrıguez, M Cristina; Simova-Stoilova, Lyudmila; Demir, Sekvan; Rodriguez-de-Francisco, Luis E; Maldonado-Alconada, Ana M; Jorrin-Prieto, Ana L; Jorrín-Novo, Jesus V
2016-03-01
The present review is an update of the previous one published in Proteomics 2015 Reviews special issue [Jorrin-Novo, J. V. et al., Proteomics 2015, 15, 1089-1112] covering the July 2014-2015 period. It has been written on the bases of the publications that appeared in Proteomics journal during that period and the most relevant ones that have been published in other high-impact journals. Methodological advances and the contribution of the field to the knowledge of plant biology processes and its translation to agroforestry and environmental sectors will be discussed. This review has been organized in four blocks, with a starting general introduction (literature survey) followed by sections focusing on the methodology (in vitro, in vivo, wet, and dry), proteomics integration with other approaches (systems biology and proteogenomics), biological information, and knowledge (cell communication, receptors, and signaling), ending with a brief mention of some other biological and translational topics to which proteomics has made some contribution. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Donahue, Timothy R; Reber, Howard A
2013-09-01
To summarize published research on pancreatic surgery over the past year. A number of studies aiming to reduce the costs associated with pancreatic surgery were reported. Retrospective analyses confirmed previous findings that neither the routine use of pancreatic duct stents decreases the rate of fistula formation nor does placement of a drain at the time of surgery change the morbidity in patients who develop one. Minimally invasive approaches, both laparoscopic and robot-assisted, are being performed more frequently to remove pancreatic cancers. A randomized trial confirmed that reinforcement of stapled closure during distal pancreatectomy reduces the rate of fistula formation. Controversy remains over whether small pancreatic neuroendocrine tumors need to be surgically resected or can be treated nonoperatively. Patients with chronic pancreatitis should be screened thoroughly before being offered surgical treatment; two studies reported preoperative factors that can be used to identify those most likely to experience pain relief. Studies published on pancreatic surgery last year focused on a wide-range of topics. The morbidity and mortality of patients undergoing pancreatic surgery continues to improve, and we anticipate that incorporation of these new findings will lead to even better outcomes.
Bapst, D W; Wright, A M; Matzke, N J; Lloyd, G T
2016-07-01
Dated phylogenies of fossil taxa allow palaeobiologists to estimate the timing of major divergences and placement of extinct lineages, and to test macroevolutionary hypotheses. Recently developed Bayesian 'tip-dating' methods simultaneously infer and date the branching relationships among fossil taxa, and infer putative ancestral relationships. Using a previously published dataset for extinct theropod dinosaurs, we contrast the dated relationships inferred by several tip-dating approaches and evaluate potential downstream effects on phylogenetic comparative methods. We also compare tip-dating analyses to maximum-parsimony trees time-scaled via alternative a posteriori approaches including via the probabilistic cal3 method. Among tip-dating analyses, we find opposing but strongly supported relationships, despite similarity in inferred ancestors. Overall, tip-dating methods infer divergence dates often millions (or tens of millions) of years older than the earliest stratigraphic appearance of that clade. Model-comparison analyses of the pattern of body-size evolution found that the support for evolutionary mode can vary across and between tree samples from cal3 and tip-dating approaches. These differences suggest that model and software choice in dating analyses can have a substantial impact on the dated phylogenies obtained and broader evolutionary inferences. © 2016 The Author(s).
QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.
Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter
2015-07-01
Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.
Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F
2018-03-11
We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.
REMARK checklist elaborated to improve tumor prognostician
Experts have elaborated on a previously published checklist of 20 items -- including descriptions of design, methods, and analysis -- that researchers should address when publishing studies of prognostic markers. These markers are indicators that enable d
Hancock, Ange-Marie; Hancock, Charles R
2010-04-01
Prior research has established diversity as a topic of empirical analysis in the vascular surgery literature. Building on the work of previously published articles on diversity in the Journal of Vascular Surgery and elsewhere, this article engages in a broad discussion of diversity in two interrelated arenas: educational/workplace diversity and culturally competent care. Interdisciplinary review of the literature indicates that diversity is often thought of as an end-state to be accomplished. A more fruitful way to encompass the changing aspects of diversity work is to think of diversity as a set of processes that can be adjusted based on a set of interrelated goals that matter differently to different groups. In considering diversity as a process, an approach to diversity emerges that considers both independent effects of gender and race/ethnicity as well as interactive effects between the two variables to address future trends in medical education. Such trends are diagnosed and multiple courses of intervention are offered as reasonable options for future efforts. A comprehensive definition of diversity will be established in order to encompass two different arenas in which diversity concerns arise: educational diversity and culturally competent patient care. Second, a discussion of the rationales for attention to diversity among vascular surgeons will provide different avenues into a conversation about diversity in the profession. In so doing, three successful efforts will be briefly discussed: the Ohio State University's MED-Path program, the Keck School of Medicine's chair-centered approach to diversity in residency training, and the American Association of Orthopedic Surgeons' (AAOS) approach to culturally competent care. Copyright 2010. Published by Mosby, Inc.
Nemčeková, Katarína; Labuda, Ján; Milata, Viktor; Blaškovičová, Jana; Sochr, Jozef
2018-05-03
The understanding of DNA-drug interaction mechanism is among the important aspects of biological studies for drug design, discovery and pharmaceutical development processes. Published rather detailed FTIR and UV-visible spectroscopic studies on the interactions of theophylline, theobromine and caffeine with calf thymus DNA have shown effective binding of these methylxanthine derivatives to DNA and RNA involving H-bonds. However, to our knowledge, there is no such investigation using electrochemical approach. As a novelty of the study, in this paper the bioelectrochemical approach has been chosen for the investigation of an interaction of low molecular salmon sperm dsDNA, ssDNA and mononucleotides with theophylline (TP) in aqueous phosphate buffered medium using DNA-based electrochemical biosensors and biosensing in solution phase. Exploitation of the electrochemical approach via changes in square wave voltammetric responses of deoxyguanosine (dGuo) and deoxyadenosine (dAdo) provided a new indication on preferential association of TP with dGuo in the case of double helical dsDNA structure which was not reported previously. Moreover, an attachment of TP molecules outside DNA was found in the presence of high concentration of 3.3 × 10 -4 M TP in solution which diminishes the electron transfer and leads to the difficulties in quantitative evaluation of the TP and dGuo voltammetric responses. The changes in UV-vis and FTIR spectra obtained in the same medium confirmed the association interaction of TP with both nucleobases. Utilizing the model and the published energies of hydrogen bonding stabilization, the formation of a DNA-TP complex was predicted through the intermolecular H-bonds between TP and the NH-CO moiety of guanine and the N-NH 2 moiety of adenine. Copyright © 2018 Elsevier B.V. All rights reserved.
Moimas, Silvia; Manasseri, Benedetto; Cuccia, Giuseppe; Stagno d'Alcontres, Francesco; Geuna, Stefano; Pattarini, Lucia; Zentilin, Lorena; Giacca, Mauro; Colonna, Michele R
2015-01-01
In regenerative medicine, new approaches are required for the creation of tissue substitutes, and the interplay between different research areas, such as tissue engineering, microsurgery and gene therapy, is mandatory. In this article, we report a modification of a published model of tissue engineering, based on an arterio-venous loop enveloped in a cross-linked collagen-glycosaminoglycan template, which acts as an isolated chamber for angiogenesis and new tissue formation. In order to foster tissue formation within the chamber, which entails on the development of new vessels, we wondered whether we might combine tissue engineering with a gene therapy approach. Based on the well-described tropism of adeno-associated viral vectors for post-mitotic tissues, a muscular flap was harvested from the pectineus muscle, inserted into the chamber and transduced by either AAV vector encoding human VEGF165 or AAV vector expressing the reporter gene β-galactosidase, as a control. Histological analysis of the specimens showed that muscle transduction by AAV vector encoding human VEGF165 resulted in enhanced tissue formation, with a significant increase in the number of arterioles within the chamber in comparison with the previously published model. Pectineus muscular flap, transduced by adeno-associated viral vectors, acted as a source of the proangiogenic factor vascular endothelial growth factor, thus inducing a consistent enhancement of vessel growth into the newly formed tissue within the chamber. In conclusion, our present findings combine three different research fields such as microsurgery, tissue engineering and gene therapy, suggesting and showing the feasibility of a mixed approach for regenerative medicine.
Genome-Wide Search Identifies 1.9 Mb from the Polar Bear Y Chromosome for Evolutionary Analyses.
Bidon, Tobias; Schreck, Nancy; Hailer, Frank; Nilsson, Maria A; Janke, Axel
2015-05-27
The male-inherited Y chromosome is the major haploid fraction of the mammalian genome, rendering Y-linked sequences an indispensable resource for evolutionary research. However, despite recent large-scale genome sequencing approaches, only a handful of Y chromosome sequences have been characterized to date, mainly in model organisms. Using polar bear (Ursus maritimus) genomes, we compare two different in silico approaches to identify Y-linked sequences: 1) Similarity to known Y-linked genes and 2) difference in the average read depth of autosomal versus sex chromosomal scaffolds. Specifically, we mapped available genomic sequencing short reads from a male and a female polar bear against the reference genome and identify 112 Y-chromosomal scaffolds with a combined length of 1.9 Mb. We verified the in silico findings for the longer polar bear scaffolds by male-specific in vitro amplification, demonstrating the reliability of the average read depth approach. The obtained Y chromosome sequences contain protein-coding sequences, single nucleotide polymorphisms, microsatellites, and transposable elements that are useful for evolutionary studies. A high-resolution phylogeny of the polar bear patriline shows two highly divergent Y chromosome lineages, obtained from analysis of the identified Y scaffolds in 12 previously published male polar bear genomes. Moreover, we find evidence of gene conversion among ZFX and ZFY sequences in the giant panda lineage and in the ancestor of ursine and tremarctine bears. Thus, the identification of Y-linked scaffold sequences from unordered genome sequences yields valuable data to infer phylogenomic and population-genomic patterns in bears. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
A Novel Quantitative Approach to Concept Analysis: The Internomological Network
Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli
2012-01-01
Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387
Serological approaches for the diagnosis of schistosomiasis - A review.
Hinz, Rebecca; Schwarz, Norbert G; Hahn, Andreas; Frickmann, Hagen
2017-02-01
Schistosomiasis is a common disease in endemic areas of Sub-Saharan Africa, South America and Asia. It is rare in Europe, mainly imported from endemic countries due to travelling or human migration. Available methods for the diagnosis of schistosomiasis comprise microscopic, molecular and serological approaches, with the latter detecting antigens or antibodies associated with Schistosoma spp. infection. The serological approach is a valuable screening tool in low-endemicity settings and for travel medicine, though the interpretation of any diagnostic results requires knowledge of test characteristics and a patient's history. Specific antibody detection by most currently used assays is only possible in a relatively late stage of infection and does not allow for the differentiation of acute from previous infections for therapeutic control or the discrimination between persisting infection and re-infection. Throughout the last decades, new target antigens have been identified, and assays with improved performance and suitability for use in the field have been developed. For numerous assays, large-scale studies are still required to reliably characterise assay characteristics alone and in association with other available methods for the diagnosis of schistosomiasis. Apart from S. mansoni, S. haematobium and S. japonicum, for which most available tests were developed, other species of Schistosoma that occur less frequently need to be taken into account. This narrative review describes and critically discusses the results of published studies on the evaluation of serological assays that detect antibodies against different Schistosoma species of humans. It provides insights into the diagnostic performance and an overview of available assays and their suitability for large-scale use or individual diagnosis, and thus sets the scene for serological diagnosis of schistosomiasis and the interpretation of results. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Social and content aware One-Class recommendation of papers in scientific social networks.
Wang, Gang; He, XiRan; Ishuga, Carolyne Isigi
2017-01-01
With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs.
Quantitative metrics for evaluating the phased roll-out of clinical information systems.
Wong, David; Wu, Nicolas; Watkinson, Peter
2017-09-01
We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Social and content aware One-Class recommendation of papers in scientific social networks
Wang, Gang; He, XiRan
2017-01-01
With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs. PMID:28771495
Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis
Garrison, Kathleen A.; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J.; Aziz-Zadeh, Lisa S.
2015-01-01
Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant’s structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant’s non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design. PMID:26441816
Australia's TERN: Advancing Ecosystem Data Management in Australia
NASA Astrophysics Data System (ADS)
Phinn, S. R.; Christensen, R.; Guru, S.
2013-12-01
Globally, there is a consistent movement towards more open, collaborative and transparent science, where the publication and citation of data is considered standard practice. Australia's Terrestrial Ecosystem Research Network (TERN) is a national research infrastructure investment designed to support the ecosystem science community through all stages of the data lifecycle. TERN has developed and implemented a comprehensive network of ';hard' and ';soft' infrastructure that enables Australia's ecosystem scientists to collect, publish, store, share, discover and re-use data in ways not previously possible. The aim of this poster is to demonstrate how TERN has successfully delivered infrastructure that is enabling a significant cultural and practical shift in Australia's ecosystem science community towards consistent approaches for data collection, meta-data, data licensing, and data publishing. TERN enables multiple disciplines, within the ecosystem sciences to more effectively and efficiently collect, store and publish their data. A critical part of TERN's approach has been to build on existing data collection activities, networks and skilled people to enable further coordination and collaboration to build each data collection facility and coordinate data publishing. Data collection in TERN is through discipline based facilities, covering long term collection of: (1) systematic plot based measurements of vegetation structure, composition and faunal biodiversity; (2) instrumented towers making systematic measurements of solar, water and gas fluxes; and (3) satellite and airborne maps of biophysical properties of vegetation, soils and the atmosphere. Several other facilities collect and integrate environmental data to produce national products for fauna and vegetation surveys, soils and coastal data, as well as integrated or synthesised products for modelling applications. Data management, publishing and sharing in TERN are implemented through a tailored data licensing framework suitable for ecosystem data, national standards for metadata, a DOI-minting service, and context-appropriate data repositories and portals. The TERN Data infrastructure is based on loosely coupled 'network of networks.' Overall, the data formats used across the TERN facilities vary from NetCDF, comma-separated values and descriptive documents. Metadata standards include ISO19115, Ecological Metadata Language and rich semantic enabled contextual information. Data services vary from Web Mapping Service, Web Feature Service, OpeNDAP, file servers and KNB Metacat. These approaches enable each data collection facility to maintain their discipline based data collection and storage protocols. TERN facility meta-data are harvested regularly for the central TERN Data Discovery Portal and converted to a national standard format. This approach enables centralised discovery, access, and re-use of data simply and effectively, while maintaining disciplinary diversity. Effort is still required to support the cultural shift towards acceptance of effective data management, publication, sharing and re-use as standard practice. To this end TERN's future activities will be directed to supporting this transformation and undertaking ';education' to enable ecosystem scientists to take full advantage of TERN's infrastructure, and providing training and guidance for best practice data management.
Fast and accurate imputation of summary statistics enhances evidence of functional enrichment.
Pasaniuc, Bogdan; Zaitlen, Noah; Shi, Huwenbo; Bhatia, Gaurav; Gusev, Alexander; Pickrell, Joseph; Hirschhorn, Joel; Strachan, David P; Patterson, Nick; Price, Alkes L
2014-10-15
Imputation using external reference panels (e.g. 1000 Genomes) is a widely used approach for increasing power in genome-wide association studies and meta-analysis. Existing hidden Markov models (HMM)-based imputation approaches require individual-level genotypes. Here, we develop a new method for Gaussian imputation from summary association statistics, a type of data that is becoming widely available. In simulations using 1000 Genomes (1000G) data, this method recovers 84% (54%) of the effective sample size for common (>5%) and low-frequency (1-5%) variants [increasing to 87% (60%) when summary linkage disequilibrium information is available from target samples] versus the gold standard of 89% (67%) for HMM-based imputation, which cannot be applied to summary statistics. Our approach accounts for the limited sample size of the reference panel, a crucial step to eliminate false-positive associations, and it is computationally very fast. As an empirical demonstration, we apply our method to seven case-control phenotypes from the Wellcome Trust Case Control Consortium (WTCCC) data and a study of height in the British 1958 birth cohort (1958BC). Gaussian imputation from summary statistics recovers 95% (105%) of the effective sample size (as quantified by the ratio of [Formula: see text] association statistics) compared with HMM-based imputation from individual-level genotypes at the 227 (176) published single nucleotide polymorphisms (SNPs) in the WTCCC (1958BC height) data. In addition, for publicly available summary statistics from large meta-analyses of four lipid traits, we publicly release imputed summary statistics at 1000G SNPs, which could not have been obtained using previously published methods, and demonstrate their accuracy by masking subsets of the data. We show that 1000G imputation using our approach increases the magnitude and statistical evidence of enrichment at genic versus non-genic loci for these traits, as compared with an analysis without 1000G imputation. Thus, imputation of summary statistics will be a valuable tool in future functional enrichment analyses. Publicly available software package available at http://bogdan.bioinformatics.ucla.edu/software/. bpasaniuc@mednet.ucla.edu or aprice@hsph.harvard.edu Supplementary materials are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Strategies for informed sample size reduction in adaptive controlled clinical trials
NASA Astrophysics Data System (ADS)
Arandjelović, Ognjen
2017-12-01
Clinical trial adaptation refers to any adjustment of the trial protocol after the onset of the trial. The main goal is to make the process of introducing new medical interventions to patients more efficient. The principal challenge, which is an outstanding research problem, is to be found in the question of how adaptation should be performed so as to minimize the chance of distorting the outcome of the trial. In this paper, we propose a novel method for achieving this. Unlike most of the previously published work, our approach focuses on trial adaptation by sample size adjustment, i.e. by reducing the number of trial participants in a statistically informed manner. Our key idea is to select the sample subset for removal in a manner which minimizes the associated loss of information. We formalize this notion and describe three algorithms which approach the problem in different ways, respectively, using (i) repeated random draws, (ii) a genetic algorithm, and (iii) what we term pair-wise sample compatibilities. Experiments on simulated data demonstrate the effectiveness of all three approaches, with a consistently superior performance exhibited by the pair-wise sample compatibilities-based method.
NASA Astrophysics Data System (ADS)
Atzberger, C.; Richter, K.
2009-09-01
The robust and accurate retrieval of vegetation biophysical variables using radiative transfer models (RTM) is seriously hampered by the ill-posedness of the inverse problem. With this research we further develop our previously published (object-based) inversion approach [Atzberger (2004)]. The object-based RTM inversion takes advantage of the geostatistical fact that the biophysical characteristics of nearby pixel are generally more similar than those at a larger distance. A two-step inversion based on PROSPECT+SAIL generated look-up-tables is presented that can be easily implemented and adapted to other radiative transfer models. The approach takes into account the spectral signatures of neighboring pixel and optimizes a common value of the average leaf angle (ALA) for all pixel of a given image object, such as an agricultural field. Using a large set of leaf area index (LAI) measurements (n = 58) acquired over six different crops of the Barrax test site, Spain), we demonstrate that the proposed geostatistical regularization yields in most cases more accurate and spatially consistent results compared to the traditional (pixel-based) inversion. Pros and cons of the approach are discussed and possible future extensions presented.
Developing a comprehensive resident education evaluation system in the era of milestone assessment.
Gardner, Aimee K; Scott, Daniel J; Choti, Michael A; Mansour, John C
2015-01-01
In an effort to move training programs toward competency-based education, the Accreditation Council for Graduate Medical Education (ACGME) introduced the Next Accreditation System (NAS), which organizes specific milestones regarding resident skills, knowledge, and abilities along a continuum. In order to foster innovation and creativity, the ACGME has provided programs with minimal guidelines regarding the optimal way to approach these milestones. The education team at UT Southwestern embraced the milestones and developed a process in which performance assessment methods were critically evaluated, mapped onto an extrapolated performance list corresponding to the areas required by the ACGME milestones, and filled gaps in the previous system by modifying evaluation tools and creating new program components. Although the authors are early in the evolution of applying the new milestones system, this approach has thus far allowed them to comprehensively evaluate the residents and the program in an efficient and effective fashion, with notable improvements compared to the prior approach. The authors hope that these experiences can inform others embarking upon similar journeys with the milestones. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Du, Chao; van Wezel, Gilles P
2018-04-30
Natural products (NPs) are a major source of compounds for medical, agricultural, and biotechnological industries. Many of these compounds are of microbial origin, and, in particular, from Actinobacteria or filamentous fungi. To successfully identify novel compounds that correlate to a bioactivity of interest, or discover new enzymes with desired functions, systematic multiomics approaches have been developed over the years. Bioinformatics tools harness the rapidly expanding wealth of genome sequence information, revealing previously unsuspected biosynthetic diversity. Varying growth conditions or application of elicitors are applied to activate cryptic biosynthetic gene clusters, and metabolomics provide detailed insights into the NPs they specify. Combining these technologies with proteomics-based approaches to profile the biosynthetic enzymes provides scientists with insights into the full biosynthetic potential of microorganisms. The proteomics approaches include enrichment strategies such as employing activity-based probes designed by chemical biology, as well as unbiased (quantitative) proteomics methods. In this review, the opportunities and challenges in microbial NP research are discussed, and, in particular, the application of proteomics to link biosynthetic enzymes to the molecules they produce, and vice versa. © 2018 The Authors. Proteomics Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Effects of sediment supply on surface textures of gravel‐bed rivers
Buffington, John M.; Montgomery, David R.
1999-01-01
Using previously published data from flume studies, we test a new approach for quantifying the effects of sediment supply (i.e., bed material supply) on surface grain size of equilibrium gravel channels. Textural response to sediment supply is evaluated relative to a theoretical prediction of competent median grain size (D50′). We find that surface median grain size (D50) varies inversely with sediment supply rate and systematically approaches the competent value (D50′) at low equilibrium transport rates. Furthermore, equilibrium transport rate is a power function of the difference between applied and critical shear stresses and is therefore a power function of the difference between competent and observed median grain sizes (D50′ and D50). Consequently, we propose that the difference between predicted and observed median grain sizes can be used to determine sediment supply rate in equilibrium channels. Our analysis framework collapses data from different studies toward a single relationship between sediment supply rate and surface grain size. While the approach appears promising, we caution that it has been tested only on a limited set of laboratory data and a narrow range of channel conditions.
A BIBLIOGRAPHY OF BIOLOGICAL APPLICATIONS OF AUTORADIOGRAPHY, 1958 THROUGH 1959
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, M.E.
1959-08-01
This bibliography of 281 reports and published literature references on biological applications of autoradiography is a supplement to the one published July 1958 as UCRL-8400. References previously omitted are included. (J.E. D.)
A second chance for authors of hijacked journals to publish in legitimate journals.
Jalalian, Mehrdad
2015-01-01
This article proposes the republication of articles that have previously been published in counterfeit websites of hijacked journals. The paper also discusses the technical and ethical aspects of republishing such articles.
NASA Astrophysics Data System (ADS)
Iyer, Gokul; Ledna, Catherine; Clarke, Leon; Edmonds, James; McJeon, Haewon; Kyle, Page; Williams, James H.
2018-03-01
In the version of this Article previously published, technical problems led to the wrong summary appearing on the homepage, and an incorrect Supplementary Information file being uploaded. Both errors have now been corrected.
Gorst, Sarah L; Gargon, Elizabeth; Clarke, Mike; Smith, Valerie; Williamson, Paula R
2016-01-01
The COMET (Core Outcome Measures in Effectiveness Trials) Initiative promotes the development and application of core outcome sets (COS), including relevant studies in an online database. In order to keep the database current, an annual search of the literature is undertaken. This study aimed to update a previous systematic review, in order to identify any further studies where a COS has been developed. Furthermore, no prioritization for COS development has previously been undertaken, therefore this study also aimed to identify COS relevant to the world's most prevalent health conditions. The methods used in this updated review followed the same approach used in the original review and the previous update. A survey was also sent to the corresponding authors of COS identified for inclusion in this review, to ascertain what lessons they had learnt from developing their COS. Additionally, the COMET database was searched to identify COS that might be relevant to the conditions with the highest global prevalence. Twenty-five reports relating to 22 new studies were eligible for inclusion in the review. Further improvements were identified in relation to the description of the scope of the COS, use of the Delphi technique, and the inclusion of patient participants within the development process. Additionally, 33 published and ongoing COS were identified for 13 of the world's most prevalent conditions. The development of a reporting guideline and minimum standards should contribute towards future improvements in development and reporting of COS. This study has also described a first approach to identifying gaps in existing COS, and to priority setting in this area. Important gaps have been identified, on the basis of global burden of disease, and the development and application of COS in these areas should be considered a priority.
Accurate Filtering of Privacy-Sensitive Information in Raw Genomic Data.
Decouchant, Jérémie; Fernandes, Maria; Völp, Marcus; Couto, Francisco M; Esteves-Veríssimo, Paulo
2018-04-13
Sequencing thousands of human genomes has enabled breakthroughs in many areas, among them precision medicine, the study of rare diseases, and forensics. However, mass collection of such sensitive data entails enormous risks if not protected to the highest standards. In this article, we follow the position and argue that post-alignment privacy is not enough and that data should be automatically protected as early as possible in the genomics workflow, ideally immediately after the data is produced. We show that a previous approach for filtering short reads cannot extend to long reads and present a novel filtering approach that classifies raw genomic data (i.e., whose location and content is not yet determined) into privacy-sensitive (i.e., more affected by a successful privacy attack) and non-privacy-sensitive information. Such a classification allows the fine-grained and automated adjustment of protective measures to mitigate the possible consequences of exposure, in particular when relying on public clouds. We present the first filter that can be indistinctly applied to reads of any length, i.e., making it usable with any recent or future sequencing technologies. The filter is accurate, in the sense that it detects all known sensitive nucleotides except those located in highly variable regions (less than 10 nucleotides remain undetected per genome instead of 100,000 in previous works). It has far less false positives than previously known methods (10% instead of 60%) and can detect sensitive nucleotides despite sequencing errors (86% detected instead of 56% with 2% of mutations). Finally, practical experiments demonstrate high performance, both in terms of throughput and memory consumption. Copyright © 2018. Published by Elsevier Inc.
Genetic and environmental control of host-gut microbiota interactions.
Org, Elin; Parks, Brian W; Joo, Jong Wha J; Emert, Benjamin; Schwartzman, William; Kang, Eun Yong; Mehrabian, Margarete; Pan, Calvin; Knight, Rob; Gunsalus, Robert; Drake, Thomas A; Eskin, Eleazar; Lusis, Aldons J
2015-10-01
Genetics provides a potentially powerful approach to dissect host-gut microbiota interactions. Toward this end, we profiled gut microbiota using 16s rRNA gene sequencing in a panel of 110 diverse inbred strains of mice. This panel has previously been studied for a wide range of metabolic traits and can be used for high-resolution association mapping. Using a SNP-based approach with a linear mixed model, we estimated the heritability of microbiota composition. We conclude that, in a controlled environment, the genetic background accounts for a substantial fraction of abundance of most common microbiota. The mice were previously studied for response to a high-fat, high-sucrose diet, and we hypothesized that the dietary response was determined in part by gut microbiota composition. We tested this using a cross-fostering strategy in which a strain showing a modest response, SWR, was seeded with microbiota from a strain showing a strong response, A×B19. Consistent with a role of microbiota in dietary response, the cross-fostered SWR pups exhibited a significantly increased response in weight gain. To examine specific microbiota contributing to the response, we identified various genera whose abundance correlated with dietary response. Among these, we chose Akkermansia muciniphila, a common anaerobe previously associated with metabolic effects. When administered to strain A×B19 by gavage, the dietary response was significantly blunted for obesity, plasma lipids, and insulin resistance. In an effort to further understand host-microbiota interactions, we mapped loci controlling microbiota composition and prioritized candidate genes. Our publicly available data provide a resource for future studies. © 2015 Org et al.; Published by Cold Spring Harbor Laboratory Press.
Phillips, R M; Burger, A M; Loadman, P M; Jarrett, C M; Swaine, D J; Fiebig, H H
2000-11-15
Mitomycin C (MMC) is a clinically used anticancer drug that is reduced to cytotoxic metabolites by cellular reductases via a process known as bioreductive drug activation. The identification of key enzymes responsible for drug activation has been investigated extensively with the ultimate aim of tailoring drug administration to patients whose tumors possess the biochemical machinery required for drug activation. In the case of MMC, considerable interest has been centered upon the enzyme DT-diaphorase (DTD) although conflicting reports of good and poor correlations between enzyme activity and response in vitro and in vivo have been published. The principle aim of this study was to provide a definitive answer to the question of whether tumor response to MMC could be predicted on the basis of DTD activity in a large panel of human tumor xenografts. DTD levels were measured in 45 human tumor xenografts that had been characterized previously in terms of their sensitivity to MMC in vitro and in vivo (the in vivo response profile to MMC was taken from work published previously). A poor correlation between DTD activity and antitumor activity in vitro as well as in vivo was obtained. This study also assessed the predictive value of an alternative approach based upon the ability of tumor homogenates to metabolize MMC. This approach is based on the premise that the overall rate of MMC metabolism may provide a better indicator of response than single enzyme measurements. MMC metabolism was evaluated in tumor homogenates (clarified by centrifugation at 1000 x g for 1 min) by measuring the disappearance of the parent compound by HPLC. In responsive [T/C <10% (T/C defined as the relative size of treated and control tumors)] and resistant (T/C >50%) tumors, the mean half life of MMC was 75+/-48.3 and 280+/-129.6 min, respectively. The difference between the two groups was statistically significant (P < 0.005). In conclusion, these results unequivocally demonstrate that response to MMC in vivo cannot be predicted on the basis of DTD activity. Measurement of MMC metabolism by tumor homogenates on the other hand may provide a better indicator of tumor response, and further studies are required to determine whether this approach has real clinical potential in terms of individualizing patient chemotherapy.
NeuroSeg: automated cell detection and segmentation for in vivo two-photon Ca2+ imaging data.
Guan, Jiangheng; Li, Jingcheng; Liang, Shanshan; Li, Ruijie; Li, Xingyi; Shi, Xiaozhe; Huang, Ciyu; Zhang, Jianxiong; Pan, Junxia; Jia, Hongbo; Zhang, Le; Chen, Xiaowei; Liao, Xiang
2018-01-01
Two-photon Ca 2+ imaging has become a popular approach for monitoring neuronal population activity with cellular or subcellular resolution in vivo. This approach allows for the recording of hundreds to thousands of neurons per animal and thus leads to a large amount of data to be processed. In particular, manually drawing regions of interest is the most time-consuming aspect of data analysis. However, the development of automated image analysis pipelines, which will be essential for dealing with the likely future deluge of imaging data, remains a major challenge. To address this issue, we developed NeuroSeg, an open-source MATLAB program that can facilitate the accurate and efficient segmentation of neurons in two-photon Ca 2+ imaging data. We proposed an approach using a generalized Laplacian of Gaussian filter to detect cells and weighting-based segmentation to separate individual cells from the background. We tested this approach on an in vivo two-photon Ca 2+ imaging dataset obtained from mouse cortical neurons with differently sized view fields. We show that this approach exhibits superior performance for cell detection and segmentation compared with the existing published tools. In addition, we integrated the previously reported, activity-based segmentation into our approach and found that this combined method was even more promising. The NeuroSeg software, including source code and graphical user interface, is freely available and will be a useful tool for in vivo brain activity mapping.
Merz, Clayton; Catchen, Julian M; Hanson-Smith, Victor; Emerson, Kevin J; Bradshaw, William E; Holzapfel, Christina M
2013-01-01
Herein we tested the repeatability of phylogenetic inference based on high throughput sequencing by increased taxon sampling using our previously published techniques in the pitcher-plant mosquito, Wyeomyia smithii in North America. We sampled 25 natural populations drawn from different localities nearby 21 previous collection localities and used these new data to construct a second, independent phylogeny, expressly to test the reproducibility of phylogenetic patterns. Comparison of trees between the two data sets based on both maximum parsimony and maximum likelihood with Bayesian posterior probabilities showed close correspondence in the grouping of the most southern populations into clear clades. However, discrepancies emerged, particularly in the middle of W. smithii's current range near the previous maximum extent of the Laurentide Ice Sheet, especially concerning the most recent common ancestor to mountain and northern populations. Combining all 46 populations from both studies into a single maximum parsimony tree and taking into account the post-glacial historical biogeography of associated flora provided an improved picture of W. smithii's range expansion in North America. In a more general sense, we propose that extensive taxon sampling, especially in areas of known geological disruption is key to a comprehensive approach to phylogenetics that leads to biologically meaningful phylogenetic inference.
Prediction and analysis of beta-turns in proteins by support vector machine.
Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao
2003-01-01
Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.
In vivo and in silico determination of essential genes of Campylobacter jejuni.
Metris, Aline; Reuter, Mark; Gaskin, Duncan J H; Baranyi, Jozsef; van Vliet, Arnoud H M
2011-11-01
In the United Kingdom, the thermophilic Campylobacter species C. jejuni and C. coli are the most frequent causes of food-borne gastroenteritis in humans. While campylobacteriosis is usually a relatively mild infection, it has a significant public health and economic impact, and possible complications include reactive arthritis and the autoimmune diseases Guillain-Barré syndrome. The rapid developments in "omics" technologies have resulted in the availability of diverse datasets allowing predictions of metabolism and physiology of pathogenic micro-organisms. When combined, these datasets may allow for the identification of potential weaknesses that can be used for development of new antimicrobials to reduce or eliminate C. jejuni and C. coli from the food chain. A metabolic model of C. jejuni was constructed using the annotation of the NCTC 11168 genome sequence, a published model of the related bacterium Helicobacter pylori, and extensive literature mining. Using this model, we have used in silico Flux Balance Analysis (FBA) to determine key metabolic routes that are essential for generating energy and biomass, thus creating a list of genes potentially essential for growth under laboratory conditions. To complement this in silico approach, candidate essential genes have been determined using a whole genome transposon mutagenesis method. FBA and transposon mutagenesis (both this study and a published study) predict a similar number of essential genes (around 200). The analysis of the intersection between the three approaches highlights the shikimate pathway where genes are predicted to be essential by one or more method, and tend to be network hubs, based on a previously published Campylobacter protein-protein interaction network, and could therefore be targets for novel antimicrobial therapy. We have constructed the first curated metabolic model for the food-borne pathogen Campylobacter jejuni and have presented the resulting metabolic insights. We have shown that the combination of in silico and in vivo approaches could point to non-redundant, indispensable genes associated with the well characterised shikimate pathway, and also genes of unknown function specific to C. jejuni, which are all potential novel Campylobacter intervention targets.
Dodsworth, Jeremy A; McDonald, Austin I; Hedlund, Brian P
2012-08-01
To inform hypotheses regarding the relative importance of chemolithotrophic metabolisms in geothermal environments, we calculated free energy yields of 26 chemical reactions potentially supporting chemolithotrophy in two US Great Basin hot springs, taking into account the effects of changing reactant and product activities on the Gibbs free energy as each reaction progressed. Results ranged from 1.2 × 10(-5) to 3.6 J kg(-1) spring water, or 3.7 × 10(-5) to 11.5 J s(-1) based on measured flow rates, with aerobic oxidation of CH(4) or NH4 + giving the highest average yields. Energy yields calculated without constraining pH were similar to those at constant pH except for reactions where H(+) was consumed, which often had significantly lower yields when pH was unconstrained. In contrast to the commonly used normalization of reaction chemical affinities per mole of electrons transferred, reaction energy yields for a given oxidant varied by several orders of magnitude and were more sensitive to differences in the activities of products and reactants. The high energy yield of aerobic ammonia oxidation is consistent with previous observations of significant ammonia oxidation rates and abundant ammonia-oxidizing archaea in sediments of these springs. This approach offers an additional lens through which to view the thermodynamic landscape of geothermal springs. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
Improving the content of feedback.
McKinley, Robert K; Williams, Valerie; Stephenson, Catherine
2010-09-01
Feedback, although an important element of skills teaching, is not well regarded by students. This lack of regard may be perpetuated by the differing expectations of tutors and learners, by the weakness of the process and by the apparent irrelevance of its content to learners. We contend that the content of feedback is critical, and has previously been neglected. We describe a concept for a tutor support tool (a glossary of strategies for improvement) that any group responsible for skills development within an institution can develop in-house and disseminate to improve the content of the feedback given to its learners. All institutions have skills assessment criteria that represent what students are expected to achieve. Conversely, they can also identify the likely range of deficiencies in students' skills, which can therefore be used as a template for identifying a core set of strategies for improvement. The strategies can be quickly developed by a group of experienced tutors, and then shared with all tutors and students. By monitoring the feedback provided to learners, potential new strategies or revisions of existing strategies can be identified. If these new strategies are considered useful they can be included in updates. In this way the collective wisdom of the school's tutors can be captured and shared. We suggest that this approach has the potential to increase congruence between the taught and assessed curriculum. If it is shared with students it may reduce the gap between the hidden and published curriculum. We encourage others to experiment with this approach. © Blackwell Publishing Ltd 2010.
Rotational excitation of HCN by para- and ortho-H₂.
Vera, Mario Hernández; Kalugina, Yulia; Denis-Alpizar, Otoniel; Stoecklin, Thierry; Lique, François
2014-06-14
Rotational excitation of the hydrogen cyanide (HCN) molecule by collisions with para-H2(j = 0, 2) and ortho-H2(j = 1) is investigated at low temperatures using a quantum time independent approach. Both molecules are treated as rigid rotors. The scattering calculations are based on a highly correlated ab initio 4-dimensional (4D) potential energy surface recently published. Rotationally inelastic cross sections among the 13 first rotational levels of HCN were obtained using a pure quantum close coupling approach for total energies up to 1200 cm(-1). The corresponding thermal rate coefficients were computed for temperatures ranging from 5 to 100 K. The HCN rate coefficients are strongly dependent on the rotational level of the H2 molecule. In particular, the rate coefficients for collisions with para-H2(j = 0) are significantly lower than those for collisions with ortho-H2(j = 1) and para-H2(j = 2). Propensity rules in favor of even Δj transitions were found for HCN in collisions with para-H2(j = 0) whereas propensity rules in favor of odd Δj transitions were found for HCN in collisions with H2(j ⩾ 1). The new rate coefficients were compared with previously published HCN-para-H2(j = 0) rate coefficients. Significant differences were found due the inclusion of the H2 rotational structure in the scattering calculations. These new rate coefficients will be crucial to improve the estimation of the HCN abundance in the interstellar medium.
A strategy to establish Food Safety Model Repositories.
Plaza-Rodríguez, C; Thoens, C; Falenski, A; Weiser, A A; Appel, B; Kaesbohrer, A; Filter, M
2015-07-02
Transferring the knowledge of predictive microbiology into real world food manufacturing applications is still a major challenge for the whole food safety modelling community. To facilitate this process, a strategy for creating open, community driven and web-based predictive microbial model repositories is proposed. These collaborative model resources could significantly improve the transfer of knowledge from research into commercial and governmental applications and also increase efficiency, transparency and usability of predictive models. To demonstrate the feasibility, predictive models of Salmonella in beef previously published in the scientific literature were re-implemented using an open source software tool called PMM-Lab. The models were made publicly available in a Food Safety Model Repository within the OpenML for Predictive Modelling in Food community project. Three different approaches were used to create new models in the model repositories: (1) all information relevant for model re-implementation is available in a scientific publication, (2) model parameters can be imported from tabular parameter collections and (3) models have to be generated from experimental data or primary model parameters. All three approaches were demonstrated in the paper. The sample Food Safety Model Repository is available via: http://sourceforge.net/projects/microbialmodelingexchange/files/models and the PMM-Lab software can be downloaded from http://sourceforge.net/projects/pmmlab/. This work also illustrates that a standardized information exchange format for predictive microbial models, as the key component of this strategy, could be established by adoption of resources from the Systems Biology domain. Copyright © 2015. Published by Elsevier B.V.
77 FR 7609 - Policy Letter 11-01, Performance of Inherently Governmental and Critical Functions
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-13
... Regulation. The corrections below should be used in place of text previously published in the September 12, 2011 notice. All other information from the published Final Policy remains unchanged. The full text of...
Device SEE Susceptibility Update: 1996-1998
NASA Technical Reports Server (NTRS)
Coss, J. R.; Miyahira, T. F.; Swift, G. M.
1998-01-01
This eighth Compendium continues the previous work of Nichols, et al, on single event effects (SEE) first published in 1985. Because the compendium has grown so voluminous, this update only presents data not publised in previous compendia.
NASA Astrophysics Data System (ADS)
Robertson, A.; Edie, R.; Soltis, J.; Field, R. A.; Murphy, S. M.
2017-12-01
Recent airborne and mobile lab-based studies by our group and others have demonstrated that production-normalized emission rates of methane can vary dramatically between different Western basins. Three oil and gas basins that are geographically near one another and have relatively similar production characteristics (all three basins produce a mix of natural gas and condensate) have starkly different production-normalized methane emission rates at both the facility and basin-wide levels. This presentation will review previously published data on methane emissions from these basins (Denver Julesburg, Uintah, and Upper Green River) and present new measurement work supporting and expanding upon previous estimates. Beyond this, we use facility level data emissions data combined with information about the date of last upgrade to determine what impact regulations have had on methane emission rates from facilities within the basins. We also investigate what impact different approaches to production may have, in particular the role of having many individual wells processed at a central facility with high throughput is analyzed in terms of its impact on methane emissions.
NASA Astrophysics Data System (ADS)
Regayre, L. A.; Johnson, J. S.; Yoshioka, M.; Pringle, K.; Sexton, D.; Booth, B.; Mann, G.; Lee, L.; Bellouin, N.; Lister, G. M. S.; Johnson, C.; Johnson, B. T.; Mollard, J.; Carslaw, K. S.
2016-12-01
Recent airborne and mobile lab-based studies by our group and others have demonstrated that production-normalized emission rates of methane can vary dramatically between different Western basins. Three oil and gas basins that are geographically near one another and have relatively similar production characteristics (all three basins produce a mix of natural gas and condensate) have starkly different production-normalized methane emission rates at both the facility and basin-wide levels. This presentation will review previously published data on methane emissions from these basins (Denver Julesburg, Uintah, and Upper Green River) and present new measurement work supporting and expanding upon previous estimates. Beyond this, we use facility level data emissions data combined with information about the date of last upgrade to determine what impact regulations have had on methane emission rates from facilities within the basins. We also investigate what impact different approaches to production may have, in particular the role of having many individual wells processed at a central facility with high throughput is analyzed in terms of its impact on methane emissions.
Olasveengen, Theresa M; de Caen, Allan R; Mancini, Mary E; Maconochie, Ian K; Aickin, Richard; Atkins, Dianne L; Berg, Robert A; Bingham, Robert M; Brooks, Steven C; Castrén, Maaret; Chung, Sung Phil; Considine, Julie; Couto, Thomaz Bittencourt; Escalante, Raffo; Gazmuri, Raúl J; Guerguerian, Anne-Marie; Hatanaka, Tetsuo; Koster, Rudolph W; Kudenchuk, Peter J; Lang, Eddy; Lim, Swee Han; Løfgren, Bo; Meaney, Peter A; Montgomery, William H; Morley, Peter T; Morrison, Laurie J; Nation, Kevin J; Ng, Kee-Chong; Nadkarni, Vinay M; Nishiyama, Chika; Nuthall, Gabrielle; Ong, Gene Yong-Kwang; Perkins, Gavin D; Reis, Amelia G; Ristagno, Giuseppe; Sakamoto, Tetsuya; Sayre, Michael R; Schexnayder, Stephen M; Sierra, Alfredo F; Singletary, Eunice M; Shimizu, Naoki; Smyth, Michael A; Stanton, David; Tijssen, Janice A; Travers, Andrew; Vaillancourt, Christian; Van de Voorde, Patrick; Hazinski, Mary Fran; Nolan, Jerry P
2017-12-01
The International Liaison Committee on Resuscitation has initiated a near-continuous review of cardiopulmonary resuscitation science that replaces the previous 5-year cyclic batch-and-queue approach process. This is the first of an annual series of International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations summary articles that will include the cardiopulmonary resuscitation science reviewed by the International Liaison Committee on Resuscitation in the previous year. The review this year includes 5 basic life support and 1 paediatric Consensuses on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations. Each of these includes a summary of the science and its quality based on Grading of Recommendations, Assessment, Development, and Evaluation criteria and treatment recommendations. Insights into the deliberations of the International Liaison Committee on Resuscitation task force members are provided in Values and Preferences sections. Finally, the task force members have prioritised and listed the top 3 knowledge gaps for each population, intervention, comparator, and outcome question. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Programming biological models in Python using PySB.
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.
Analytical Study of 90Sr Betavoltaic Nuclear Battery Performance Based on p-n Junction Silicon
NASA Astrophysics Data System (ADS)
Rahastama, Swastya; Waris, Abdul
2016-08-01
Previously, an analytical calculation of 63Ni p-n junction betavoltaic battery has been published. As the basic approach, we reproduced the analytical simulation of 63Ni betavoltaic battery and then compared it to previous results using the same design of the battery. Furthermore, we calculated its maximum power output and radiation- electricity conversion efficiency using semiconductor analysis method.Then, the same method were applied to calculate and analyse the performance of 90Sr betavoltaic battery. The aim of this project is to compare the analytical perfomance results of 90Sr betavoltaic battery to 63Ni betavoltaic battery and the source activity influences to performance. Since it has a higher power density, 90Sr betavoltaic battery yields more power than 63Ni betavoltaic battery but less radiation-electricity conversion efficiency. However, beta particles emitted from 90Sr source could travel further inside the silicon corresponding to stopping range of beta particles, thus the 90Sr betavoltaic battery could be designed thicker than 63Ni betavoltaic battery to achieve higher conversion efficiency.
Bas-relief generation using adaptive histogram equalization.
Sun, Xianfang; Rosin, Paul L; Martin, Ralph R; Langbein, Frank C
2009-01-01
An algorithm is presented to automatically generate bas-reliefs based on adaptive histogram equalization (AHE), starting from an input height field. A mesh model may alternatively be provided, in which case a height field is first created via orthogonal or perspective projection. The height field is regularly gridded and treated as an image, enabling a modified AHE method to be used to generate a bas-relief with a user-chosen height range. We modify the original image-contrast-enhancement AHE method to use gradient weights also to enhance the shape features of the bas-relief. To effectively compress the height field, we limit the height-dependent scaling factors used to compute relative height variations in the output from height variations in the input; this prevents any height differences from having too great effect. Results of AHE over different neighborhood sizes are averaged to preserve information at different scales in the resulting bas-relief. Compared to previous approaches, the proposed algorithm is simple and yet largely preserves original shape features. Experiments show that our results are, in general, comparable to and in some cases better than the best previously published methods.
Programming biological models in Python using PySB
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320
Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data
NASA Astrophysics Data System (ADS)
Lee, Sanggyun; Kim, Hyun-cheol; Im, Jungho
2018-05-01
We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m) MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0), as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011-2016, excluding the summer season (i.e., June to September). We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.
The Cryoelectron Microscopy Structure of the Type 1 Chaperone-Usher Pilus Rod.
Hospenthal, Manuela K; Zyla, Dawid; Costa, Tiago R D; Redzej, Adam; Giese, Christoph; Lillington, James; Glockshuber, Rudi; Waksman, Gabriel
2017-12-05
Adhesive chaperone-usher pili are long, supramolecular protein fibers displayed on the surface of many bacterial pathogens. The type 1 and P pili of uropathogenic Escherichia coli (UPEC) play important roles during urinary tract colonization, mediating attachment to the bladder and kidney, respectively. The biomechanical properties of the helical pilus rods allow them to reversibly uncoil in response to flow-induced forces, allowing UPEC to retain a foothold in the unique and hostile environment of the urinary tract. Here we provide the 4.2-Å resolution cryo-EM structure of the type 1 pilus rod, which together with the previous P pilus rod structure rationalizes the remarkable "spring-like" properties of chaperone-usher pili. The cryo-EM structure of the type 1 pilus rod differs in its helical parameters from the structure determined previously by a hybrid approach. We provide evidence that these structural differences originate from different quaternary structures of pili assembled in vivo and in vitro. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
A voxel-based approach to gray matter asymmetries.
Luders, E; Gaser, C; Jancke, L; Schlaug, G
2004-06-01
Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
Fusion gene addiction: can tumours be forced to give up the habit?
Selfe, Joanna L; Shipley, Janet
2017-07-01
Fusion of genes in tumours can have oncogenic roles in reprogramming cells through overexpression of oncogenes or the production of novel fusion proteins. A fundamental question in cancer biology is what genetic events are critical for initiation and whether these are also required for cancer progression. In recent work published in The Journal of Pathology, dependency on a fusion protein was addressed using a model of alveolar rhabdomyosarcomas - a sarcoma subtype with frequent fusion of PAX3 and FOXO1 genes that is associated with poor outcome. PAX3-FOXO1 encodes a potent transcription factor that together with MYCN alters the transcriptional landscape of cells. Building on previous work, an inducible model in human myoblast cells was used to show that PAX3-FOXO1 and MYCN can initiate rhabdomyosarcoma development but, contrary to current thinking, tumour recurrences occasionally arose independent of the fusion protein. Further work needs to identify the molecular nature of this independence and assess any relevance in human tumours. Such functional approaches are required together with computational modeling of molecular data to unravel spatial and temporal dependencies on specific genetic events. This may support molecular prognostic markers and therapeutic targets. Copyright © 2017 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd. Copyright © 2017 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
Ung, Timothy H; Madsen, Helen J; Hellwinkel, Justin E; Lencioni, Alex M; Graner, Michael W
2014-11-01
Exosomes are virus-sized, membrane-enclosed vesicles with origins in the cellular endosomal system, but are released extracellularly. As a population, these tiny vesicles carry relatively enormous amounts of information in their protein, lipid and nucleic acid content, and the vesicles can have profound impacts on recipient cells. This review employs publically-available data combined with gene ontology applications to propose a novel concept, that exosomes transport transcriptional and translational machinery that may have direct impacts on gene expression in recipient cells. Here, we examine the previously published proteomic contents of medulloblastoma-derived exosomes, focusing on transcriptional regulators; we found that there are numerous proteins that may have potential roles in transcriptional and translational regulation with putative influence on downstream, cancer-related pathways. We expanded this search to all of the proteins in the Vesiclepedia database; using gene ontology approaches, we see that these regulatory factors are implicated in many of the processes involved in cancer initiation and progression. This information suggests that some of the effects of exosomes on recipient cells may be due to the delivery of protein factors that can directly and fundamentally change the transcriptional landscape of the cells. Within a tumor environment, this has potential to tilt the advantage towards the cancer. © 2014 The Authors. Cancer Science published by Wiley Publishing Asia Pty Ltd on behalf of Japanese Cancer Association.
Aagaard, Lise; Hansen, Ebba Holme
2009-01-01
Background Despite surveillance efforts, unexpected and serious adverse drug reactions (ADRs) repeatedly occur after marketing. The aim of this article is to analyse ADRs reported by available ADR signal detection approaches and to explore which information about new and unexpected ADRs these approaches have detected. Methods We selected three therapeutic cases for the review: antibiotics for systemic use, non-steroidal anti-inflammatory medicines (NSAID) and selective serotonin re-uptake inhibitors (SSRI). These groups are widely used and represent different therapeutic classes of medicines. The ADR studies were identified through literature search in Medline and Embase. The search was conducted in July 2007. For each therapeutic case, we analysed the time of publication, the strengths of the evidence of safety in the different approaches, reported ADRs and whether the studies have produced new information about ADRs compared to the information available at the time of marketing. Results 79 studies were eligible for inclusion in the analysis: 23 antibiotics studies, 35 NSAID studies, 20 SSRI studies. Studies were mainly published from the end of the 1990s and onwards. Although the drugs were launched in different decades, both analytical and observational approaches to ADR studies were similar for all three therapeutic cases: antibiotics, NSAIDs and SSRIs. The studies primarily dealt with analyses of ADRs of the type A and B and to a lesser extent C and D, cf. Rawlins' classification system. The therapeutic cases provided similar results with regard to detecting information about new ADRs despite different time periods and organs attacked. Approaches ranging higher in the evidence hierarchy provided information about risks of already known or expected ADRs, while information about new and previously unknown ADRs was only detected by case reports, the lowest ranking approach in the evidence hierarchy. Conclusion Although the medicines were launched in different decades, approaches to the ADR studies were similar for all three therapeutic cases: antibiotics, NSAIDs and SSRIs. Both descriptive and analytical designs were applied. Despite the fact that analytical studies rank higher in the evidence hierarchy, only the lower ranking descriptive case reports/spontaneous reports provided information about new and previously undetected ADRs. This review underscores the importance of systems for spontaneous reporting of ADRs. Therefore, spontaneous reporting should be encouraged further and the information in ADR databases should continuously be subjected to systematic analysis. PMID:19254390
Aagaard, Lise; Hansen, Ebba Holme
2009-03-03
Despite surveillance efforts, unexpected and serious adverse drug reactions (ADRs) repeatedly occur after marketing. The aim of this article is to analyse ADRs reported by available ADR signal detection approaches and to explore which information about new and unexpected ADRs these approaches have detected. We selected three therapeutic cases for the review: antibiotics for systemic use, non-steroidal anti-inflammatory medicines (NSAID) and selective serotonin re-uptake inhibitors (SSRI). These groups are widely used and represent different therapeutic classes of medicines. The ADR studies were identified through literature search in Medline and Embase. The search was conducted in July 2007. For each therapeutic case, we analysed the time of publication, the strengths of the evidence of safety in the different approaches, reported ADRs and whether the studies have produced new information about ADRs compared to the information available at the time of marketing. 79 studies were eligible for inclusion in the analysis: 23 antibiotics studies, 35 NSAID studies, 20 SSRI studies. Studies were mainly published from the end of the 1990s and onwards. Although the drugs were launched in different decades, both analytical and observational approaches to ADR studies were similar for all three therapeutic cases: antibiotics, NSAIDs and SSRIs. The studies primarily dealt with analyses of ADRs of the type A and B and to a lesser extent C and D, cf. Rawlins' classification system. The therapeutic cases provided similar results with regard to detecting information about new ADRs despite different time periods and organs attacked. Approaches ranging higher in the evidence hierarchy provided information about risks of already known or expected ADRs, while information about new and previously unknown ADRs was only detected by case reports, the lowest ranking approach in the evidence hierarchy. Although the medicines were launched in different decades, approaches to the ADR studies were similar for all three therapeutic cases: antibiotics, NSAIDs and SSRIs. Both descriptive and analytical designs were applied. Despite the fact that analytical studies rank higher in the evidence hierarchy, only the lower ranking descriptive case reports/spontaneous reports provided information about new and previously undetected ADRs. This review underscores the importance of systems for spontaneous reporting of ADRs. Therefore, spontaneous reporting should be encouraged further and the information in ADR databases should continuously be subjected to systematic analysis.
Yin, Yuzhi; Bai, Yun; Olivera, Ana; Desai, Avanti; Metcalfe, Dean D
2017-09-01
The culture of mast cells from human tissues such a cord blood, peripheral blood or bone marrow aspirates has advanced our understanding of human mast cells (huMC) degranulation, mediator production and response to pharmacologic agents. However, existing methods for huMC culture tend to be laborious and expensive. Combining technical approaches from several of these protocols, we designed a simplified and more cost effective approach to the culture of mast cells from human cell populations including peripheral blood and cryopreserved cells from lymphocytapheresis. On average, we reduced by 30-50 fold the amount of culture media compared to our previously reported method, while the total MC number generated by this method (2.46±0.63×10 6 vs. 2.4±0.28×10 6 , respectively, from 1.0×10 8 lymphocytapheresis or peripheral blood mononuclear blood cells [PBMCs]) was similar to our previous method (2.36±0.70×10 6 ), resulting in significant budgetary savings. In addition, we compared the yield of huMCs with or without IL-3 added to early cultures in the presence of stem cell factor (SCF) and interlukin-6 (IL-6) and found that the total MC number generated, while higher with IL-3 in the culture, did not reach statistical significance, suggesting that IL-3, often recommended in the culture of huMCs, is not absolutely required. We then performed a functional analysis by flow cytometry using standard methods and which maximized the data we could obtain from cultured cells. We believe these approaches will allow more laboratories to culture and examine huMC behavior going forward. Published by Elsevier B.V.
Biglioli, Federico; Chiapasco, Matteo
2014-12-01
To present the authors' experience concerning the removal of dental implants displaced in the maxillary sinus via an intraoral approach consisting of the creation of a bony window pedicled to the maxillary sinus membrane. Thirty-six systemically healthy patients, presenting with oral implants displaced into the maxillary sinus, but with no signs of acute or chronic sinusitis, were consecutively treated between 2002 and 2012 via an intraoral approach with the bony window technique. Removal of oral implants from the maxillary sinus was achieved in all patients, and postoperative recovery was uneventful in all of them. Computed tomographies performed after surgery showed no signs of residual sinus infection in all patients and a complete ossification of the bony window margins. Twelve of the 36 treated patients were treated with a sinus grafting procedure 12-18 months after in the same areas previously treated with the bone lid technique. Seventeen implants were placed in the grafted areas 6-9 months later and, after a further waiting period needed for osseointegration, the treated patients were rehabilitated with implant-supported prostheses. The survival rate of implants was 100%, and no complications related to the sinuses and implants were recorded. Results from this study seem to demonstrate that the bony window technique is a safe and easy way to remove oral implants from the maxillary sinus under local anesthesia. The surgical access is hardly visible 6-12 months after surgery, and maxillary sinuses appeared free from residual pathology in all treated patients. Finally, this procedure allows a second-stage sinus grafting procedure via a lateral approach as in a previously untreated maxillary sinus, thus allowing an implant-supported prosthetic restoration. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Florindo, Joao B; Bruno, Odemir M; Landini, Gabriel
2017-02-01
The Odontogenic keratocyst (OKC) is a cystic lesion of the jaws, which has high growth and recurrence rates compared to other cysts of the jaws (for instance, radicular cyst, which is the most common jaw cyst type). For this reason OKCs are considered by some to be benign neoplasms. There exist two sub-types of OKCs (sporadic and syndromic) and the ability to discriminate between these sub-types, as well as other jaw cysts, is an important task in terms of disease diagnosis and prognosis. With the development of digital pathology, computational algorithms have become central to addressing this type of problem. Considering that only basic feature-based methods have been investigated in this problem before, we propose to use a different approach (the Bouligand-Minkowski descriptors) to assess the success rates achieved on the classification of a database of histological images of the epithelial lining of these cysts. This does not require the level of abstraction necessary to extract histologically-relevant features and therefore has the potential of being more robust than previous approaches. The descriptors were obtained by mapping pixel intensities into a three dimensional cloud of points in discrete space and applying morphological dilations with spheres of increasing radii. The descriptors were computed from the volume of the dilated set and submitted to a machine learning algorithm to classify the samples into diagnostic groups. This approach was capable of discriminating between OKCs and radicular cysts in 98% of images (100% of cases) and between the two sub-types of OKCs in 68% of images (71% of cases). These results improve over previously reported classification rates reported elsewhere and suggest that Bouligand-Minkowski descriptors are useful features to be used in histopathological images of these cysts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Crown, Scott B; Kelleher, Joanne K; Rouf, Rosanne; Muoio, Deborah M; Antoniewicz, Maciek R
2016-10-01
In many forms of cardiomyopathy, alterations in energy substrate metabolism play a key role in disease pathogenesis. Stable isotope tracing in rodent heart perfusion systems can be used to determine cardiac metabolic fluxes, namely those relative fluxes that contribute to pyruvate, the acetyl-CoA pool, and pyruvate anaplerosis, which are critical to cardiac homeostasis. Methods have previously been developed to interrogate these relative fluxes using isotopomer enrichments of measured metabolites and algebraic equations to determine a predefined metabolic flux model. However, this approach is exquisitely sensitive to measurement error, thus precluding accurate relative flux parameter determination. In this study, we applied a novel mathematical approach to determine relative cardiac metabolic fluxes using 13 C-metabolic flux analysis ( 13 C-MFA) aided by multiple tracer experiments and integrated data analysis. Using 13 C-MFA, we validated a metabolic network model to explain myocardial energy substrate metabolism. Four different 13 C-labeled substrates were queried (i.e., glucose, lactate, pyruvate, and oleate) based on a previously published study. We integrated the analysis of the complete set of isotopomer data gathered from these mouse heart perfusion experiments into a single comprehensive network model that delineates substrate contributions to both pyruvate and acetyl-CoA pools at a greater resolution than that offered by traditional methods using algebraic equations. To our knowledge, this is the first rigorous application of 13 C-MFA to interrogate data from multiple tracer experiments in the perfused heart. We anticipate that this approach can be used widely to study energy substrate metabolism in this and other similar biological systems. Copyright © 2016 the American Physiological Society.
Kelleher, Joanne K.; Rouf, Rosanne; Muoio, Deborah M.; Antoniewicz, Maciek R.
2016-01-01
In many forms of cardiomyopathy, alterations in energy substrate metabolism play a key role in disease pathogenesis. Stable isotope tracing in rodent heart perfusion systems can be used to determine cardiac metabolic fluxes, namely those relative fluxes that contribute to pyruvate, the acetyl-CoA pool, and pyruvate anaplerosis, which are critical to cardiac homeostasis. Methods have previously been developed to interrogate these relative fluxes using isotopomer enrichments of measured metabolites and algebraic equations to determine a predefined metabolic flux model. However, this approach is exquisitely sensitive to measurement error, thus precluding accurate relative flux parameter determination. In this study, we applied a novel mathematical approach to determine relative cardiac metabolic fluxes using 13C-metabolic flux analysis (13C-MFA) aided by multiple tracer experiments and integrated data analysis. Using 13C-MFA, we validated a metabolic network model to explain myocardial energy substrate metabolism. Four different 13C-labeled substrates were queried (i.e., glucose, lactate, pyruvate, and oleate) based on a previously published study. We integrated the analysis of the complete set of isotopomer data gathered from these mouse heart perfusion experiments into a single comprehensive network model that delineates substrate contributions to both pyruvate and acetyl-CoA pools at a greater resolution than that offered by traditional methods using algebraic equations. To our knowledge, this is the first rigorous application of 13C-MFA to interrogate data from multiple tracer experiments in the perfused heart. We anticipate that this approach can be used widely to study energy substrate metabolism in this and other similar biological systems. PMID:27496880
Clerc-Blain, Jessica L E; Starr, Julian R; Bull, Roger D; Saarela, Jeffery M
2010-01-01
Previous research on barcoding sedges (Carex) suggested that basic searches within a global barcoding database would probably not resolve more than 60% of the world's some 2000 species. In this study, we take an alternative approach and explore the performance of plant DNA barcoding in the Carex lineage from an explicitly regional perspective. We characterize the utility of a subset of the proposed protein-coding and noncoding plastid barcoding regions (matK, rpoB, rpoC1, rbcL, atpF-atpH, psbK-psbI) for distinguishing species of Carex and Kobresia in the Canadian Arctic Archipelago, a clearly defined eco-geographical region representing 1% of the Earth's landmass. Our results show that matK resolves the greatest number of species of any single-locus (95%), and when combined in a two-locus barcode, it provides 100% species resolution in all but one combination (matK + atpFH) during unweighted pair-group method with arithmetic mean averages (UPGMA) analyses. Noncoding regions were equally or more variable than matK, but as single markers they resolve substantially fewer taxa than matK alone. When difficulties with sequencing and alignment due to microstructural variation in noncoding regions are also considered, our results support other studies in suggesting that protein-coding regions are more practical as barcoding markers. Plastid DNA barcodes are an effective identification tool for species of Carex and Kobresia in the Canadian Arctic Archipelago, a region where the number of co-existing closely related species is limited. We suggest that if a regional approach to plant DNA barcoding was applied on a global scale, it could provide a solution to the generally poor species resolution seen in previous barcoding studies. © 2009 Blackwell Publishing Ltd.
Varatharajah, Yogatheesan; Berry, Brent; Cimbalnik, Jan; Kremen, Vaclav; Van Gompel, Jamie; Stead, Matt; Brinkmann, Benjamin; Iyer, Ravishankar; Worrell, Gregory
2018-08-01
An ability to map seizure-generating brain tissue, i.e. the seizure onset zone (SOZ), without recording actual seizures could reduce the duration of invasive EEG monitoring for patients with drug-resistant epilepsy. A widely-adopted practice in the literature is to compare the incidence (events/time) of putative pathological electrophysiological biomarkers associated with epileptic brain tissue with the SOZ determined from spontaneous seizures recorded with intracranial EEG, primarily using a single biomarker. Clinical translation of the previous efforts suffers from their inability to generalize across multiple patients because of (a) the inter-patient variability and (b) the temporal variability in the epileptogenic activity. Here, we report an artificial intelligence-based approach for combining multiple interictal electrophysiological biomarkers and their temporal characteristics as a way of accounting for the above barriers and show that it can reliably identify seizure onset zones in a study cohort of 82 patients who underwent evaluation for drug-resistant epilepsy. Our investigation provides evidence that utilizing the complementary information provided by multiple electrophysiological biomarkers and their temporal characteristics can significantly improve the localization potential compared to previously published single-biomarker incidence-based approaches, resulting in an average area under ROC curve (AUC) value of 0.73 in a cohort of 82 patients. Our results also suggest that recording durations between 90 min and 2 h are sufficient to localize SOZs with accuracies that may prove clinically relevant. The successful validation of our approach on a large cohort of 82 patients warrants future investigation on the feasibility of utilizing intra-operative EEG monitoring and artificial intelligence to localize epileptogenic brain tissue. Broadly, our study demonstrates the use of artificial intelligence coupled with careful feature engineering in augmenting clinical decision making.
LeadMine: a grammar and dictionary driven approach to entity recognition.
Lowe, Daniel M; Sayle, Roger A
2015-01-01
Chemical entity recognition has traditionally been performed by machine learning approaches. Here we describe an approach using grammars and dictionaries. This approach has the advantage that the entities found can be directly related to a given grammar or dictionary, which allows the type of an entity to be known and, if an entity is misannotated, indicates which resource should be corrected. As recognition is driven by what is expected, if spelling errors occur, they can be corrected. Correcting such errors is highly useful when attempting to lookup an entity in a database or, in the case of chemical names, converting them to structures. Our system uses a mixture of expertly curated grammars and dictionaries, as well as dictionaries automatically derived from public resources. We show that the heuristics developed to filter our dictionary of trivial chemical names (from PubChem) yields a better performing dictionary than the previously published Jochem dictionary. Our final system performs post-processing steps to modify the boundaries of entities and to detect abbreviations. These steps are shown to significantly improve performance (2.6% and 4.0% F1-score respectively). Our complete system, with incremental post-BioCreative workshop improvements, achieves 89.9% precision and 85.4% recall (87.6% F1-score) on the CHEMDNER test set. Grammar and dictionary approaches can produce results at least as good as the current state of the art in machine learning approaches. While machine learning approaches are commonly thought of as "black box" systems, our approach directly links the output entities to the input dictionaries and grammars. Our approach also allows correction of errors in detected entities, which can assist with entity resolution.
Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca
2016-01-01
Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198
LeadMine: a grammar and dictionary driven approach to entity recognition
2015-01-01
Background Chemical entity recognition has traditionally been performed by machine learning approaches. Here we describe an approach using grammars and dictionaries. This approach has the advantage that the entities found can be directly related to a given grammar or dictionary, which allows the type of an entity to be known and, if an entity is misannotated, indicates which resource should be corrected. As recognition is driven by what is expected, if spelling errors occur, they can be corrected. Correcting such errors is highly useful when attempting to lookup an entity in a database or, in the case of chemical names, converting them to structures. Results Our system uses a mixture of expertly curated grammars and dictionaries, as well as dictionaries automatically derived from public resources. We show that the heuristics developed to filter our dictionary of trivial chemical names (from PubChem) yields a better performing dictionary than the previously published Jochem dictionary. Our final system performs post-processing steps to modify the boundaries of entities and to detect abbreviations. These steps are shown to significantly improve performance (2.6% and 4.0% F1-score respectively). Our complete system, with incremental post-BioCreative workshop improvements, achieves 89.9% precision and 85.4% recall (87.6% F1-score) on the CHEMDNER test set. Conclusions Grammar and dictionary approaches can produce results at least as good as the current state of the art in machine learning approaches. While machine learning approaches are commonly thought of as "black box" systems, our approach directly links the output entities to the input dictionaries and grammars. Our approach also allows correction of errors in detected entities, which can assist with entity resolution. PMID:25810776
Orbital Plotting of WDS 04545-0314 and WDS 04478+5318
NASA Astrophysics Data System (ADS)
Smith, Nick; Foster, Chris; Myers, Blake; Sepulveda, Barbel; Genet, Russell
2016-01-01
Students at Lincoln High School used the PlateSolve 3 program to obtain the position angle and separation of two double stars, WDS 04545-0314 and WDS 04478+5318. Both stars were observed at Kitt Peak on October 20, 2013. A java-based program developed by the team was used to plot the new data on the previously published orbital paths. It was determined that WDS 04545-0314 is maintaining the previously published orbital solution but that the orbit of WDS 04478+5318 may need to be revised.
Bratt, Ewa-Lena; Moons, Philip
2015-09-15
The first study on quality of life (QoL) in patients with congenital heart disease was published 40 years ago. Since then, the number of QoL articles on these patients has grown exponentially. We conducted a systematic literature review of all empirical studies on QoL in patients with congenital heart disease published since 1974, with the aim of determining the range of conceptual and methodological rigor of studies and identifying temporal trends in these parameters. PubMed, Embase, and Cinahl were searched for empirical studies addressing QoL in children, adolescents, or adults with congenital heart disease, published between January 1, 1974, and December 31, 2014. We applied 10 review criteria that were previously developed by Gill and Feinstein in 1994 and further refined by Moons et al. in 2004. Overall, 234 articles were reviewed. We found slight but non-significant temporal improvements in conceptual and methodological rigor and in use of assessment methods. This indicates a trend toward a more professional and exacting approach in QoL assessments. However, the majority of articles still had substantial conceptual and methodological deficits. Furthermore, we observed that citation of the publications of Gill and Feinstein and Moons et al. in published QoL research is associated with higher quality scores, suggesting that these articles have a positive impact on conceptual and methodological caliber. Despite 40 years of QoL research in this field, this review shows that major weaknesses in methodological rigor remain highly prevalent, which may make QoL studies inconclusive. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Giuliano Vanghetti and the innovation of "cineplastic operations".
Tropea, Peppino; Mazzoni, Alberto; Micera, Silvestro; Corbo, Massimo
2017-10-10
Developing functional artificial limbs for amputees has been a centuries-old challenge in medicine. We review the mechanical and neurologic principles of "cineplastic operations" and "plastic motors" used to restore movements in prostheses, with special attention to the work of Giuliano Vanghetti. We evaluated original publications describing cineplastic operations, biographic information, writings, drawings, and unpublished letters from the Vanghetti library, preserved in Empoli, Italy, and performed a bibliographic search and comparison for similar procedures in the literature. Vanghetti's method for cineplastic operations differs from similar previous methods, being the first aimed at exploiting natural movements of the remnant muscles to activate the mechanical prosthesis, and the first to do so by directly connecting the prosthesis to the residual muscles and tendons. This represented a frame-changing innovation for that time and paved the way for current neuroprosthetic approaches. The first description of the method was published in 1898 and human studies started in 1900. The results of these studies were presented in 1905 and published in 1906 in Plastic and Kinematic Prosthesis . A German surgeon, Ferdinand Sauerbruch, often acknowledged as the inventor of the method, published his first results in 1915. Vanghetti was the first to accurately perform and describe cineplastic operations for patients following an upper arm amputation. He considered the neurologic implications of the problem and, perhaps in an effort to provide more appropriate proprioceptive feedback, he intuitively applied the prostheses so that they were functionally activated by the muscles of the proximal stump. Copyright © 2017 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Neurology.
Increasing algal photosynthetic productivity by integrating ecophysiology with systems biology.
Peers, Graham
2014-11-01
Oxygenic photosynthesis is the process by which plants, algae, and cyanobacteria convert sunlight and CO2 into chemical energy and biomass. Previously published estimates suggest that algal photosynthesis is, at best, able to convert approximately 5-7% of incident light energy to biomass and there is opportunity for improvement. Recent analyses of in situ photophysiology in mass cultures of algae and cyanobacteria show that cultivation methods can have detrimental effects on a cell's photophysiology - reinforcing the need to understand the complex responses of cell biology to a highly variable environment. A systems-based approach to understanding the stresses and efficiencies associated with light-energy harvesting, CO2 fixation, and carbon partitioning will be necessary to make major headway toward improving photosynthetic yields. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dynamical effects in x-ray absorption spectra of graphene and monolayered h -BN on Ni(111)
NASA Astrophysics Data System (ADS)
Rusz, J.; Preobrajenski, A. B.; Ng, May Ling; Vinogradov, N. A.; Mårtensson, N.; Wessely, O.; Sanyal, B.; Eriksson, O.
2010-02-01
We present first-principles calculations of x-ray absorption spectra of graphene and hexagonal BN monolayer on the Ni(111) substrate. Including dynamical core-hole screening effects according to the theory of Mahan-Nozières-de Dominics (MND) results in an overall good agreement with previously published experimental data and our new observations. This approach provides a unified first-principles description of the electronic structure and core excitations in the sp2 -bonded materials on metal surfaces and a better insight into the dynamics of screening effects. We demonstrate in particular that the observed spectral features of graphene and hexagonal BN can be well reproduced with the MND theory, and that they are determined by a delicate balance between initial and final-state effects.
Ecologies, outreach, and the evolution of medical libraries.
Shen, Bern
2005-10-01
What are some of the forces shaping the evolution of medical libraries, and where might they lead? Published literature in the fields of library and information sciences, technology, health services research, and business was consulted. Medical libraries currently have a modest footprint in most consumers' personal health ecologies, the network of resources and activities they use to improve their health. They also occupy a relatively small space in the health care, information, and business ecologies of which they are a part. Several trends in knowledge discovery, technology, and social organizations point to ways in which the roles of medical libraries might grow and become more complex. As medical libraries evolve and reach out to previously underserved communities, an ecological approach can serve as a useful organizing framework for the forces shaping this evolution.
Cleft Rhinoplasty: Strategies for the Multiply Operated Nose.
Hsieh, Tsung-Yen; Dedhia, Raj; Tollefson, Travis T
2018-06-01
Rhinoplasty, as a surgical procedure to improve the appearance of the nose while preserving or improving function, is complicated and difficult to master. Revision cleft rhinoplasty offers another tier of challenge. The symmetry, proportions, and definition of the nose are affected by the native cleft deformity but also previous surgical scars, cartilage grafts, and skin excisions. Our preferred approach is to use structural cartilage grafting to establish septal and lower lateral cartilage resiliency. Internal lining deficiency is addressed with skin or lining transfer, while excess nasal tip thickness is contoured to improve definition. Of the utmost importance, the cleft nasal deformity cannot be considered in isolation, but rather a combined amalgamation of the lip muscle and scar, dentofacial occlusion, and skeletal maxillary deficiency. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Beyond "utilitarianism": maximizing the clinical impact of moral judgment research.
Rosas, Alejandro; Koenigs, Michael
2014-01-01
The use of hypothetical moral dilemmas--which pit utilitarian considerations of welfare maximization against emotionally aversive "personal" harms--has become a widespread approach for studying the neuropsychological correlates of moral judgment in healthy subjects, as well as in clinical populations with social, cognitive, and affective deficits. In this article, we propose that a refinement of the standard stimulus set could provide an opportunity to more precisely identify the psychological factors underlying performance on this task, and thereby enhance the utility of this paradigm for clinical research. To test this proposal, we performed a re-analysis of previously published moral judgment data from two clinical populations: neurological patients with prefrontal brain damage and psychopathic criminals. The results provide intriguing preliminary support for further development of this assessment paradigm.
NASA Technical Reports Server (NTRS)
Asenov, Asen; Slavcheva, G.; Brown, A. R.; Davies, J. H.; Saini, Subhash
1999-01-01
A detailed study of the influence of quantum effects in the inversion layer on the random dopant induced threshold voltage fluctuations and lowering in sub 0.1 micron MOSFETs has been performed. This has been achieved using a full 3D implementation of the density gradient (DG) formalism incorporated in our previously published 3D 'atomistic' simulation approach. This results in a consistent, fully 3D, quantum mechanical picture which implies not only the vertical inversion layer quantisation but also the lateral confinement effects manifested by current filamentation in the 'valleys' of the random potential fluctuations. We have shown that the net result of including quantum mechanical effects, while considering statistical fluctuations, is an increase in both threshold voltage fluctuations and lowering.
Generalized Effective Medium Theory for Particulate Nanocomposite Materials
Siddiqui, Muhammad Usama; Arif, Abul Fazal M.
2016-01-01
The thermal conductivity of particulate nanocomposites is strongly dependent on the size, shape, orientation and dispersion uniformity of the inclusions. To correctly estimate the effective thermal conductivity of the nanocomposite, all these factors should be included in the prediction model. In this paper, the formulation of a generalized effective medium theory for the determination of the effective thermal conductivity of particulate nanocomposites with multiple inclusions is presented. The formulated methodology takes into account all the factors mentioned above and can be used to model nanocomposites with multiple inclusions that are randomly oriented or aligned in a particular direction. The effect of inclusion dispersion non-uniformity is modeled using a two-scale approach. The applications of the formulated effective medium theory are demonstrated using previously published experimental and numerical results for several particulate nanocomposites. PMID:28773817
NASA Astrophysics Data System (ADS)
Rao, J. Anand; Raju, R. Srinivasa; Bucchaiah, C. D.
2018-05-01
In this work, the effect of magnetohydrodynamic natural or free convective of an incompressible, viscous and electrically conducting non-newtonian Jeffrey fluid over a semi-infinite vertically inclined permeable moving plate embedded in a porous medium in the presence of heat absorption, heat and mass transfer. By using non-dimensional quantities, the fundamental governing non-linear partial differential equations are transformed into linear partial differential equations and these equations together with associated boundary conditions are solved numerically by using versatile, extensively validated, variational finite element method. The sway of important key parameters on hydrodynamic, thermal and concentration boundary layers are examined in detail and the results are shown graphically. Finally the results are compared with the works published previously and found to be excellent agreement.
Manning's roughness coefficient for Illinois streams
Soong, David T.; Prater, Crystal D.; Halfar, Teresa M.; Wobig, Loren A.
2012-01-01
Manning's roughness coefficients for 43 natural and constructed streams in Illinois are reported and displayed on a U.S. Geological Survey Web site. At a majority of the sites, discharge and stage were measured, and corresponding Manning's coefficients—the n-values—were determined at more than one river discharge. The n-values discussed in this report are computed from data representing the stream reach studied and, therefore, are reachwise values. Presentation of the resulting n-values takes a visual-comparison approach similar to the previously published Barnes report (1967), in which photographs of channel conditions, description of the site, and the resulting n-values are organized for each site. The Web site where the data can be accessed and are displayed is at URL http://il.water.usgs.gov/proj/nvalues/.
NASA Technical Reports Server (NTRS)
Currit, P. A.
1983-01-01
The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.
Shape optimization of road tunnel cross-section by simulated annealing
NASA Astrophysics Data System (ADS)
Sobótka, Maciej; Pachnicz, Michał
2016-06-01
The paper concerns shape optimization of a tunnel excavation cross-section. The study incorporates optimization procedure of the simulated annealing (SA). The form of a cost function derives from the energetic optimality condition, formulated in the authors' previous papers. The utilized algorithm takes advantage of the optimization procedure already published by the authors. Unlike other approaches presented in literature, the one introduced in this paper takes into consideration a practical requirement of preserving fixed clearance gauge. Itasca Flac software is utilized in numerical examples. The optimal excavation shapes are determined for five different in situ stress ratios. This factor significantly affects the optimal topology of excavation. The resulting shapes are elongated in the direction of a principal stress greater value. Moreover, the obtained optimal shapes have smooth contours circumscribing the gauge.
δ-dependency for privacy-preserving XML data publishing.
Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny
2014-08-01
An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis, to demonstrate the proposed privacy scheme in practical application. Copyright © 2014. Published by Elsevier Inc.
An update on the clinical evidence that supports biosimilar approvals in Europe.
Mielke, Johanna; Jilma, Bernd; Jones, Byron; Koenig, Franz
2018-03-25
Sponsors and regulators have more than 10 years of experience with the development of biosimilars in Europe. However, the regulatory pathway is still evolving. The present article provides an update on biosimilar development in practice by reviewing the clinical development programmes of recently approved biosimilars in Europe. We used the European public assessment reports (EPARs) which are published by the European Medicines Agency (EMA) for a comparison of the clinical development programmes of the 37 approved biosimilars in Europe. Here, we present novel strategies in the development of biosimilars by focusing specifically on the 17 biosimilars that have gained approval in the last year, but we also compare additional key characteristics for all approved biosimilars. The high variability of the clinical development strategies that we found previously was confirmed in the present analysis. Compared with earlier biosimilar applications, more nonstandard development strategies have been used recently. This includes, for example, applications without any studies in patients, and more complex study designs. During this study, we found that the EPARs for biosimilars seem to be improving; however, we identified important details which were still often missing. We provide a proposal for a checklist of the minimum information that should be included in biosimilar EPARs for giving the general public insights into the rationale for the approval of biosimilars. European regulators still seem to be open to consider approaches that differ from the guidelines or previous applications, as long as justification is provided. © 2018 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.
Swedish women's perceptions of and conformity to feminine norms.
Kling, Johanna; Holmqvist Gattario, Kristina; Frisén, Ann
2017-06-01
The relatively high gender equality in the Swedish society is likely to exert an influence on gender role construction. Hence, the present research aimed to investigate Swedish women's perceptions of and conformity to feminine norms. A mixed methods approach with two studies was used. In Study 1, young Swedish women's gender role conformity, as measured by the Conformity to Feminine Norms Inventory 45 (CFNI-45), was compared to the results from previously published studies in Canada, the United States, and Slovakia. Overall, Swedish women displayed less conformity than their foreign counterparts, with the largest difference on the subscale Sexual fidelity. In Study 2, focus group interviews with young Swedish women added a more complex picture of feminine norms in the Swedish society. For instance the results indicated that Swedish women, while living in a society with a strong gender equality discourse, are torn between the perceived need to invest in their appearances and the risk of being viewed as non-equal when doing so. In sum, despite the fact that traditional gender roles are less pronounced in Sweden, gender role conformity is still a pressing issue. Since attending to the potential roles of feminine norms in women's lives previously has been proposed to be useful in counseling and therapeutic work, the present research also offers valuable information for both researchers and practitioners. [Correction added on 5 May 2017, after first online publication in April 2017: An incorrect Abstract was inadvertently captured in the published article and has been corrected in this current version.]. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Vollert, Jan; Magerl, Walter; Baron, Ralf; Binder, Andreas; Enax-Krumova, Elena K; Geisslinger, Gerd; Gierthmühlen, Janne; Henrich, Florian; Hüllemann, Philipp; Klein, Thomas; Lötsch, Jörn; Maier, Christoph; Oertel, Bruno; Schuh-Hofer, Sigrid; Tölle, Thomas R; Treede, Rolf-Detlef
2018-06-01
As an indirect approach to relate previously identified sensory phenotypes of patients suffering from peripheral neuropathic pain to underlying mechanisms, we used a published sorting algorithm to estimate the prevalence of denervation, peripheral and central sensitization in 657 healthy subjects undergoing experimental models of nerve block (NB) (compression block and topical lidocaine), primary hyperalgesia (PH) (sunburn and topical capsaicin), or secondary hyperalgesia (intradermal capsaicin and electrical high-frequency stimulation), and in 902 patients suffering from neuropathic pain. Some of the data have been previously published. Randomized split-half analysis verified a good concordance with a priori mechanistic sensory profile assignment in the training (79%, Cohen κ = 0.54, n = 265) and the test set (81%, Cohen κ = 0.56, n = 279). Nerve blocks were characterized by pronounced thermal and mechanical sensory loss, but also mild pinprick hyperalgesia and paradoxical heat sensations. Primary hyperalgesia was characterized by pronounced gain for heat, pressure and pinprick pain, and mild thermal sensory loss. Secondary hyperalgesia was characterized by pronounced pinprick hyperalgesia and mild thermal sensory loss. Topical lidocaine plus topical capsaicin induced a combined phenotype of NB plus PH. Topical menthol was the only model with significant cold hyperalgesia. Sorting of the 902 patients into these mechanistic phenotypes led to a similar distribution as the original heuristic clustering (65% identity, Cohen κ = 0.44), but the denervation phenotype was more frequent than in heuristic clustering. These data suggest that sorting according to human surrogate models may be useful for mechanism-based stratification of neuropathic pain patients for future clinical trials, as encouraged by the European Medicines Agency.
Goekoop, Rutger; Goekoop, Jaap G.
2014-01-01
Introduction The vast number of psychopathological syndromes that can be observed in clinical practice can be described in terms of a limited number of elementary syndromes that are differentially expressed. Previous attempts to identify elementary syndromes have shown limitations that have slowed progress in the taxonomy of psychiatric disorders. Aim To examine the ability of network community detection (NCD) to identify elementary syndromes of psychopathology and move beyond the limitations of current classification methods in psychiatry. Methods 192 patients with unselected mental disorders were tested on the Comprehensive Psychopathological Rating Scale (CPRS). Principal component analysis (PCA) was performed on the bootstrapped correlation matrix of symptom scores to extract the principal component structure (PCS). An undirected and weighted network graph was constructed from the same matrix. Network community structure (NCS) was optimized using a previously published technique. Results In the optimal network structure, network clusters showed a 89% match with principal components of psychopathology. Some 6 network clusters were found, including "DEPRESSION", "MANIA", “ANXIETY”, "PSYCHOSIS", "RETARDATION", and "BEHAVIORAL DISORGANIZATION". Network metrics were used to quantify the continuities between the elementary syndromes. Conclusion We present the first comprehensive network graph of psychopathology that is free from the biases of previous classifications: a ‘Psychopathology Web’. Clusters within this network represent elementary syndromes that are connected via a limited number of bridge symptoms. Many problems of previous classifications can be overcome by using a network approach to psychopathology. PMID:25427156
Lash, R. Ryan; Johansson, Michael A.; Sharp, Tyler M.; Henry, Ronnie; Brady, Oliver J.; Sotir, Mark J.; Hay, Simon I.; Margolis, Harold S.; Brunette, Gary W.
2016-01-01
Abstract Background: International travel can expose travellers to pathogens not commonly found in their countries of residence, like dengue virus. Travellers and the clinicians who advise and treat them have unique needs for understanding the geographic extent of risk for dengue. Specifically, they should assess the need for prevention measures before travel and ensure appropriate treatment of illness post-travel. Previous dengue-risk maps published in the Centers for Disease Control and Prevention’s Yellow Book lacked specificity, as there was a binary (risk, no risk) classification. We developed a process to compile evidence, evaluate it and apply more informative risk classifications. Methods: We collected more than 839 observations from official reports, ProMED reports and published scientific research for the period 2005–2014. We classified each location as frequent/continuous risk if there was evidence of more than 10 dengue cases in at least three of the previous 10 years. For locations that did not fit this criterion, we classified locations as sporadic/uncertain risk if the location had evidence of at least one locally acquired dengue case during the last 10 years. We used expert opinion in limited instances to augment available data in areas where data were sparse. Results: Initial categorizations classified 134 areas as frequent/continuous and 140 areas as sporadic/uncertain. CDC subject matter experts reviewed all initial frequent/continuous and sporadic/uncertain categorizations and the previously uncategorized areas. From this review, most categorizations stayed the same; however, 11 categorizations changed from the initial determinations. Conclusions: These new risk classifications enable detailed consideration of dengue risk, with clearer meaning and a direct link to the evidence that supports the specific classification. Since many infectious diseases have dynamic risk, strong geographical heterogeneities and varying data quality and availability, using this approach for other diseases can improve the accuracy, clarity and transparency of risk communication. PMID:27625400
Gene genealogies for genetic association mapping, with application to Crohn's disease
Burkett, Kelly M.; Greenwood, Celia M. T.; McNeney, Brad; Graham, Jinko
2013-01-01
A gene genealogy describes relationships among haplotypes sampled from a population. Knowledge of the gene genealogy for a set of haplotypes is useful for estimation of population genetic parameters and it also has potential application in finding disease-predisposing genetic variants. As the true gene genealogy is unknown, Markov chain Monte Carlo (MCMC) approaches have been used to sample genealogies conditional on data at multiple genetic markers. We previously implemented an MCMC algorithm to sample from an approximation to the distribution of the gene genealogy conditional on haplotype data. Our approach samples ancestral trees, recombination and mutation rates at a genomic focal point. In this work, we describe how our sampler can be used to find disease-predisposing genetic variants in samples of cases and controls. We use a tree-based association statistic that quantifies the degree to which case haplotypes are more closely related to each other around the focal point than control haplotypes, without relying on a disease model. As the ancestral tree is a latent variable, so is the tree-based association statistic. We show how the sampler can be used to estimate the posterior distribution of the latent test statistic and corresponding latent p-values, which together comprise a fuzzy p-value. We illustrate the approach on a publicly-available dataset from a study of Crohn's disease that consists of genotypes at multiple SNP markers in a small genomic region. We estimate the posterior distribution of the tree-based association statistic and the recombination rate at multiple focal points in the region. Reassuringly, the posterior mean recombination rates estimated at the different focal points are consistent with previously published estimates. The tree-based association approach finds multiple sub-regions where the case haplotypes are more genetically related than the control haplotypes, and that there may be one or multiple disease-predisposing loci. PMID:24348515
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
Making the Leap to Desktop Publishing.
ERIC Educational Resources Information Center
Schleifer, Neal
1986-01-01
Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)
Aprotinin; friend or foe? A review of recent medical literature.
Royston, D; van Haaften, N; De Vooght, P
2007-01-01
Recent articles published in peer review journals have questioned the safety of using aprotinin in patients having heart surgery. Also, evidence has been published to suggest an increase in renal events in patients given aprotinin when compared to those where tranexamic acid was used. The present review will focus principally on the first of these articles in relation to previously published data and experience.
Mammographic texture synthesis using genetic programming and clustered lumpy background
NASA Astrophysics Data System (ADS)
Castella, Cyril; Kinkel, Karen; Descombes, François; Eckstein, Miguel P.; Sottas, Pierre-Edouard; Verdun, Francis R.; Bochud, François O.
2006-03-01
In this work we investigated the digital synthesis of images which mimic real textures observed in mammograms. Such images could be produced in an unlimited number with tunable statistical properties in order to study human performance and model observer performance in perception experiments. We used the previously developed clustered lumpy background (CLB) technique and optimized its parameters with a genetic algorithm (GA). In order to maximize the realism of the textures, we combined the GA objective approach with psychophysical experiments involving the judgments of radiologists. Thirty-six statistical features were computed and averaged, over 1000 real mammograms regions of interest. The same features were measured for the synthetic textures, and the Mahalanobis distance was used to quantify the similarity of the features between the real and synthetic textures. The similarity, as measured by the Mahalanobis distance, was used as GA fitness function for evolving the free CLB parameters. In the psychophysical approach, experienced radiologists were asked to qualify the realism of synthetic images by considering typical structures that are expected to be found on real mammograms: glandular and fatty areas, and fiber crossings. Results show that CLB images found via optimization with GA are significantly closer to real mammograms than previously published images. Moreover, the psychophysical experiments confirm that all the above mentioned structures are reproduced well on the generated images. This means that we can generate an arbitrary large database of textures mimicking mammograms with traceable statistical properties.
Hannigan, Geoffrey D.; Duhaime, Melissa B.; Koutra, Danai
2018-01-01
Viruses and bacteria are critical components of the human microbiome and play important roles in health and disease. Most previous work has relied on studying bacteria and viruses independently, thereby reducing them to two separate communities. Such approaches are unable to capture how these microbial communities interact, such as through processes that maintain community robustness or allow phage-host populations to co-evolve. We implemented a network-based analytical approach to describe phage-bacteria network diversity throughout the human body. We built these community networks using a machine learning algorithm to predict which phages could infect which bacteria in a given microbiome. Our algorithm was applied to paired viral and bacterial metagenomic sequence sets from three previously published human cohorts. We organized the predicted interactions into networks that allowed us to evaluate phage-bacteria connectedness across the human body. We observed evidence that gut and skin network structures were person-specific and not conserved among cohabitating family members. High-fat diets appeared to be associated with less connected networks. Network structure differed between skin sites, with those exposed to the external environment being less connected and likely more susceptible to network degradation by microbial extinction events. This study quantified and contrasted the diversity of virome-microbiome networks across the human body and illustrated how environmental factors may influence phage-bacteria interactive dynamics. This work provides a baseline for future studies to better understand system perturbations, such as disease states, through ecological networks. PMID:29668682
Żurek-Biesiada, Dominika; Szczurek, Aleksander T; Prakash, Kirti; Mohana, Giriram K; Lee, Hyun-Keun; Roignant, Jean-Yves; Birk, Udo J; Dobrucki, Jurek W; Cremer, Christoph
2016-05-01
Higher order chromatin structure is not only required to compact and spatially arrange long chromatids within a nucleus, but have also important functional roles, including control of gene expression and DNA processing. However, studies of chromatin nanostructures cannot be performed using conventional widefield and confocal microscopy because of the limited optical resolution. Various methods of superresolution microscopy have been described to overcome this difficulty, like structured illumination and single molecule localization microscopy. We report here that the standard DNA dye Vybrant(®) DyeCycle™ Violet can be used to provide single molecule localization microscopy (SMLM) images of DNA in nuclei of fixed mammalian cells. This SMLM method enabled optical isolation and localization of large numbers of DNA-bound molecules, usually in excess of 10(6) signals in one cell nucleus. The technique yielded high-quality images of nuclear DNA density, revealing subdiffraction chromatin structures of the size in the order of 100nm; the interchromatin compartment was visualized at unprecedented optical resolution. The approach offers several advantages over previously described high resolution DNA imaging methods, including high specificity, an ability to record images using a single wavelength excitation, and a higher density of single molecule signals than reported in previous SMLM studies. The method is compatible with DNA/multicolor SMLM imaging which employs simple staining methods suited also for conventional optical microscopy. Copyright © 2016. Published by Elsevier Inc.
Genetic susceptibility factors for alcohol-induced chronic pancreatitis.
Aghdassi, Ali A; Weiss, F Ulrich; Mayerle, Julia; Lerch, Markus M; Simon, Peter
2015-07-01
Chronic pancreatitis is a progressive inflammatory disease of the pancreas and frequently associated with immoderate alcohol consumption. Since only a small proportion of alcoholics eventually develop chronic pancreatitis genetic susceptibility factors have long been suspected to contribute to the pathogenesis of the disease. Smaller studies in ethnically defined populations have found that not only polymorphism in proteins involved in the metabolism of ethanol, such as Alcohol Dehydrogenase and Aldehyde Dehydrogenase, can confer a risk for developing chronic pancreatitis but also mutations that had previously been reported in association with idiopathic pancreatitis, such as SPINK1 mutations. In a much broader approach employing genome wide search strategies the NAPS study found that polymorphisms in the Trypsin locus (PRSS1 rs10273639), and the Claudin 2 locus (CLDN2-RIPPLY1-MORC4 locus rs7057398 and rs12688220) confer an increased risk of developing alcohol-induced pancreatitis. These results from North America have now been confirmed by a European consortium. In another genome wide approach polymorphisms in the genes encoding Fucosyltransferase 2 (FUT2) non-secretor status and blood group B were not only found in association with higher serum lipase levels in healthy volunteers but also to more than double the risk for developing alcohol-associated chronic pancreatitis. These novel genetic associations will allow to investigate the pathophysiological and biochemical basis of alcohol-induced chronic pancreatitis on a cellular level and in much more detail than previously possible. Copyright © 2015 IAP and EPC. Published by Elsevier B.V. All rights reserved.
Hannigan, Geoffrey D; Duhaime, Melissa B; Koutra, Danai; Schloss, Patrick D
2018-04-01
Viruses and bacteria are critical components of the human microbiome and play important roles in health and disease. Most previous work has relied on studying bacteria and viruses independently, thereby reducing them to two separate communities. Such approaches are unable to capture how these microbial communities interact, such as through processes that maintain community robustness or allow phage-host populations to co-evolve. We implemented a network-based analytical approach to describe phage-bacteria network diversity throughout the human body. We built these community networks using a machine learning algorithm to predict which phages could infect which bacteria in a given microbiome. Our algorithm was applied to paired viral and bacterial metagenomic sequence sets from three previously published human cohorts. We organized the predicted interactions into networks that allowed us to evaluate phage-bacteria connectedness across the human body. We observed evidence that gut and skin network structures were person-specific and not conserved among cohabitating family members. High-fat diets appeared to be associated with less connected networks. Network structure differed between skin sites, with those exposed to the external environment being less connected and likely more susceptible to network degradation by microbial extinction events. This study quantified and contrasted the diversity of virome-microbiome networks across the human body and illustrated how environmental factors may influence phage-bacteria interactive dynamics. This work provides a baseline for future studies to better understand system perturbations, such as disease states, through ecological networks.
A morphometric system to distinguish sheep and goat postcranial bones
2017-01-01
Distinguishing between the bones of sheep and goat is a notorious challenge in zooarchaeology. Several methodological contributions have been published at different times and by various people to facilitate this task, largely relying on a macro-morphological approach. This is now routinely adopted by zooarchaeologists but, although it certainly has its value, has also been shown to have limitations. Morphological discriminant criteria can vary in different populations and correct identification is highly dependent upon a researcher’s experience, availability of appropriate reference collections, and many other factors that are difficult to quantify. There is therefore a need to establish a more objective system, susceptible to scrutiny. In order to fulfil such a requirement, this paper offers a comprehensive morphometric method for the identification of sheep and goat postcranial bones, using a sample of more than 150 modern skeletons as a basis, and building on previous pioneering work. The proposed method is based on measurements—some newly created, others previously published–and its use is recommended in combination with the more traditional morphological approach. Measurement ratios, used to translate morphological traits into biometrical attributes, are demonstrated to have substantial diagnostic potential, with the vast majority of specimens correctly assigned to species. The efficacy of the new method is also tested with Discriminant Analysis, which provides a successful verification of the biometrical indices, a statistical means to select the most promising measurements, and an additional line of analysis to be used in conjunction with the others. PMID:28594831
Sensory dissociation in chronic low back pain: Two case reports.
Adamczyk, Wacław M; Luedtke, Kerstin; Saulicz, Oskar; Saulicz, Edward
2018-08-01
Patients with chronic low back pain often report that they do not perceive their painful back accurately. Previous studies confirmed that sensory dissociation and/or discrepancy between perceived body image and actual size is one of the specific traits of patients with chronic pain. Current approaches for measuring sensory dissociation are limited to two-point-discrimination or rely on pain drawings not allowing for quantitative analysis. This case study reports the sensory dissociation of two cases with chronic low back pain using a recently published test (point-to-point-test (PTP)) and a newly developed test (two-point-estimation (TPE)). Both patients mislocalized tactile stimuli delivered to the painful location compared to non-painful locations (PTP test). In addition, both patients perceived their painful lumbar region differently from non-painful sites above and below and contralateral to the painful site. TPE data showed two distinct clinical patterns of sensory dissociation: one patient perceived the two-point distance in the painful area as expanded, while the other patient perceived it as shrunk. The latter pattern of sensory dissociation (i.e., pattern shrunk) is likely to respond to sensory training. Whether enlarged patterns of sensory dissociation are more resistant to treatment remains unknown but would explain the low effectiveness of previous studies using sensory training in chronic low back pain populations. Subgrouping patients according to their sensory discrimination pattern could contribute to the choice and effectiveness of the treatment approach.
Omer-Salim, Amal; Suri, Shobha; Dadhich, Jai Prakash; Faridi, Mohammad Moonis Akbar; Olsson, Pia
2014-12-01
Women's agency, or intentional actions, in combining breastfeeding and employment is significant for health and labour productivity. Previous research in India showed that mothers use various collaborative strategies to ensure a "good enough" combination of breastfeeding and employment. Bandura's theoretical agency constructs previously applied in various realms could facilitate the exploration of agency in an Indian context. To explore manifestations of agency in combining breastfeeding and employment amongst Indian health workers using Bandura's theoretical constructs of agency and women's experiences. Qualitative semi-structured interviews were conducted with ten women employees within the governmental health sector in New Delhi, India. Both deductive and inductive qualitative content analyses were used. Bandura's features and modes of agency revealed that intentionality is underpinned by knowledge, forethought means being prepared, self-reactiveness includes collaboration and that self-reflectiveness gives perspective. Women's interviews revealed four approaches to agency entitled: 'All within my stride or the knowledgeable navigator'; 'Much harder than expected, but ok overall'; This is a very lonely job'; and 'Out of my control'. Agency features and their elements are complex, dynamic and involve family members. Bandura's theoretical agency constructs are partially useful in this context, but additional social practice constructs of family structure and relationship quality are needed for better correspondence with women's experiences of agency. The variation in individual approaches to agency has implications for supportive health and workplace services. Copyright © 2014 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Thompson, Alexander E; Meredig, Bryce; Wolverton, C
2014-03-12
We have created an improved xenon interatomic potential for use with existing UO2 potentials. This potential was fit to density functional theory calculations with the Hubbard U correction (DFT + U) using a genetic algorithm approach called iterative potential refinement (IPR). We examine the defect energetics of the IPR-fitted xenon interatomic potential as well as other, previously published xenon potentials. We compare these potentials to DFT + U derived energetics for a series of xenon defects in a variety of incorporation sites (large, intermediate, and small vacant sites). We find the existing xenon potentials overestimate the energy needed to add a xenon atom to a wide set of defect sites representing a range of incorporation sites, including failing to correctly rank the energetics of the small incorporation site defects (xenon in an interstitial and xenon in a uranium site neighboring uranium in an interstitial). These failures are due to problematic descriptions of Xe-O and/or Xe-U interactions of the previous xenon potentials. These failures are corrected by our newly created xenon potential: our IPR-generated potential gives good agreement with DFT + U calculations to which it was not fitted, such as xenon in an interstitial (small incorporation site) and xenon in a double Schottky defect cluster (large incorporation site). Finally, we note that IPR is very flexible and can be applied to a wide variety of potential forms and materials systems, including metals and EAM potentials.
Masterminding a Masterpiece: A Guide to Publishing Your Institution's History.
ERIC Educational Resources Information Center
Curtis, Melinda Burdette
1984-01-01
Suggestions for undertaking and succeeding at publishing an institutional history address these issues: expectations, funding, choosing an author, stirring interest among alumni, involving alumni and older faculty, determining a writing approach, layout, and publishing. (MSE)
Staley, Dennis M.; Negri, Jacquelyn; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.
2017-01-01
Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity–duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity–duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity–duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity–duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity–duration thresholds do not currently exist.
Dolci, Ricardo Landini Lutaif; Todeschini, Alexandre Bossi; Santos, Américo Rubens Leite Dos; Lazarini, Paulo Roberto
2018-04-19
One of the main concerns in endoscopic endonasal approaches to the skull base has been the high incidence and morbidity associated with cerebrospinal fluid leaks. The introduction and routine use of vascularized flaps allowed a marked decrease in this complication followed by a great expansion in the indications and techniques used in endoscopic endonasal approaches, extending to defects from huge tumours and previously inaccessible areas of the skull base. Describe the technique of performing endoscopic double flap multi-layered reconstruction of the anterior skull base without craniotomy. Step by step description of the endoscopic double flap technique (nasoseptal and pericranial vascularized flaps and fascia lata free graft) as used and illustrated in two patients with an olfactory groove meningioma who underwent an endoscopic approach. Both patients achieved a gross total resection: subsequent reconstruction of the anterior skull base was performed with the nasoseptal and pericranial flaps onlay and a fascia lata free graft inlay. Both patients showed an excellent recovery, no signs of cerebrospinal fluid leak, meningitis, flap necrosis, chronic meningeal or sinonasal inflammation or cerebral herniation having developed. This endoscopic double flap technique we have described is a viable, versatile and safe option for anterior skull base reconstructions, decreasing the incidence of complications in endoscopic endonasal approaches. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
NASA Astrophysics Data System (ADS)
Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.
2017-02-01
Early warning of post-fire debris-flow occurrence during intense rainfall has traditionally relied upon a library of regionally specific empirical rainfall intensity-duration thresholds. Development of this library and the calculation of rainfall intensity-duration thresholds often require several years of monitoring local rainfall and hydrologic response to rainstorms, a time-consuming approach where results are often only applicable to the specific region where data were collected. Here, we present a new, fully predictive approach that utilizes rainfall, hydrologic response, and readily available geospatial data to predict rainfall intensity-duration thresholds for debris-flow generation in recently burned locations in the western United States. Unlike the traditional approach to defining regional thresholds from historical data, the proposed methodology permits the direct calculation of rainfall intensity-duration thresholds for areas where no such data exist. The thresholds calculated by this method are demonstrated to provide predictions that are of similar accuracy, and in some cases outperform, previously published regional intensity-duration thresholds. The method also provides improved predictions of debris-flow likelihood, which can be incorporated into existing approaches for post-fire debris-flow hazard assessment. Our results also provide guidance for the operational expansion of post-fire debris-flow early warning systems in areas where empirically defined regional rainfall intensity-duration thresholds do not currently exist.
DATA ASSIMILATION APPROACH FOR FORECAST OF SOLAR ACTIVITY CYCLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitiashvili, Irina N., E-mail: irina.n.kitiashvili@nasa.gov
Numerous attempts to predict future solar cycles are mostly based on empirical relations derived from observations of previous cycles, and they yield a wide range of predicted strengths and durations of the cycles. Results obtained with current dynamo models also deviate strongly from each other, thus raising questions about criteria to quantify the reliability of such predictions. The primary difficulties in modeling future solar activity are shortcomings of both the dynamo models and observations that do not allow us to determine the current and past states of the global solar magnetic structure and its dynamics. Data assimilation is a relativelymore » new approach to develop physics-based predictions and estimate their uncertainties in situations where the physical properties of a system are not well-known. This paper presents an application of the ensemble Kalman filter method for modeling and prediction of solar cycles through use of a low-order nonlinear dynamo model that includes the essential physics and can describe general properties of the sunspot cycles. Despite the simplicity of this model, the data assimilation approach provides reasonable estimates for the strengths of future solar cycles. In particular, the prediction of Cycle 24 calculated and published in 2008 is so far holding up quite well. In this paper, I will present my first attempt to predict Cycle 25 using the data assimilation approach, and discuss the uncertainties of that prediction.« less
Privacy preserving protocol for detecting genetic relatives using rare variants.
Hormozdiari, Farhad; Joo, Jong Wha J; Wadia, Akshay; Guan, Feng; Ostrosky, Rafail; Sahai, Amit; Eskin, Eleazar
2014-06-15
High-throughput sequencing technologies have impacted many areas of genetic research. One such area is the identification of relatives from genetic data. The standard approach for the identification of genetic relatives collects the genomic data of all individuals and stores it in a database. Then, each pair of individuals is compared to detect the set of genetic relatives, and the matched individuals are informed. The main drawback of this approach is the requirement of sharing your genetic data with a trusted third party to perform the relatedness test. In this work, we propose a secure protocol to detect the genetic relatives from sequencing data while not exposing any information about their genomes. We assume that individuals have access to their genome sequences but do not want to share their genomes with anyone else. Unlike previous approaches, our approach uses both common and rare variants which provide the ability to detect much more distant relationships securely. We use a simulated data generated from the 1000 genomes data and illustrate that we can easily detect up to fifth degree cousins which was not possible using the existing methods. We also show in the 1000 genomes data with cryptic relationships that our method can detect these individuals. The software is freely available for download at http://genetics.cs.ucla.edu/crypto/. © The Author 2014. Published by Oxford University Press.
Chuang, Sheuwen; Howley, Peter P; Lin, Shih-Hua
2015-05-01
Root cause analysis (RCA) is often adopted to complement epidemiologic investigation for outbreaks and infection-related adverse events in hospitals; however, RCA has been argued to have limited effectiveness in preventing such events. We describe how an innovative systems analysis approach halted repeated scabies outbreaks, and highlight the importance of systems thinking for outbreaks analysis and sustaining effective infection prevention and control. Following RCA for a third successive outbreak of scabies over a 17-month period in a 60-bed respiratory care ward of a Taiwan hospital, a systems-oriented event analysis (SOEA) model was used to reanalyze the outbreak. Both approaches and the recommendations were compared. No nosocomial scabies have been reported for more than 1975 days since implementation of the SOEA. Previous intervals between seeming eradication and repeat outbreaks following RCA were 270 days and 180 days. Achieving a sustainable positive resolution relied on applying systems thinking and the holistic analysis of the system, not merely looking for root causes of events. To improve the effectiveness of outbreaks analysis and infection control, an emphasis on systems thinking is critical, along with a practical approach to ensure its effective implementation. The SOEA model provides the necessary framework and is a viable complementary approach, or alternative, to RCA. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Modeling startle eyeblink electromyogram to assess fear learning.
Khemka, Saurabh; Tzovara, Athina; Gerster, Samuel; Quednow, Boris B; Bach, Dominik R
2017-02-01
Pavlovian fear conditioning is widely used as a laboratory model of associative learning in human and nonhuman species. In this model, an organism is trained to predict an aversive unconditioned stimulus from initially neutral events (conditioned stimuli, CS). In humans, fear memory is typically measured via conditioned autonomic responses or fear-potentiated startle. For the latter, various analysis approaches have been developed, but a systematic comparison of competing methodologies is lacking. Here, we investigate the suitability of a model-based approach to startle eyeblink analysis for assessment of fear memory, and compare this to extant analysis strategies. First, we build a psychophysiological model (PsPM) on a generic startle response. Then, we optimize and validate this PsPM on three independent fear-conditioning data sets. We demonstrate that our model can robustly distinguish aversive (CS+) from nonaversive stimuli (CS-, i.e., has high predictive validity). Importantly, our model-based approach captures fear-potentiated startle during fear retention as well as fear acquisition. Our results establish a PsPM-based approach to assessment of fear-potentiated startle, and qualify previous peak-scoring methods. Our proposed model represents a generic startle response and can potentially be used beyond fear conditioning, for example, to quantify affective startle modulation or prepulse inhibition of the acoustic startle response. © 2016 The Authors. Psychophysiology published by Wiley Periodicals, Inc. on behalf of Society for Psychophysiological Research.
Schwessinger, Ron; Suciu, Maria C; McGowan, Simon J; Telenius, Jelena; Taylor, Stephen; Higgs, Doug R; Hughes, Jim R
2017-10-01
In the era of genome-wide association studies (GWAS) and personalized medicine, predicting the impact of single nucleotide polymorphisms (SNPs) in regulatory elements is an important goal. Current approaches to determine the potential of regulatory SNPs depend on inadequate knowledge of cell-specific DNA binding motifs. Here, we present Sasquatch, a new computational approach that uses DNase footprint data to estimate and visualize the effects of noncoding variants on transcription factor binding. Sasquatch performs a comprehensive k -mer-based analysis of DNase footprints to determine any k -mer's potential for protein binding in a specific cell type and how this may be changed by sequence variants. Therefore, Sasquatch uses an unbiased approach, independent of known transcription factor binding sites and motifs. Sasquatch only requires a single DNase-seq data set per cell type, from any genotype, and produces consistent predictions from data generated by different experimental procedures and at different sequence depths. Here we demonstrate the effectiveness of Sasquatch using previously validated functional SNPs and benchmark its performance against existing approaches. Sasquatch is available as a versatile webtool incorporating publicly available data, including the human ENCODE collection. Thus, Sasquatch provides a powerful tool and repository for prioritizing likely regulatory SNPs in the noncoding genome. © 2017 Schwessinger et al.; Published by Cold Spring Harbor Laboratory Press.
Identifying well-formed biomedical phrases in MEDLINE® text.
Kim, Won; Yeganova, Lana; Comeau, Donald C; Wilbur, W John
2012-12-01
In the modern world people frequently interact with retrieval systems to satisfy their information needs. Humanly understandable well-formed phrases represent a crucial interface between humans and the web, and the ability to index and search with such phrases is beneficial for human-web interactions. In this paper we consider the problem of identifying humanly understandable, well formed, and high quality biomedical phrases in MEDLINE documents. The main approaches used previously for detecting such phrases are syntactic, statistical, and a hybrid approach combining these two. In this paper we propose a supervised learning approach for identifying high quality phrases. First we obtain a set of known well-formed useful phrases from an existing source and label these phrases as positive. We then extract from MEDLINE a large set of multiword strings that do not contain stop words or punctuation. We believe this unlabeled set contains many well-formed phrases. Our goal is to identify these additional high quality phrases. We examine various feature combinations and several machine learning strategies designed to solve this problem. A proper choice of machine learning methods and features identifies in the large collection strings that are likely to be high quality phrases. We evaluate our approach by making human judgments on multiword strings extracted from MEDLINE using our methods. We find that over 85% of such extracted phrase candidates are humanly judged to be of high quality. Published by Elsevier Inc.
Sacks, G; Swinburn, B; Kraak, V; Downs, S; Walker, C; Barquera, S; Friel, S; Hawkes, C; Kelly, B; Kumanyika, S; L'Abbé, M; Lee, A; Lobstein, T; Ma, J; Macmullan, J; Mohan, S; Monteiro, C; Neal, B; Rayner, M; Sanders, D; Snowdon, W; Vandevijvere, S
2013-10-01
Private-sector organizations play a critical role in shaping the food environments of individuals and populations. However, there is currently very limited independent monitoring of private-sector actions related to food environments. This paper reviews previous efforts to monitor the private sector in this area, and outlines a proposed approach to monitor private-sector policies and practices related to food environments, and their influence on obesity and non-communicable disease (NCD) prevention. A step-wise approach to data collection is recommended, in which the first ('minimal') step is the collation of publicly available food and nutrition-related policies of selected private-sector organizations. The second ('expanded') step assesses the nutritional composition of each organization's products, their promotions to children, their labelling practices, and the accessibility, availability and affordability of their products. The third ('optimal') step includes data on other commercial activities that may influence food environments, such as political lobbying and corporate philanthropy. The proposed approach will be further developed and piloted in countries of varying size and income levels. There is potential for this approach to enable national and international benchmarking of private-sector policies and practices, and to inform efforts to hold the private sector to account for their role in obesity and NCD prevention. © 2013 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of the International Association for the Study of Obesity.
Review of Static Approaches to Surgical Correction of Presbyopia
Zare Mehrjerdi, Mohammad Ali; Mohebbi, Masomeh; Zandian, Mehdi
2017-01-01
Presbyopia is the primary cause of reduction in the quality of life of people in their 40s, due to dependence on spectacles. Therefore, presbyopia correction has become an evolving and rapidly progressive field in refractive surgery. There are two primary options for presbyopia correction: the dynamic approach uses the residual accommodative capacity of the eye, and the static approach attempts to enhance the depth of focus of the optical system. The dynamic approach attempts to reverse suspected pathophysiologic changes. Dynamic approaches such as accommodative intraocular lenses (IOLs), scleral expansion techniques, refilling, and photodisruption of the crystalline lens have attracted less clinical interest due to inconsistent results and the complexity of the techniques. We have reviewed the most popular static techniques in presbyopia surgery, including multifocal IOLs, PresbyLASIK, and corneal inlays, but we should emphasize that these techniques are very different from the physiologic status of an untouched eye. A systematic PubMed search for the keywords “presbylasik”, “multifocal IOL”, and “presbyopic corneal inlay” revealed 634 articles; 124 were controlled clinical trials, 95 were published in the previous 10 years, and 78 were English with available full text. We reviewed the abstracts and rejected the unrelated articles; other references were included as needed. This narrative review compares different treatments according to available information on the optical basis of each treatment modality, including the clinical outcomes such as near, intermediate, and far visual acuity, spectacles independence, quality of vision, and dysphotopic phenomena. PMID:29090052
Mahmoudi, Zeinab; Johansen, Mette Dencker; Christiansen, Jens Sandahl
2014-01-01
Background: The purpose of this study was to investigate the effect of using a 1-point calibration approach instead of a 2-point calibration approach on the accuracy of a continuous glucose monitoring (CGM) algorithm. Method: A previously published real-time CGM algorithm was compared with its updated version, which used a 1-point calibration instead of a 2-point calibration. In addition, the contribution of the corrective intercept (CI) to the calibration performance was assessed. Finally, the sensor background current was estimated real-time and retrospectively. The study was performed on 132 type 1 diabetes patients. Results: Replacing the 2-point calibration with the 1-point calibration improved the CGM accuracy, with the greatest improvement achieved in hypoglycemia (18.4% median absolute relative differences [MARD] in hypoglycemia for the 2-point calibration, and 12.1% MARD in hypoglycemia for the 1-point calibration). Using 1-point calibration increased the percentage of sensor readings in zone A+B of the Clarke error grid analysis (EGA) in the full glycemic range, and also enhanced hypoglycemia sensitivity. Exclusion of CI from calibration reduced hypoglycemia accuracy, while slightly increased euglycemia accuracy. Both real-time and retrospective estimation of the sensor background current suggest that the background current can be considered zero in the calibration of the SCGM1 sensor. Conclusions: The sensor readings calibrated with the 1-point calibration approach indicated to have higher accuracy than those calibrated with the 2-point calibration approach. PMID:24876420
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... SECURITIES AND EXCHANGE COMMISSION Agency Meeting Federal Register Citation of Previous Announcement: [To be published] Status: Open Meeting. Place: 100 F. Street, NE., Washington, DC. Date and Time of Previously Announced Meeting: September 15, 2010. Change In the Meeting: Room Change. The Joint...
RNA motif search with data-driven element ordering.
Rampášek, Ladislav; Jimenez, Randi M; Lupták, Andrej; Vinař, Tomáš; Brejová, Broňa
2016-05-18
In this paper, we study the problem of RNA motif search in long genomic sequences. This approach uses a combination of sequence and structure constraints to uncover new distant homologs of known functional RNAs. The problem is NP-hard and is traditionally solved by backtracking algorithms. We have designed a new algorithm for RNA motif search and implemented a new motif search tool RNArobo. The tool enhances the RNAbob descriptor language, allowing insertions in helices, which enables better characterization of ribozymes and aptamers. A typical RNA motif consists of multiple elements and the running time of the algorithm is highly dependent on their ordering. By approaching the element ordering problem in a principled way, we demonstrate more than 100-fold speedup of the search for complex motifs compared to previously published tools. We have developed a new method for RNA motif search that allows for a significant speedup of the search of complex motifs that include pseudoknots. Such speed improvements are crucial at a time when the rate of DNA sequencing outpaces growth in computing. RNArobo is available at http://compbio.fmph.uniba.sk/rnarobo .
CE-MS for metabolomics: Developments and applications in the period 2014-2016.
Ramautar, Rawi; Somsen, Govert W; de Jong, Gerhardus J
2017-01-01
CE-MS can be considered a useful analytical technique for the global profiling of (highly) polar and charged metabolites in various samples. Over the past few years, significant advancements have been made in CE-MS approaches for metabolomics studies. In this paper, which is a follow-up of a previous review paper covering the years 2012-2014 (Electrophoresis 2015, 36, 212-224), recent CE-MS strategies developed for metabolomics covering the literature from July 2014 to June 2016 are outlined. Attention will be paid to new CE-MS approaches for the profiling of anionic metabolites and the potential of SPE coupled to CE-MS is also demonstrated. Representative examples illustrate the applicability of CE-MS in the fields of biomedical, clinical, microbial, plant, and food metabolomics. A complete overview of recent CE-MS-based metabolomics studies is given in a table, which provides information on sample type and pretreatment, capillary coatings, and MS detection mode. Finally, general conclusions and perspectives are given. © 2016 The Authors ELECTROPHORESIS Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Enhancing photocurrent transient spectroscopy by electromagnetic modeling.
Diesinger, H; Panahandeh-Fard, M; Wang, Z; Baillargeat, D; Soci, C
2012-05-01
The shape and duration of photocurrent transients generated by a photoconductive switch depend on both the intrinsic response of the active material and the geometry of the transmission line structure. The present electromagnetic model decouples both shape forming contributions. In contrast to previously published work, it accounts for the particular operating mode of transient spectroscopy. The objective is to increase the time resolution by two approaches, by optimizing structural response and by deconvolving it from experimental data. The switch structure is represented by an effective transimpedance onto which the active material acts as current generator. As proof of concept, the response of a standard microstrip switch is modeled and deconvolved from experimental data acquired in GaAs, yielding a single exponential material response and hence supporting the validity of the approach. Beyond compensating for the response deterioration by the structure, switch architectures can be a priori optimized with respect to frequency response. As an example, it is shown that a microstrip gap that can be deposited on materials incompatible with standard lithography reduces pulse broadening by an order of magnitude if it is provided with transitions to coplanar access lines.
Xu, Chet C; Chan, Roger W; Sun, Han; Zhan, Xiaowei
2017-11-01
A mixed-effects model approach was introduced in this study for the statistical analysis of rheological data of vocal fold tissues, in order to account for the data correlation caused by multiple measurements of each tissue sample across the test frequency range. Such data correlation had often been overlooked in previous studies in the past decades. The viscoelastic shear properties of the vocal fold lamina propria of two commonly used laryngeal research animal species (i.e. rabbit, porcine) were measured by a linear, controlled-strain simple-shear rheometer. Along with published canine and human rheological data, the vocal fold viscoelastic shear moduli of these animal species were compared to those of human over a frequency range of 1-250Hz using the mixed-effects models. Our results indicated that tissues of the rabbit, canine and porcine vocal fold lamina propria were significantly stiffer and more viscous than those of human. Mixed-effects models were shown to be able to more accurately analyze rheological data generated from repeated measurements. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mechanistic modelling of drug release from a polymer matrix using magnetic resonance microimaging.
Kaunisto, Erik; Tajarobi, Farhad; Abrahmsen-Alami, Susanna; Larsson, Anette; Nilsson, Bernt; Axelsson, Anders
2013-03-12
In this paper a new model describing drug release from a polymer matrix tablet is presented. The utilization of the model is described as a two step process where, initially, polymer parameters are obtained from a previously published pure polymer dissolution model. The results are then combined with drug parameters obtained from literature data in the new model to predict solvent and drug concentration profiles and polymer and drug release profiles. The modelling approach was applied to the case of a HPMC matrix highly loaded with mannitol (model drug). The results showed that the drug release rate can be successfully predicted, using the suggested modelling approach. However, the model was not able to accurately predict the polymer release profile, possibly due to the sparse amount of usable pure polymer dissolution data. In addition to the case study, a sensitivity analysis of model parameters relevant to drug release was performed. The analysis revealed important information that can be useful in the drug formulation process. Copyright © 2013 Elsevier B.V. All rights reserved.
Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.
Chahrour, Osama; Malone, John
2017-01-01
Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Progress in Defining Disease: Improved Approaches and Increased Impact.
Schwartz, Peter H
2017-08-01
In a series of recent papers, I have made three arguments about how to define "disease" and evaluate and apply possible definitions. First, I have argued that definitions should not be seen as traditional conceptual analyses, but instead as proposals about how to define and use the term "disease" in the future. Second, I have pointed out and attempted to address a challenge for dysfunction-requiring accounts of disease that I call the "line-drawing" problem: distinguishing between low-normal functioning and dysfunctioning. Finally, I have used a dysfunction-requiring approach to argue that some extremely prevalent conditions, such as high blood pressure, high cholesterol, and ductal carcinoma in situ, are not diseases, but instead are risk factors. Four of the papers in this issue directly engage my previous work. In this commentary, I applaud the advances these authors make, address points of disagreement, and make suggestions about where the discussion should go next. © The Author 2017. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The impact of capillary backpressure on spontaneous counter-current imbibition in porous media
NASA Astrophysics Data System (ADS)
Foley, Amir Y.; Nooruddin, Hasan A.; Blunt, Martin J.
2017-09-01
We investigate the impact of capillary backpressure on spontaneous counter-current imbibition. For such displacements in strongly water-wet systems, the non-wetting phase is forced out through the inlet boundary as the wetting phase imbibes into the rock, creating a finite capillary backpressure. Under the assumption that capillary backpressure depends on the water saturation applied at the inlet boundary of the porous medium, its impact is determined using the continuum modelling approach by varying the imposed inlet saturation in the analytical solution. We present analytical solutions for the one-dimensional incompressible horizontal displacement of a non-wetting phase by a wetting phase in a porous medium. There exists an inlet saturation value above which any change in capillary backpressure has a negligible impact on the solutions. Above this threshold value, imbibition rates and front positions are largely invariant. A method for identifying this inlet saturation is proposed using an analytical procedure and we explore how varying multiphase flow properties affects the analytical solutions and this threshold saturation. We show the value of this analytical approach through the analysis of previously published experimental data.
Cameron, Andrew; Lui, Dorothy; Boroomand, Ameneh; Glaister, Jeffrey; Wong, Alexander; Bizheva, Kostadinka
2013-01-01
Optical coherence tomography (OCT) allows for non-invasive 3D visualization of biological tissue at cellular level resolution. Often hindered by speckle noise, the visualization of important biological tissue details in OCT that can aid disease diagnosis can be improved by speckle noise compensation. A challenge with handling speckle noise is its inherent non-stationary nature, where the underlying noise characteristics vary with the spatial location. In this study, an innovative speckle noise compensation method is presented for handling the non-stationary traits of speckle noise in OCT imagery. The proposed approach centers on a non-stationary spline-based speckle noise modeling strategy to characterize the speckle noise. The novel method was applied to ultra high-resolution OCT (UHROCT) images of the human retina and corneo-scleral limbus acquired in-vivo that vary in tissue structure and optical properties. Test results showed improved performance of the proposed novel algorithm compared to a number of previously published speckle noise compensation approaches in terms of higher signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and better overall visual assessment.
Cameron, Andrew; Lui, Dorothy; Boroomand, Ameneh; Glaister, Jeffrey; Wong, Alexander; Bizheva, Kostadinka
2013-01-01
Optical coherence tomography (OCT) allows for non-invasive 3D visualization of biological tissue at cellular level resolution. Often hindered by speckle noise, the visualization of important biological tissue details in OCT that can aid disease diagnosis can be improved by speckle noise compensation. A challenge with handling speckle noise is its inherent non-stationary nature, where the underlying noise characteristics vary with the spatial location. In this study, an innovative speckle noise compensation method is presented for handling the non-stationary traits of speckle noise in OCT imagery. The proposed approach centers on a non-stationary spline-based speckle noise modeling strategy to characterize the speckle noise. The novel method was applied to ultra high-resolution OCT (UHROCT) images of the human retina and corneo-scleral limbus acquired in-vivo that vary in tissue structure and optical properties. Test results showed improved performance of the proposed novel algorithm compared to a number of previously published speckle noise compensation approaches in terms of higher signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and better overall visual assessment. PMID:24049697
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harding, James, E-mail: drjames.harding@btinternet.com; Mortimer, Alex; Kelly, Michael
Percutaneous cholecystostomy is a minimally invasive procedure for providing gallbladder decompression, often in critically ill patients. It can be used in malignant biliary obstruction following failed endoscopic retrograde cholangiopancreatography when the intrahepatic ducts are not dilated or when stent insertion is not possible via the bile ducts. In properly selected patients, percutaneous cholecystostomy in obstructive jaundice is a simple, safe, and rapid option for biliary decompression, thus avoiding the morbidity and mortality involved with percutaneous transhepatic biliary stenting. Subsequent use of a percutaneous cholecystostomy for definitive biliary stent placement is an attractive concept and leaves patients with no external drain.more » To the best of our knowledge, it has only been described on three previous occasions in the published literature, on each occasion forced by surgical or technical considerations. Traditionally, anatomic/technical considerations and the risk of bile leak have precluded such an approach, but improvements in catheter design and manufacture may now make it more feasible. We report a case of successful interval metal stent placement via percutaneous cholecystostomy which was preplanned and achieved excellent palliation for the patient. The pros and cons of the procedure and approach are discussed.« less
Submucosal surgery: novel interventions in the third space.
Teitelbaum, Ezra N; Swanstrom, Lee L
2018-02-01
Traditional surgeries involve accessing body cavities, such as the abdomen and thorax, via incisions that divide skin and muscle. These operations result in postoperative pain and convalescence, and a risk of complications such as wound infection and hernia. The development of flexible endoscopy allowed diseases as varied as gastrointestinal bleeding and colon adenomas to be treated without incisions, but this technique is restricted by its endoluminal nature. A novel category of surgical endoscopic procedures has recently been developed that uses flexible endoscopic techniques to enter and access the submucosa of the gastrointestinal tract. Through this approach, the advantages of incisionless endoscopy can be applied to areas of the body that previously could only be reached with surgery. This Review introduces this new class of interventions by describing two examples of such submucosal surgeries for the treatment of benign gastrointestinal disease: per-oral endoscopic myotomy and per-oral pyloromyotomy. The approach to pre-procedure patient evaluation, operative technique, and the published outcomes are discussed, as well as potential future applications of similar techniques and procedures in this so-called third space. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Cost of Crime to Society: New Crime-Specific Estimates for Policy and Program Evaluation
French, Michael T.; Fang, Hai
2010-01-01
Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than ten years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost of society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. PMID:20071107
The influence of pressure relaxation on the structure of an axial vortex
NASA Astrophysics Data System (ADS)
Ash, Robert L.; Zardadkhan, Irfan; Zuckerwar, Allan J.
2011-07-01
Governing equations including the effects of pressure relaxation have been utilized to study an incompressible, steady-state viscous axial vortex with specified far-field circulation. When sound generation is attributed to a velocity gradient tensor-pressure gradient product, the modified conservation of momentum equations that result yield an exact solution for a steady, incompressible axial vortex. The vortex velocity profile has been shown to closely approximate experimental vortex measurements in air and water over a wide range of circulation-based Reynolds numbers. The influence of temperature and humidity on the pressure relaxation coefficient in air has been examined using theoretical and empirical approaches, and published axial vortex experiments have been employed to estimate the pressure relaxation coefficient in water. Non-equilibrium pressure gradient forces have been shown to balance the viscous stresses in the vortex core region, and the predicted pressure deficits that result from this non-equilibrium balance can be substantially larger than the pressure deficits predicted using a Bernoulli equation approach. Previously reported pressure deficit distributions for dust devils and tornados have been employed to validate the non-equilibrium pressure deficit predictions.
The cost of crime to society: new crime-specific estimates for policy and program evaluation.
McCollister, Kathryn E; French, Michael T; Fang, Hai
2010-04-01
Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than 10 years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost to society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Jacquemin, Denis; Moore, Barry; Planchat, Aurélien; Adamo, Carlo; Autschbach, Jochen
2014-04-08
Using a set of 40 conjugated molecules, we assess the performance of an "optimally tuned" range-separated hybrid functional in reproducing the experimental 0-0 energies. The selected protocol accounts for the impact of solvation using a corrected linear-response continuum approach and vibrational corrections through calculations of the zero-point energies of both ground and excited-states and provides basis set converged data thanks to the systematic use of diffuse-containing atomic basis sets at all computational steps. It turns out that an optimally tuned long-range corrected hybrid form of the Perdew-Burke-Ernzerhof functional, LC-PBE*, delivers both the smallest mean absolute error (0.20 eV) and standard deviation (0.15 eV) of all tested approaches, while the obtained correlation (0.93) is large but remains slightly smaller than its M06-2X counterpart (0.95). In addition, the efficiency of two other recently developed exchange-correlation functionals, namely SOGGA11-X and ωB97X-D, has been determined in order to allow more complete comparisons with previously published data.
[Community acquired pneumonia in children: Outpatient treatment and prevention].
Moreno-Pérez, D; Andrés Martín, A; Tagarro García, A; Escribano Montaner, A; Figuerola Mulet, J; García García, J J; Moreno-Galdó, A; Rodrigo Gonzalo de Lliria, C; Ruiz Contreras, J; Saavedra Lozano, J
2015-12-01
There have been significant changes in community acquired pneumonia (CAP) in children in the last decade. These changes relate to epidemiology and clinical presentation. Resistance to antibiotics is also a changing issue. These all have to be considered when treating CAP. In this document, two of the main Spanish pediatric societies involved in the treatment of CAP in children, propose a consensus concerning therapeutic approach. These societies are the Spanish Society of Paediatric Infectious Diseases and the Spanish Society of Paediatric Chest Diseases. The Advisory Committee on Vaccines of the Spanish Association of Paediatrics (CAV-AEP) has also been involved in the prevention of CAP. An attempt is made to provide up-to-date guidelines to all paediatricians. The first part of the statement presents the approach to ambulatory, previously healthy children. We also review the prevention with currently available vaccines. In a next second part, special situations and complicated forms will be addressed. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.
Lyons, Karl M; Darby, Ivan
2017-06-01
Periodontics cannot be practiced in isolation as frequently many patients have multiple dental needs or medical health issues requiring management. In addition, pathology may manifest in the periodontal tissues, and the onset and progression of periodontitis can be affected by systemic conditions, such as diabetes, and vice versa. The focus of this volume of Periodontology 2000 is interdisciplinary periodontics, and the articles included discuss the interactions and the interrelationshipbetween periodontal tissues/periodontal diseases and endodontics, fixed prosthodontics, implant dentistry, esthetics, gerodontology, radiology, orthodontics, pediatric dentistry, oral and maxillofacial surgery, oral pathology, special needs dentistry and general medicine. Previous volumes of Periodontology 2000 have covered some of the interactions between periodontal diseases and other dental disciplines, especially implant dentistry, 'and the interaction between periodontal disease and systemic disease', but there has not been a volume on interdisciplinary periodontics. The intention therefore is to show how and why periodontics should be interdisciplinary, as well as the benefits of an interdisciplinary approach; in addition, the potential consequences of using a discipline in isolation are discussed. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Neurocognitive performance as an endophenotype for bipolar disorder.
Raust, Aurelie; Daban, Claire; Cochet, Barbara; Henry, Chantal; Bellivier, Frank; Scott, Jan
2014-01-01
Identification of the underlying liability to develop bipolar disorders (BD) is hindered by the genetic complexity and phenotypic heterogeneity of the disease. The use of endophenotypes has been acknowledged as a promising approach that may detect the hidden manifestations of a genetic liability for an illness. One of the most commonly proposed endophenotypes in BD is neurocognitive performance. We identified and examined previously published review articles that had any data pertaining to endophenotypes in BD and combined this with an extensive review of studies of cognitive deficits in BD from 2000 onwards. Using criteria for a valid endophenotype, we identifed that the domains of executive functioning and verbal memory are the most promising candidate endophenotypes for BD. However, they do not meet the criteria for specificity as similar deficits present in schizophrenia and/or severe or psychotic major depressions. Further research is needed as the findings regarding endophenotypes show between-study heterogeneity. In the future, examination of quantitative traits may offer a more promising approach to the study of endophenotypes rather than solely focusing on diagnostic categories.
Rosenberry, Ryan; Chung, Susie; Nelson, Michael D
2018-02-20
Exercise represents a major hemodynamic stress that demands a highly coordinated neurovascular response in order to match oxygen delivery to metabolic demand. Reactive hyperemia (in response to a brief period of tissue ischemia) is an independent predictor of cardiovascular events and provides important insight into vascular health and vasodilatory capacity. Skeletal muscle oxidative capacity is equally important in health and disease, as it determines the energy supply for myocellular processes. Here, we describe a simple, non-invasive approach using near-infrared spectroscopy to assess each of these major clinical endpoints (reactive hyperemia, neurovascular coupling, and muscle oxidative capacity) during a single clinic or laboratory visit. Unlike Doppler ultrasound, magnetic resonance images/spectroscopy, or invasive catheter-based flow measurements or muscle biopsies, our approach is less operator-dependent, low-cost, and completely non-invasive. Representative data from our lab taken together with summary data from previously published literature illustrate the utility of each of these end-points. Once this technique is mastered, application to clinical populations will provide important mechanistic insight into exercise intolerance and cardiovascular dysfunction.
In-situ sequential laser transfer and laser reduction of graphene oxide films
NASA Astrophysics Data System (ADS)
Papazoglou, S.; Petridis, C.; Kymakis, E.; Kennou, S.; Raptis, Y. S.; Chatzandroulis, S.; Zergioti, I.
2018-04-01
Achieving high quality transfer of graphene on selected substrates is a priority in device fabrication, especially where drop-on-demand applications are involved. In this work, we report an in-situ, fast, simple, and one step process that resulted in the reduction, transfer, and fabrication of reduced graphene oxide-based humidity sensors, using picosecond laser pulses. By tuning the laser illumination parameters, we managed to implement the sequential printing and reduction of graphene oxide flakes. The overall process lasted only a few seconds compared to a few hours that our group has previously published. DC current measurements, X-Ray Photoelectron Spectroscopy, X-Ray Diffraction, and Raman Spectroscopy were employed in order to assess the efficiency of our approach. To demonstrate the applicability and the potential of the technique, laser printed reduced graphene oxide humidity sensors with a limit of detection of 1700 ppm are presented. The results demonstrated in this work provide a selective, rapid, and low-cost approach for sequential transfer and photochemical reduction of graphene oxide micro-patterns onto various substrates for flexible electronics and sensor applications.
Nickerson, Beverly; Harrington, Brent; Li, Fasheng; Guo, Michele Xuemei
2017-11-30
Drug product assay is one of several tests required for new drug products to ensure the quality of the product at release and throughout the life cycle of the product. Drug product assay testing is typically performed by preparing a composite sample of multiple dosage units to obtain an assay value representative of the batch. In some cases replicate composite samples may be prepared and the reportable assay value is the average value of all the replicates. In previously published work by Harrington et al. (2014) [5], a sample preparation composite and replicate strategy for assay was developed to provide a systematic approach which accounts for variability due to the analytical method and dosage form with a standard error of the potency assay criteria based on compendia and regulatory requirements. In this work, this sample preparation composite and replicate strategy for assay is applied to several case studies to demonstrate the utility of this approach and its application at various stages of pharmaceutical drug product development. Copyright © 2017 Elsevier B.V. All rights reserved.
Müller, Dirk K; Pampel, André; Möller, Harald E
2013-05-01
Quantification of magnetization-transfer (MT) experiments are typically based on the assumption of the binary spin-bath model. This model allows for the extraction of up to six parameters (relative pool sizes, relaxation times, and exchange rate constants) for the characterization of macromolecules, which are coupled via exchange processes to the water in tissues. Here, an approach is presented for estimating MT parameters acquired with arbitrary saturation schemes and imaging pulse sequences. It uses matrix algebra to solve the Bloch-McConnell equations without unwarranted simplifications, such as assuming steady-state conditions for pulsed saturation schemes or neglecting imaging pulses. The algorithm achieves sufficient efficiency for voxel-by-voxel MT parameter estimations by using a polynomial interpolation technique. Simulations, as well as experiments in agar gels with continuous-wave and pulsed MT preparation, were performed for validation and for assessing approximations in previous modeling approaches. In vivo experiments in the normal human brain yielded results that were consistent with published data. Copyright © 2013 Elsevier Inc. All rights reserved.
Chlamydia trachomatis screening in young women.
Baraitser, Paula; Alexander, Sarah; Sheringham, Jessica
2011-10-01
As the number of chlamydia screening programmes implemented worldwide increases, we summarize current understanding of the epidemiology, natural history, and management of chlamydia, focusing on screening in young women. Chlamydia diagnoses continue to rise, with young women at high risk. Recently published trials show that the risk of serious reproductive health outcomes is lower than previously thought. They illustrate that significant barriers - both practical and cultural - remain to engaging young people and health professionals in routine testing for sexually transmitted infections. Chlamydia control efforts have driven innovative approaches to testing including new approaches to engaging young people in discussions of sexual health and screening accessed via the Internet. Chlamydia is highly prevalent among young women and may cause serious reproductive sequelae. Gaps in our knowledge of the epidemiology, natural history and immunology of this organism continue to hamper efforts to control it. Sexual health promotion and screening of young people remain the mainstay of population control, although there is as yet no strong evidence of health screening benefits. Control efforts will require new strategies to engage young people and health professionals to normalize sexual health testing. (C) 2011 Lippincott Williams & Wilkins, Inc.
Ross, Douglas S; Burch, Henry B; Cooper, David S; Greenlee, M Carol; Laurberg, Peter; Maia, Ana Luiza; Rivkees, Scott A; Samuels, Mary; Sosa, Julie Ann; Stan, Marius N; Walter, Martin A
2016-10-01
Thyrotoxicosis has multiple etiologies, manifestations, and potential therapies. Appropriate treatment requires an accurate diagnosis and is influenced by coexisting medical conditions and patient preference. This document describes evidence-based clinical guidelines for the management of thyrotoxicosis that would be useful to generalist and subspecialty physicians and others providing care for patients with this condition. The American Thyroid Association (ATA) previously cosponsored guidelines for the management of thyrotoxicosis that were published in 2011. Considerable new literature has been published since then, and the ATA felt updated evidence-based guidelines were needed. The association assembled a task force of expert clinicians who authored this report. They examined relevant literature using a systematic PubMed search supplemented with additional published materials. An evidence-based medicine approach that incorporated the knowledge and experience of the panel was used to update the 2011 text and recommendations. The strength of the recommendations and the quality of evidence supporting them were rated according to the approach recommended by the Grading of Recommendations, Assessment, Development, and Evaluation Group. Clinical topics addressed include the initial evaluation and management of thyrotoxicosis; management of Graves' hyperthyroidism using radioactive iodine, antithyroid drugs, or surgery; management of toxic multinodular goiter or toxic adenoma using radioactive iodine or surgery; Graves' disease in children, adolescents, or pregnant patients; subclinical hyperthyroidism; hyperthyroidism in patients with Graves' orbitopathy; and management of other miscellaneous causes of thyrotoxicosis. New paradigms since publication of the 2011 guidelines are presented for the evaluation of the etiology of thyrotoxicosis, the management of Graves' hyperthyroidism with antithyroid drugs, the management of pregnant hyperthyroid patients, and the preparation of patients for thyroid surgery. The sections on less common causes of thyrotoxicosis have been expanded. One hundred twenty-four evidence-based recommendations were developed to aid in the care of patients with thyrotoxicosis and to share what the task force believes is current, rational, and optimal medical practice.
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Psychological treatment of attention deficit hyperactivity disorder in adults: a systematic review.
Vidal-Estrada, Raquel; Bosch-Munso, Rosa; Nogueira-Morais, Mariana; Casas-Brugue, Miquel; Ramos-Quiroga, Josep A
2012-01-01
Attention-deficit/hyperactivity disorder (ADHD) is a neurodevelopmental disorder of childhood onset. The disorder persists into adulthood in most cases, significantly affecting patient function. Although the firstline choice of treatment for ADHD is pharmacological, drug treatments are not always sufficient. All the published studies on the psychological treatment of ADHD were systematically reviewed for the present article. The MEDLINE and PsychINFO electronic databases were searched using the terms psychological treatment OR psychotherapy OR psychosocial treatment AND ADHD. Patient age was restricted to adults (all adult:19+ years). Eighteen published studies met inclusion criteria for the review. Fifteen efficacy studies of psychological treatment were selected (cognitive behavioral therapy, metacognitive therapy, dialectical behavior therapy, coaching, cognitive remediation) and three previous reviews. The results indicate that cognitive behavioral therapy is the most effective psychological treatment for ADHD symptoms in adults and the comorbid symptoms of anxiety and depression, which have an important functional impact on the daily life of patients. However, more research is needed to know the differential effects of each psychological approach in relation to improved ADHD symptoms in adults. Finally, future directions for the psychosocial treatment of ADHD problems of adults are suggested.
Lindahl, Berit
2011-01-01
The research presented in this work represents reflections in the light of Julia Kristeva's philosophy concerning empirical data drawn from research describing the everyday life of people dependent on ventilators. It also presents a qualitative and narrative methodological approach from a person-centred perspective. Most research on home ventilator treatment is biomedical. There are a few published studies describing the situation of people living at home on a ventilator but no previous publications have used the thoughts in Kristeva's philosophy applied to this topic from a caring science perspective. The paper also addresses what a life at home on a ventilator may be like and will hopefully add some new aspects to the discussion of philosophical issues in nursing and the very essence of care. Kristeva's philosophy embraces phenomena such as language, abjection, body, and love, allowing her writings to make a fruitful contribution to nursing philosophy in that they strengthen, expand, and deepen a caring perspective. Moreover, her writings about revolt having the power to create hope add an interesting aspect to the work of earlier philosophers and nursing theorists. © 2010 Blackwell Publishing Ltd.
Applicability of canisters for sample storage in the determination of hazardous air pollutants
NASA Astrophysics Data System (ADS)
Kelly, Thomas J.; Holdren, Michael W.
This paper evaluates the applicability of canisters for storage of air samples containing volatile organic compounds listed among the 189 hazardous air pollutants (HAPs) in the 1990 U.S. Clean Air Act Amendments. Nearly 100 HAPs have sufficient vapor pressure to be considered volatile compounds. Of those volatile organic HAPs, 52 have been tested previously for stability during storage in canisters. The published HAP stability studies are reviewed, illustrating that for most of the 52 HAPs tested, canisters are an effective sample storage approach. However, the published stability studies used a variety of canister types and test procedures, and generally considered only a few compounds in a very small set of canisters. A comparison of chemical and physical properties of the HAPs has also been conducted, to evaluate the applicability of canister sampling for other HAPs, for which canister stability testing has never been conducted. Of 45 volatile HAPs never tested in canisters, this comparison identifies nine for which canisters should be effective, and 17 for which canisters are not likely to be effective. For the other 19 HAPs, no clear decision can be reached on the likely applicability of air sample storage in canisters.
A New Class of Macrocyclic Chiral Selectors for Stereochemical Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-11
This report summarizes the work accomplished in the authors laboratories over the previous three years. During the funding period they have had 23 monographs published or in press, 1 book chapter, 1 patent issued and have delivered 28 invited seminars or plenary lectures on DOE sponsored research. This report covers the work that has been published (or accepted). The most notable aspect of this work involves the successful development and understanding of a new class of fused macrocyclic compounds as pseudophases and selectors in high performance separations (including high performance liquid chromatography, HPLC; capillary electrophoresis, CE; and thin layer chromatography,more » TLC). They have considerably extended their chiral biomarker work from amber to crude oil and coal. In the process of doing this we've developed several novel separation approaches. They finished their work on the new GSC-PLOT column which is now being used by researchers world-wide for the analysis of gases, light hydrocarbons and halocarbons. Finally, we completed basic studies on immobilizing a cyclodextrin/oligosiloxane hybrid on the wall of fused silica, as well as a basic study on the separation behavior of buckminster fullerene and higher fullerenes.« less
Sun, Jian; Zhang, Lei; Cui, Jing; Li, Shanshan; Lu, Hongting; Zhang, Yong; Li, Haiming; Sun, Jianping; Baloch, Zulqarnain
2018-05-10
Previous studies have shown beneficial effects of dietary approaches for iron deficiency anemia (IDA) control. This study was design to investigate the effect of dietary intervention treatment on children with iron deficiency anemia. We performed a systematic review of published dietary interventions effect on IDA treatment through meta-analysis. CBM, CNKI, Wanfang database, EMBASE, VIP, PubMed and Web of science database were searched to identify studies published between January, 1980 and December, 2016. Statistical analysis was performed by Revmen5.2 software. Initially we retrieved for 373 studies, and then 6 studies with a total of 676 individuals were included in the analysis according to the inclusion and exclusion criteria for meta-analysis. The overall pooled estimate of odds ratio [(OR), 95% confidence intervals (95% CI)] in the dietary intervention on children with iron deficiency anemia was 6.54 (95% CI: 3.48-12.31, Z = 5.82, p<0.001) and funnel plot is symmetric. Our meta-analysis suggested that dietary interventions are effective in improving the iron deficiency in children with iron deficiency anemia (IDA) and should be considered in the overall strategy of IDA management.
An Overview to Research on Education Technology Based on Constructivist Learning Approach
ERIC Educational Resources Information Center
Asiksoy, Gulsum; Ozdamli, Fezile
2017-01-01
The aim of this research is to determine the trends of education technology researches on Constructivist Learning Approach, which were published on database of ScienceDirect between 2010 and 2016. It also aims to guide researchers who will do studies in this field. After scanning the database, 81 articles published on ScienceDirect's data base…
Then and Now: Approaches to Understanding Children's Literature in Two Volumes
ERIC Educational Resources Information Center
Bani-Khair, Baker M.; Khawaldeh, Imad M.
2016-01-01
This research paper investigates two main volumes taken from "Children's Literature Association Quarterly"; the earlier one is Vol. 11 published in 1986, and the other one, a more recent one, Vol. 32 published in 2007, as to understand the differences and similarities regarding the approaches used in the articles to understand Children's…
Bartsch, Sarah M; Umscheid, Craig A; Nachamkin, Irving; Hamilton, Keith; Lee, Bruce Y
2015-01-01
Accurate diagnosis of Clostridium difficile infection (CDI) is essential to effectively managing patients and preventing transmission. Despite the availability of several diagnostic tests, the optimal strategy is debatable and their economic values are unknown. We modified our previously existing C. difficile simulation model to determine the economic value of different CDI diagnostic approaches from the hospital perspective. We evaluated four diagnostic methods for a patient suspected of having CDI: 1) toxin A/B enzyme immunoassay, 2) glutamate dehydrogenase (GDH) antigen/toxin AB combined in one test, 3) nucleic acid amplification test (NAAT), and 4) GDH antigen/toxin AB combination test with NAAT confirmation of indeterminate results. Sensitivity analysis varied the proportion of those tested with clinically significant diarrhoea, the probability of CDI, NAAT cost and CDI treatment delay resulting from a false-negative test, length of stay and diagnostic sensitivity and specificity. The GDH/toxin AB plus NAAT approach leads to the timeliest treatment with the fewest unnecessary treatments given, resulted in the best bed management and generated the lowest cost. The NAAT-alone approach also leads to timely treatment. The GDH/toxin AB diagnostic (without NAAT confirmation) approach resulted in a large number of delayed treatments, but results in the fewest secondary colonisations. Results were robust to the sensitivity analysis. Choosing the right diagnostic approach is a matter of cost and test accuracy. GDH/toxin AB plus NAAT diagnosis led to the timeliest treatment and was the least costly. Copyright © 2014 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Ladegaard, Michael; Jensen, Frants Havmand; Beedholm, Kristian; da Silva, Vera Maria Ferreira; Madsen, Peter Teglberg
2017-07-15
Toothed whales have evolved to live in extremely different habitats and yet they all rely strongly on echolocation for finding and catching prey. Such biosonar-based foraging involves distinct phases of searching for, approaching and capturing prey, where echolocating animals gradually adjust sonar output to actively shape the flow of sensory information. Measuring those outputs in absolute levels requires hydrophone arrays centred on the biosonar beam axis, but this has never been done for wild toothed whales approaching and capturing prey. Rather, field studies make the assumption that toothed whales will adjust their biosonar in the same manner to arrays as they will when approaching prey. To test this assumption, we recorded wild botos ( Inia geoffrensis ) as they approached and captured dead fish tethered to a hydrophone in front of a star-shaped seven-hydrophone array. We demonstrate that botos gradually decrease interclick intervals and output levels during prey approaches, using stronger adjustment magnitudes than predicted from previous boto array data. Prey interceptions are characterised by high click rates, but although botos buzz during prey capture, they do so at lower click rates than marine toothed whales, resulting in a much more gradual transition from approach phase to buzzing. We also demonstrate for the first time that wild toothed whales broaden biosonar beamwidth when closing in on prey, as is also seen in captive toothed whales and bats, thus resulting in a larger ensonified volume around the prey, probably aiding prey tracking by decreasing the risk of prey evading ensonification. © 2017. Published by The Company of Biologists Ltd.
A New Analysis of Fireball Data from the Meteorite Observation and Recovery Project (MORP)
NASA Astrophysics Data System (ADS)
Campbell-Brown, M. D.; Hildebrand, A.
2004-12-01
Sixty fireball cameras operated in Western Canada from 1971 to 1985. Over one thousand (1016) fireballs were recorded at more than one station, but only 367 were reduced, of which 285 have been published, including that of the Innisfree meteorite. Digitization of all the data is underway, and procedures are being developed which will allow the automatic reduction of events not previously examined. The results of the analysis of 80 fireballs reduced but not previously published are presented. When the new analysis is complete, the MORP archive will be a valuable source of information on meteoroid orbits.
Cost Modeling for Space Optical Telescope Assemblies
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda
2011-01-01
Parametric cost models are used to plan missions, compare concepts and justify technology investments. This paper reviews an on-going effort to develop cost modes for space telescopes. This paper summarizes the methodology used to develop cost models and documents how changes to the database have changed previously published preliminary cost models. While the cost models are evolving, the previously published findings remain valid: it costs less per square meter of collecting aperture to build a large telescope than a small telescope; technology development as a function of time reduces cost; and lower areal density telescopes cost more than more massive telescopes.
Leading by Example? ALA Division Publications, Open Access, and Sustainability
ERIC Educational Resources Information Center
Hall, Nathan; Arnold-Garza, Sara; Gong, Regina; Shorish, Yasmeen
2016-01-01
This investigation explores scholarly communication business models in American Library Association (ALA) division peer-reviewed academic journals. Previous studies reveal the numerous issues organizations and publishers face in the academic publishing environment. Through an analysis of documented procedures, policies, and finances of five ALA…
Thomas Jefferson, Page Design, and Desktop Publishing.
ERIC Educational Resources Information Center
Hartley, James
1991-01-01
Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…