Sample records for introducing comparative analysis

  1. A cost analysis of introducing an infectious disease specialist-guided antimicrobial stewardship in an area with relatively low prevalence of antimicrobial resistance.

    PubMed

    Lanbeck, Peter; Ragnarson Tennvall, Gunnel; Resman, Fredrik

    2016-07-27

    Antimicrobial stewardship programs have been widely introduced in hospitals as a response to increasing antimicrobial resistance. Although such programs are commonly used, the long-term effects on antimicrobial resistance as well as societal economics are uncertain. We performed a cost analysis of an antimicrobial stewardship program introduced in Malmö, Sweden in 20 weeks 2013 compared with a corresponding control period in 2012. All direct costs and opportunity costs related to the stewardship intervention were calculated for both periods. Costs during the stewardship period were directly compared to costs in the control period and extrapolated to a yearly cost. Two main analyses were performed, one including only comparable direct costs (analysis one) and one including comparable direct and opportunity costs (analysis two). An extra analysis including all comparable direct costs including costs related to length of hospital stay (analysis three) was performed, but deemed as unrepresentative. According to analysis one, the cost per year was SEK 161 990 and in analysis two the cost per year was SEK 5 113. Since the two cohorts were skewed in terms of size and of infection severity as a consequence of the program, and since short-term patient outcomes have been demonstrated to be unchanged by the intervention, the costs pertaining to patient outcomes were not included in the analysis, and we suggest that analysis two provides the most correct cost calculation. In this analysis, the main cost drivers were the physician time and nursing time. A sensitivity analysis of analysis two suggested relatively modest variation under changing assumptions. The total yearly cost of introducing an infectious disease specialist-guided, audit-based antimicrobial stewardship in a department of internal medicine, including direct costs and opportunity costs, was calculated to be as low as SEK 5 113.

  2. Comparative study of standard space and real space analysis of quantitative MR brain data.

    PubMed

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  3. [Comparative analysis of Andean and Caribbean region health systems].

    PubMed

    Gómez-Camelo, Diana

    2005-01-01

    Carrying out a comparative analysis of Andean and Caribbean health systems contributing towards the general panorama of Andean and Caribbean region health care system experience. This study was aimed at carrying out a comparative analysis of health systems in Bolivia, Colombia, Ecuador, Peru, Venezuela, the Dominican Republic and Cuba between 1990 and 2004. Documentary information from secondary sources was used. Reform and changes during the aforementioned period were compared, as well as the systems' current configurations. Described typologies were used for studying the health systems. Different organisational designs were found for the systems: a national health system (NHS), segmented systems and systems based on mandatory insurance. The trend of reforms introduced in the 1990s and current proposals in almost all systems are directed towards adopting mandatory insurance via a basic packet of services and strengthening competition in providing services through a public and private mix. The organisation and structure of most systems studied have introduced and continue to introduce changes in line with international guidelines. The generality of these structures means that efforts must still be made to adopt designs strengthening them as instruments improving populations' quality of life. Comparative analysis is a tool leading to studying health systems and producing information which can nourish debate regarding current sector reform. This work took shape during the first approach to a comparative study of Andean region and Caribbean health systems.

  4. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  5. Religious Education in Russia: A Comparative and Critical Analysis

    ERIC Educational Resources Information Center

    Blinkova, Alexandra; Vermeer, Paul

    2018-01-01

    RE in Russia has been recently introduced as a compulsory regular school subject during the last year of elementary school. The present study offers a critical analysis of the current practice of Russian RE by comparing it with RE in Sweden, Denmark and Britain. This analysis shows that Russian RE is ambivalent. Although it is based on a…

  6. The Prospect of Responsive Spacecraft Using Aeroassisted, Trans-Atmospheric Maneuvers

    DTIC Science & Technology

    2014-06-19

    skip entry aeroassisted maneuvers. By overflying a geographically diverse set of sample ground targets, comparative analysis indicates a significant... analysis . Depending on the chosen re- circularization altitude, the coupled optimal design can achieve an inclination change of 19.91 deg with 50-85...third phase also introduces the “Maneuver Performance Number” as a dimensionless means of comparative effectiveness analysis for both exo- and trans

  7. Comparative Analysis of Western and Domestic Practice of Interactive Method Application in Teaching Social and Political Disciplines at the Universities

    ERIC Educational Resources Information Center

    Hladka, Halyna

    2014-01-01

    The comparative analysis of western and domestic practice of introducing active and interactive methods of studies in the process of teaching social science disciplines has been carried out. Features, realities, prospects and limitations in application of interactive methods of teaching in the process of implementing social-political science…

  8. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  9. A comparative analysis of terrestrial arthropod assemblages from a relict forest unveils historical extinctions and colonization differences between two oceanic islands

    PubMed Central

    Matthews, Thomas J.; Rego, Carla; Crespo, Luis; Aguiar, Carlos A. S.; Cardoso, Pedro; Rigal, François; Silva, Isamberto; Pereira, Fernando; Borges, Paulo A. V.; Serrano, Artur R. M.

    2018-01-01

    During the last few centuries oceanic island biodiversity has been drastically modified by human-mediated activities. These changes have led to the increased homogenization of island biota and to a high number of extinctions lending support to the recognition of oceanic islands as major threatspots worldwide. Here, we investigate the impact of habitat changes on the spider and ground beetle assemblages of the native forests of Madeira (Madeira archipelago) and Terceira (Azores archipelago) and evaluate its effects on the relative contribution of rare endemics and introduced species to island biodiversity patterns. We found that the native laurel forest of Madeira supported higher species richness of spiders and ground beetles compared with Terceira, including a much larger proportion of indigenous species, particularly endemics. In Terceira, introduced species are well-represented in both terrestrial arthropod taxa and seem to thrive in native forests as shown by the analysis of species abundance distributions (SAD) and occupancy frequency distributions (OFD). Low abundance range-restricted species in Terceira are mostly introduced species dispersing from neighbouring man-made habitats while in Madeira a large number of true rare endemic species can still be found in the native laurel forest. Further, our comparative analysis shows striking differences in species richness and composition that are due to the geographical and geological particularities of the two islands, but also seem to reflect the differences in the severity of human-mediated impacts between them. The high proportion of introduced species, the virtual absence of rare native species and the finding that the SADs and OFDs of introduced species match the pattern of native species in Terceira suggest the role of man as an important driver of species diversity in oceanic islands and add evidence for an extensive and severe human-induced species loss in the native forests of Terceira. PMID:29694360

  10. A comparative analysis of terrestrial arthropod assemblages from a relict forest unveils historical extinctions and colonization differences between two oceanic islands.

    PubMed

    Boieiro, Mário; Matthews, Thomas J; Rego, Carla; Crespo, Luis; Aguiar, Carlos A S; Cardoso, Pedro; Rigal, François; Silva, Isamberto; Pereira, Fernando; Borges, Paulo A V; Serrano, Artur R M

    2018-01-01

    During the last few centuries oceanic island biodiversity has been drastically modified by human-mediated activities. These changes have led to the increased homogenization of island biota and to a high number of extinctions lending support to the recognition of oceanic islands as major threatspots worldwide. Here, we investigate the impact of habitat changes on the spider and ground beetle assemblages of the native forests of Madeira (Madeira archipelago) and Terceira (Azores archipelago) and evaluate its effects on the relative contribution of rare endemics and introduced species to island biodiversity patterns. We found that the native laurel forest of Madeira supported higher species richness of spiders and ground beetles compared with Terceira, including a much larger proportion of indigenous species, particularly endemics. In Terceira, introduced species are well-represented in both terrestrial arthropod taxa and seem to thrive in native forests as shown by the analysis of species abundance distributions (SAD) and occupancy frequency distributions (OFD). Low abundance range-restricted species in Terceira are mostly introduced species dispersing from neighbouring man-made habitats while in Madeira a large number of true rare endemic species can still be found in the native laurel forest. Further, our comparative analysis shows striking differences in species richness and composition that are due to the geographical and geological particularities of the two islands, but also seem to reflect the differences in the severity of human-mediated impacts between them. The high proportion of introduced species, the virtual absence of rare native species and the finding that the SADs and OFDs of introduced species match the pattern of native species in Terceira suggest the role of man as an important driver of species diversity in oceanic islands and add evidence for an extensive and severe human-induced species loss in the native forests of Terceira.

  11. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  12. Lemonade's the Name, Simulation's the Game.

    ERIC Educational Resources Information Center

    Friel, Susan

    1983-01-01

    Provides a detailed description of Lemonade, a business game designed to introduce elementary and secondary students to the basics of business; i.e., problem solving strategies, hypothesis formulation and testing, trend analysis, prediction, comparative analysis, and effects of such factors as advertising and climatic conditions on sales and…

  13. Chiral Compounds and Green Chemistry in Undergraduate Organic Laboratories: Reduction of a Ketone by Sodium Borohydride and Baker's Yeast

    NASA Astrophysics Data System (ADS)

    Pohl, Nicola; Clague, Allen; Schwarz, Kimberly

    2002-06-01

    We describe an integrated set of experiments for the undergraduate organic laboratory that allows students to compare and contrast biological and chemical means of introducing chirality into a molecule. The racemic reduction of ethyl acetoacetate with sodium borohydride and the same reduction in the presence of a tartaric acid ligand are described, and a capillary gas chromatography column packed with a chiral material for product analysis is introduced. The results of these two hydride reactions are compared with the results of a common undergraduate experiment, the baker's yeast reduction of ethyl acetoacetate.

  14. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    PubMed Central

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-01-01

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis. PMID:28029121

  15. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    PubMed

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  16. Dimeric spectra analysis in Microsoft Excel: a comparative study.

    PubMed

    Gilani, A Ghanadzadeh; Moghadam, M; Zakerhamidi, M S

    2011-11-01

    The purpose of this work is to introduce the reader to an Add-in implementation, Decom. This implementation provides the whole processing requirements for analysis of dimeric spectra. General linear and nonlinear decomposition algorithms were integrated as an Excel Add-in for easy installation and usage. In this work, the results of several samples investigations were compared to those obtained by Datan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. A comparative cellular and molecular biology of longevity database.

    PubMed

    Stuart, Jeffrey A; Liang, Ping; Luo, Xuemei; Page, Melissa M; Gallagher, Emily J; Christoff, Casey A; Robb, Ellen L

    2013-10-01

    Discovering key cellular and molecular traits that promote longevity is a major goal of aging and longevity research. One experimental strategy is to determine which traits have been selected during the evolution of longevity in naturally long-lived animal species. This comparative approach has been applied to lifespan research for nearly four decades, yielding hundreds of datasets describing aspects of cell and molecular biology hypothesized to relate to animal longevity. Here, we introduce a Comparative Cellular and Molecular Biology of Longevity Database, available at ( http://genomics.brocku.ca/ccmbl/ ), as a compendium of comparative cell and molecular data presented in the context of longevity. This open access database will facilitate the meta-analysis of amalgamated datasets using standardized maximum lifespan (MLSP) data (from AnAge). The first edition contains over 800 data records describing experimental measurements of cellular stress resistance, reactive oxygen species metabolism, membrane composition, protein homeostasis, and genome homeostasis as they relate to vertebrate species MLSP. The purpose of this review is to introduce the database and briefly demonstrate its use in the meta-analysis of combined datasets.

  18. Evaluating the effect of the new incentive system for high-risk pressure ulcer patients on wound healing and cost-effectiveness: a cohort study.

    PubMed

    Sanada, Hiromi; Nakagami, Gojiro; Mizokami, Yuko; Minami, Yukiko; Yamamoto, Aya; Oe, Makoto; Kaitani, Toshiko; Iizaka, Shinji

    2010-03-01

    To evaluate the effectiveness and cost-effectiveness of new incentive system for pressure ulcer management, which focused on skilled nurse staffing in terms of rate of healing and medical costs. A prospective cohort study included two types of groups: 39 institutions, which introduced the new incentive system, and 20 non-introduced groups (control). Sixty-seven patients suffering from severe pressure ulcers in the introduced group and 38 patients in the non-introduced group were included. Wound healing and medical costs were monitored weekly for three weeks by their skilled nurses in charge. Healing status and related medical costs. The introduced group showed significantly higher rate of healing compared with the control group at each weekly assessment. Multiple regression analysis revealed that the introduction of the new incentive system was independently associated with the faster healing rate (beta=3.44, P<.001). The budget impact analysis demonstrated that introducing this system could reduce cost of treating severe pressure ulcers by 1.776 billion yen per year. The new incentive system for the management of pressure ulcers, which focused on staffing with skilled nurses can improve healing rate with reduced medical cost. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Accuracy of AFM force distance curves via direct solution of the Euler-Bernoulli equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eppell, Steven J., E-mail: steven.eppell@case.edu; Liu, Yehe; Zypman, Fredy R.

    2016-03-15

    In an effort to improve the accuracy of force-separation curves obtained from atomic force microscope data, we compare force-separation curves computed using two methods to solve the Euler-Bernoulli equation. A recently introduced method using a direct sequential forward solution, Causal Time-Domain Analysis, is compared against a previously introduced Tikhonov Regularization method. Using the direct solution as a benchmark, it is found that the regularization technique is unable to reproduce accurate curve shapes. Using L-curve analysis and adjusting the regularization parameter, λ, to match either the depth or the full width at half maximum of the force curves, the two techniquesmore » are contrasted. Matched depths result in full width at half maxima that are off by an average of 27% and matched full width at half maxima produce depths that are off by an average of 109%.« less

  20. Extremal entanglement and mixedness in continuous variable systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adesso, Gerardo; Serafini, Alessio; Illuminati, Fabrizio

    2004-08-01

    We investigate the relationship between mixedness and entanglement for Gaussian states of continuous variable systems. We introduce generalized entropies based on Schatten p norms to quantify the mixedness of a state and derive their explicit expressions in terms of symplectic spectra. We compare the hierarchies of mixedness provided by such measures with the one provided by the purity (defined as tr {rho}{sup 2} for the state {rho}) for generic n-mode states. We then review the analysis proving the existence of both maximally and minimally entangled states at given global and marginal purities, with the entanglement quantified by the logarithmic negativity.more » Based on these results, we extend such an analysis to generalized entropies, introducing and fully characterizing maximally and minimally entangled states for given global and local generalized entropies. We compare the different roles played by the purity and by the generalized p entropies in quantifying the entanglement and the mixedness of continuous variable systems. We introduce the concept of average logarithmic negativity, showing that it allows a reliable quantitative estimate of continuous variable entanglement by direct measurements of global and marginal generalized p entropies.« less

  1. The Development of a Monitoring System of Higher Education Quality in Ukraine and Germany: Comparative Component

    ERIC Educational Resources Information Center

    Chorna, Olga

    2015-01-01

    The article reveals specific features of functioning systems of higher education quality monitoring at the present stage, taking into account national traditions, historical experience and mentality of the population. The article introduces a comparative analysis of monitoring actors at national, regional and local levels in two countries. The…

  2. Discourse Analysis and Development of English Listening for Non-English Majors in China

    ERIC Educational Resources Information Center

    Ji, Yinxiu

    2015-01-01

    Traditional approach of listening teaching mainly focuses on the sentence level and regards the listening process in a passive and static way. To compensate for this deficiency, a new listening approach, that is, discourse-oriented approach has been introduced into the listening classroom. Although discourse analysis is a comparatively new field…

  3. Children's drawings as facilitators of communication: a meta-analysis.

    PubMed

    Driessnack, Martha

    2005-12-01

    In an attempt to explore new methods for accessing children's voices, this meta-analysis explores the facilitative effects of offering children the opportunity to draw as an interview strategy as compared with a traditional directed interview. Based on this analysis, introducing the opportunity to draw appears to be a relatively robust interview strategy with a large overall effect size (d = .95). Both research and clinical implications are discussed.

  4. Introducing conjoint analysis method into delayed lotteries studies: its validity and time stability are higher than in adjusting.

    PubMed

    Białek, Michał; Markiewicz, Łukasz; Sawicki, Przemysław

    2015-01-01

    The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship). However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal. The goal of this study was to introduce the novel method for analyzing delayed lotteries-conjoint analysis-which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 and 2), and they are more stable over time (Study 2) compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  5. Introducing Students to Protein Analysis Techniques: Separation and Comparative Analysis of Gluten Proteins in Various Wheat Strains

    ERIC Educational Resources Information Center

    Pirinelli, Alyssa L.; Trinidad, Jonathan C.; Pohl, Nicola L. B.

    2016-01-01

    Polyacrylamide gel electrophoresis (PAGE) is commonly taught in undergraduate laboratory classes as a traditional method to analyze proteins. An experiment has been developed to teach these basic protein gel skills in the context of gluten protein isolation from various types of wheat flour. A further goal is to relate this technique to current…

  6. Analysis and trade-off studies of large lightweight mirror structures. [large space telescope

    NASA Technical Reports Server (NTRS)

    Soosaar, K.; Grin, R.; Ayer, F.

    1975-01-01

    A candidate mirror, hexagonally lightweighted, is analyzed under various loadings using as complete a procedure as possible. Successive simplifications are introduced and compared to an original analysis. A model which is a reasonable compromise between accuracy and cost is found and is used for making trade-off studies of the various structural parameters of the lightweighted mirror.

  7. Analysis and quality control of carbohydrates in therapeutic proteins with fluorescence HPLC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Kun; Huang, Jian; Center for Informational Biology, University of Electronic Science and Technology of China, Chengdu 610054

    Conbercept is an Fc fusion protein with very complicated carbohydrate profiles which must be carefully monitored through manufacturing process. Here, we introduce an optimized fluorescence derivatization high-performance liquid chromatographic method for glycan mapping in conbercept. Compared with conventional glycan analysis method, this method has much better resolution and higher reproducibility making it excellent for product quality control.

  8. Comparing Eye Tracking with Electrooculography for Measuring Individual Sentence Comprehension Duration

    PubMed Central

    Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas

    2016-01-01

    The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125

  9. Ortholog Identification and Comparative Analysis of Microbial Genomes Using MBGD and RECOG.

    PubMed

    Uchiyama, Ikuo

    2017-01-01

    Comparative genomics is becoming an essential approach for identification of genes associated with a specific function or phenotype. Here, we introduce the microbial genome database for comparative analysis (MBGD), which is a comprehensive ortholog database among the microbial genomes available so far. MBGD contains several precomputed ortholog tables including the standard ortholog table covering the entire taxonomic range and taxon-specific ortholog tables for various major taxa. In addition, MBGD allows the users to create an ortholog table within any specified set of genomes through dynamic calculations. In particular, MBGD has a "My MBGD" mode where users can upload their original genome sequences and incorporate them into orthology analysis. The created ortholog table can serve as the basis for various comparative analyses. Here, we describe the use of MBGD and briefly explain how to utilize the orthology information during comparative genome analysis in combination with the stand-alone comparative genomics software RECOG, focusing on the application to comparison of closely related microbial genomes.

  10. A Comparative Study of Random Patterns for Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.

    2012-06-01

    Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.

  11. An interactive method based on the live wire for segmentation of the breast in mammography images.

    PubMed

    Zewei, Zhang; Tianyue, Wang; Li, Guo; Tingting, Wang; Lu, Xu

    2014-01-01

    In order to improve accuracy of computer-aided diagnosis of breast lumps, the authors introduce an improved interactive segmentation method based on Live Wire. This paper presents the Gabor filters and FCM clustering algorithm is introduced to the Live Wire cost function definition. According to the image FCM analysis for image edge enhancement, we eliminate the interference of weak edge and access external features clear segmentation results of breast lumps through improving Live Wire on two cases of breast segmentation data. Compared with the traditional method of image segmentation, experimental results show that the method achieves more accurate segmentation of breast lumps and provides more accurate objective basis on quantitative and qualitative analysis of breast lumps.

  12. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    NASA Astrophysics Data System (ADS)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  13. Significant genetic differentiation between native and introduced silver carp (Hypophthalmichthys molitrix) inferred from mtDNA analysis

    USGS Publications Warehouse

    Li, S.-F.; Xu, J.-W.; Yang, Q.-L.; Wang, C.H.; Chapman, D.C.; Lu, G.

    2011-01-01

    Silver carp Hypophthalmichthys molitrix (Cyprinidae) is native to China and has been introduced to over 80 countries. The extent of genetic diversity in introduced silver carp and the genetic divergence between introduced and native populations remain largely unknown. In this study, 241 silver carp sampled from three major native rivers and two non-native rivers (Mississippi River and Danube River) were analyzed using nucleotide sequences of mitochondrial COI gene and D-loop region. A total of 73 haplotypes were observed, with no haplotype found common to all the five populations and eight haplotypes shared by two to four populations. As compared with introduced populations, all native populations possess both higher haplotype diversity and higher nucleotide diversity, presumably a result of the founder effect. Significant genetic differentiation was revealed between native and introduced populations as well as among five sampled populations, suggesting strong selection pressures might have occurred in introduced populations. Collectively, this study not only provides baseline information for sustainable use of silver carp in their native country (i.e., China), but also offers first-hand genetic data for the control of silver carp in countries (e.g., the United States) where they are considered invasive.

  14. Decorrelation correction for nanoparticle tracking analysis of dilute polydisperse suspensions in bulk flow

    NASA Astrophysics Data System (ADS)

    Hartman, John; Kirby, Brian

    2017-03-01

    Nanoparticle tracking analysis, a multiprobe single particle tracking technique, is a widely used method to quickly determine the concentration and size distribution of colloidal particle suspensions. Many popular tools remove non-Brownian components of particle motion by subtracting the ensemble-average displacement at each time step, which is termed dedrifting. Though critical for accurate size measurements, dedrifting is shown here to introduce significant biasing error and can fundamentally limit the dynamic range of particle size that can be measured for dilute heterogeneous suspensions such as biological extracellular vesicles. We report a more accurate estimate of particle mean-square displacement, which we call decorrelation analysis, that accounts for correlations between individual and ensemble particle motion, which are spuriously introduced by dedrifting. Particle tracking simulation and experimental results show that this approach more accurately determines particle diameters for low-concentration polydisperse suspensions when compared with standard dedrifting techniques.

  15. Remarks on residual stress measurement by hole-drilling and electronic speckle pattern interferometry.

    PubMed

    Barile, Claudia; Casavola, Caterina; Pappalettera, Giovanni; Pappalettere, Carmine

    2014-01-01

    Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value.

  16. Remarks on Residual Stress Measurement by Hole-Drilling and Electronic Speckle Pattern Interferometry

    PubMed Central

    2014-01-01

    Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value. PMID:25276850

  17. An Analysis of Unemployment and Other Labor Market Indicators in 10 Countries.

    ERIC Educational Resources Information Center

    Moy, Joyanna

    1988-01-01

    Compares unemployment, employment, and related labor market statistics in the United States, Canada, Australia, Japan, France, Germany, Italy, the Netherlands, Sweden, and the United Kingdom. Introduces employment-to-population ratios by sex and discusses unemployment rates published by the Organization for Economic Cooperation and Development and…

  18. Manual for the Comparative Politics Laboratory: Conditions for Effective Democracy.

    ERIC Educational Resources Information Center

    Fogelman, Edwin; Zingale, Nancy

    This manual introduces undergraduate students in political science to major types of data and methods for cross-national quantitative analysis. The manual's topic, Conditions for Effective Democracy, was chosen because it incorporates several different kinds of data and illustrates various methodological problems. The data are cross-sectional…

  19. The Rhetoric of Chinese Layoff Memos

    ERIC Educational Resources Information Center

    Sisco, Lisa A.; Yu, Na

    2010-01-01

    In this analysis the authors introduce three memos announcing layoffs in Chinese companies. The three memos, translated from Chinese, are from: (1) Hewlett Packard China, an American company doing business in China; (2) UT Starcom, founded in China; and (3) Rizhao Steel, one of China's largest steel manufacturers. Comparing the Chinese and…

  20. Policy Expansion of School Choice in the American States

    ERIC Educational Resources Information Center

    Wong, Kenneth K.; Langevin, Warren E.

    2007-01-01

    This research study explores the policy expansion of school choice within the methodological approach of event history analysis. The first section provides a comparative overview of state adoption of public school choice laws. After creating a statistical portrait of the contemporary landscape for school choice, the authors introduce event history…

  1. Exploratory Bi-factor Analysis: The Oblique Case.

    PubMed

    Jennrich, Robert I; Bentler, Peter M

    2012-07-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (Psychometrika 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (Psychometrika 76:537-549, 2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bifactor rotation criterion designed to produce a rotated loading matrix that has an approximate bi-factor structure. Among other things this can be used as an aid in finding an explicit bi-factor structure for use in a confirmatory bi-factor analysis. They considered only orthogonal rotation. The purpose of this paper is to consider oblique rotation and to compare it to orthogonal rotation. Because there are many more oblique rotations of an initial loading matrix than orthogonal rotations, one expects the oblique results to approximate a bi-factor structure better than orthogonal rotations and this is indeed the case. A surprising result arises when oblique bi-factor rotation methods are applied to ideal data.

  2. An analysis of science content and representations in introductory college physics textbooks and multimodal learning resources

    NASA Astrophysics Data System (ADS)

    Donnelly, Suzanne M.

    This study features a comparative descriptive analysis of the physics content and representations surrounding the first law of thermodynamics as presented in four widely used introductory college physics textbooks representing each of four physics textbook categories (calculus-based, algebra/trigonometry-based, conceptual, and technical/applied). Introducing and employing a newly developed theoretical framework, multimodal generative learning theory (MGLT), an analysis of the multimodal characteristics of textbook and multimedia representations of physics principles was conducted. The modal affordances of textbook representations were identified, characterized, and compared across the four physics textbook categories in the context of their support of problem-solving. Keywords: college science, science textbooks, multimodal learning theory, thermodynamics, representations

  3. Performance of Renormalization Group Algebraic Turbulence Model on Boundary Layer Transition Simulation

    NASA Technical Reports Server (NTRS)

    Ahn, Kyung H.

    1994-01-01

    The RNG-based algebraic turbulence model, with a new method of solving the cubic equation and applying new length scales, is introduced. An analysis is made of the RNG length scale which was previously reported and the resulting eddy viscosity is compared with those from other algebraic turbulence models. Subsequently, a new length scale is introduced which actually uses the two previous RNG length scales in a systematic way to improve the model performance. The performance of the present RNG model is demonstrated by simulating the boundary layer flow over a flat plate and the flow over an airfoil.

  4. Application of wavelet and Fuorier transforms as powerful alternatives for derivative spectrophotometry in analysis of binary mixtures: A comparative study

    NASA Astrophysics Data System (ADS)

    Hassan, Said A.; Abdel-Gawad, Sherif A.

    2018-02-01

    Two signal processing methods, namely, Continuous Wavelet Transform (CWT) and the second was Discrete Fourier Transform (DFT) were introduced as alternatives to the classical Derivative Spectrophotometry (DS) in analysis of binary mixtures. To show the advantages of these methods, a comparative study was performed on a binary mixture of Naltrexone (NTX) and Bupropion (BUP). The methods were compared by analyzing laboratory prepared mixtures of the two drugs. By comparing performance of the three methods, it was proved that CWT and DFT methods are more efficient and advantageous in analysis of mixtures with overlapped spectra than DS. The three signal processing methods were adopted for the quantification of NTX and BUP in pure and tablet forms. The adopted methods were validated according to the ICH guideline where accuracy, precision and specificity were found to be within appropriate limits.

  5. Status of nuclear PDFs after the first LHC p-Pb run

    NASA Astrophysics Data System (ADS)

    Paukkunen, Hannu

    2017-11-01

    In this talk, I overview the recent progress on the global analysis of nuclear parton distribution functions (nuclear PDFs). After first introducing the contemporary fits, the analysis procedures are quickly recalled and the ambiguities in the use of experimental data outlined. Various nuclear-PDF parametrizations are compared and the main differences explained. The effects of nuclear PDFs in the LHC p-Pb hard-process observables are discussed and some future prospects sketched.

  6. Coagulation dynamics of a blood sample by multiple scattering analysis

    NASA Astrophysics Data System (ADS)

    Faivre, Magalie; Peltié, Philippe; Planat-Chrétien, Anne; Cosnier, Marie-Line; Cubizolles, Myriam; Nougier, Christophe; Négrier, Claude; Pouteau, Patrick

    2011-05-01

    We report a new technique to measure coagulation dynamics on whole-blood samples. The method relies on the analysis of the speckle figure resulting from a whole-blood sample mixed with coagulation reagent and introduced in a thin chamber illuminated with a coherent light. A dynamic study of the speckle reveals a typical behavior due to coagulation. We compare our measured coagulation times to a reference method obtained in a medical laboratory.

  7. Comparative Analyses of Plastid Sequences between Native and Introduced Populations of Aquatic Weeds Elodea canadensis and E. nuttallii

    PubMed Central

    Huotari, Tea; Korpelainen, Helena

    2013-01-01

    Non-indigenous species (NIS) are species living outside their historic or native range. Invasive NIS often cause severe environmental impacts, and may have large economical and social consequences. Elodea (Hydrocharitaceae) is a New World genus with at least five submerged aquatic angiosperm species living in fresh water environments. Our aim was to survey the geographical distribution of cpDNA haplotypes within the native and introduced ranges of invasive aquatic weeds Elodea canadensis and E. nuttallii and to reconstruct the spreading histories of these invasive species. In order to reveal informative chloroplast (cp) genome regions for phylogeographic analyses, we compared the plastid sequences of native and introduced individuals of E. canadensis. In total, we found 235 variable sites (186 SNPs, 47 indels and two inversions) between the two plastid sequences consisting of 112,193 bp and developed primers flanking the most variable genomic areas. These 29 primer pairs were used to compare the level and pattern of intraspecific variation within E. canadensis to interspecific variation between E. canadensis and E. nuttallii. Nine potentially informative primer pairs were used to analyze the phylogeographic structure of both Elodea species, based on 70 E. canadensis and 25 E. nuttallii individuals covering native and introduced distributions. On the whole, the level of variation between the two Elodea species was 53% higher than that within E. canadensis. In our phylogeographic analysis, only a single haplotype was found in the introduced range in both species. These haplotypes H1 (E. canadensis) and A (E. nuttallii) were also widespread in the native range, covering the majority of native populations analyzed. Therefore, we were not able to identify either the geographic origin of the introduced populations or test the hypothesis of single versus multiple introductions. The divergence between E. canadensis haplotypes was surprisingly high, and future research may clarify mechanisms that structure native E. canadensis populations. PMID:23620722

  8. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  9. Improving Critical Thinking "via" Authenticity: The CASPiE Research Experience in a Military Academy Chemistry Course

    ERIC Educational Resources Information Center

    Chase, A. M.; Clancy, H. A.; Lachance, R. P.; Mathison, B. M.; Chiu, M. M.; Weaver, G. C.

    2017-01-01

    Course-based undergraduate research experiences (CUREs) can introduce many students to authentic research activities in a cost-effective manner. Past studies have shown that students who participated in CUREs report greater interest in chemistry, better data collection and analysis skills, and enhanced scientific reasoning compared to traditional…

  10. Tale of Two Tales: Locally Produced Accounts and Memberships during Research Interviews with a Multilingual Speaker

    ERIC Educational Resources Information Center

    Mori, Junko

    2012-01-01

    A growing number of studies have examined qualitative research interviews in terms of how researchers' own identities and agendas are implicated in the construction of interviewees' responses. Adopting the constructionist conception of research interviews, the current study introduces a comparative analysis of 2 interviews with a multilingual…

  11. Comparing Effects of School Inspections in Sweden and Austria

    ERIC Educational Resources Information Center

    Kemethofer, David; Gustafsson, Jan-Eric; Altrichter, Herbert

    2017-01-01

    In recent years, school inspections have been newly introduced or adapted to the evidence-based governance logic in many European countries. So far, empirical research on the impact of school inspections has produced inconclusive results. Methodologically, it has mainly focussed on analysis of a national inspection model and used cross-sectional…

  12. Project-Based Learning in Post-WWII Japanese School Curriculum: An Analysis via Curriculum Orientations

    ERIC Educational Resources Information Center

    Nomura, Kazuyuki

    2017-01-01

    In the 2000s, the new national curriculum, dubbed as the "yutori curriculum," introduced a new subject for project-based learning "Integrated Study" as its prominent feature. Comparing curriculum orientations in project-based learning in three historical periods after the WWII including Integrated Study, this paper aims to…

  13. Japan's Higher Education Incorporation Policy: A Comparative Analysis of Three Stages of National University Governance

    ERIC Educational Resources Information Center

    Hanada, Shingo

    2013-01-01

    A number of countries with public higher education systems have implemented privatisation policies. In Japan, the national government introduced the National University Corporation Act (NUCA) in 2004 and changed the legal status of national universities from that of government-owned public institutions to independent administrative agencies. Its…

  14. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  15. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Hot-compress: A new postdeposition treatment for ZnO-based flexible dye-sensitized solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haque Choudhury, Mohammad Shamimul, E-mail: shamimul129@gmail.com; Department of Electrical and Electronic Engineering, International Islamic University Chittagong, b154/a, College Road, Chittagong 4203; Kishi, Naoki

    2016-08-15

    Highlights: • A new postdeposition treatment named hot-compress is introduced. • Hot-compression gives homogeneous compact layer ZnO photoanode. • I-V and EIS analysis data confirms the efficacy of this method. • Charge transport resistance was reduced by the application of hot-compression. - Abstract: This article introduces a new postdeposition treatment named hot-compress for flexible zinc oxide–base dye-sensitized solar cells. This postdeposition treatment includes the application of compression pressure at an elevated temperature. The optimum compression pressure of 130 Ma at an optimum compression temperature of 70 °C heating gives better photovoltaic performance compared to the conventional cells. The aptness ofmore » this method was confirmed by investigating scanning electron microscopy image, X-ray diffraction, current-voltage and electrochemical impedance spectroscopy analysis of the prepared cells. Proper heating during compression lowers the charge transport resistance, longer the electron lifetime of the device. As a result, the overall power conversion efficiency of the device was improved about 45% compared to the conventional room temperature compressed cell.« less

  17. Optimal Sharpening of Compensated Comb Decimation Filters: Analysis and Design

    PubMed Central

    Troncoso Romero, David Ernesto

    2014-01-01

    Comb filters are a class of low-complexity filters especially useful for multistage decimation processes. However, the magnitude response of comb filters presents a droop in the passband region and low stopband attenuation, which is undesirable in many applications. In this work, it is shown that, for stringent magnitude specifications, sharpening compensated comb filters requires a lower-degree sharpening polynomial compared to sharpening comb filters without compensation, resulting in a solution with lower computational complexity. Using a simple three-addition compensator and an optimization-based derivation of sharpening polynomials, we introduce an effective low-complexity filtering scheme. Design examples are presented in order to show the performance improvement in terms of passband distortion and selectivity compared to other methods based on the traditional Kaiser-Hamming sharpening and the Chebyshev sharpening techniques recently introduced in the literature. PMID:24578674

  18. Optimal sharpening of compensated comb decimation filters: analysis and design.

    PubMed

    Troncoso Romero, David Ernesto; Laddomada, Massimiliano; Jovanovic Dolecek, Gordana

    2014-01-01

    Comb filters are a class of low-complexity filters especially useful for multistage decimation processes. However, the magnitude response of comb filters presents a droop in the passband region and low stopband attenuation, which is undesirable in many applications. In this work, it is shown that, for stringent magnitude specifications, sharpening compensated comb filters requires a lower-degree sharpening polynomial compared to sharpening comb filters without compensation, resulting in a solution with lower computational complexity. Using a simple three-addition compensator and an optimization-based derivation of sharpening polynomials, we introduce an effective low-complexity filtering scheme. Design examples are presented in order to show the performance improvement in terms of passband distortion and selectivity compared to other methods based on the traditional Kaiser-Hamming sharpening and the Chebyshev sharpening techniques recently introduced in the literature.

  19. A Simple Deep Learning Method for Neuronal Spike Sorting

    NASA Astrophysics Data System (ADS)

    Yang, Kai; Wu, Haifeng; Zeng, Yu

    2017-10-01

    Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.

  20. On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.

    PubMed

    Li, Bing; Chun, Hyonho; Zhao, Hongyu

    2014-09-01

    We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.

  1. Revisiting an old concept: the coupled oscillator model for VCD. Part 1: the generalised coupled oscillator mechanism and its intrinsic connection to the strength of VCD signals.

    PubMed

    Nicu, Valentin Paul

    2016-08-03

    Motivated by the renewed interest in the coupled oscillator (CO) model for VCD, in this work a generalised coupled oscillator (GCO) expression is derived by introducing the concept of a coupled oscillator origin. Unlike the standard CO expression, the GCO expression is exact within the harmonic approximation. Using two illustrative example molecules, the theoretical concepts introduced here are demonstrated by performing a GCO decomposition of the rotational strengths computed using DFT. This analysis shows that: (1) the contributions to the rotational strengths that are normally neglected in the standard CO model can be comparable to or larger than the CO contribution, and (2) the GCO mechanism introduced here can affect the VCD intensities of all types of modes in symmetric and asymmetric molecules.

  2. Introduction Analysis of Refrigerating and Air-Conditioning Technologies in Micro Grid Type Food Industrial Park

    NASA Astrophysics Data System (ADS)

    Shimazaki, Yoichi

    The aim of this study was to evaluate the refrigerating and air-conditioning technologies in cases of introducing both cogeneration system and energy network in food industrial park. The energy data of 14 factories were classified into steam, hot water, heating, cooling, refrigerating, freezing and electric power by interviews. The author developed a micro grid model based on linear programming so as to minimize the total system costs. The industrial park was divided into the 2,500 square meter mesh in order to take steam transport into consideration. Four cases were investigated. It was found that the electric power driven freezer was introduced compared with the ammonia absorption freezer. The ammonia absorption freezer was introduced in the factory that there is a little steam demand and large freezing demand at the same time.

  3. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. dc analysis and design of zero-voltage-switched multi-resonant converters

    NASA Astrophysics Data System (ADS)

    Tabisz, Wojciech A.; Lee, Fred C.

    Recently introduced multiresonant converters (MRCs) provide zero-voltage switching (ZVS) of both active and passive switches and offer a substantial reduction of transistor voltage stress and an increase of load range, compared to their quasi-resonant converter counterparts. Using the resonant switch concept, a simple, generalized analysis of ZVS MRCs is presented. The conversion ratio and voltage stress characteristics are derived for basic ZVS MRCs, including buck, boost, and buck/boost converters. Based on the analysis, a design procedure that optimizes the selection of resonant elements for maximum conversion efficiency is proposed.

  5. Insight into the collagen assembly in the presence of lysine and glutamic acid: An in vitro study.

    PubMed

    Liu, Xinhua; Dan, Nianhua; Dan, Weihua

    2017-01-01

    The aim of this study is to evaluate the effects of two different charged amino acids in collagen chains, lysine and glutamic acid, on the fibrillogenesis process of collagen molecules. The turbidity, zeta potential, and fiber diameter analysis suggest that introducing the positively charged lysine into collagen might improve the sizes or amounts of the self-assembled collagen fibrils significantly. Conversely, the negatively charged glutamic acid might restrict the self-assembly of collagen building blocks into a higher order structure. Meanwhile, the optimal fibrillogenesis condition is achieved when the concentration of lysine reaches to 1mM. Both scanning electron microscopy (SEM) and atomic force microscope (AFM) analysis indicates that compared to pure collagen fibrils, the reconstructed collagen-lysine co-fibrils exhibit a higher degree of inter-fiber entanglements with more straight and longer fibrils. Noted that the specific D-period patterns of the reconstructed collagen fibrils could be clearly discernible and the width of D-banding increases steadily after introducing lysine. Besides, the kinetic and thermodynamic collagen self-assembly analysis confirms that the rate constants of both the first and second assembly phase decrease after introducing lysine, and lysine could promote the process of collagen fibrillogenesis obeying the laws of thermodynamics. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less

  7. Analysis of Streamline Separation at Infinity Using Time-Discrete Markov Chains.

    PubMed

    Reich, W; Scheuermann, G

    2012-12-01

    Existing methods for analyzing separation of streamlines are often restricted to a finite time or a local area. In our paper we introduce a new method that complements them by allowing an infinite-time-evaluation of steady planar vector fields. Our algorithm unifies combinatorial and probabilistic methods and introduces the concept of separation in time-discrete Markov-Chains. We compute particle distributions instead of the streamlines of single particles. We encode the flow into a map and then into a transition matrix for each time direction. Finally, we compare the results of our grid-independent algorithm to the popular Finite-Time-Lyapunov-Exponents and discuss the discrepancies.

  8. An improved optimization algorithm and Bayes factor termination criterion for sequential projection pursuit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Jarman, Kristin H.; Harvey, Scott D.

    2005-05-28

    A fundamental problem in analysis of highly multivariate spectral or chromatographic data is reduction of dimensionality. Principal components analysis (PCA), concerned with explaining the variance-covariance structure of the data, is a commonly used approach to dimension reduction. Recently an attractive alternative to PCA, sequential projection pursuit (SPP), has been introduced. Designed to elicit clustering tendencies in the data, SPP may be more appropriate when performing clustering or classification analysis. However, the existing genetic algorithm (GA) implementation of SPP has two shortcomings, computation time and inability to determine the number of factors necessary to explain the majority of the structure inmore » the data. We address both these shortcomings. First, we introduce a new SPP algorithm, a random scan sampling algorithm (RSSA), that significantly reduces computation time. We compare the computational burden of the RSS and GA implementation for SPP on a dataset containing Raman spectra of twelve organic compounds. Second, we propose a Bayes factor criterion, BFC, as an effective measure for selecting the number of factors needed to explain the majority of the structure in the data. We compare SPP to PCA on two datasets varying in type, size, and difficulty; in both cases SPP achieves a higher accuracy with a lower number of latent variables.« less

  9. Interactive visual optimization and analysis for RFID benchmarking.

    PubMed

    Wu, Yingcai; Chung, Ka-Kei; Qu, Huamin; Yuan, Xiaoru; Cheung, S C

    2009-01-01

    Radio frequency identification (RFID) is a powerful automatic remote identification technique that has wide applications. To facilitate RFID deployment, an RFID benchmarking instrument called aGate has been invented to identify the strengths and weaknesses of different RFID technologies in various environments. However, the data acquired by aGate are usually complex time varying multidimensional 3D volumetric data, which are extremely challenging for engineers to analyze. In this paper, we introduce a set of visualization techniques, namely, parallel coordinate plots, orientation plots, a visual history mechanism, and a 3D spatial viewer, to help RFID engineers analyze benchmark data visually and intuitively. With the techniques, we further introduce two workflow procedures (a visual optimization procedure for finding the optimum reader antenna configuration and a visual analysis procedure for comparing the performance and identifying the flaws of RFID devices) for the RFID benchmarking, with focus on the performance analysis of the aGate system. The usefulness and usability of the system are demonstrated in the user evaluation.

  10. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  11. A Geometric Approach to Fair Division

    ERIC Educational Resources Information Center

    Barbanel, Julius

    2010-01-01

    We wish to divide a cake among some collection of people (who may have very different notions of the comparative value of pieces of cake) in a way that is both "fair" and "efficient." We explore the meaning of these terms, introduce two geometric tools to aid our analysis, and present a proof (due to Dietrich Weller) that establishes the existence…

  12. English Learner "Curricular Streams" in Four Middle Schools: Triage in the Trenches

    ERIC Educational Resources Information Center

    Estrada, Peggy

    2014-01-01

    Little is known about the curricular experiences schools provide English learner students (ELs) to meet the dual goals of attaining English language proficiency (ELP) and grade-level achievement. I introduce the concept of "Curricular Streams" to provide a more nuanced comparative analysis of four urban middle schools, focusing on: (a)…

  13. What Makes the Difference? An Analysis of a Reading Intervention Programme Implemented in Rural Schools in Cambodia

    ERIC Educational Resources Information Center

    Courtney, Jane; Gravelle, Maggie

    2014-01-01

    This article compares the existing single-strategy approach towards the teaching of early literacy in schools in rural Cambodia with a multiple-strategy approach introduced as part of a reading intervention programme. Classroom observations, questionnaires and in-depth interviews with teachers were used to explore teachers' practices and…

  14. A computational model to compare different investment scenarios for mini-stereotactic frame approach to deep brain stimulation surgery.

    PubMed

    Lanotte, M; Cavallo, M; Franzini, A; Grifi, M; Marchese, E; Pantaleoni, M; Piacentino, M; Servello, D

    2010-09-01

    Deep brain stimulation (DBS) alleviates symptoms of many neurological disorders by applying electrical impulses to the brain by means of implanted electrodes, generally put in place using a conventional stereotactic frame. A new image guided disposable mini-stereotactic system has been designed to help shorten and simplify DBS procedures when compared to standard stereotaxy. A small number of studies have been conducted which demonstrate localization accuracies of the system similar to those achievable by the conventional frame. However no data are available to date on the economic impact of this new frame. The aim of this paper was to develop a computational model to evaluate the investment required to introduce the image guided mini-stereotactic technology for stereotactic DBS neurosurgery. A standard DBS patient care pathway was developed and related costs were analyzed. A differential analysis was conducted to capture the impact of introducing the image guided system on the procedure workflow. The analysis was carried out in five Italian neurosurgical centers. A computational model was developed to estimate upfront investments and surgery costs leading to a definition of the best financial option to introduce the new frame. Investments may vary from Euro 1.900 (purchasing of Image Guided [IG] mini-stereotactic frame only) to Euro 158.000.000. Moreover the model demonstrates how the introduction of the IG mini-stereotactic frame doesn't substantially affect the DBS procedure costs.

  15. Bayesian data analysis in observational comparative effectiveness research: rationale and examples.

    PubMed

    Olson, William H; Crivera, Concetta; Ma, Yi-Wen; Panish, Jessica; Mao, Lian; Lynch, Scott M

    2013-11-01

    Many comparative effectiveness research and patient-centered outcomes research studies will need to be observational for one or both of two reasons: first, randomized trials are expensive and time-consuming; and second, only observational studies can answer some research questions. It is generally recognized that there is a need to increase the scientific validity and efficiency of observational studies. Bayesian methods for the design and analysis of observational studies are scientifically valid and offer many advantages over frequentist methods, including, importantly, the ability to conduct comparative effectiveness research/patient-centered outcomes research more efficiently. Bayesian data analysis is being introduced into outcomes studies that we are conducting. Our purpose here is to describe our view of some of the advantages of Bayesian methods for observational studies and to illustrate both realized and potential advantages by describing studies we are conducting in which various Bayesian methods have been or could be implemented.

  16. Flexural torsional buckling of uniformly compressed beam-like structures

    NASA Astrophysics Data System (ADS)

    Ferretti, M.

    2018-02-01

    A Timoshenko beam model embedded in a 3D space is introduced for buckling analysis of multi-store buildings, made by rigid floors connected by elastic columns. The beam model is developed via a direct approach, and the constitutive law, accounting for prestress forces, is deduced via a suitable homogenization procedure. The bifurcation analysis for the case of uniformly compressed buildings is then addressed, and numerical results concerning the Timoshenko model are compared with 3D finite element analyses. Finally, some conclusions and perspectives are drawn.

  17. Power Spectral Density Error Analysis of Spectral Subtraction Type of Speech Enhancement Methods

    NASA Astrophysics Data System (ADS)

    Händel, Peter

    2006-12-01

    A theoretical framework for analysis of speech enhancement algorithms is introduced for performance assessment of spectral subtraction type of methods. The quality of the enhanced speech is related to physical quantities of the speech and noise (such as stationarity time and spectral flatness), as well as to design variables of the noise suppressor. The derived theoretical results are compared with the outcome of subjective listening tests as well as successful design strategies, performed by independent research groups.

  18. Virtual Tool Mark Generation for Efficient Striation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguishedmore » known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.« less

  19. SICR rumor spreading model in complex networks: Counterattack and self-resistance

    NASA Astrophysics Data System (ADS)

    Zan, Yongli; Wu, Jianliang; Li, Ping; Yu, Qinglin

    2014-07-01

    Rumor is an important form of social interaction. However, spreading of harmful rumors could have a significant negative impact on the well-being of the society. In this paper, considering the counterattack mechanism of the rumor spreading, we introduce two new models: Susceptible-Infective-Counterattack-Refractory (SICR) model and adjusted-SICR model. We then derive mean-field equations to describe their dynamics in homogeneous networks and conduct the steady-state analysis. We also introduce the self-resistance parameter τ, and study the influence of this parameter on rumor spreading. Numerical simulations are performed to compare the SICR model with the SIR model and the adjusted-SICR model, respectively, and we investigate the spreading peak of the rumor and the final size of the rumor with various parameters. Simulation results are congruent exactly with the theoretical analysis. The experiment reveals some interesting patterns of rumor spreading involved with counterattack force.

  20. Thermal Analysis of Reinforced Concrete Tank for Conditioning Wood by FEM Method

    NASA Astrophysics Data System (ADS)

    Błaszczyński, Tomasz; Babiak, Michał; Wielentejczyk, Przemysław

    2017-10-01

    The article introduces the analysis of a RC tank for conditioning wood carried out using the FEM (Finite Element Method). A temperature gradient distribution increase resulting from the influence of hot liquid filling the tank was defined. Values of gradients in border sections of the tank walls and the bottom were defined on the basis of the isotherm method. The obtained results were compared with empirical formulas from literature. Strength analyses were also carried out. Additionally, the problematic aspects of elongated monolithic tanks for liquids were introduced, especially regarding large temperature gradients and the means of necessary technical solutions. The use of the FEM method for designing engineering objects is, nowadays, an irreplaceable solution. In the case of the discussed tank, a spatial model of the construction mapping its actual performance was constructed in order to correctly estimate the necessary dimensions of wall and bottom sections, as well as reinforcement.

  1. Method and apparatus for continuously referenced analysis of reactive components in solution

    DOEpatents

    Bostick, W.D.; Denton, M.S.; Dinsmore, S.R.

    1979-07-31

    A continuously referenced apparatus for measuring the concentration of a reactive chemical species in solution comprises in combination conduit means for introducing a sample solution, means for introducing one or more reactants into a sample solution, and a stream separator disposed within the conduit means for separating the sample solution into a first sample stream and a second sample stream. A reactor is disposed in fluid communication with the first sample stream. A reaction takes place between the reactants introduced and the reactive chemical species of interest, causing the consumption or production of an indicator species in the first sample stream. Measurement means such as a photometric system are disposed in communication with the first and second sample streams, and the outputs of the measurement means are compared to provide a blanked measurement of the concentration of indicator species. The apparatus is particularly suitable for measurement of isoenzymes in body tissues or fluids.

  2. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  3. Cost and Time Effectiveness Analysis of a Telemedicine Service in Bangladesh.

    PubMed

    Sorwar, Golam; Rahamn, Md Mustafizur; Uddin, Ramiz; Hoque, Md Rakibul

    2016-01-01

    Telemedicine has great potential to overcome geographical barriers to providing access to equal health care services, particularly for people living in remote and rural areas in developing countries like Bangladesh. A number of telemedicine systems have been implemented in Bangladesh. However, no significant studies have been conducted to determine either their cost effectiveness or efficiency in reducing travel time required by patients. In addition, very few studies have analyzed the attitude and level of satisfaction of telemedicine service recipients in Bangladesh. The aim of this study was to analyze the cost and time effectiveness of a telemedicine service, implemented through locally developed PC based diagnostic equipment and software in Bangladesh, compared to conventional means of providing those services. The study revealed that the introduced telemedicine service reduced cost and travel time on average by 56% and 94% respectively compared to its counterpart conventional approach. The study also revealed that majority of users were highly satisfied with the newly introduced telemedicine service. Therefore, the introduced telemedicine service can be considered as a low cost and time efficient health service solution to improve health care facilities in the remote rural areas in Bangladesh.

  4. The digital storytelling process: A comparative analysis from various experts

    NASA Astrophysics Data System (ADS)

    Hussain, Hashiroh; Shiratuddin, Norshuhada

    2016-08-01

    Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.

  5. Effects of the change in cutoff values for human epidermal growth factor receptor 2 status by immunohistochemistry and fluorescence in situ hybridization: a study comparing conventional brightfield microscopy, image analysis-assisted microscopy, and interobserver variation.

    PubMed

    Atkinson, Roscoe; Mollerup, Jens; Laenkholm, Anne-Vibeke; Verardo, Mark; Hawes, Debra; Commins, Deborah; Engvad, Birte; Correa, Adrian; Ehlers, Charlotte Cort; Nielsen, Kirsten Vang

    2011-08-01

    New guidelines for HER2 testing have been introduced. To evaluate the difference in HER2 assessment after introduction of new cutoff levels for both immunohistochemistry (IHC) and fluorescence in situ hybridization (FISH) and to compare interobserver agreement and time to score between image analysis and conventional microscopy. Samples from 150 patients with breast cancer were scored by 7 pathologists using conventional microscopy, with a cutoff of both 10% and 30% IHC-stained cells, and using automated microscopy with image analysis. The IHC results were compared individually and to HER2 status as determined by FISH, using both the approved cutoff of 2.0 and the recently introduced cutoff of 2.2. High concordance was found in IHC scoring among the 7 pathologists. The 30% cutoff led to slightly fewer positive IHC observations. Introduction of a FISH equivocal zone affected 4% of the FISH scores. If cutoff for FISH is kept at 2.0, no difference in patient selection is found between the 10% and the 30% IHC cutoff. Among the 150 breast cancer samples, the new 30% IHC and 2.2 FISH cutoff levels resulted in one case without a firm diagnosis because both IHC and FISH were equivocal. Automated microscopy and image analysis-assisted IHC led to significantly better interobserver agreement among the 7 pathologists, with an increase in mean scoring time of only about 30 seconds per slide. The change in cutoff levels led to a higher concordance between IHC and FISH, but fewer samples were classified as HER2 positive.

  6. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  7. A Conceptual Analysis of State Support for Higher Education: Appropriations versus Need-Based Financial Aid

    ERIC Educational Resources Information Center

    Toutkoushian, Robert K.; Shafiq, M. Najeeb

    2010-01-01

    In this paper, we use economic concepts to examine the choice that states make between giving appropriations to public colleges or need-based financial aid to students. We begin by reviewing the economic justification for state support for higher education. Next, we introduce a simple economic model for comparing and contrasting appropriations and…

  8. Research studies of aging changes of hyaline cartilage surface by using Raman-scattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Timchenko, E. V.; Timchenko, P. E.; Dolgushkin, D. A.; Volova, L. T.; Lazarev, V. A.; Tyumchenkova, A. S.; Markova, M. D.

    2017-08-01

    The paper presents the results of a comparative analysis by the method of Raman spectroscopy of the joint hyaline cartilage of adults and children. Differences in the spectral characteristics of the surface of articular cartilage are shown. New optical coefficients have been introduced, which make it possible to evaluate the age-related changes in cartilaginous tissue.

  9. Continuous Training and Wages: An Empirical Analysis Using a Comparison-Group Approach

    ERIC Educational Resources Information Center

    Gorlitz, Katja

    2011-01-01

    Using German linked employer-employee data, this paper investigates the short-term impact of on-the-job training on wages. The applied estimation approach was first introduced by Leuven and Oosterbeek (2008). Wages of employees who intended to participate in training but did not do so because of a random event are compared to wages of training…

  10. Methodological Challenges in Researching Threshold Concepts: A Comparative Analysis of Three Projects

    ERIC Educational Resources Information Center

    Quinlan, K. M.; Male, S.; Baillie, C.; Stamboulis, A.; Fill, J.; Jaffer, Z.

    2013-01-01

    Threshold concepts were introduced nearly 10 years ago by Ray Land and Jan Meyer. This work has spawned four international conferences and hundreds of papers. Although the idea has clearly gained traction in higher education, this sub-field does not yet have a fully fledged research methodology or a strong critical discourse about methodology.…

  11. Helminth species richness of introduced and native grey mullets (Teleostei: Mugilidae).

    PubMed

    Sarabeev, Volodimir

    2015-08-01

    Quantitative complex analyses of parasite communities of invaders across different native and introduced populations are largely lacking. The present study provides a comparative analysis of species richness of helminth parasites in native and invasive populations of grey mullets. The local species richness differed between regions and host species, but did not differ when compared with invasive and native hosts. The size of parasite assemblages of endohelminths was higher in the Mediterranean and Azov-Black Seas, while monogeneans were the most diverse in the Sea of Japan. The helminth diversity was apparently higher in the introduced population of Liza haematocheilus than that in their native habitat, but this trend could not be confirmed when the size of geographic range and sampling efforts were controlled for. The parasite species richness at the infracommunity level of the invasive host population is significantly lower compared with that of the native host populations that lends support to the enemy release hypothesis. A distribution pattern of the infracommunity richness of acquired parasites by the invasive host can be characterized as aggregated and it is random in native host populations. Heterogeneity in the host susceptibility and vulnerability to acquired helminth species was assumed to be a reason of the aggregation of species numbers in the population of the invasive host. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Introducing conjoint analysis method into delayed lotteries studies: its validity and time stability are higher than in adjusting

    PubMed Central

    Białek, Michał; Markiewicz, Łukasz; Sawicki, Przemysław

    2015-01-01

    The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship). However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal. The goal of this study was to introduce the novel method for analyzing delayed lotteries—conjoint analysis—which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 and 2), and they are more stable over time (Study 2) compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods. PMID:25674069

  13. High School Class for Gifted Pupils in Physics and Sciences and Pupils' Skills Measured by Standard and Pisa Test

    NASA Astrophysics Data System (ADS)

    Djordjevic, G. S.; Pavlovic-Babic, D.

    2010-01-01

    The "High school class for students with special abilities in physics" was founded in Nis, Serbia (www.pmf.ni.ac.yu/f_odeljenje) in 2003. The basic aim of this project has been introducing a broadened curriculum of physics, mathematics, computer science, as well as chemistry and biology. Now, six years after establishing of this specialized class, and 3 years after the previous report, we present analyses of the pupils' skills in solving rather problem oriented test, as PISA test, and compare their results with the results of pupils who study under standard curricula. More precisely results are compared to the progress results of the pupils in a standard Grammar School and the corresponding classes of the Mathematical Gymnasiums in Nis. Analysis of achievement data should clarify what are benefits of introducing in school system track for gifted students. Additionally, item analysis helps in understanding and improvement of learning strategies' efficacy. We make some conclusions and remarks that may be useful for the future work that aims to increase pupils' intrinsic and instrumental motivation for physics and sciences, as well as to increase the efficacy of teaching physics and science.

  14. GENOME-WIDE COMPARATIVE ANALYSIS OF PHYLOGENETIC TREES: THE PROKARYOTIC FOREST OF LIFE

    PubMed Central

    Puigbò, Pere; Wolf, Yuri I.; Koonin, Eugene V.

    2013-01-01

    Genome-wide comparison of phylogenetic trees is becoming an increasingly common approach in evolutionary genomics, and a variety of approaches for such comparison have been developed. In this article we present several methods for comparative analysis of large numbers of phylogenetic trees. To compare phylogenetic trees taking into account the bootstrap support for each internal branch, the Boot-Split Distance (BSD) method is introduced as an extension of the previously developed Split Distance (SD) method for tree comparison. The BSD method implements the straightforward idea that comparison of phylogenetic trees can be made more robust by treating tree splits differentially depending on the bootstrap support. Approaches are also introduced for detecting tree-like and net-like evolutionary trends in the phylogenetic Forest of Life (FOL), i.e., the entirety of the phylogenetic trees for conserved genes of prokaryotes. The principal method employed for this purpose includes mapping quartets of species onto trees to calculate the support of each quartet topology and so to quantify the tree and net contributions to the distances between species. We describe the applications methods used to analyze the FOL and the results obtained with these methods. These results support the concept of the Tree of Life (TOL) as a central evolutionary trend in the FOL as opposed to the traditional view of the TOL as a ‘species tree’. PMID:22399455

  15. Genome-wide comparative analysis of phylogenetic trees: the prokaryotic forest of life.

    PubMed

    Puigbò, Pere; Wolf, Yuri I; Koonin, Eugene V

    2012-01-01

    Genome-wide comparison of phylogenetic trees is becoming an increasingly common approach in evolutionary genomics, and a variety of approaches for such comparison have been developed. In this article, we present several methods for comparative analysis of large numbers of phylogenetic trees. To compare phylogenetic trees taking into account the bootstrap support for each internal branch, the Boot-Split Distance (BSD) method is introduced as an extension of the previously developed Split Distance method for tree comparison. The BSD method implements the straightforward idea that comparison of phylogenetic trees can be made more robust by treating tree splits differentially depending on the bootstrap support. Approaches are also introduced for detecting tree-like and net-like evolutionary trends in the phylogenetic Forest of Life (FOL), i.e., the entirety of the phylogenetic trees for conserved genes of prokaryotes. The principal method employed for this purpose includes mapping quartets of species onto trees to calculate the support of each quartet topology and so to quantify the tree and net contributions to the distances between species. We describe the application of these methods to analyze the FOL and the results obtained with these methods. These results support the concept of the Tree of Life (TOL) as a central evolutionary trend in the FOL as opposed to the traditional view of the TOL as a "species tree."

  16. Ultrastructural analysis of different-made staplers' staples.

    PubMed

    Gentilli, S; Portigliotti, L; Aronici, M; Ferrante, D; Surico, D; Milanesio, M; Gianotti, V; Gatti, G; Addante, A; Garavoglia, M

    2012-10-01

    Recently, Chinese-made mechanical staplers with lower price respect to American-made ones have been introduced in clinical practice. In literature, small case series compare the clinical outcomes of different staplers concluding that the new stapler devices perform as well as the American ones. The aim of this study is to compare with an ultrastructural analysis the staples of different staplers in order to verify the existence of differences that might explain significant price disparity and condition clinical outcomes. Each stapler was subjected to morphological analysis, energy dispersive X-Ray spectroscopy, metal release assessment followed by inductively coupled plasma mass spectroscopy. P-values were considered statistically significant when <0.05. Autosuture staples have square section whereas the other American one and Chinese made staples have round sections. Roughness index and chips presence before and after ageing tests were comparable for all samples except for Ethicon Endo-Surgery stapler. Energy dispersive X-Ray spectroscopy showed that all staplers are made of pure Titanium but Ethicon Endo-Surgery staples are made with an alloy. Metal release analysis release statistically significant differences between samples in simulated body fluid 20 days solution (P=0.002) and in Aquaregia at 14 days solution. Discussion. Stapling devices have became routinely used in gastrointestinal surgery mainly because of operative time reduction. Recently, new Chinese-made mechanical staplers, with significantly lower prices, have been introduced in clinical practice. In literature, there are some studies that compare clinical outcomes of American-made and Chinese-made staplers on small groups of patients but doesn't exist any work which consider structural differences between traditional and new devices. In our study, for the first time, we propose a comparison between two American-made staplers and three Chinese-made staplers which evaluate morphology, metal composition and chemical staples release. Our study suggest that there are some ultrastructural differences between commercially available staplers with no correlation to price disparity. More studies are needed to confirm our results and to verify if our findings could condition clinical outcomes.

  17. Network meta-analysis: an introduction for pharmacists.

    PubMed

    Xu, Yina; Amiche, Mohamed Amine; Tadrous, Mina

    2018-05-21

    Network meta-analysis is a new tool used to summarize and compare studies for multiple interventions, irrespective of whether these interventions have been directly evaluated against each other. Network meta-analysis is quickly becoming the standard in conducting therapeutic reviews and clinical guideline development. However, little guidance is available to help pharmacists review network meta-analysis studies in their practice. Major institutions such as the Cochrane Collaboration, Agency for Healthcare Research and Quality, Canadian Agency for Drugs and Technologies in Health, and National Institute for Health and Care Excellence Decision Support Unit have endorsed utilizing network meta-analysis to establish therapeutic evidence and inform decision making. Our objective is to introduce this novel technique to pharmacy practitioners, and highlight key assumptions behind network meta-analysis studies.

  18. Sequence data - Magnitude and implications of some ambiguities.

    NASA Technical Reports Server (NTRS)

    Holmquist, R.; Jukes, T. H.

    1972-01-01

    A stochastic model is applied to the divergence of the horse-pig lineage from a common ansestor in terms of the alpha and beta chains of hemoglobin and fibrinopeptides. The results are compared with those based on the minimum mutation distance model of Fitch (1972). Buckwheat and cauliflower cytochrome c sequences are analyzed to demonstrate their ambiguities. A comparative analysis of evolutionary rates for various proteins of horses and pigs shows that errors of considerable magnitude are introduced by Glx and Asx ambiguities into evolutionary conclusions drawn from sequences of incompletely analyzed proteins.

  19. Effects of land use pattern on soil water in revegetation watersheds in semi-arid Chinese Loess Plateau

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Chen, Liding; Wei, Wei

    2017-04-01

    Soil water stored below rainfall infiltration depth is a reliable water resource for plant growth in arid and semi-arid regions. For decreasing serious soil erosion, large-scale human-introduced vegetation restoration was initiated in Chinese Loess Plateau in late 1990s. However, these activities may result in excessive water consumption and soil water deficit if no appropriate scientific guidance were offered. This in turn impacts the regional ecological restoration and sustainable management of water resources. In this study, soil water content data in depth of 0-5 m was obtained by long-term field observation and geostatistical method in 6 small watersheds covered with different land use pattern. Profile characteristics and spatial-temporal patterns of soil water were compared between different land use types, hillslopes, and watersheds. The results showed that: (1) Introduced vegetation consumed excessive amount of water when compared with native grassland and farmland, and induced temporally stable soil desiccation in depth of 0-5 m. The introduced vegetation decreased soil water content to levels lower than the reference value representing no human impact in all soil layers. (2) The analysis of differences in soil water at hillslope and watershed scales indicated that land use determined the spatial and temporal variability of soil water. Soil water at watershed scale increased with the increasing area of farmland, and decreased with increasing percentage of introduced vegetation. Land use structure determined the soil water condition and land use pattern determined the spatial-temporal variability of soil water at watershed scale. (3) Large-scale revegetation with introduced vegetation diminished the spatial heterogeneity of soil water at different scales. Land use pattern adjustment could be used to improve the water resources management and maintain the sustainability of vegetation restoration.

  20. Lobbying Reform: Background and Legislative Proposals, 109th Congress

    DTIC Science & Technology

    2006-03-23

    activities have also been linked to campaign finance practices, congressional procedures regarding the acceptance of gifts from lobbyists, and the inclusion...Introduced in the 109th Congress: A Comparative Analysis, by R. Eric Petersen; and CRS Report RL33237, Congressional Gifts and Travel, Legislative...linked to other activities carried out by lobbyists. These include campaign finance practices,3 congressional rules regarding the acceptance of gifts

  1. Music in Higher Education after the Bologna Treaty: Or, in Search of a New Educational Culture

    ERIC Educational Resources Information Center

    Mota, Graca

    2012-01-01

    This paper aims to introduce a critical reflection on the field of music education in higher education, using the Bologna Declaration and the European context as a backdrop. However, the author would like to clarify that she does not intend to develop a thorough comparative analysis of music education in European countries. In fact, this is being…

  2. Bidirectional reflectance distribution function measurements and analysis of retroreflective materials.

    PubMed

    Belcour, Laurent; Pacanowski, Romain; Delahaie, Marion; Laville-Geay, Aude; Eupherte, Laure

    2014-12-01

    We compare the performance of various analytical retroreflecting bidirectional reflectance distribution function (BRDF) models to assess how they reproduce accurately measured data of retroreflecting materials. We introduce a new parametrization, the back vector parametrization, to analyze retroreflecting data, and we show that this parametrization better preserves the isotropy of data. Furthermore, we update existing BRDF models to improve the representation of retroreflective data.

  3. Analysis of Train Suspension System Using MR dampers

    NASA Astrophysics Data System (ADS)

    RamaSastry, DVA; Ramana, K. V.; Mohan Rao, N.; Siva Kumar, SVR; Priyanka, T. G. L.

    2016-09-01

    This paper deals with introducing MR dampers to the Train Suspension System for improving the ride comfort of the passengers. This type of suspension system comes under Semi-active suspension system which utilizes the properties of MR fluid to damp the vibrations. In case of high speed trains, the coach body is subjected to vibrations due to vertical displacement, yaw and pitch movements. When the body receives these disturbances from the ground,the transmission of vibrations to the passenger increases which affect the ride comfort. In this work, the equations of motion of suspension system are developed for both conventional passive system and semi-active system and are modelled in Matlab/Simulink and analysis has been carried out. The passive suspension system analysis shows that it is taking more time to damp the vibrations and at the same time the transmissibility of vibrations is more.Introducing MR dampers,vertical and angular displacements of the body are computed and compared. The results show that the introduction of MR dampers into the train suspension system improves ride comfort.

  4. Recent advances in methods for the analysis of protein o-glycosylation at proteome level.

    PubMed

    You, Xin; Qin, Hongqiang; Ye, Mingliang

    2018-01-01

    O-Glycosylation, which refers to the glycosylation of the hydroxyl group of side chains of Serine/Threonine/Tyrosine residues, is one of the most common post-translational modifications. Compared with N-linked glycosylation, O-glycosylation is less explored because of its complex structure and relatively low abundance. Recently, O-glycosylation has drawn more and more attention for its various functions in many sophisticated biological processes. To obtain a deep understanding of O-glycosylation, many efforts have been devoted to develop effective strategies to analyze the two most abundant types of O-glycosylation, i.e. O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. In this review, we summarize the proteomics workflows to analyze these two types of O-glycosylation. For the large-scale analysis of mucin-type glycosylation, the glycan simplification strategies including the ''SimpleCell'' technology were introduced. A variety of enrichment methods including lectin affinity chromatography, hydrophilic interaction chromatography, hydrazide chemistry, and chemoenzymatic method were introduced for the proteomics analysis of O-N-acetylgalactosamine and O-N-acetylglucosamine glycosylation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. AHRQ series paper 1: comparing medical interventions: AHRQ and the effective health-care program.

    PubMed

    Slutsky, Jean; Atkins, David; Chang, Stephanie; Sharp, Beth A Collins

    2010-05-01

    In 2005, the Agency for Healthcare Research and Quality established the Effective Health Care (EHC) Program. The EHC Program aims to provide understandable and actionable information for patients, clinicians, and policy makers. The Evidence-based Practice Centers are one of the cornerstones of the EHC Program. Three key elements guide the EHC Program and thus, the conduct of Comparative Effectiveness Reviews by the EPC Program. Comparative Effectiveness Reviews introduce several specific challenges in addition to the familiar issues raised in a systematic review or meta-analysis of a single intervention. The articles in this series together form the current Methods Guide for Comparative Effectiveness Reviews of the EHC Program.

  6. Electroosmotic pumps for microflow analysis

    PubMed Central

    Wang, Xiayan; Wang, Shili; Gendhar, Brina; Cheng, Chang; Byun, Chang Kyu; Li, Guanbin; Zhao, Meiping; Liu, Shaorong

    2009-01-01

    With rapid development in microflow analysis, electroosmotic pumps are receiving increasing attention. Compared to other micropumps, electroosmotic pumps have several unique features. For example, they are bi-directional, can generate constant and pulse-free flows with flow rates well suited to microanalytical systems, and can be readily integrated with lab-on-chip devices. The magnitude and the direction of flow of an electroosmotic pump can be changed instantly. In addition, electroosmotic pumps have no moving parts. In this article, we discuss common features, introduce fabrication technologies and highlight applications of electroosmotic pumps. PMID:20047021

  7. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  8. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.

  9. The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology

    PubMed Central

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805

  10. Methods for comparative metagenomics

    PubMed Central

    Huson, Daniel H; Richter, Daniel C; Mitra, Suparna; Auch, Alexander F; Schuster, Stephan C

    2009-01-01

    Background Metagenomics is a rapidly growing field of research that aims at studying uncultured organisms to understand the true diversity of microbes, their functions, cooperation and evolution, in environments such as soil, water, ancient remains of animals, or the digestive system of animals and humans. The recent development of ultra-high throughput sequencing technologies, which do not require cloning or PCR amplification, and can produce huge numbers of DNA reads at an affordable cost, has boosted the number and scope of metagenomic sequencing projects. Increasingly, there is a need for new ways of comparing multiple metagenomics datasets, and for fast and user-friendly implementations of such approaches. Results This paper introduces a number of new methods for interactively exploring, analyzing and comparing multiple metagenomic datasets, which will be made freely available in a new, comparative version 2.0 of the stand-alone metagenome analysis tool MEGAN. Conclusion There is a great need for powerful and user-friendly tools for comparative analysis of metagenomic data and MEGAN 2.0 will help to fill this gap. PMID:19208111

  11. Design and Static Analysis of Airlesstyre to Reduce Deformation

    NASA Astrophysics Data System (ADS)

    Mathew, Nibin Jacob; Sahoo, Dillip Kumar; Mithun Chakravarthy, E.

    2017-05-01

    In this work a model of an air less tire is introduced with a replacement of natural rubber materials in place of synthetic rubber in tread and polyester in place of nylon in carcass. The construction and material study of various types of air less tyre is done by comparing with pneumatic tire. A brief structural study has been done on spokes of airless tyre and analyzed by ANSYS software. Analysis has been carried out on various structures like honey comb, Spokes, triangular and diamond with an applied load of 1200N. Comparison study has been carried out among various structures with different materials and it study shows that tyre with diamond structure with synthetic materials gives less deformation compared to other structure

  12. Temporal correlations in the Vicsek model with vectorial noise

    NASA Astrophysics Data System (ADS)

    Gulich, Damián; Baglietto, Gabriel; Rozenfeld, Alejandro F.

    2018-07-01

    We study the temporal correlations in the evolution of the order parameter ϕ(t) for the Vicsek model with vectorial noise by estimating its Hurst exponent H with detrended fluctuation analysis (DFA). We present results on this parameter as a function of noise amplitude η introduced in simulations. We also compare with well known order-disorder phase transition for that same noise range. We find that - regardless of detrending degree - H spikes at the known coexistence noise for phase transition, and that this is due to nonstationarities introduced by the transit of the system between two well defined states with lower exponents. We statistically support this claim by successfully synthesizing equivalent cases derived from a transformed fractional Brownian motion (TfBm).

  13. Optimization of location routing inventory problem with transshipment

    NASA Astrophysics Data System (ADS)

    Ghani, Nor Edayu Abd; Shariff, S. Sarifah Radiah; Zahari, Siti Meriam

    2015-05-01

    Location Routing Inventory Problem (LRIP) is a collaboration of the three components in the supply chain. It is confined by location-allocation, vehicle routing and inventory management. The aim of the study is to minimize the total system cost in the supply chain. Transshipment is introduced in order to allow the products to be shipped to a customer who experiences a shortage, either directly from the supplier or from another customer. In the study, LRIP is introduced with the transshipment (LRIPT) and customers act as the transshipment points. We select the transshipment point by using the p-center and we present the results in two divisions of cases. Based on the analysis, the results indicated that LRIPT performed well compared to LRIP.

  14. Power scaling limits in high power fiber amplifiers due to transverse mode instability, thermal lensing, and fiber mechanical reliability

    NASA Astrophysics Data System (ADS)

    Zervas, Michalis N.

    2018-02-01

    We introduced a simple formula providing the mode-field diameter shrinkage, due to heat load in fiber amplifiers, and used it to compare the traditional thermal-lensing power limit (PTL) to a newly developed transverse-mode instability (TMI) power limit (PTMI), giving a fixed ratio of PTMI/PTL≍0.6, in very good agreement with experiment. Using a failure-in-time analysis we also introduced a new power limiting factor due to mechanical reliability of bent fibers. For diode (tandem) pumping power limits of 28kW (52kW) are predicted. Setting a practical limit of maximum core diameter to 35μm, the limits reduce to 15kW (25kW).

  15. Introducing MASC: a movie for the assessment of social cognition.

    PubMed

    Dziobek, Isabel; Fleck, Stefan; Kalbe, Elke; Rogers, Kimberley; Hassenstab, Jason; Brand, Matthias; Kessler, Josef; Woike, Jan K; Wolf, Oliver T; Convit, Antonio

    2006-07-01

    In the present study we introduce a sensitive video-based test for the evaluation of subtle mindreading difficulties: the Movie for the Assessment of Social Cognition (MASC). This new mindreading tool involves watching a short film and answering questions referring to the actors' mental states. A group of adults with Asperger syndrome (n = 19) and well-matched control subjects (n = 20) were administered the MASC and three other mindreading tools as part of a broader neuropsychological testing session. Compared to control subjects, Asperger individuals exhibited marked and selective difficulties in social cognition. A Receiver Operating Characteristic (ROC) analysis for the mindreading tests identified the MASC as discriminating the diagnostic groups most accurately. Issues pertaining to the multidimensionality of the social cognition construct are discussed.

  16. [Immobilization of introduced bacteria and degradation of pyrene and benzo(alpha) pyrene in soil by immobilized bacteria].

    PubMed

    Wang, Xin; Li, Peijun; Song, Shouzhi; Zhong, Yong; Zhang, Hui; Verkhozina, E V

    2006-11-01

    In this study, introduced bacteria were applied in the bioremediation of pyrene and benzo (alpha) pyrene in organic pollutants-contaminated soils, aimed to test whether it was feasible to introduce bacteria to environmental engineering. Three introduced bacteria were immobilized separately or together to degrade the pyrene and benzo (alpha) pyrene in soil, taking dissociated bacteria as the control, and comparing with three indigenous bacteria. The results showed that immobilized introduced bacteria, either single or mixed, had higher degradation efficiency than dissociated bacteria. Compared with indigenous bacteria, some introduced bacteria had predominance to some degree. The introduced bacteria-mixture had better degradation efficiency after being immobilized. The degradation rate of pyrene and benzo(alpha) pyrene after treated with immobilized bacteria-( B61-B67)-mixture for 96 hours was 43.49% and 38.55%, respectively.

  17. Lattice Independent Component Analysis for Mobile Robot Localization

    NASA Astrophysics Data System (ADS)

    Villaverde, Ivan; Fernandez-Gauna, Borja; Zulueta, Ekaitz

    This paper introduces an approach to appearance based mobile robot localization using Lattice Independent Component Analysis (LICA). The Endmember Induction Heuristic Algorithm (EIHA) is used to select a set of Strong Lattice Independent (SLI) vectors, which can be assumed to be Affine Independent, and therefore candidates to be the endmembers of the data. Selected endmembers are used to compute the linear unmixing of the robot's acquired images. The resulting mixing coefficients are used as feature vectors for view recognition through classification. We show on a sample path experiment that our approach can recognise the localization of the robot and we compare the results with the Independent Component Analysis (ICA).

  18. Complex network analysis of conventional and Islamic stock market in Indonesia

    NASA Astrophysics Data System (ADS)

    Rahmadhani, Andri; Purqon, Acep; Kim, Sehyun; Kim, Soo Yong

    2015-09-01

    The rising popularity of Islamic financial products in Indonesia has become a new interesting topic to be analyzed recently. We introduce a complex network analysis to compare conventional and Islamic stock market in Indonesia. Additionally, Random Matrix Theory (RMT) has been added as a part of reference to expand the analysis of the result. Both of them are based on the cross correlation matrix of logarithmic price returns. Closing price data, which is taken from June 2011 to July 2012, is used to construct logarithmic price returns. We also introduce the threshold value using winner-take-all approach to obtain scale-free property of the network. This means that the nodes of the network that has a cross correlation coefficient below the threshold value should not be connected with an edge. As a result, we obtain 0.5 as the threshold value for all of the stock market. From the RMT analysis, we found that there is only market wide effect on both stock market and no clustering effect has been found yet. From the network analysis, both of stock market networks are dominated by the mining sector. The length of time series of closing price data must be expanded to get more valuable results, even different behaviors of the system.

  19. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  20. Citation Analysis and Discourse Analysis Revisited

    ERIC Educational Resources Information Center

    White, Howard D.

    2004-01-01

    John Swales's 1986 article "Citation analysis and discourse analysis" was written by a discourse analyst to introduce citation research from other fields, mainly sociology of science, to his own discipline. Here, I introduce applied linguists and discourse analysts to citation studies from information science, a complementary tradition not…

  1. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  2. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  3. Lessons from innovation in drug-device combination products.

    PubMed

    Couto, Daniela S; Perez-Breva, Luis; Saraiva, Pedro; Cooney, Charles L

    2012-01-01

    Drug-device combination products introduced a new dynamic on medical product development, regulatory approval, and corporate interaction that provide valuable lessons for the development of new generations of combination products. This paper examines the case studies of drug-eluting stents and transdermal patches to facilitate a detailed understanding of the challenges and opportunities introduced by combination products when compared to previous generations of traditional medical or drug delivery devices. Our analysis indicates that the largest barrier to introduce a new kind of combination products is the determination of the regulatory center that is to oversee its approval. The first product of a new class of combination products offers a learning opportunity for the regulator and the sponsor. Once that first product is approved, the leading regulatory center is determined, and the uncertainty about the entire class of combination products is drastically reduced. The sponsor pioneering a new class of combination products assumes a central role in reducing this uncertainty by advising the decision on the primary function of the combination product. Our analysis also suggests that this decision influences the nature (pharmaceutical, biotechnology, or medical devices) of the companies that will lead the introduction of these products into the market, and guide the structure of corporate interaction thereon. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. [Hospitality workers' exposure to environmental tobacco smoke before and after implementation of smoking ban in public places: a review of epidemiological studies].

    PubMed

    Polańska, Kinga; Hanke, Wojciech; Konieczko, Katarzyna

    2011-01-01

    Environmental tobacco smoke (ETS) exposure induces serious negative health consequences, of which the increased risk of cardiovascular diseases, cancer, respiratory symptoms and poor pregnancy outcomes appear to be most important. Taking into account those health consequences of ETS exposure most countries have introduced legislation to ban or restrict smoking in public places. In this paper the effectiveness of the introduced legislation was analyzed with regard to the protection of hospitality workers from ETS exposure in the workplace. The analysis of 12 papers published after 2000 covered the year of publication, type of legislation, study population, hospitality venue (pub, bar, restaurant, disco) and type of markers or self-reported perception of exposure to ETS. The analysis indicates that the legislation to ban smoking in hospitality venues protects workers from ETS exposure when the venues are 100% tobacco smoke free. The reduction of the cotinine level in biological samples after the implementation of smoke free law was 57-89%, comparing to the biomarker level in the samples taken before the new law was introduced. About 90% of reduction in nicotine and PM levels was also noted. In addition, the positive self perception reported by workers proved the effectiveness of new legislation protecting them from ETS exposure.

  5. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    PubMed

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  6. Navigating the complexities of qualitative comparative analysis: case numbers, necessity relations, and model ambiguities.

    PubMed

    Thiem, Alrik

    2014-12-01

    In recent years, the method of Qualitative Comparative Analysis (QCA) has been enjoying increasing levels of popularity in evaluation and directly neighboring fields. Its holistic approach to causal data analysis resonates with researchers whose theories posit complex conjunctions of conditions and events. However, due to QCA's relative immaturity, some of its technicalities and objectives have not yet been well understood. In this article, I seek to raise awareness of six pitfalls of employing QCA with regard to the following three central aspects: case numbers, necessity relations, and model ambiguities. Most importantly, I argue that case numbers are irrelevant to the methodological choice of QCA or any of its variants, that necessity is not as simple a concept as it has been suggested by many methodologists, and that doubt must be cast on the determinacy of virtually all results presented in past QCA research. By means of empirical examples from published articles, I explain the background of these pitfalls and introduce appropriate procedures, partly with reference to current software, that help avoid them. QCA carries great potential for scholars in evaluation and directly neighboring areas interested in the analysis of complex dependencies in configurational data. If users beware of the pitfalls introduced in this article, and if they avoid mechanistic adherence to doubtful "standards of good practice" at this stage of development, then research with QCA will gain in quality, as a result of which a more solid foundation for cumulative knowledge generation and well-informed policy decisions will also be created. © The Author(s) 2014.

  7. Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS

    NASA Astrophysics Data System (ADS)

    Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.

    2017-04-01

    The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.

  8. Multiscale analysis of neural spike trains.

    PubMed

    Ramezan, Reza; Marriott, Paul; Chenouri, Shojaeddin

    2014-01-30

    This paper studies the multiscale analysis of neural spike trains, through both graphical and Poisson process approaches. We introduce the interspike interval plot, which simultaneously visualizes characteristics of neural spiking activity at different time scales. Using an inhomogeneous Poisson process framework, we discuss multiscale estimates of the intensity functions of spike trains. We also introduce the windowing effect for two multiscale methods. Using quasi-likelihood, we develop bootstrap confidence intervals for the multiscale intensity function. We provide a cross-validation scheme, to choose the tuning parameters, and study its unbiasedness. Studying the relationship between the spike rate and the stimulus signal, we observe that adjusting for the first spike latency is important in cross-validation. We show, through examples, that the correlation between spike trains and spike count variability can be multiscale phenomena. Furthermore, we address the modeling of the periodicity of the spike trains caused by a stimulus signal or by brain rhythms. Within the multiscale framework, we introduce intensity functions for spike trains with multiplicative and additive periodic components. Analyzing a dataset from the retinogeniculate synapse, we compare the fit of these models with the Bayesian adaptive regression splines method and discuss the limitations of the methodology. Computational efficiency, which is usually a challenge in the analysis of spike trains, is one of the highlights of these new models. In an example, we show that the reconstruction quality of a complex intensity function demonstrates the ability of the multiscale methodology to crack the neural code. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Dynamic clustering detection through multi-valued descriptors of dermoscopic images.

    PubMed

    Cozza, Valentina; Guarracino, Maria Rosario; Maddalena, Lucia; Baroni, Adone

    2011-09-10

    This paper introduces a dynamic clustering methodology based on multi-valued descriptors of dermoscopic images. The main idea is to support medical diagnosis to decide if pigmented skin lesions belonging to an uncertain set are nearer to malignant melanoma or to benign nevi. Melanoma is the most deadly skin cancer, and early diagnosis is a current challenge for clinicians. Most data analysis algorithms for skin lesions discrimination focus on segmentation and extraction of features of categorical or numerical type. As an alternative approach, this paper introduces two new concepts: first, it considers multi-valued data that scalar variables not only describe but also intervals or histogram variables; second, it introduces a dynamic clustering method based on Wasserstein distance to compare multi-valued data. The overall strategy of analysis can be summarized into the following steps: first, a segmentation of dermoscopic images allows to identify a set of multi-valued descriptors; second, we performed a discriminant analysis on a set of images where there is an a priori classification so that it is possible to detect which features discriminate the benign and malignant lesions; and third, we performed the proposed dynamic clustering method on the uncertain cases, which need to be associated to one of the two previously mentioned groups. Results based on clinical data show that the grading of specific descriptors associated to dermoscopic characteristics provides a novel way to characterize uncertain lesions that can help the dermatologist's diagnosis. Copyright © 2011 John Wiley & Sons, Ltd.

  10. The Indigenous and Exogenous Aspects of Moral Education: A Comparative Analysis of the U.S. Military Occupation in Japan and Germany after World War II.

    ERIC Educational Resources Information Center

    Shibata, Masako

    During the U.S. military occupation of Japan after World War II, few sectors of Japanese society were left untouched. Reforms during the occupation included education, religion, moral values, and gender relations. By contrast, in Germany, except in the Soviet-controlled zone, no radical changes were introduced in the education system during the…

  11. Simulation based optimization on automated fibre placement process

    NASA Astrophysics Data System (ADS)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  12. Coherence-Based Modeling of Cultural Change and Political Violence

    DTIC Science & Technology

    2010-08-31

    the classic sociologist Emile Durkheim . The grid/group concept was introduced to the risk analysis community in 1982 by a book Douglas wrote with...has been compared at time to the pioneering concepts of integration and regulation by 19th century sociologist Emile Durkheim (1997 [1897]), for... Durkheim , Emile , Suicide. (New York: Free Press, 1897. Reissue edition. 1997). Dufwenberg, Martin, and Georg Kirchsteiger, 1998. A Theory of

  13. Improved mapping of the travelling salesman problem for quantum annealing

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias; Heim, Bettina; Brown, Ethan; Wecker, David

    2015-03-01

    We consider the quantum adiabatic algorithm as applied to the travelling salesman problem (TSP). We introduce a novel mapping of TSP to an Ising spin glass Hamiltonian and compare it to previous known mappings. Through direct perturbative analysis, unitary evolution, and simulated quantum annealing, we show this new mapping to be significantly superior. We discuss how this advantage can translate to actual physical implementations of TSP on quantum annealers.

  14. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  15. Fuzzy decision-making framework for treatment selection based on the combined QUALIFLEX-TODIM method

    NASA Astrophysics Data System (ADS)

    Ji, Pu; Zhang, Hong-yu; Wang, Jian-qiang

    2017-10-01

    Treatment selection is a multi-criteria decision-making problem of significant concern in the medical field. In this study, a fuzzy decision-making framework is established for treatment selection. The framework mitigates information loss by introducing single-valued trapezoidal neutrosophic numbers to denote evaluation information. Treatment selection has multiple criteria that remarkably exceed the alternatives. In consideration of this characteristic, the framework utilises the idea of the qualitative flexible multiple criteria method. Furthermore, it considers the risk-averse behaviour of a decision maker by employing a concordance index based on TODIM (an acronym in Portuguese of interactive and multi-criteria decision-making) method. A sensitivity analysis is performed to illustrate the robustness of the framework. Finally, a comparative analysis is conducted to compare the framework with several extant methods. Results indicate the advantages of the framework and its better performance compared with the extant methods.

  16. Novel methodologies for spectral classification of exon and intron sequences

    NASA Astrophysics Data System (ADS)

    Kwan, Hon Keung; Kwan, Benjamin Y. M.; Kwan, Jennifer Y. Y.

    2012-12-01

    Digital processing of a nucleotide sequence requires it to be mapped to a numerical sequence in which the choice of nucleotide to numeric mapping affects how well its biological properties can be preserved and reflected from nucleotide domain to numerical domain. Digital spectral analysis of nucleotide sequences unfolds a period-3 power spectral value which is more prominent in an exon sequence as compared to that of an intron sequence. The success of a period-3 based exon and intron classification depends on the choice of a threshold value. The main purposes of this article are to introduce novel codes for 1-sequence numerical representations for spectral analysis and compare them to existing codes to determine appropriate representation, and to introduce novel thresholding methods for more accurate period-3 based exon and intron classification of an unknown sequence. The main findings of this study are summarized as follows: Among sixteen 1-sequence numerical representations, the K-Quaternary Code I offers an attractive performance. A windowed 1-sequence numerical representation (with window length of 9, 15, and 24 bases) offers a possible speed gain over non-windowed 4-sequence Voss representation which increases as sequence length increases. A winner threshold value (chosen from the best among two defined threshold values and one other threshold value) offers a top precision for classifying an unknown sequence of specified fixed lengths. An interpolated winner threshold value applicable to an unknown and arbitrary length sequence can be estimated from the winner threshold values of fixed length sequences with a comparable performance. In general, precision increases as sequence length increases. The study contributes an effective spectral analysis of nucleotide sequences to better reveal embedded properties, and has potential applications in improved genome annotation.

  17. Low gas flow inductively coupled plasma optical emission spectrometry for the analysis of food samples after microwave digestion.

    PubMed

    Nowak, Sascha; Gesell, Monika; Holtkamp, Michael; Scheffer, Andy; Sperling, Michael; Karst, Uwe; Buscher, Wolfgang

    2014-11-01

    In this work, the recently introduced low flow inductively coupled plasma optical emission spectrometry (ICP-OES) with a total argon consumption below 0.7 L/min is applied for the first time to the field of food analysis. One goal is the investigation of the performance of this low flow plasma compared to a conventional ICP-OES system when non-aqueous samples with a certain matrix are introduced into the system. For this purpose, arsenic is determined in three different kinds of fish samples. In addition several nutrients (K, Na, Mg, Ca) and trace metals (Co, Cu, Mn, Cd, Pb, Zn, Fe, and Ni) are determined in honey samples (acacia) after microwave digestion. The precision of the measurements is characterized by relative standard deviations (RSD) and compared to the corresponding precision values achieved using the conventional Fassel-type torch of the ICP. To prove the accuracy of the low flow ICP-OES method, the obtained data from honey samples are validated by a conventional ICP-OES. For the measurements concerning arsenic in fish, the low flow ICP-OES values are validated by conventional Fassel-type ICP-OES. Furthermore, a certified reference material was investigated with the low gas flow setup. Limits of detection (LOD), according to the 3σ criterion, were determined to be in the low microgram per liter range for all analytes. Recovery rates in the range of 96-106% were observed for the determined trace metal elements. It was proven that the low gas flow ICP-OES leads to results that are comparable with those obtained with the Fassel-type torch for the analysis of food samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faranda, Davide, E-mail: davide.faranda@cea.fr; Dubrulle, Bérengère; Daviaud, François

    We introduce a novel way to extract information from turbulent datasets by applying an Auto Regressive Moving Average (ARMA) statistical analysis. Such analysis goes well beyond the analysis of the mean flow and of the fluctuations and links the behavior of the recorded time series to a discrete version of a stochastic differential equation which is able to describe the correlation structure in the dataset. We introduce a new index Υ that measures the difference between the resulting analysis and the Obukhov model of turbulence, the simplest stochastic model reproducing both Richardson law and the Kolmogorov spectrum. We test themore » method on datasets measured in a von Kármán swirling flow experiment. We found that the ARMA analysis is well correlated with spatial structures of the flow, and can discriminate between two different flows with comparable mean velocities, obtained by changing the forcing. Moreover, we show that the Υ is highest in regions where shear layer vortices are present, thereby establishing a link between deviations from the Kolmogorov model and coherent structures. These deviations are consistent with the ones observed by computing the Hurst exponents for the same time series. We show that some salient features of the analysis are preserved when considering global instead of local observables. Finally, we analyze flow configurations with multistability features where the ARMA technique is efficient in discriminating different stability branches of the system.« less

  19. Quantitative EEG analysis using error reduction ratio-causality test; validation on simulated and real EEG data.

    PubMed

    Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios

    2014-01-01

    To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Semi-automation of Doppler Spectrum Image Analysis for Grading Aortic Valve Stenosis Severity.

    PubMed

    Niakšu, O; Balčiunaitė, G; Kizlaitis, R J; Treigys, P

    2016-01-01

    Doppler echocardiography analysis has become a golden standard in the modern diagnosis of heart diseases. In this paper, we propose a set of techniques for semi-automated parameter extraction for aortic valve stenosis severity grading. The main objectives of the study is to create echocardiography image processing techniques, which minimize manual image processing work of clinicians and leads to reduced human error rates. Aortic valve and left ventricle output tract spectrogram images have been processed and analyzed. A novel method was developed to trace systoles and to extract diagnostic relevant features. The results of the introduced method have been compared to the findings of the participating cardiologists. The experimental results showed the accuracy of the proposed method is comparable to the manual measurement performed by medical professionals. Linear regression analysis of the calculated parameters and the measurements manually obtained by the cardiologists resulted in the strongly correlated values: peak systolic velocity's and mean pressure gradient's R2 both equal to 0.99, their means' differences equal to 0.02 m/s and 4.09 mmHg, respectively, and aortic valve area's R2 of 0.89 with the two methods means' difference of 0.19 mm. The introduced Doppler echocardiography images processing method can be used as a computer-aided assistance in the aortic valve stenosis diagnostics. In our future work, we intend to improve precision of left ventricular outflow tract spectrogram measurements and apply data mining methods to propose a clinical decision support system for diagnosing aortic valve stenosis.

  1. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  2. Design and Analysis of Cognitive Interviews for Comparative Multinational Testing

    PubMed Central

    Fitzgerald, Rory; Padilla, José-Luis; Willson, Stephanie; Widdop, Sally; Caspar, Rachel; Dimov, Martin; Gray, Michelle; Nunes, Cátia; Prüfer, Peter; Schöbi, Nicole; Schoua-Glusberg, Alisú

    2011-01-01

    This article summarizes the work of the Comparative Cognitive Testing Workgroup, an international coalition of survey methodologists interested in developing an evidence-based methodology for examining the comparability of survey questions within cross-cultural or multinational contexts. To meet this objective, it was necessary to ensure that the cognitive interviewing (CI) method itself did not introduce method bias. Therefore, the workgroup first identified specific characteristics inherent in CI methodology that could undermine the comparability of CI evidence. The group then developed and implemented a protocol addressing those issues. In total, 135 cognitive interviews were conducted by participating countries. Through the process, the group identified various interpretive patterns resulting from sociocultural and language-related differences among countries as well as other patterns of error that would impede comparability of survey data. PMID:29081719

  3. The Effect of Laminar Flow on Rotor Hover Performance

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Martin, Preston B.

    2017-01-01

    The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.

  4. Comparison of dynamic treatment regimes via inverse probability weighting.

    PubMed

    Hernán, Miguel A; Lanoy, Emilie; Costagliola, Dominique; Robins, James M

    2006-03-01

    Appropriate analysis of observational data is our best chance to obtain answers to many questions that involve dynamic treatment regimes. This paper describes a simple method to compare dynamic treatment regimes by artificially censoring subjects and then using inverse probability weighting (IPW) to adjust for any selection bias introduced by the artificial censoring. The basic strategy can be summarized in four steps: 1) define two regimes of interest, 2) artificially censor individuals when they stop following one of the regimes of interest, 3) estimate inverse probability weights to adjust for the potential selection bias introduced by censoring in the previous step, 4) compare the survival of the uncensored individuals under each regime of interest by fitting an inverse probability weighted Cox proportional hazards model with the dichotomous regime indicator and the baseline confounders as covariates. In the absence of model misspecification, the method is valid provided data are available on all time-varying and baseline joint predictors of survival and regime discontinuation. We present an application of the method to compare the AIDS-free survival under two dynamic treatment regimes in a large prospective study of HIV-infected patients. The paper concludes by discussing the relative advantages and disadvantages of censoring/IPW versus g-estimation of nested structural models to compare dynamic regimes.

  5. MCCE analysis of the pKas of introduced buried acids and bases in staphylococcal nuclease.

    PubMed

    Gunner, M R; Zhu, Xuyu; Klein, Max C

    2011-12-01

    The pK(a)s of 96 acids and bases introduced into buried sites in the staphylococcal nuclease protein (SNase) were calculated using the multiconformation continuum electrostatics (MCCE) program and the results compared with experimental values. The pK(a)s are obtained by Monte Carlo sampling of coupled side chain protonation and position as a function of pH. The dependence of the results on the protein dielectric constant (ε(prot)) in the continuum electrostatics analysis and on the Lennard-Jones non-electrostatics parameters was evaluated. The pK(a)s of the introduced residues have a clear dependence on ε(prot,) whereas native ionizable residues do not. The native residues have electrostatic interactions with other residues in the protein favoring ionization, which are larger than the desolvation penalty favoring the neutral state. Increasing ε(prot) scales both terms, which for these residues leads to small changes in pK(a). The introduced residues have a larger desolvation penalty and negligible interactions with residues in the protein. For these residues, changing ε(prot) has a large influence on the calculated pK(a). An ε(prot) of 8-10 and a Lennard-Jones scaling of 0.25 is best here. The X-ray crystal structures of the mutated proteins are found to provide somewhat better results than calculations carried out on mutations made in silico. Initial relaxation of the in silico mutations by Gromacs and extensive side chain rotamer sampling within MCCE can significantly improve the match with experiment. Copyright © 2011 Wiley-Liss, Inc.

  6. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  7. Numerical Investigation on Detection of Prestress Losses in a Prestressed Concrete Slab by Modal Analysis

    NASA Astrophysics Data System (ADS)

    Kovalovs, A.; Rucevskis, S.; Akishin, P.; Kolupajevs, J.

    2017-10-01

    The paper presents numerical results of loss of prestress in the reinforced prestressed precast hollow core slabs by modal analysis. Loss of prestress is investigated by the 3D finite element method, using ANSYS software. In the numerical examples, variables initial stresses were introduced into seven-wire stress-relieved strands of the concrete slabs. The effects of span and material properties of concrete on the modal frequencies of the concrete structure under initial stress were studied. Modal parameters computed from the finite element models were compared. Applicability and effectiveness of the proposed method was investigated.

  8. Kinematic Analysis and Performance Evaluation of Novel PRS Parallel Mechanism

    NASA Astrophysics Data System (ADS)

    Balaji, K.; Khan, B. Shahul Hamid

    2018-02-01

    In this paper, a 3 DoF (Degree of Freedom) novel PRS (Prismatic-Revolute- Spherical) type parallel mechanisms has been designed and presented. The combination of striaght and arc type linkages for 3 DOF parallel mechanism is introduced for the first time. The performances of the mechanisms are evaluated based on the indices such as Minimum Singular Value (MSV), Condition Number (CN), Local Conditioning Index (LCI), Kinematic Configuration Index (KCI) and Global Conditioning Index (GCI). The overall reachable workspace of all mechanisms are presented. The kinematic measure, dexterity measure and workspace analysis for all the mechanism have been evaluated and compared.

  9. Discriminative structural approaches for enzyme active-site prediction.

    PubMed

    Kato, Tsuyoshi; Nagano, Nozomi

    2011-02-15

    Predicting enzyme active-sites in proteins is an important issue not only for protein sciences but also for a variety of practical applications such as drug design. Because enzyme reaction mechanisms are based on the local structures of enzyme active-sites, various template-based methods that compare local structures in proteins have been developed to date. In comparing such local sites, a simple measurement, RMSD, has been used so far. This paper introduces new machine learning algorithms that refine the similarity/deviation for comparison of local structures. The similarity/deviation is applied to two types of applications, single template analysis and multiple template analysis. In the single template analysis, a single template is used as a query to search proteins for active sites, whereas a protein structure is examined as a query to discover the possible active-sites using a set of templates in the multiple template analysis. This paper experimentally illustrates that the machine learning algorithms effectively improve the similarity/deviation measurements for both the analyses.

  10. Differential analysis between somatic mutation and germline variation profiles reveals cancer-related genes.

    PubMed

    Przytycki, Pawel F; Singh, Mona

    2017-08-25

    A major aim of cancer genomics is to pinpoint which somatically mutated genes are involved in tumor initiation and progression. We introduce a new framework for uncovering cancer genes, differential mutation analysis, which compares the mutational profiles of genes across cancer genomes with their natural germline variation across healthy individuals. We present DiffMut, a fast and simple approach for differential mutational analysis, and demonstrate that it is more effective in discovering cancer genes than considerably more sophisticated approaches. We conclude that germline variation across healthy human genomes provides a powerful means for characterizing somatic mutation frequency and identifying cancer driver genes. DiffMut is available at https://github.com/Singh-Lab/Differential-Mutation-Analysis .

  11. Evaluating acoustic speaker normalization algorithms: evidence from longitudinal child data.

    PubMed

    Kohn, Mary Elizabeth; Farrington, Charlie

    2012-03-01

    Speaker vowel formant normalization, a technique that controls for variation introduced by physical differences between speakers, is necessary in variationist studies to compare speakers of different ages, genders, and physiological makeup in order to understand non-physiological variation patterns within populations. Many algorithms have been established to reduce variation introduced into vocalic data from physiological sources. The lack of real-time studies tracking the effectiveness of these normalization algorithms from childhood through adolescence inhibits exploration of child participation in vowel shifts. This analysis compares normalization techniques applied to data collected from ten African American children across five time points. Linear regressions compare the reduction in variation attributable to age and gender for each speaker for the vowels BEET, BAT, BOT, BUT, and BOAR. A normalization technique is successful if it maintains variation attributable to a reference sociolinguistic variable, while reducing variation attributable to age. Results indicate that normalization techniques which rely on both a measure of central tendency and range of the vowel space perform best at reducing variation attributable to age, although some variation attributable to age persists after normalization for some sections of the vowel space. © 2012 Acoustical Society of America

  12. Reasoning about variables in 11 to 18 year olds: informal, schooled and formal expression in learning about functions

    NASA Astrophysics Data System (ADS)

    Ayalon, Michal; Watson, Anne; Lerman, Steve

    2016-09-01

    This study examines expressions of reasoning by some higher achieving 11 to 18 year-old English students responding to a survey consisting of function tasks developed in collaboration with their teachers. We report on 70 students, 10 from each of English years 7-13. Iterative and comparative analysis identified capabilities and difficulties of students and suggested conjectures concerning links between the affordances of the tasks, the curriculum, and students' responses. The paper focuses on five of the survey tasks and highlights connections between informal and formal expressions of reasoning about variables in learning. We introduce the notion of `schooled' expressions of reasoning, neither formal nor informal, to emphasise the role of the formatting tools introduced in school that shape future understanding and reasoning.

  13. Development of a variable structure-based fault detection and diagnosis strategy applied to an electromechanical system

    NASA Astrophysics Data System (ADS)

    Gadsden, S. Andrew; Kirubarajan, T.

    2017-05-01

    Signal processing techniques are prevalent in a wide range of fields: control, target tracking, telecommunications, robotics, fault detection and diagnosis, and even stock market analysis, to name a few. Although first introduced in the 1950s, the most popular method used for signal processing and state estimation remains the Kalman filter (KF). The KF offers an optimal solution to the estimation problem under strict assumptions. Since this time, a number of other estimation strategies and filters were introduced to overcome robustness issues, such as the smooth variable structure filter (SVSF). In this paper, properties of the SVSF are explored in an effort to detect and diagnosis faults in an electromechanical system. The results are compared with the KF method, and future work is discussed.

  14. Analytical criteria for performance characteristics of IgE binding methods for evaluating safety of biotech food products.

    PubMed

    Holzhauser, Thomas; Ree, Ronald van; Poulsen, Lars K; Bannon, Gary A

    2008-10-01

    There is detailed guidance on how to perform bioinformatic analyses and enzymatic degradation studies for genetically modified crops under consideration for approval by regulatory agencies; however, there is no consensus in the scientific community on the details of how to perform IgE serum studies. IgE serum studies are an important safety component to acceptance of genetically modified crops when the introduced protein is novel, the introduced protein is similar to known allergens, or the crop is allergenic. In this manuscript, we describe the characteristics of the reagents, validation of assay performance, and data analysis necessary to optimize the information obtained from serum testing of novel proteins and genetically modified (GM) crops and to make results more accurate and comparable between different investigations.

  15. Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments

    DTIC Science & Technology

    2017-06-13

    reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must

  16. [Variable selection methods combined with local linear embedding theory used for optimization of near infrared spectral quantitative models].

    PubMed

    Hao, Yong; Sun, Xu-Dong; Yang, Qiang

    2012-12-01

    Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.

  17. Description of textures by a structural analysis.

    PubMed

    Tomita, F; Shirai, Y; Tsuji, S

    1982-02-01

    A structural analysis system for describing natural textures is introduced. The analyzer automatically extracts the texture elements in an input image, measures their properties, classifies them into some distinctive classes (one ``ground'' class and some ``figure'' classes), and computes the distributions of the gray level, the shape, and the placement of the texture elements in each class. These descriptions are used for classification of texture images. An analysis-by-synthesis method for evaluating texture analyzers is also presented. We propose a synthesizer which generates a texture image based on the descriptions. By comparing the reconstructed image with the original one, we can see what information is preserved and what is lost in the descriptions.

  18. [Clinical value evaluation of Chinese herbal formula in context of multi-omics network].

    PubMed

    Li, Bing; Han, Fei; Wang, Zhong; Wang, Yong-Yan

    2017-03-01

    Clinical value evaluation is the key issue to solve the problems such as high repetition rate, fuzzy clinical positioning, broad indications and unclear clinical values in Chinese herbal formula(Chinese patent medicine). By analyzing the challenges and opportunities of Chinese herbal formula in clinical value evaluation, this paper introduced a strategy of multi-omic network analysis. Through comparative analysis of three stroke treatment formulas, we suggested their different characteristic advantages for variant symptoms or phenotypes of stroke, which may provide reference for rational clinical choice. Such multi-omic network analysis strategy may open a unique angle of view for clinical evaluation and comparison of Chinese herbal formula. Copyright© by the Chinese Pharmaceutical Association.

  19. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  20. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  1. Optimizing methods and dodging pitfalls in microbiome research.

    PubMed

    Kim, Dorothy; Hofstaedter, Casey E; Zhao, Chunyu; Mattei, Lisa; Tanes, Ceylan; Clarke, Erik; Lauder, Abigail; Sherrill-Mix, Scott; Chehoud, Christel; Kelsen, Judith; Conrad, Máire; Collman, Ronald G; Baldassano, Robert; Bushman, Frederic D; Bittinger, Kyle

    2017-05-05

    Research on the human microbiome has yielded numerous insights into health and disease, but also has resulted in a wealth of experimental artifacts. Here, we present suggestions for optimizing experimental design and avoiding known pitfalls, organized in the typical order in which studies are carried out. We first review best practices in experimental design and introduce common confounders such as age, diet, antibiotic use, pet ownership, longitudinal instability, and microbial sharing during cohousing in animal studies. Typically, samples will need to be stored, so we provide data on best practices for several sample types. We then discuss design and analysis of positive and negative controls, which should always be run with experimental samples. We introduce a convenient set of non-biological DNA sequences that can be useful as positive controls for high-volume analysis. Careful analysis of negative and positive controls is particularly important in studies of samples with low microbial biomass, where contamination can comprise most or all of a sample. Lastly, we summarize approaches to enhancing experimental robustness by careful control of multiple comparisons and to comparing discovery and validation cohorts. We hope the experimental tactics summarized here will help researchers in this exciting field advance their studies efficiently while avoiding errors.

  2. Low Predictability of Colour Polymorphism in Introduced Guppy (Poecilia reticulata) Populations in Panama

    PubMed Central

    Martínez, Celestino; Chavarría, Carmen; Sharpe, Diana M. T.; De León, Luis Fernando

    2016-01-01

    Colour polymorphism is a recurrent feature of natural populations, and its maintenance has been studied in a range of taxa in their native ranges. However, less is known about whether (and how) colour polymorphism is maintained when populations are removed from their native environments, as in the case of introduced species. We here address this issue by analyzing variation in colour patterns in recently-discovered introduced populations of the guppy (Poecilia reticulata) in Panama. Specifically, we use classic colour analysis to estimate variation in the number and the relative area of different colour spots across low predation sites in the introduced Panamanian range of the species. We then compare this variation to that found in the native range of the species under low- and high predation regimes. We found aspects of the colour pattern that were both consistent and inconsistent with the classical paradigm of colour evolution in guppies. On one hand, the same colours that dominated in native populations (orange, iridescent and black) were also the most dominant in the introduced populations in Panama. On the other, there were no clear differences between either introduced-low and native low- and high predation populations. Our results are therefore only partially consistent with the traditional role of female preference in the absence of predators, and suggest that additional factors could influence colour patterns when populations are removed from their native environments. Future research on the interaction between female preference and environmental variability (e.g. multifarious selection), could help understand adaptive variation in this widely-introduced species, and the contexts under which variation in adaptive traits parallels (or not) variation in the native range. PMID:26863538

  3. Low Predictability of Colour Polymorphism in Introduced Guppy (Poecilia reticulata) Populations in Panama.

    PubMed

    Martínez, Celestino; Chavarría, Carmen; Sharpe, Diana M T; De León, Luis Fernando

    2016-01-01

    Colour polymorphism is a recurrent feature of natural populations, and its maintenance has been studied in a range of taxa in their native ranges. However, less is known about whether (and how) colour polymorphism is maintained when populations are removed from their native environments, as in the case of introduced species. We here address this issue by analyzing variation in colour patterns in recently-discovered introduced populations of the guppy (Poecilia reticulata) in Panama. Specifically, we use classic colour analysis to estimate variation in the number and the relative area of different colour spots across low predation sites in the introduced Panamanian range of the species. We then compare this variation to that found in the native range of the species under low- and high predation regimes. We found aspects of the colour pattern that were both consistent and inconsistent with the classical paradigm of colour evolution in guppies. On one hand, the same colours that dominated in native populations (orange, iridescent and black) were also the most dominant in the introduced populations in Panama. On the other, there were no clear differences between either introduced-low and native low- and high predation populations. Our results are therefore only partially consistent with the traditional role of female preference in the absence of predators, and suggest that additional factors could influence colour patterns when populations are removed from their native environments. Future research on the interaction between female preference and environmental variability (e.g. multifarious selection), could help understand adaptive variation in this widely-introduced species, and the contexts under which variation in adaptive traits parallels (or not) variation in the native range.

  4. Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A

    NASA Technical Reports Server (NTRS)

    Perez, Reinaldo J.

    1990-01-01

    A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.

  5. Hierarchical analysis of spatial pattern and processes of Douglas-fir forests. Ph.D. Thesis, 10 Sep. 1991 Abstract Only

    NASA Technical Reports Server (NTRS)

    Bradshaw, G. A.

    1995-01-01

    There has been an increased interest in the quantification of pattern in ecological systems over the past years. This interest is motivated by the desire to construct valid models which extend across many scales. Spatial methods must quantify pattern, discriminate types of pattern, and relate hierarchical phenomena across scales. Wavelet analysis is introduced as a method to identify spatial structure in ecological transect data. The main advantage of the wavelet transform over other methods is its ability to preserve and display hierarchical information while allowing for pattern decomposition. Two applications of wavelet analysis are illustrated, as a means to: (1) quantify known spatial patterns in Douglas-fir forests at several scales, and (2) construct spatially-explicit hypotheses regarding pattern generating mechanisms. Application of the wavelet variance, derived from the wavelet transform, is developed for forest ecosystem analysis to obtain additional insight into spatially-explicit data. Specifically, the resolution capabilities of the wavelet variance are compared to the semi-variogram and Fourier power spectra for the description of spatial data using a set of one-dimensional stationary and non-stationary processes. The wavelet cross-covariance function is derived from the wavelet transform and introduced as a alternative method for the analysis of multivariate spatial data of understory vegetation and canopy in Douglas-fir forests of the western Cascades of Oregon.

  6. Calibration and assessment of channel-specific biases in microarray data with extended dynamical range.

    PubMed

    Bengtsson, Henrik; Jönsson, Göran; Vallon-Christersson, Johan

    2004-11-12

    Non-linearities in observed log-ratios of gene expressions, also known as intensity dependent log-ratios, can often be accounted for by global biases in the two channels being compared. Any step in a microarray process may introduce such offsets and in this article we study the biases introduced by the microarray scanner and the image analysis software. By scanning the same spotted oligonucleotide microarray at different photomultiplier tube (PMT) gains, we have identified a channel-specific bias present in two-channel microarray data. For the scanners analyzed it was in the range of 15-25 (out of 65,535). The observed bias was very stable between subsequent scans of the same array although the PMT gain was greatly adjusted. This indicates that the bias does not originate from a step preceding the scanner detector parts. The bias varies slightly between arrays. When comparing estimates based on data from the same array, but from different scanners, we have found that different scanners introduce different amounts of bias. So do various image analysis methods. We propose a scanning protocol and a constrained affine model that allows us to identify and estimate the bias in each channel. Backward transformation removes the bias and brings the channels to the same scale. The result is that systematic effects such as intensity dependent log-ratios are removed, but also that signal densities become much more similar. The average scan, which has a larger dynamical range and greater signal-to-noise ratio than individual scans, can then be obtained. The study shows that microarray scanners may introduce a significant bias in each channel. Such biases have to be calibrated for, otherwise systematic effects such as intensity dependent log-ratios will be observed. The proposed scanning protocol and calibration method is simple to use and is useful for evaluating scanner biases or for obtaining calibrated measurements with extended dynamical range and better precision. The cross-platform R package aroma, which implements all described methods, is available for free from http://www.maths.lth.se/bioinformatics/.

  7. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  8. Strategies for Introducing Outpatient Specialty Palliative Care in Gynecologic Oncology.

    PubMed

    Hay, Casey M; Lefkowits, Carolyn; Crowley-Matoka, Megan; Bakitas, Marie A; Clark, Leslie H; Duska, Linda R; Urban, Renata R; Creasy, Stephanie L; Schenker, Yael

    2017-09-01

    Concern that patients will react negatively to the idea of palliative care is cited as a barrier to timely referral. Strategies to successfully introduce specialty palliative care to patients have not been well described. We sought to understand how gynecologic oncologists introduce outpatient specialty palliative care. We conducted a national qualitative interview study at six geographically diverse academic cancer centers with well-established palliative care clinics between September 2015 and March 2016. Thirty-four gynecologic oncologists participated in semistructured telephone interviews focusing on attitudes, experiences, and practices related to outpatient palliative care. A multidisciplinary team analyzed interview transcripts using constant comparative methods to inductively develop and refine a coding framework. This analysis focuses on practices for introducing palliative care. Mean participant age was 47 years (standard deviation, 10 years). Mean interview length was 25 minutes (standard deviation, 7 minutes). Gynecologic oncologists described the following three main strategies for introducing outpatient specialty palliative care: focus initial palliative care referral on symptom management to dissociate palliative care from end-of-life care and facilitate early relationship building with palliative care clinicians; use a strong physician-patient relationship and patient trust to increase acceptance of referral; and explain and normalize palliative care referral to address negative associations and decrease patient fear of abandonment. These strategies aim to decrease negative patient associations and encourage acceptance of early referral to palliative care specialists. Gynecologic oncologists have developed strategies for introducing palliative care services to alleviate patient concerns. These strategies provide groundwork for developing system-wide best practice approaches to the presentation of palliative care referral.

  9. Considerations for analysis of time-to-event outcomes measured with error: Bias and correction with SIMEX.

    PubMed

    Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A

    2018-04-15

    For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Design and simulation of MEMS-actuated adjustable optical wedge for laser beam scanners

    NASA Astrophysics Data System (ADS)

    Bahgat, Ahmed S.; Zaki, Ahmed H.; Abdo Mohamed, Mohamed; El Sherif, Ashraf Fathy

    2018-01-01

    This paper introduces both optical and mechanical design and simulation of large static deflection MOEMS actuator. The designed device is in the form of an adjustable optical wedge (AOW) laser scanner. The AOW is formed of 1.5-mm-diameter plano-convex lens separated by air gap from plano-concave fixed lens. The convex lens is actuated by staggered vertical comb drive and suspended by rectangular cross-section torsion beam. An optical analysis and simulation of air separated AOW as well as detailed design, analysis, and static simulation of comb -drive are introduced. The dynamic step response of the full system is also introduced. The analytical solution showed a good agreement with the simulation results. A general global minimum optimization algorithm is applied to the comb-drive design to minimize driving voltage. A maximum comb-drive mechanical deflection angle of 12 deg in each direction was obtained under DC actuation voltage of 32 V with a settling time of 90 ms, leading to 1-mm one-dimensional (1-D) steering of laser beam with continuous optical scan angle of 5 deg in each direction. This optimization process provided a design of larger deflection actuator with smaller driving voltage compared with other conventional devices. This enhancement could lead to better performance of MOEMS-based laser beam scanners for imaging and low-speed applications.

  11. A Spiking Neural Network Methodology and System for Learning and Comparative Analysis of EEG Data From Healthy Versus Addiction Treated Versus Addiction Not Treated Subjects.

    PubMed

    Doborjeh, Maryam Gholami; Wang, Grace Y; Kasabov, Nikola K; Kydd, Robert; Russell, Bruce

    2016-09-01

    This paper introduces a method utilizing spiking neural networks (SNN) for learning, classification, and comparative analysis of brain data. As a case study, the method was applied to electroencephalography (EEG) data collected during a GO/NOGO cognitive task performed by untreated opiate addicts, those undergoing methadone maintenance treatment (MMT) for opiate dependence and a healthy control group. the method is based on an SNN architecture called NeuCube, trained on spatiotemporal EEG data. NeuCube was used to classify EEG data across subject groups and across GO versus NOGO trials, but also facilitated a deeper comparative analysis of the dynamic brain processes. This analysis results in a better understanding of human brain functioning across subject groups when performing a cognitive task. In terms of the EEG data classification, a NeuCube model obtained better results (the maximum obtained accuracy: 90.91%) when compared with traditional statistical and artificial intelligence methods (the maximum obtained accuracy: 50.55%). more importantly, new information about the effects of MMT on cognitive brain functions is revealed through the analysis of the SNN model connectivity and its dynamics. this paper presented a new method for EEG data modeling and revealed new knowledge on brain functions associated with mental activity which is different from the brain activity observed in a resting state of the same subjects.

  12. COGNAT: a web server for comparative analysis of genomic neighborhoods.

    PubMed

    Klimchuk, Olesya I; Konovalov, Kirill A; Perekhvatov, Vadim V; Skulachev, Konstantin V; Dibrova, Daria V; Mulkidjanian, Armen Y

    2017-11-22

    In prokaryotic genomes, functionally coupled genes can be organized in conserved gene clusters enabling their coordinated regulation. Such clusters could contain one or several operons, which are groups of co-transcribed genes. Those genes that evolved from a common ancestral gene by speciation (i.e. orthologs) are expected to have similar genomic neighborhoods in different organisms, whereas those copies of the gene that are responsible for dissimilar functions (i.e. paralogs) could be found in dissimilar genomic contexts. Comparative analysis of genomic neighborhoods facilitates the prediction of co-regulated genes and helps to discern different functions in large protein families. We intended, building on the attribution of gene sequences to the clusters of orthologous groups of proteins (COGs), to provide a method for visualization and comparative analysis of genomic neighborhoods of evolutionary related genes, as well as a respective web server. Here we introduce the COmparative Gene Neighborhoods Analysis Tool (COGNAT), a web server for comparative analysis of genomic neighborhoods. The tool is based on the COG database, as well as the Pfam protein families database. As an example, we show the utility of COGNAT in identifying a new type of membrane protein complex that is formed by paralog(s) of one of the membrane subunits of the NADH:quinone oxidoreductase of type 1 (COG1009) and a cytoplasmic protein of unknown function (COG3002). This article was reviewed by Drs. Igor Zhulin, Uri Gophna and Igor Rogozin.

  13. Exploratory Bi-Factor Analysis: The Oblique Case

    ERIC Educational Resources Information Center

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  14. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  15. Coded excitation for infrared non-destructive testing of carbon fiber reinforced plastics.

    PubMed

    Mulaveesala, Ravibabu; Venkata Ghali, Subbarao

    2011-05-01

    This paper proposes a Barker coded excitation for defect detection using infrared non-destructive testing. Capability of the proposed excitation scheme is highlighted with recently introduced correlation based post processing approach and compared with the existing phase based analysis by taking the signal to noise ratio into consideration. Applicability of the proposed scheme has been experimentally validated on a carbon fiber reinforced plastic specimen containing flat bottom holes located at different depths.

  16. Application of finite element method in mechanical design of automotive parts

    NASA Astrophysics Data System (ADS)

    Gu, Suohai

    2017-09-01

    As an effective numerical analysis method, finite element method (FEM) has been widely used in mechanical design and other fields. In this paper, the development of FEM is introduced firstly, then the specific steps of FEM applications are illustrated and the difficulties of FEM are summarized in detail. Finally, applications of FEM in automobile components such as automobile wheel, steel plate spring, body frame, shaft parts and so on are summarized, compared with related research experiments.

  17. A review and critique of some models used in competing risk analysis.

    PubMed

    Gail, M

    1975-03-01

    We have introduced a notation which allows one to define competing risk models easily and to examine underlying assumptions. We have treated the actuarial model for competing risk in detail, comparing it with other models and giving useful variance formulae both for the case when times of death are available and for the case when they are not. The generality of these methods is illustrated by an example treating two dependent competing risks.

  18. Introducing Co-Activation Pattern Metrics to Quantify Spontaneous Brain Network Dynamics

    PubMed Central

    Chen, Jingyuan E.; Chang, Catie; Greicius, Michael D.; Glover, Gary H.

    2015-01-01

    Recently, fMRI researchers have begun to realize that the brain's intrinsic network patterns may undergo substantial changes during a single resting state (RS) scan. However, despite the growing interest in brain dynamics, metrics that can quantify the variability of network patterns are still quite limited. Here, we first introduce various quantification metrics based on the extension of co-activation pattern (CAP) analysis, a recently proposed point-process analysis that tracks state alternations at each individual time frame and relies on very few assumptions; then apply these proposed metrics to quantify changes of brain dynamics during a sustained 2-back working memory (WM) task compared to rest. We focus on the functional connectivity of two prominent RS networks, the default-mode network (DMN) and executive control network (ECN). We first demonstrate less variability of global Pearson correlations with respect to the two chosen networks using a sliding-window approach during WM task compared to rest; then we show that the macroscopic decrease in variations in correlations during a WM task is also well characterized by the combined effect of a reduced number of dominant CAPs, increased spatial consistency across CAPs, and increased fractional contributions of a few dominant CAPs. These CAP metrics may provide alternative and more straightforward quantitative means of characterizing brain network dynamics than time-windowed correlation analyses. PMID:25662866

  19. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis.

    PubMed

    Schäfer, Sebastian; Nylund, Kim; Sævik, Fredrik; Engjom, Trond; Mézl, Martin; Jiřík, Radovan; Dimcevski, Georg; Gilja, Odd Helge; Tönnies, Klaus

    2015-08-01

    This paper presents a system for correcting motion influences in time-dependent 2D contrast-enhanced ultrasound (CEUS) images to assess tissue perfusion characteristics. The system consists of a semi-automatic frame selection method to find images with out-of-plane motion as well as a method for automatic motion compensation. Translational and non-rigid motion compensation is applied by introducing a temporal continuity assumption. A study consisting of 40 clinical datasets was conducted to compare the perfusion with simulated perfusion using pharmacokinetic modeling. Overall, the proposed approach decreased the mean average difference between the measured perfusion and the pharmacokinetic model estimation. It was non-inferior for three out of four patient cohorts to a manual approach and reduced the analysis time by 41% compared to manual processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Interplay of heritage and habitat in the distribution of bacterial signal transduction systems.

    PubMed

    Galperin, Michael Y; Higdon, Roger; Kolker, Eugene

    2010-04-01

    Comparative analysis of the complete genome sequences from a variety of poorly studied organisms aims at predicting ecological and behavioral properties of these organisms and helping in characterizing their habitats. This task requires finding appropriate descriptors that could be correlated with the core traits of each system and would allow meaningful comparisons. Using the relatively simple bacterial models, first attempts have been made to introduce suitable metrics to describe the complexity of organism's signaling machinery, which included introducing the "bacterial IQ" score. Here, we use an updated census of prokaryotic signal transduction systems to improve this parameter and evaluate its consistency within selected bacterial phyla. We also introduce a more elaborate descriptor, a set of profiles of relative abundance of members of each family of signal transduction proteins encoded in each genome. We show that these family profiles are well conserved within each genus and are often consistent within families of bacteria. Thus, they reflect evolutionary relationships between organisms as well as individual adaptations of each organism to its specific ecological niche.

  1. Objective and subjective quality assessment of geometry compression of reconstructed 3D humans in a 3D virtual room

    NASA Astrophysics Data System (ADS)

    Mekuria, Rufael; Cesar, Pablo; Doumanis, Ioannis; Frisiello, Antonella

    2015-09-01

    Compression of 3D object based video is relevant for 3D Immersive applications. Nevertheless, the perceptual aspects of the degradation introduced by codecs for meshes and point clouds are not well understood. In this paper we evaluate the subjective and objective degradations introduced by such codecs in a state of art 3D immersive virtual room. In the 3D immersive virtual room, users are captured with multiple cameras, and their surfaces are reconstructed as photorealistic colored/textured 3D meshes or point clouds. To test the perceptual effect of compression and transmission, we render degraded versions with different frame rates in different contexts (near/far) in the scene. A quantitative subjective study with 16 users shows that negligible distortion of decoded surfaces compared to the original reconstructions can be achieved in the 3D virtual room. In addition, a qualitative task based analysis in a full prototype field trial shows increased presence, emotion, user and state recognition of the reconstructed 3D Human representation compared to animated computer avatars.

  2. Quartz Crystal Microbalance Electronic Interfacing Systems: A Review.

    PubMed

    Alassi, Abdulrahman; Benammar, Mohieddine; Brett, Dan

    2017-12-05

    Quartz Crystal Microbalance (QCM) sensors are actively being implemented in various fields due to their compatibility with different operating conditions in gaseous/liquid mediums for a wide range of measurements. This trend has been matched by the parallel advancement in tailored electronic interfacing systems for QCM sensors. That is, selecting the appropriate electronic circuit is vital for accurate sensor measurements. Many techniques were developed over time to cover the expanding measurement requirements (e.g., accommodating highly-damping environments). This paper presents a comprehensive review of the various existing QCM electronic interfacing systems. Namely, impedance-based analysis, oscillators (conventional and lock-in based techniques), exponential decay methods and the emerging phase-mass based characterization. The aforementioned methods are discussed in detail and qualitatively compared in terms of their performance for various applications. In addition, some theoretical improvements and recommendations are introduced for adequate systems implementation. Finally, specific design considerations of high-temperature microbalance systems (e.g., GaPO₄ crystals (GCM) and Langasite crystals (LCM)) are introduced, while assessing their overall system performance, stability and quality compared to conventional low-temperature applications.

  3. Beyond conventional dose-response curves: Sensorgram comparison in SPR allows single concentration activity and similarity assessment.

    PubMed

    Gassner, C; Karlsson, R; Lipsmeier, F; Moelleken, J

    2018-05-30

    Previously we have introduced two SPR-based assay principles (dual-binding assay and bridging assay), which allow the determination of two out of three possible interaction parameters for bispecific molecules within one assay setup: two individual interactions to both targets, and/or one simultaneous/overall interaction, which potentially reflects the inter-dependency of both individual binding events. However, activity and similarity are determined by comparing report points over a concentration range, which also mirrors the way data is generated by conventional ELISA-based methods So far, binding kinetics have not been specifically considered in generic approaches for activity assessment. Here, we introduce an improved slope-ratio model which, together with a sensorgram comparison based similarity assessment, allows the development of a detailed, USP-conformal ligand binding assay using only a single sample concentration. We compare this novel analysis method to the usual concentration-range approach for both SPR-based assay principles and discuss its impact on data quality and increased sample throughput. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Quartz Crystal Microbalance Electronic Interfacing Systems: A Review

    PubMed Central

    Benammar, Mohieddine; Brett, Dan

    2017-01-01

    Quartz Crystal Microbalance (QCM) sensors are actively being implemented in various fields due to their compatibility with different operating conditions in gaseous/liquid mediums for a wide range of measurements. This trend has been matched by the parallel advancement in tailored electronic interfacing systems for QCM sensors. That is, selecting the appropriate electronic circuit is vital for accurate sensor measurements. Many techniques were developed over time to cover the expanding measurement requirements (e.g., accommodating highly-damping environments). This paper presents a comprehensive review of the various existing QCM electronic interfacing systems. Namely, impedance-based analysis, oscillators (conventional and lock-in based techniques), exponential decay methods and the emerging phase-mass based characterization. The aforementioned methods are discussed in detail and qualitatively compared in terms of their performance for various applications. In addition, some theoretical improvements and recommendations are introduced for adequate systems implementation. Finally, specific design considerations of high-temperature microbalance systems (e.g., GaPO4 crystals (GCM) and Langasite crystals (LCM)) are introduced, while assessing their overall system performance, stability and quality compared to conventional low-temperature applications. PMID:29206212

  5. Secure multiparty computation of a comparison problem.

    PubMed

    Liu, Xin; Li, Shundong; Liu, Jian; Chen, Xiubo; Xu, Gang

    2016-01-01

    Private comparison is fundamental to secure multiparty computation. In this study, we propose novel protocols to privately determine [Formula: see text], or [Formula: see text] in one execution. First, a 0-1-vector encoding method is introduced to encode a number into a vector, and the Goldwasser-Micali encryption scheme is used to compare integers privately. Then, we propose a protocol by using a geometric method to compare rational numbers privately, and the protocol is information-theoretical secure. Using the simulation paradigm, we prove the privacy-preserving property of our protocols in the semi-honest model. The complexity analysis shows that our protocols are more efficient than previous solutions.

  6. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    NASA Astrophysics Data System (ADS)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as "targeted change detection". Based on a one-class classifier "Support Vector Domain Description (SVDD)", a novel algorithm named "Three-layer SVDD Fusion (TLSF)" is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  7. An air brake model for longitudinal train dynamics studies

    NASA Astrophysics Data System (ADS)

    Wei, Wei; Hu, Yang; Wu, Qing; Zhao, Xubao; Zhang, Jun; Zhang, Yuan

    2017-04-01

    Experience of heavy haul train operation shows that heavy haul train fatigue fracture of coupler and its related components, even the accidents are caused by excessive coupler force. The most economical and effective method to study on train longitudinal impulse by reducing the coupler force is simulation method. The characteristics of train air brake system is an important excitation source for the study of longitudinal impulse. It is very difficult to obtain the braking characteristic by the test method, a better way to get the input parameters of the excitation source in the train longitudinal dynamics is modelling the train air brake system. In this paper, the air brake system model of integrated system of air brake and longitudinal dynamics is introduced. This introduce is focus on the locomotive automatic brake valve and vehicle distribution valve model, and the comparative analysis of the simulation and test results of the braking system is given. It is proved that the model can predict the characteristics of train braking system. This method provides a good solution for the excitation source of longitudinal dynamic analysis system.

  8. A new constitutive model for simulation of softening, plateau, and densification phenomena for trabecular bone under compression.

    PubMed

    Lee, Chi-Seung; Lee, Jae-Myung; Youn, BuHyun; Kim, Hyung-Sik; Shin, Jong Ki; Goh, Tae Sik; Lee, Jung Sub

    2017-01-01

    A new type of constitutive model and its computational implementation procedure for the simulation of a trabecular bone are proposed in the present study. A yield surface-independent Frank-Brockman elasto-viscoplastic model is introduced to express the nonlinear material behavior such as softening beyond yield point, plateau, and densification under compressive loads. In particular, the hardening- and softening-dominant material functions are introduced and adopted in the plastic multiplier to describe each nonlinear material behavior separately. In addition, the elasto-viscoplastic model is transformed into an implicit type discrete model, and is programmed as a user-defined material subroutine in commercial finite element analysis code. In particular, the consistent tangent modulus method is proposed to improve the computational convergence and to save computational time during finite element analysis. Through the developed material library, the nonlinear stress-strain relationship is analyzed qualitatively and quantitatively, and the simulation results are compared with the results of compression test on the trabecular bone to validate the proposed constitutive model, computational method, and material library. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Multi-agent fare optimization model of two modes problem and its analysis based on edge of chaos

    NASA Astrophysics Data System (ADS)

    Li, Xue-yan; Li, Xue-mei; Li, Xue-wei; Qiu, He-ting

    2017-03-01

    This paper proposes a new framework of fare optimization & game model for studying the competition between two travel modes (high speed railway and civil aviation) in which passengers' group behavior is taken into consideration. The small-world network is introduced to construct the multi-agent model of passengers' travel mode choice. The cumulative prospect theory is adopted to depict passengers' bounded rationality, the heterogeneity of passengers' reference point is depicted using the idea of group emotion computing. The conceptions of "Langton parameter" and "evolution entropy" in the theory of "edge of chaos" are introduced to create passengers' "decision coefficient" and "evolution entropy of travel mode choice" which are used to quantify passengers' group behavior. The numerical simulation and the analysis of passengers' behavior show that (1) the new model inherits the features of traditional model well and the idea of self-organizing traffic flow evolution fully embodies passengers' bounded rationality, (2) compared with the traditional model (logit model), when passengers are in the "edge of chaos" state, the total profit of the transportation system is higher.

  10. Effectiveness of a financial incentive to physicians for timely follow-up after hospital discharge: a population-based time series analysis.

    PubMed

    Lapointe-Shaw, Lauren; Mamdani, Muhammad; Luo, Jin; Austin, Peter C; Ivers, Noah M; Redelmeier, Donald A; Bell, Chaim M

    2017-10-02

    Timely follow-up after hospital discharge may decrease readmission to hospital. Financial incentives to improve follow-up have been introduced in the United States and Canada, but it is unknown whether they are effective. Our objective was to evaluate the impact of an incentive program on timely physician follow-up after hospital discharge. We conducted an interventional time series analysis of all medical and surgical patients who were discharged home from hospital between Apr. 1, 2002, and Jan. 30, 2015, in Ontario, Canada. The intervention was a supplemental billing code for physician follow-up within 14 days of discharge from hospital, introduced in 2006. The primary outcome was an outpatient visit within 14 days of discharge. Secondary outcomes were 7-day follow-up and a composite of emergency department visits, nonelective hospital readmission and death within 14 days. We included 8 008 934 patient discharge records. The incentive code was claimed in 31% of eligible visits by 51% of eligible physicians, and cost $17.5 million over the study period. There was no change in the average monthly rate of outcomes in the year before the incentive was introduced compared with the year following introduction: 14-day follow-up (66.5% v. 67.0%, overall p = 0.5), 7-day follow-up (44.9% v. 44.9%, overall p = 0.5) and composite outcome (16.7% v. 16.9%, overall p = 0.2). Despite uptake by physicians, a financial incentive did not alter follow-up after hospital discharge. This lack of effect may be explained by features of the incentive or by extra-physician barriers to follow-up. These barriers should be considered by policymakers before introducing similar initiatives. © 2017 Canadian Medical Association or its licensors.

  11. How market structure drives commodity prices

    NASA Astrophysics Data System (ADS)

    Li, Bin; Wong, K. Y. Michael; Chan, Amos H. M.; So, Tsz Yan; Heimonen, Hermanni; Wei, Junyi; Saad, David

    2017-11-01

    We introduce an agent-based model, in which agents set their prices to maximize profit. At steady state the market self-organizes into three groups: excess producers, consumers and balanced agents, with prices determined by their own resource level and a couple of macroscopic parameters that emerge naturally from the analysis, akin to mean-field parameters in statistical mechanics. When resources are scarce prices rise sharply below a turning point that marks the disappearance of excess producers. To compare the model with real empirical data, we study the relationship between commodity prices and stock-to-use ratios in a range of commodities such as agricultural products and metals. By introducing an elasticity parameter to mitigate noise and long-term changes in commodities data, we confirm the trend of rising prices, provide evidence for turning points, and indicate yield points for less essential commodities.

  12. Tripartite community structure in social bookmarking data

    NASA Astrophysics Data System (ADS)

    Neubauer, Nicolas; Obermayer, Klaus

    2011-12-01

    Community detection is a branch of network analysis concerned with identifying strongly connected subnetworks. Social bookmarking sites aggregate datasets of often hundreds of millions of triples (document, user, and tag), which, when interpreted as edges of a graph, give rise to special networks called 3-partite, 3-uniform hypergraphs. We identify challenges and opportunities of generalizing community detection and in particular modularity optimization to these structures. Two methods for community detection are introduced that preserve the hypergraph's special structure to different degrees. Their performance is compared on synthetic datasets, showing the benefits of structure preservation. Furthermore, a tool for interactive exploration of the community detection results is introduced and applied to examples from real datasets. We find additional evidence for the importance of structure preservation and, more generally, demonstrate how tripartite community detection can help understand the structure of social bookmarking data.

  13. On the relation between phase-field crack approximation and gradient damage modelling

    NASA Astrophysics Data System (ADS)

    Steinke, Christian; Zreid, Imadeddin; Kaliske, Michael

    2017-05-01

    The finite element implementation of a gradient enhanced microplane damage model is compared to a phase-field model for brittle fracture. Phase-field models and implicit gradient damage models share many similarities despite being conceived from very different standpoints. In both approaches, an additional differential equation and a length scale are introduced. However, while the phase-field method is formulated starting from the description of a crack in fracture mechanics, the gradient method starts from a continuum mechanics point of view. At first, the scope of application for both models is discussed to point out intersections. Then, the analysis of the employed mathematical methods and their rigorous comparison are presented. Finally, numerical examples are introduced to illustrate the findings of the comparison which are summarized in a conclusion at the end of the paper.

  14. Finite element analysis of fretting contact for nonhomogenous materials

    NASA Astrophysics Data System (ADS)

    Korkmaz, Y. M.; Coker, D.

    2018-01-01

    Fretting problem arises in the case of relatively small sliding motion between contacting surfaces. Fatigue life of the components that are in contact with each other, especially in rotorcraft may be significantly reduced due to fretting. The purpose of this study is to investigate material inhomogeneity near the contact region on the fretting problem in a cylindrical on flat contact configuration. A finite element (FE) model was constructed by using commercial finite element package ABAQUSTMto study partial sliding and stress concentrations. In order to investigate the effect of material inhomogeneity, the fretting contact is analyzed by introducing voids near the contact region. The void size and an array of voids is introduced into the substrate. The results are compared in terms of pressure, shear traction, tangential stress magnitudes and relative slip between the contacting materials.

  15. Conclusions about Niche Expansion in Introduced Impatiens walleriana Populations Depend on Method of Analysis

    PubMed Central

    Mandle, Lisa; Warren, Dan L.; Hoffmann, Matthias H.; Peterson, A. Townsend; Schmitt, Johanna; von Wettberg, Eric J.

    2010-01-01

    Determining the degree to which climate niches are conserved across plant species' native and introduced ranges is valuable to developing successful strategies to limit the introduction and spread of invasive plants, and also has important ecological and evolutionary implications. Here, we test whether climate niches differ between native and introduced populations of Impatiens walleriana, globally one of the most popular horticultural species. We use approaches based on both raw climate data associated with occurrence points and ecological niche models (ENMs) developed with Maxent. We include comparisons of climate niche breadth in both geographic and environmental spaces, taking into account differences in available habitats between the distributional areas. We find significant differences in climate envelopes between native and introduced populations when comparing raw climate variables, with introduced populations appearing to expand into wetter and cooler climates. However, analyses controlling for differences in available habitat in each region do not indicate expansion of climate niches. We therefore cannot reject the hypothesis that observed differences in climate envelopes reflect only the limited environments available within the species' native range in East Africa. Our results suggest that models built from only native range occurrence data will not provide an accurate prediction of the potential for invasiveness if applied to areas containing a greater range of environmental combinations, and that tests of niche expansion may overestimate shifts in climate niches if they do not control carefully for environmental differences between distributional areas. PMID:21206912

  16. PET kinetic analysis --pitfalls and a solution for the Logan plot.

    PubMed

    Kimura, Yuichi; Naganawa, Mika; Shidahara, Miho; Ikoma, Yoko; Watabe, Hiroshi

    2007-01-01

    The Logan plot is a widely used algorithm for the quantitative analysis of neuroreceptors using PET because it is easy to use and simple to implement. The Logan plot is also suitable for receptor imaging because its algorithm is fast. However, use of the Logan plot, and interpretation of the formed receptor images should be regarded with caution, because noise in PET data causes bias in the Logan plot estimates. In this paper, we describe the basic concept of the Logan plot in detail and introduce three algorithms for the Logan plot. By comparing these algorithms, we demonstrate the pitfalls of the Logan plot and discuss the solution.

  17. How weeds emerge: a taxonomic and trait-based examination using United States data

    PubMed Central

    Kuester, Adam; Conner, Jeffrey K; Culley, Theresa; Baucom, Regina S

    2014-01-01

    Weeds can cause great economic and ecological harm to ecosystems. Despite their importance, comparisons of the taxonomy and traits of successful weeds often focus on a few specific comparisons – for example, introduced versus native weeds.We used publicly available inventories of US plant species to make comprehensive comparisons of the factors that underlie weediness. We quantitatively examined taxonomy to determine if certain genera are overrepresented by introduced, weedy or herbicide-resistant species, and we compared phenotypic traits of weeds to those of nonweeds, whether introduced or native.We uncovered genera that have more weeds and introduced species than expected by chance and plant families that have more herbicide-resistant species than expected by chance. Certain traits, generally related to fast reproduction, were more likely to be associated with weedy plants regardless of species’ origins. We also found stress tolerance traits associated with either native or introduced weeds compared with native or introduced nonweeds. Weeds and introduced species have significantly smaller genomes than nonweeds and native species.These results support trends for weedy plants reported from other floras, suggest that native and introduced weeds have different stress adaptations, and provide a comprehensive survey of trends across weeds within the USA. PMID:24494694

  18. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  19. Multi-trait analysis of genome-wide association summary statistics using MTAG.

    PubMed

    Turley, Patrick; Walters, Raymond K; Maghzian, Omeed; Okbay, Aysu; Lee, James J; Fontana, Mark Alan; Nguyen-Viet, Tuan Anh; Wedow, Robbee; Zacher, Meghan; Furlotte, Nicholas A; Magnusson, Patrik; Oskarsson, Sven; Johannesson, Magnus; Visscher, Peter M; Laibson, David; Cesarini, David; Neale, Benjamin M; Benjamin, Daniel J

    2018-02-01

    We introduce multi-trait analysis of GWAS (MTAG), a method for joint analysis of summary statistics from genome-wide association studies (GWAS) of different traits, possibly from overlapping samples. We apply MTAG to summary statistics for depressive symptoms (N eff  = 354,862), neuroticism (N = 168,105), and subjective well-being (N = 388,538). As compared to the 32, 9, and 13 genome-wide significant loci identified in the single-trait GWAS (most of which are themselves novel), MTAG increases the number of associated loci to 64, 37, and 49, respectively. Moreover, association statistics from MTAG yield more informative bioinformatics analyses and increase the variance explained by polygenic scores by approximately 25%, matching theoretical expectations.

  20. Influence of ECG sampling rate in fetal heart rate variability analysis.

    PubMed

    De Jonckheere, J; Garabedian, C; Charlier, P; Champion, C; Servan-Schreiber, E; Storme, L; Debarge, V; Jeanne, M; Logier, R

    2017-07-01

    Fetal hypoxia results in a fetal blood acidosis (pH<;7.10). In such a situation, the fetus develops several adaptation mechanisms regulated by the autonomic nervous system. Many studies demonstrated significant changes in heart rate variability in hypoxic fetuses. So, fetal heart rate variability analysis could be of precious help for fetal hypoxia prediction. Commonly used fetal heart rate variability analysis methods have been shown to be sensitive to the ECG signal sampling rate. Indeed, a low sampling rate could induce variability in the heart beat detection which will alter the heart rate variability estimation. In this paper, we introduce an original fetal heart rate variability analysis method. We hypothesize that this method will be less sensitive to ECG sampling frequency changes than common heart rate variability analysis methods. We then compared the results of this new heart rate variability analysis method with two different sampling frequencies (250-1000 Hz).

  1. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Meta-analysis of laparoscopic versus open repair of perforated peptic ulcer.

    PubMed

    Antoniou, Stavros A; Antoniou, George A; Koch, Oliver O; Pointner, Rudolph; Granderath, Frank A

    2013-01-01

    Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU.

  3. Meta-analysis of Laparoscopic Versus Open Repair of Perforated Peptic Ulcer

    PubMed Central

    Antoniou, George A.; Koch, Oliver O.; Pointner, Rudolph; Granderath, Frank A.

    2013-01-01

    Background and Objectives: Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. Methods: We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Results: Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. Conclusion: In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU. PMID:23743368

  4. Iterative inversion of deformation vector fields with feedback control.

    PubMed

    Dubey, Abhishek; Iliopoulos, Alexandros-Stavros; Sun, Xiaobai; Yin, Fang-Fang; Ren, Lei

    2018-05-14

    Often, the inverse deformation vector field (DVF) is needed together with the corresponding forward DVF in four-dimesional (4D) reconstruction and dose calculation, adaptive radiation therapy, and simultaneous deformable registration. This study aims at improving both accuracy and efficiency of iterative algorithms for DVF inversion, and advancing our understanding of divergence and latency conditions. We introduce a framework of fixed-point iteration algorithms with active feedback control for DVF inversion. Based on rigorous convergence analysis, we design control mechanisms for modulating the inverse consistency (IC) residual of the current iterate, to be used as feedback into the next iterate. The control is designed adaptively to the input DVF with the objective to enlarge the convergence area and expedite convergence. Three particular settings of feedback control are introduced: constant value over the domain throughout the iteration; alternating values between iteration steps; and spatially variant values. We also introduce three spectral measures of the displacement Jacobian for characterizing a DVF. These measures reveal the critical role of what we term the nontranslational displacement component (NTDC) of the DVF. We carry out inversion experiments with an analytical DVF pair, and with DVFs associated with thoracic CT images of six patients at end of expiration and end of inspiration. The NTDC-adaptive iterations are shown to attain a larger convergence region at a faster pace compared to previous nonadaptive DVF inversion iteration algorithms. By our numerical experiments, alternating control yields smaller IC residuals and inversion errors than constant control. Spatially variant control renders smaller residuals and errors by at least an order of magnitude, compared to other schemes, in no more than 10 steps. Inversion results also show remarkable quantitative agreement with analysis-based predictions. Our analysis captures properties of DVF data associated with clinical CT images, and provides new understanding of iterative DVF inversion algorithms with a simple residual feedback control. Adaptive control is necessary and highly effective in the presence of nonsmall NTDCs. The adaptive iterations or the spectral measures, or both, may potentially be incorporated into deformable image registration methods. © 2018 American Association of Physicists in Medicine.

  5. Parasites and marine invasions

    USGS Publications Warehouse

    Torchin, M.E.; Lafferty, K.D.; Kuris, A.M.

    2002-01-01

    Introduced marine species are a major environmental and economic problem. The rate of these biological invasions has substantially increased in recent years due to the globalization of the world's economies. The damage caused by invasive species is often a result of the higher densities and larger sizes they attain compared to where they are native. A prominent hypothesis explaining the success of introduced species is that they are relatively free of the effects of natural enemies. Most notably, they may encounter fewer parasites in their introduced range compared to their native range. Parasites are ubiquitous and pervasive in marine systems, yet their role in marine invasions is relatively unexplored. Although data on parasites of marine organisms exist, the extent to which parasites can mediate marine invasions, or the extent to which invasive parasites and pathogens are responsible for infecting or potentially decimating native marine species have not been examined. In this review, we present a theoretical framework to model invasion success and examine the evidence for a relationship between parasite presence and the success of introduced marine species. For this, we compare the prevalence and species richness of parasites in several introduced populations of marine species with populations where they are native. We also discuss the potential impacts of introduced marine parasites on native ecosystems.

  6. A cellular automata model of Ebola virus dynamics

    NASA Astrophysics Data System (ADS)

    Burkhead, Emily; Hawkins, Jane

    2015-11-01

    We construct a stochastic cellular automaton (SCA) model for the spread of the Ebola virus (EBOV). We make substantial modifications to an existing SCA model used for HIV, introduced by others and studied by the authors. We give a rigorous analysis of the similarities between models due to the spread of virus and the typical immune response to it, and the differences which reflect the drastically different timing of the course of EBOV. We demonstrate output from the model and compare it with clinical data.

  7. An internet graph model based on trade-off optimization

    NASA Astrophysics Data System (ADS)

    Alvarez-Hamelin, J. I.; Schabanel, N.

    2004-03-01

    This paper presents a new model for the Internet graph (AS graph) based on the concept of heuristic trade-off optimization, introduced by Fabrikant, Koutsoupias and Papadimitriou in[CITE] to grow a random tree with a heavily tailed degree distribution. We propose here a generalization of this approach to generate a general graph, as a candidate for modeling the Internet. We present the results of our simulations and an analysis of the standard parameters measured in our model, compared with measurements from the physical Internet graph.

  8. Ground coupled solar heat pumps: analysis of four options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, J.W.

    Heat pump systems which utilize both solar energy and energy withdrawn from the ground are analyzed using a simplified procedure which optimizes the solar storage temperature on a monthly basis. Four ways of introducing collected solar energy to the system are optimized and compared. These include use of actively collected thermal input to the heat pump; use of collected solar energy to heat the load directly (two different ways); and use of a passive option to reduce the effective heating load.

  9. Introduction: health of the health care system in Korea.

    PubMed

    Kim, Dong Soo

    2010-03-01

    This study is a mega evaluation of Korea's health care system as developed thus far. It aims to review the historical context in which this system was developed and the political stage and motivation for such development. It will highlight unique features of the system and some comparative analysis with other developed nations. Then it will introduce selective, specific areas and aspects of the health care system, service delivery, and practices. It will suggest its implications for future direction.

  10. Compositional analysis of genetically modified corn events (NK603, MON88017×MON810 and MON89034×MON88017) compared to conventional corn.

    PubMed

    Rayan, Ahmed M; Abbott, Louise C

    2015-06-01

    Compositional analysis of genetically modified (GM) crops continues to be an important part of the overall evaluation in the safety assessment for these materials. The present study was designed to detect the genetic modifications and investigate the compositional analysis of GM corn containing traits of multiple genes (NK603, MON88017×MON810 and MON89034×MON88017) compared with non-GM corn. Values for most biochemical components assessed for the GM corn samples were similar to those of the non-GM control or were within the literature range. Significant increases were observed in protein, fat, fiber and fatty acids of the GM corn samples. The observed increases may be due to the synergistic effect of new traits introduced into corn varieties. Furthermore, SDS-PAGE analysis showed high similarity among the protein fractions of the investigated corn samples. These data indicate that GM corn samples were compositionally equivalent to, and as nutritious as, non-GM corn. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Percent area coverage through image analysis

    NASA Astrophysics Data System (ADS)

    Wong, Chung M.; Hong, Sung M.; Liu, De-Ling

    2016-09-01

    The notion of percent area coverage (PAC) has been used to characterize surface cleanliness levels in the spacecraft contamination control community. Due to the lack of detailed particle data, PAC has been conventionally calculated by multiplying the particle surface density in predetermined particle size bins by a set of coefficients per MIL-STD-1246C. In deriving the set of coefficients, the surface particle size distribution is assumed to follow a log-normal relation between particle density and particle size, while the cross-sectional area function is given as a combination of regular geometric shapes. For particles with irregular shapes, the cross-sectional area function cannot describe the true particle area and, therefore, may introduce error in the PAC calculation. Other errors may also be introduced by using the lognormal surface particle size distribution function that highly depends on the environmental cleanliness and cleaning process. In this paper, we present PAC measurements from silicon witness wafers that collected fallouts from a fabric material after vibration testing. PAC calculations were performed through analysis of microscope images and compare them to values derived through the MIL-STD-1246C method. Our results showed that the MIL-STD-1246C method does provide a reasonable upper bound to the PAC values determined through image analysis, in particular for PAC values below 0.1.

  12. A Procedure for Modeling Structural Component/Attachment Failure Using Transient Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Jegley, Dawn C. (Technical Monitor)

    2007-01-01

    Structures often comprise smaller substructures that are connected to each other or attached to the ground by a set of finite connections. Under static loading one or more of these connections may exceed allowable limits and be deemed to fail. Of particular interest is the structural response when a connection is severed (failed) while the structure is under static load. A transient failure analysis procedure was developed by which it is possible to examine the dynamic effects that result from introducing a discrete failure while a structure is under static load. The failure is introduced by replacing a connection load history by a time-dependent load set that removes the connection load at the time of failure. The subsequent transient response is examined to determine the importance of the dynamic effects by comparing the structural response with the appropriate allowables. Additionally, this procedure utilizes a standard finite element transient analysis that is readily available in most commercial software, permitting the study of dynamic failures without the need to purchase software specifically for this purpose. The procedure is developed and explained, demonstrated on a simple cantilever box example, and finally demonstrated on a real-world example, the American Airlines Flight 587 (AA587) vertical tail plane (VTP).

  13. Morphofunctional diversity of equine of varied genetic compositions raised in the Pantanal biome of Brazil.

    PubMed

    de Rezende, Marcos Paulo Gonçalves; de Souza, Julio Cesar; Carneiro, Paulo Luiz Souza; Bozzi, Riccardo; Jardim, Rodrigo Jose Delgado; Malhado, Carlos Henrique Mendes

    2018-06-01

    Evaluating phenotypic diversity makes it possible to identify discrepancies in aptitudes among animals of different genetic bases, which is an indicator of adaptive or selective differences between populations. The objective of this work was to evaluate the morphofunctional diversity of 452 male and female adult equines (Arabian, Quarter Mile, Pantaneiro, and Criollo breeds, and undefined crossbreeds of horses and mules) raised in the Pantanal biome (Brazil). Linear measurements were performed to estimate conformation indexes. Initially, a discriminant analysis was performed, regardless of the animal's size, followed by factor analysis. The factors were characterized and used as new variables. The diversity among equines and their relationship with the factors were evaluated using multivariate analysis. The factors were classified according to their decreasing importance: balance, rusticity, and robustness for the measurement factors; and load, ability, conformation, and equilibrium for the index factors. The genetic groups of equines have well-defined morphofunctional characteristics. The main differences are based on the rusticity and ability typologies in relation to those based on performance. Equines introduced to the Pantanal biome presented a more robust and compact body with good conformation. As a result, these horses may have superior athletic performance during equestrian activities when compared to the Pantaneiro local breed. However, this biotype may represent less rusticity (less adaptive capacity). Therefore, the regional breed can be equal or better in equestrian activities than breeds introduced to the Pantanal biome. Thus, breeders may cross horses from local breeds as an alternative to those introduced. Undefined crossbred male equines presented a different profile from the Pantaneiro breed, which may indicate little use of crossbreeds in breeding.

  14. Stochastic modelling of shifts in allele frequencies reveals a strongly polygynous mating system in the re-introduced Asiatic wild ass.

    PubMed

    Renan, Sharon; Greenbaum, Gili; Shahar, Naama; Templeton, Alan R; Bouskila, Amos; Bar-David, Shirli

    2015-04-01

    Small populations are prone to loss of genetic variation and hence to a reduction in their evolutionary potential. Therefore, studying the mating system of small populations and its potential effects on genetic drift and genetic diversity is of high importance for their viability assessments. The traditional method for studying genetic mating systems is paternity analysis. Yet, as small populations are often rare and elusive, the genetic data required for paternity analysis are frequently unavailable. The endangered Asiatic wild ass (Equus hemionus), like all equids, displays a behaviourally polygynous mating system; however, the level of polygyny has never been measured genetically in wild equids. Combining noninvasive genetic data with stochastic modelling of shifts in allele frequencies, we developed an alternative approach to paternity analysis for studying the genetic mating system of the re-introduced Asiatic wild ass in the Negev Desert, Israel. We compared the shifts in allele frequencies (as a measure of genetic drift) that have occurred in the wild ass population since re-introduction onset to simulated scenarios under different proportions of mating males. We revealed a strongly polygynous mating system in which less than 25% of all males participate in the mating process each generation. This strongly polygynous mating system and its potential effect on the re-introduced population's genetic diversity could have significant consequences for the long-term persistence of the population in the Negev. The stochastic modelling approach and the use of allele-frequency shifts can be further applied to systems that are affected by genetic drift and for which genetic data are limited. © 2015 John Wiley & Sons Ltd.

  15. Stability analysis for a delay differential equations model of a hydraulic turbine speed governor

    NASA Astrophysics Data System (ADS)

    Halanay, Andrei; Safta, Carmen A.; Dragoi, Constantin; Piraianu, Vlad F.

    2017-01-01

    The paper aims to study the dynamic behavior of a speed governor for a hydraulic turbine using a mathematical model. The nonlinear mathematical model proposed consists in a system of delay differential equations (DDE) to be compared with already established mathematical models of ordinary differential equations (ODE). A new kind of nonlinearity is introduced as a time delay. The delays can characterize different running conditions of the speed governor. For example, it is considered that spool displacement of hydraulic amplifier might be blocked due to oil impurities in the oil supply system and so the hydraulic amplifier has a time delay in comparison to the time control. Numerical simulations are presented in a comparative manner. A stability analysis of the hydraulic control system is performed, too. Conclusions of the dynamic behavior using the DDE model of a hydraulic turbine speed governor are useful in modeling and controlling hydropower plants.

  16. Dominant factor analysis of B-flow twinkling sign with phantom and simulation data.

    PubMed

    Lu, Weijia; Haider, Bruno

    2017-01-01

    The twinkling sign in B-flow imaging (BFI-TS) has been reported in the literature to increase both specificity and sensitivity compared to the traditional gray-scale imaging. Unfortunately, there has been no conclusive study on the mechanism of this effect. In the study presented here, a comparative test on phantoms is introduced, where the variance of a phase estimator is used to quantify the motion amplitude. The statistical inference is employed later to find the dominate factor for the twinkling sign, which is proven by computer simulation. Through the analysis, it is confirmed that the tissue viscoelasticity is closely coupled with the twinkling sign. Moreover, the acoustic radiation force caused by tissue attenuation is found to be the trigger of the twinkling sign. Based on these findings, the BFI-TS is interpreted as a tissue movement triggering vibration of microcalcifications particle.

  17. A GIS-based approach for comparative analysis of potential fire risk assessment

    NASA Astrophysics Data System (ADS)

    Sun, Ying; Hu, Lieqiu; Liu, Huiping

    2007-06-01

    Urban fires are one of the most important sources of property loss and human casualty and therefore it is necessary to assess the potential fire risk with consideration of urban community safety. Two evaluation models are proposed, both of which are integrated with GIS. One is the single factor model concerning the accessibility of fire passage and the other is grey clustering approach based on the multifactor system. In the latter model, fourteen factors are introduced and divided into four categories involving security management, evacuation facility, construction resistance and fire fighting capability. A case study on campus of Beijing Normal University is presented to express the potential risk assessment models in details. A comparative analysis of the two models is carried out to validate the accuracy. The results are approximately consistent with each other. Moreover, modeling with GIS promotes the efficiency the potential risk assessment.

  18. A new approach on the upgrade of energetic system based on green energy. A complex comparative analysis of the EEDI and EEOI

    NASA Astrophysics Data System (ADS)

    Faitar, C.; Novac, I.

    2016-08-01

    In recent years, many environmental organizations was interested to optimize the energy consumption which has become, today, one of the main concerns to the whole world. From this point of view, the maritime industry, has strove to optimize the fuel consumption of ship through the development of engines and propulsion system, improve the hull design, or using alternative energies, this way making a reduction in the amount of CO2 released to the atmosphere. The main idea of this paper is to realize a complex comparative analysis of Energy Efficiency Design Index and Energy Efficiency Operational Indicator which are calculated in two cases: first, in a classical approach for a crude oil super tanker ship and second, after the energy performance of this ship has been improved by introducing alternative energy sources on board.

  19. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  20. [Multilevel Analysis in Health Services Research in Healthcare Organizations: Benefits, Requirements and Implementation].

    PubMed

    Ansmann, L; Kuhr, K; Kowalski, C

    2017-03-01

    Multilevel Analysis (MLA) are still rarely used in Health Services Research in Germany, though hierarchical data, e. g. from patients clustered in hospitals, is often present. MLA provide the valuable opportunity to study the health care context in health care organizations and the associations between context and health care outcomes. This article's aims are to introduce this particular method of data analysis, to discuss its' benefits and its' applicability particularly for Health Services Research focusing on organizational characteristics and to provide a concise guideline for performing the analysis. First, the benefits and the necessity for MLA compared to ordinary correlation analyses in the case of hierarchical data are discussed. Furthermore, the statistical requirements and key decisions for the performance of MLA are illustrated. © Georg Thieme Verlag KG Stuttgart · New York.

  1. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  2. Structural Performance’s Optimally Analysing and Implementing Based on ANSYS Technology

    NASA Astrophysics Data System (ADS)

    Han, Na; Wang, Xuquan; Yue, Haifang; Sun, Jiandong; Wu, Yongchun

    2017-06-01

    Computer-aided Engineering (CAE) is a hotspot both in academic field and in modern engineering practice. Analysis System(ANSYS) simulation software for its excellent performance become outstanding one in CAE family, it is committed to the innovation of engineering simulation to help users to shorten the design process, improve product innovation and performance. Aimed to explore a structural performance’s optimally analyzing model for engineering enterprises, this paper introduced CAE and its development, analyzed the necessity for structural optimal analysis as well as the framework of structural optimal analysis on ANSYS Technology, used ANSYS to implement a reinforced concrete slab structural performance’s optimal analysis, which was display the chart of displacement vector and the chart of stress intensity. Finally, this paper compared ANSYS software simulation results with the measured results,expounded that ANSYS is indispensable engineering calculation tools.

  3. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    PubMed

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  4. Change in practice: a qualitative exploration of midwives' and doctors' views about the introduction of STan monitoring in an Australian hospital.

    PubMed

    Mayes, M E; Wilkinson, C; Kuah, S; Matthews, G; Turnbull, D

    2018-02-17

    The present study examines the introduction of an innovation in intrapartum foetal monitoring practice in Australia. ST-Analysis (STan) is a technology that adds information to conventional fetal monitoring (cardiotocography) during labour, with the aim of reducing unnecessary obstetric intervention. Adoption of this technology has been controversial amongst obstetricians and midwives, particularly as its use necessitates a more invasive means of monitoring (a scalp clip), compared to external monitoring from cardiotocography alone. If adoption of this technology is going to be successful, then understanding staff opinions about the implementation of STan in an Australian setting is an important issue for maternity care providers and policy makers. Using a maximum variation purposive sampling method, 18 interviews were conducted with 10 midwives and 8 doctors from the Women's and Children's Hospital, South Australia to explore views about the introduction of the new technology. The data were analysed using Framework Analysis. Midwives and doctors indicated four important areas of consideration when introducing STan: 1) philosophy of care; 2) the implementation process including training and education; 3) the existence of research evidence; and 4) attitudes towards the new technology. Views were expressed about the management of change process, the fit of the new technology within the current models of care, the need for ongoing training and the importance of having local evidence. These findings, coupled with the general literature about introducing innovation and change, can be used by other centres looking to introduce STan technology.

  5. Joint Sequence Analysis: Association and Clustering

    ERIC Educational Resources Information Center

    Piccarreta, Raffaella

    2017-01-01

    In its standard formulation, sequence analysis aims at finding typical patterns in a set of life courses represented as sequences. Recently, some proposals have been introduced to jointly analyze sequences defined on different domains (e.g., work career, partnership, and parental histories). We introduce measures to evaluate whether a set of…

  6. Unbalance Response Analysis and Experimental Validation of an Ultra High Speed Motor-Generator for Microturbine Generators Considering Balancing

    PubMed Central

    Hong, Do-Kwan; Joo, Dae-Suk; Woo, Byung-Chul; Koo, Dae-Hyun; Ahn, Chan-Woo

    2014-01-01

    The objective of the present study was to deal with the rotordynamics of the rotor of an ultra-high speed PM type synchronous motor-generator for a 500 W rated micro gas turbine generator. This paper introduces dynamic analysis, and experiments on the motor-generator. The focus is placed on an analytical approach considering the mechanical dynamic problems. It is essential to deal with dynamic stability at ultra-high speeds. Unbalance response analysis is performed by calculating the unbalance with and without balancing using a balancing machine. Critical speed analysis is performed to determine the operating speed with sufficient separation margin. The unbalance response analysis is compared with the experimental results considering the balancing grade (ISO 1940-1) and predicted vibration displacement with and without balancing. Based on these results, a high-speed motor-generator was successfully developed. PMID:25177804

  7. Three dimensional, numerical analysis of an elasto hydrodynamic lubrication using fluid structure interaction (FSI) approach

    NASA Astrophysics Data System (ADS)

    Hanoca, P.; Ramakrishna, H. V.

    2018-03-01

    This work is related to develop a methodology to model and simulate the TEHD using the sequential application of CFD and CSD. The FSI analyses are carried out using ANSYS Workbench. In this analysis steady state, 3D Navier-Stoke equations along with energy equation are solved. Liquid properties are introduced where the viscosity and density are the function of pressure and temperature. The cavitation phenomenon is adopted in the analysis. Numerical analysis has been carried at different speeds and surfaces temperatures. During the analysis, it was found that as speed increases, hydrodynamic pressures will also increases. The pressure profile obtained from the Roelands equation is more sensitive to the temperature as compared to the Barus equation. The stress distributions specify the significant positions in the bearing structure. The developed method is capable of giving latest approaching into the physics of elasto hydrodynamic lubrication.

  8. How weeds emerge: a taxonomic and trait-based examination using United States data.

    PubMed

    Kuester, Adam; Conner, Jeffrey K; Culley, Theresa; Baucom, Regina S

    2014-05-01

    Weeds can cause great economic and ecological harm to ecosystems. Despite their importance, comparisons of the taxonomy and traits of successful weeds often focus on a few specific comparisons - for example, introduced versus native weeds. We used publicly available inventories of US plant species to make comprehensive comparisons of the factors that underlie weediness. We quantitatively examined taxonomy to determine if certain genera are overrepresented by introduced, weedy or herbicide-resistant species, and we compared phenotypic traits of weeds to those of nonweeds, whether introduced or native. We uncovered genera that have more weeds and introduced species than expected by chance and plant families that have more herbicide-resistant species than expected by chance. Certain traits, generally related to fast reproduction, were more likely to be associated with weedy plants regardless of species' origins. We also found stress tolerance traits associated with either native or introduced weeds compared with native or introduced nonweeds. Weeds and introduced species have significantly smaller genomes than nonweeds and native species. These results support trends for weedy plants reported from other floras, suggest that native and introduced weeds have different stress adaptations, and provide a comprehensive survey of trends across weeds within the USA. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.

  9. The correction of aberrations computed in the aperture plane of multifrequency microwave radiometer antennas

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1984-01-01

    An analytical/numerical approach to identifying and correcting the aberrations introduced by a general displacement of the feed from the focal point of a single offset paraboloid antenna used in deployable radiometer systems is developed. A 15 meter reflector with 18 meter focal length is assumed for the analysis, which considers far field radiation pattern quality, focal region fields, and aberrations appearing in the aperture plane. The latter are obtained by ray tracing in the transmit mode and are expressed in terms of optical notation. Attention is given to the physical restraints imposed on corrective elements by real microwave systems and to the intermediate near field aspects of the problem in three dimensions. The subject of wave fronts and caustics in the receive mode is introduced for comparative purposes. Several specific examples are given for aberration reduction at eight beamwidths of scan at a frequency of 1.414 GHz.

  10. Immersed boundary lattice Boltzmann model based on multiple relaxation times

    NASA Astrophysics Data System (ADS)

    Lu, Jianhua; Han, Haifeng; Shi, Baochang; Guo, Zhaoli

    2012-01-01

    As an alterative version of the lattice Boltzmann models, the multiple relaxation time (MRT) lattice Boltzmann model introduces much less numerical boundary slip than the single relaxation time (SRT) lattice Boltzmann model if some special relationship between the relaxation time parameters is chosen. On the other hand, most current versions of the immersed boundary lattice Boltzmann method, which was first introduced by Feng and improved by many other authors, suffer from numerical boundary slip as has been investigated by Le and Zhang. To reduce such a numerical boundary slip, an immerse boundary lattice Boltzmann model based on multiple relaxation times is proposed in this paper. A special formula is given between two relaxation time parameters in the model. A rigorous analysis and the numerical experiments carried out show that the numerical boundary slip reduces dramatically by using the present model compared to the single-relaxation-time-based model.

  11. The Guderley problem revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D; Kamm, James R; Bolstad, John H

    2009-01-01

    The self-similar converging-diverging shock wave problem introduced by Guderley in 1942 has been the source of numerous investigations since its publication. In this paper, we review the simplifications and group invariance properties that lead to a self-similar formulation of this problem from the compressible flow equations for a polytropic gas. The complete solution to the self-similar problem reduces to two coupled nonlinear eigenvalue problems: the eigenvalue of the first is the so-called similarity exponent for the converging flow, and that of the second is a trajectory multiplier for the diverging regime. We provide a clear exposition concerning the reflected shockmore » configuration. Additionally, we introduce a new approximation for the similarity exponent, which we compare with other estimates and numerically computed values. Lastly, we use the Guderley problem as the basis of a quantitative verification analysis of a cell-centered, finite volume, Eulerian compressible flow algorithm.« less

  12. Solid wastes integrated management in Rio de Janeiro: input-output analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pimenteira, C.A.P.; Carpio, L.G.T.; Rosa, L.P.

    2005-07-01

    This paper analyzes the socioeconomic aspects of solid waste management in Rio de Janeiro. An 'input-output' methodology was used to examine how the secondary product resulting from recycling is re-introduced into the productive process. A comparative profile was developed from the state of recycling and the various other aspects of solid waste management, both from the perspective of its economic feasibility and from the social aspects involved. This was done analyzing the greenhouse gas emissions and the decreased energy consumption. The effects of re-introducing recycled raw materials into the matrix and the ensuing reduction of the demand for virgin rawmore » materials was based on the input-output matrix for the State of Rio de Janeiro. This paper also analyzes the energy savings obtained from recycling and measures the avoided emissions of greenhouse gases.« less

  13. One-way ANOVA based on interval information

    NASA Astrophysics Data System (ADS)

    Hesamian, Gholamreza

    2016-08-01

    This paper deals with extending the one-way analysis of variance (ANOVA) to the case where the observed data are represented by closed intervals rather than real numbers. In this approach, first a notion of interval random variable is introduced. Especially, a normal distribution with interval parameters is introduced to investigate hypotheses about the equality of interval means or test the homogeneity of interval variances assumption. Moreover, the least significant difference (LSD method) for investigating multiple comparison of interval means is developed when the null hypothesis about the equality of means is rejected. Then, at a given interval significance level, an index is applied to compare the interval test statistic and the related interval critical value as a criterion to accept or reject the null interval hypothesis of interest. Finally, the method of decision-making leads to some degrees to accept or reject the interval hypotheses. An applied example will be used to show the performance of this method.

  14. Dibenzo[a,f]perylene bisimide: effects of introducing two fused rings.

    PubMed

    Chaolumen; Enno, Hiroki; Murata, Michihisa; Wakamiya, Atsushi; Murata, Yasujiro

    2014-11-01

    Perylene bisimides (PBIs) are fascinating dyes with various potential applications. To study the effects of introducing a dibenzo-fused structure to the perylene moiety, π-extended PBI derivatives with a dibenzo-fused structure at both of the a and f bonds were synthesized. The twisted structure was characterized by X-ray crystal structure analysis. In the cyclic voltammograms, the dibenzo[a,f]-fused PBI showed a reversible oxidation wave at much less positive potential, relative to a dibenzo[a,o]-fused PBI derivative. These data indicated that two ring fusions at both sides of a naphthalene moiety, which construct a tetracene core, effectively raise the HOMO level compared to fusion of one ring at each naphthalene moiety (two anthracene cores). The dibenzo[a,f]-fused PBI derivative showed an absorption band at 735 nm with a shoulder band reaching 900 nm. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Mobile free-space optical communications: a feasibility study of various battlefield scenarios

    NASA Astrophysics Data System (ADS)

    Harris, Alan; Al-Akkoumi, Mouhammad K.; Sluss, James J., Jr.

    2012-06-01

    Free Space Optics (FSO) technology was originally envisioned to be a viable solution for the provision of high bandwidth optical connectivity in the last mile of today's telecommunications infrastructure. Due to atmospheric limitations inherent to FSO technology, FSO is now widely envisioned as a solution for the provision of high bandwidth, temporary mobile communications links. The need for FSO communications links will increase as mobility is introduced to this technology. In this paper, a theoretical solution for adding mobility to FSO communication links is introduced. Three-dimensional power estimation studies are presented to represent mobile FSO transmission under various weather conditions. Three wavelengths, 0.85, 1.55 and 10 um, are tested and compared to illustrate the pros and cons of each source wavelength used for transmission, depending on prevalent weather conditions and atmospheric turbulence conditions. A simulation analysis of the transmission properties of the source wavelengths used in the study is shown.

  16. Theoretical model for thin ferroelectric films and the multilayer structures based on them

    NASA Astrophysics Data System (ADS)

    Starkov, A. S.; Pakhomov, O. V.; Starkov, I. A.

    2013-06-01

    A modified Weiss mean-field theory is used to study the dependence of the properties of a thin ferroelectric film on its thickness. The possibility of introducing gradient terms into the thermodynamic potential is analyzed using the calculus of variations. An integral equation is introduced to generalize the well-known Langevin equation to the case of the boundaries of a ferroelectric. An analysis of this equation leads to the existence of a transition layer at the interface between ferroelectrics or a ferroelectric and a dielectric. The permittivity of this layer is shown to depend on the electric field direction even if the ferroelectrics in contact are homogeneous. The results obtained in terms of the Weiss model are compared with the results of the models based on the correlation effect and the presence of a dielectric layer at the boundary of a ferroelectric and with experimental data.

  17. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography

    PubMed Central

    Jørgensen, J. S.; Sidky, E. Y.

    2015-01-01

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620

  18. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.

    PubMed

    Jørgensen, J S; Sidky, E Y

    2015-06-13

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.

  19. Genome Information Broker (GIB): data retrieval and comparative analysis system for completed microbial genomes and more

    PubMed Central

    Fumoto, Masaki; Miyazaki, Satoru; Sugawara, Hideaki

    2002-01-01

    Genome Information Broker (GIB) is a powerful tool for the study of comparative genomics. GIB allows users to retrieve and display partial and/or whole genome sequences together with the relevant biological annotation. GIB has accumulated all the completed microbial genome and has recently been expanded to include Arabidopsis thaliana genome data from DDBJ/EMBL/GenBank. In the near future, hundreds of genome sequences will be determined. In order to handle such huge data, we have enhanced the GIB architecture by using XML, CORBA and distributed RDBs. We introduce the new GIB here. GIB is freely accessible at http://gib.genes.nig.ac.jp/. PMID:11752256

  20. Comparative Psychology as an Effective Supplement to Undergraduate Core Psychology Courses

    ERIC Educational Resources Information Center

    Thomas, Nathaniel R.

    2009-01-01

    This article describes the design and implementation of a 1-credit-hour seminar in comparative psychology as a supplement to an introductory biopsychology course. The purpose of the course was to introduce students to the ecological and evolutionary aspects of animal behavior by building on topics that are introduced in many biopsychology courses.…

  1. A Comparative Study of Sequence of Instruction When Introducing Golf Skills to Beginners.

    ERIC Educational Resources Information Center

    Kraft, Robert E.

    Three instructional methods of club sequence for introducing golf skills to beginning golfers were compared: (1) full swing; (2) putter and short approach; and (3) freedom of choice. Sixty-eight male and female college students participated in golf lessons twice weekly for 12 weeks, receiving small group and individual instruction. Two forms of…

  2. Super-delta: a new differential gene expression analysis procedure with robust data normalization.

    PubMed

    Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing

    2017-12-21

    Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.

  3. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography.

    PubMed

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-09-01

    Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.

  4. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  5. Anisotropy of anomalous diffusion improves the accuracy of differentiating low- and high-grade cerebral gliomas.

    PubMed

    Xu, Boyan; Su, Lu; Wang, Zhenxiong; Fan, Yang; Gong, Gaolang; Zhu, Wenzhen; Gao, Peiyi; Gao, Jia-Hong

    2018-04-17

    Anomalous diffusion model has been introduced and shown to be beneficial in clinical applications. However, only the directionally averaged values of anomalous diffusion parameters were investigated, and the anisotropy of anomalous diffusion remains unexplored. The aim of this study was to demonstrate the feasibility of using anisotropy of anomalous diffusion for differentiating low- and high-grade cerebral gliomas. Diffusion MRI images were acquired from brain tumor patients and analyzed using the fractional motion (FM) model. Twenty-two patients with histopathologically confirmed gliomas were selected. An anisotropy metric for the FM-related parameters, including the Noah exponent (α) and the Hurst exponent (H), was introduced and their values were statistically compared between the low- and high-grade gliomas. Additionally, multivariate logistic regression analysis was performed to assess the combination of the anisotropy metric and the directionally averaged value for each parameter. The diagnostic performances for grading gliomas were evaluated using a receiver operating characteristic (ROC) analysis. The Hurst exponent H was more anisotropic in high-grade than in low-grade gliomas (P = 0.015), while no significant difference was observed for the anisotropy of α. The ROC analysis revealed that larger areas under the ROC curves were produced for the combination of α (1) and the combination of H (0.813) compared with the directionally averaged α (0.979) and H (0.594), indicating an improved performance for tumor differentiation. The anisotropy of anomalous diffusion can provide distinctive information and benefit the differentiation of low- and high-grade gliomas. The utility of anisotropic anomalous diffusion may have an improved effect for investigating pathological changes in tissues. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Analysis of Slug Tests in Formations of High Hydraulic Conductivity

    USGS Publications Warehouse

    Butler, J.J.; Garnett, E.J.; Healey, J.M.

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  7. A time-frequency approach for the analysis of normal and arrhythmia cardiac signals.

    PubMed

    Mahmoud, Seedahmed S; Fang, Qiang; Davidović, Dragomir M; Cosic, Irena

    2006-01-01

    Previously, electrocardiogram (ECG) signals have been analyzed in either a time-indexed or spectral form. The reality, is that the ECG and all other biological signals belong to the family of multicomponent nonstationary signals. Due to this reason, the use of time-frequency analysis can be unavoidable for these signals. The Husimi and Wigner distributions are normally used in quantum mechanics for phase space representations of the wavefunction. In this paper, we introduce the Husimi distribution (HD) to analyze the normal and abnormal ECG signals in time-frequency domain. The abnormal cardiac signal was taken from a patient with supraventricular arrhythmia. Simulation results show that the HD has a good performance in the analysis of the ECG signals comparing with the Wigner-Ville distribution (WVD).

  8. Meta-analyzing dependent correlations: an SPSS macro and an R script.

    PubMed

    Cheung, Shu Fai; Chan, Darius K-S

    2014-06-01

    The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.

  9. Analysis of slug tests in formations of high hydraulic conductivity.

    PubMed

    Butler, James J; Garnett, Elizabeth J; Healey, John M

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  10. Comprehensive study of numerical anisotropy and dispersion in 3-D TLM meshes

    NASA Astrophysics Data System (ADS)

    Berini, Pierre; Wu, Ke

    1995-05-01

    This paper presents a comprehensive analysis of the numerical anisotropy and dispersion of 3-D TLM meshes constructed using several generalized symmetrical condensed TLM nodes. The dispersion analysis is performed in isotropic lossless, isotropic lossy and anisotropic lossless media and yields a comparison of the simulation accuracy for the different TLM nodes. The effect of mesh grading on the numerical dispersion is also determined. The results compare meshes constructed with Johns' symmetrical condensed node (SCN), two hybrid symmetrical condensed nodes (HSCN) and two frequency domain symmetrical condensed nodes (FDSCN). It has been found that under certain circumstances, the time domain nodes may introduce numerical anisotropy when modelling isotropic media.

  11. Study on Web-Based Tool for Regional Agriculture Industry Structure Optimization Using Ajax

    NASA Astrophysics Data System (ADS)

    Huang, Xiaodong; Zhu, Yeping

    According to the research status of regional agriculture industry structure adjustment information system and the current development of information technology, this paper takes web-based regional agriculture industry structure optimization tool as research target. This paper introduces Ajax technology and related application frameworks to build an auxiliary toolkit of decision support system for agricultural policy maker and economy researcher. The toolkit includes a “one page” style component of regional agriculture industry structure optimization which provides agile arguments setting method that enables applying sensitivity analysis and usage of data and comparative advantage analysis result, and a component that can solve the linear programming model and its dual problem by simplex method.

  12. Observations of geographically correlated orbit errors for TOPEX/Poseidon using the global positioning system

    NASA Technical Reports Server (NTRS)

    Christensen, E. J.; Haines, B. J.; Mccoll, K. C.; Nerem, R. S.

    1994-01-01

    We have compared Global Positioning System (GPS)-based dynamic and reduced-dynamic TOPEX/Poseidon orbits over three 10-day repeat cycles of the ground-track. The results suggest that the prelaunch joint gravity model (JGM-1) introduces geographically correlated errors (GCEs) which have a strong meridional dependence. The global distribution and magnitude of these GCEs are consistent with a prelaunch covariance analysis, with estimated and predicted global rms error statistics of 2.3 and 2.4 cm rms, respectively. Repeating the analysis with the post-launch joint gravity model (JGM-2) suggests that a portion of the meridional dependence observed in JGM-1 still remains, with global rms error of 1.2 cm.

  13. A method for reduction of cogging torque in brushless DC motor considering the distribution of magnetization by 3DEMCN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hur, J.; Hyun, D.S.; Hong, J.P.

    1998-09-01

    The method of reducing cogging torque and improving average torque has been studied by changing the dead zone angle of trapezoidal magnetization distribution of ring type rotor magnet in brushless DC motor (BLDCM). Because BLDCM has 3-D shape of overhang, 3-D analysis should be used for exact computation of its magnetic field. 3-D equivalent magnetic circuit network method (3-D EMCN) which can analyze an accurate 3-D magnetic field has been introduced. The analysis results of cogging torque using 3-D EMCN are compared with ones of 3-D finite element method (3-D FEM) and experimental data.

  14. Local and Global Gestalt Laws: A Neurally Based Spectral Approach.

    PubMed

    Favali, Marta; Citti, Giovanna; Sarti, Alessandro

    2017-02-01

    This letter presents a mathematical model of figure-ground articulation that takes into account both local and global gestalt laws and is compatible with the functional architecture of the primary visual cortex (V1). The local gestalt law of good continuation is described by means of suitable connectivity kernels that are derived from Lie group theory and quantitatively compared with long-range connectivity in V1. Global gestalt constraints are then introduced in terms of spectral analysis of a connectivity matrix derived from these kernels. This analysis performs grouping of local features and individuates perceptual units with the highest salience. Numerical simulations are performed, and results are obtained by applying the technique to a number of stimuli.

  15. Analysis of S.1844, the Clear Skies Act of 2003; S. 843, the Clean Air Planning Act of 2003; and S. 366, the Clean Power Act of 2003

    EIA Publications

    2004-01-01

    Senator James M. Inhofe requested that the Energy Information Administration (EIA) undertake analysis of S.843, the Clean Air Planning Act of 2003, introduced by Senator Thomas Carper; S.366, the Clean Power Act of 2003, introduced by Senator James Jeffords; and S.1844, the Clear Skies Act of 2003, introduced by Senator James M. Inhofe. The EIA received this request on March 19, 2004. This Service Report responds to his request.

  16. Cost-effectiveness of 13-valent pneumococcal conjugate vaccination in Mongolia.

    PubMed

    Sundaram, Neisha; Chen, Cynthia; Yoong, Joanne; Luvsan, Munkh-Erdene; Fox, Kimberley; Sarankhuu, Amarzaya; La Vincente, Sophie; Jit, Mark

    2017-02-15

    The Ministry of Health (MOH), Mongolia, is considering introducing 13-valent pneumococcal conjugate vaccine (PCV13) in its national immunization programme to prevent the burden of disease caused by Streptococcus pneumoniae. This study evaluates the cost-effectiveness and budget impact of introducing PCV13 compared to no PCV vaccination in Mongolia. The incremental cost-effectiveness ratio (ICER) of introducing PCV13 compared to no PCV vaccination was assessed using an age-stratified static multiple cohort model. The risk of various clinical presentations of pneumococcal disease (meningitis, pneumonia, non-meningitis non-pneumonia invasive pneumococcal disease and acute otitis media) at all ages for thirty birth cohorts was assessed. The analysis considered both health system and societal perspectives. A 3+0 vaccine schedule and price of US$3.30 per dose was assumed for the baseline scenario based on Gavi, the Vaccine Alliance's advance market commitment tail price. The ICER of PCV13 introduction is estimated at US$52 per disability-adjusted life year (DALY) averted (health system perspective), and cost-saving (societal perspective). Although indirect effects of PCV have been well-documented, a conservative scenario that does not consider indirect effects estimated PCV13 introduction to cost US$79 per DALY averted (health system perspective), and US$19 per DALY averted (societal perspective). Vaccination with PCV13 is expected to cost around US$920,000 in 2016, and thereafter US$820,000 every year. The programme is likely to reduce direct disease-related costs to MOH by US$440,000 in the first year, increasing to US$510,000 by 2025. Introducing PCV13 as part of Mongolia's national programme appears to be highly cost-effective when compared to no vaccination and cost-saving from a societal perspective at vaccine purchase prices offered through Gavi. Notwithstanding uncertainties around some parameters, cost-effectiveness of PCV introduction for Mongolia remains robust over a range of conservative scenarios. Availability of high-quality national data would improve future economic analyses for vaccine introduction. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811

  18. Headspace-mass spectrometry determination of benzene, toluene and the mixture of ethylbenzene and xylene isomers in soil samples using chemometrics.

    PubMed

    Esteve-Turrillas, F A; Armenta, S; Garrigues, S; Pastor, A; de la Guardia, M

    2007-03-21

    A simple and fast method has been developed for the determination of benzene, toluene and the mixture of ethylbenzene and xylene isomers (BTEX) in soils. Samples were introduced in 10 mL standard glass vials of a headspace (HS) autosampler together with 150 microL of 2,6,10,14-tetramethylpentadecane, heated at 90 degrees C for 10 min and introduced in the mass spectrometer by using a transfer line heated at 250 degrees C as interface. The volatile fraction of samples was directly introduced into the source of the mass spectrometer which was scanned from m/z 75 to 110. A partial least squares (PLS) multivariate calibration approach based on a classical 3(3) calibration model was build with mixtures of benzene, toluene and o-xylene in 2,6,10,14-tetramethylpentadecane for BTEX determination. Results obtained for BTEX analysis by HS-MS in different types of soil samples were comparables to those obtained by the reference HS-GC-MS procedure. So, the developed procedure allowed a fast identification and prediction of BTEX present in the samples without a prior chromatographic separation.

  19. A Ritz approach for the static analysis of planar pantographic structures modeled with nonlinear Euler-Bernoulli beams

    NASA Astrophysics Data System (ADS)

    Andreaus, Ugo; Spagnuolo, Mario; Lekszycki, Tomasz; Eugster, Simon R.

    2018-04-01

    We present a finite element discrete model for pantographic lattices, based on a continuous Euler-Bernoulli beam for modeling the fibers composing the pantographic sheet. This model takes into account large displacements, rotations and deformations; the Euler-Bernoulli beam is described by using nonlinear interpolation functions, a Green-Lagrange strain for elongation and a curvature depending on elongation. On the basis of the introduced discrete model of a pantographic lattice, we perform some numerical simulations. We then compare the obtained results to an experimental BIAS extension test on a pantograph printed with polyamide PA2200. The pantographic structures involved in the numerical as well as in the experimental investigations are not proper fabrics: They are composed by just a few fibers for theoretically allowing the use of the Euler-Bernoulli beam theory in the description of the fibers. We compare the experiments to numerical simulations in which we allow the fibers to elastically slide one with respect to the other in correspondence of the interconnecting pivot. We present as result a very good agreement between the numerical simulation, based on the introduced model, and the experimental measures.

  20. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  1. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  2. Comparative methods for the analysis of gene-expression evolution: an example using yeast functional genomic data.

    PubMed

    Oakley, Todd H; Gu, Zhenglong; Abouheif, Ehab; Patel, Nipam H; Li, Wen-Hsiung

    2005-01-01

    Understanding the evolution of gene function is a primary challenge of modern evolutionary biology. Despite an expanding database from genomic and developmental studies, we are lacking quantitative methods for analyzing the evolution of some important measures of gene function, such as gene-expression patterns. Here, we introduce phylogenetic comparative methods to compare different models of gene-expression evolution in a maximum-likelihood framework. We find that expression of duplicated genes has evolved according to a nonphylogenetic model, where closely related genes are no more likely than more distantly related genes to share common expression patterns. These results are consistent with previous studies that found rapid evolution of gene expression during the history of yeast. The comparative methods presented here are general enough to test a wide range of evolutionary hypotheses using genomic-scale data from any organism.

  3. Geomorphic classification of Icelandic and Martian volcanoes: Limitations of comparative planetology research from LANDSAT and Viking orbiter images

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    Some limitations in using orbital images of planetary surfaces for comparative landform analyses are discussed. The principal orbital images used were LANDSAT MSS images of Earth and nominal Viking Orbiter images of Mars. Both are roughly comparable in having a pixel size which corresponds to about 100 m on the planetary surface. A volcanic landform on either planet must have a horizontal dimension of at least 200 m to be discernible on orbital images. A twofold bias is directly introduced into any comparative analysis of volcanic landforms on Mars versus those in Iceland because of this scale limitation. First, the 200-m cutoff of landforms may delete more types of volcanic landforms on Earth than on Mars or vice versa. Second, volcanic landforms in Iceland, too small to be resolved or orbital images, may be represented by larger counterparts on Mars or vice versa.

  4. A comparative analysis of auditory perception in humans and songbirds: a modular approach.

    PubMed

    Weisman, Ronald; Hoeschele, Marisa; Sturdy, Christopher B

    2014-05-01

    We propose that a relatively small number of perceptual skills underlie human perception of music and speech. Humans and songbirds share a number of features in the development of their auditory communication systems. These similarities invite comparisons between species in their auditory perceptual skills. Here, we summarized our experimental comparisons between humans (and other mammals) and songbirds (and other birds) in their use of pitch height and pitch chroma perception and discuss similarities and differences in other auditory perceptual abilities of these species. Specifically, we introduced a functional modular view, using pitch chroma and pitch height perception as examples, as a theoretical framework for the comparative study of auditory perception and perhaps all of the study of comparative cognition. We also contrasted phylogeny and adaptation as causal mechanisms in comparative cognition using examples from auditory perception. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A method of alignment masking for refining the phylogenetic signal of multiple sequence alignments.

    PubMed

    Rajan, Vaibhav

    2013-03-01

    Inaccurate inference of positional homologies in multiple sequence alignments and systematic errors introduced by alignment heuristics obfuscate phylogenetic inference. Alignment masking, the elimination of phylogenetically uninformative or misleading sites from an alignment before phylogenetic analysis, is a common practice in phylogenetic analysis. Although masking is often done manually, automated methods are necessary to handle the much larger data sets being prepared today. In this study, we introduce the concept of subsplits and demonstrate their use in extracting phylogenetic signal from alignments. We design a clustering approach for alignment masking where each cluster contains similar columns-similarity being defined on the basis of compatible subsplits; our approach then identifies noisy clusters and eliminates them. Trees inferred from the columns in the retained clusters are found to be topologically closer to the reference trees. We test our method on numerous standard benchmarks (both synthetic and biological data sets) and compare its performance with other methods of alignment masking. We find that our method can eliminate sites more accurately than other methods, particularly on divergent data, and can improve the topologies of the inferred trees in likelihood-based analyses. Software available upon request from the author.

  6. Design of An Energy Efficient Hydraulic Regenerative circuit

    NASA Astrophysics Data System (ADS)

    Ramesh, S.; Ashok, S. Denis; Nagaraj, Shanmukha; Adithyakumar, C. R.; Reddy, M. Lohith Kumar; Naulakha, Niranjan Kumar

    2018-02-01

    Increasing cost and power demand, leads to evaluation of new method to increase through productivity and help to solve the power demands. Many researchers have break through to increase the efficiency of a hydraulic power pack, one of the promising methods is the concept of regenerative. The objective of this research work is to increase the efficiency of a hydraulic circuit by introducing a concept of regenerative circuit. A Regenerative circuit is a system that is used to speed up the extension stroke of the double acting single rod hydraulic cylinder. The output is connected to the input in the directional control value. By this concept, increase in velocity of the piston and decrease the cycle time. For the research, a basic hydraulic circuit and a regenerative circuit are designated and compared both with their results. The analysis was based on their time taken for extension and retraction of the piston. From the detailed analysis of both the hydraulic circuits, it is found that the efficiency by introducing hydraulic regenerative circuit increased by is 5.3%. The obtained results conclude that, implementing hydraulic regenerative circuit in a hydraulic power pack decreases power consumption, reduces cycle time and increases productivity in a longer run.

  7. Nanobodies: site-specific labeling for super-resolution imaging, rapid epitope-mapping and native protein complex isolation

    PubMed Central

    Pleiner, Tino; Bates, Mark; Trakhanov, Sergei; Lee, Chung-Tien; Schliep, Jan Erik; Chug, Hema; Böhning, Marc; Stark, Holger; Urlaub, Henning; Görlich, Dirk

    2015-01-01

    Nanobodies are single-domain antibodies of camelid origin. We generated nanobodies against the vertebrate nuclear pore complex (NPC) and used them in STORM imaging to locate individual NPC proteins with <2 nm epitope-label displacement. For this, we introduced cysteines at specific positions in the nanobody sequence and labeled the resulting proteins with fluorophore-maleimides. As nanobodies are normally stabilized by disulfide-bonded cysteines, this appears counterintuitive. Yet, our analysis showed that this caused no folding problems. Compared to traditional NHS ester-labeling of lysines, the cysteine-maleimide strategy resulted in far less background in fluorescence imaging, it better preserved epitope recognition and it is site-specific. We also devised a rapid epitope-mapping strategy, which relies on crosslinking mass spectrometry and the introduced ectopic cysteines. Finally, we used different anti-nucleoporin nanobodies to purify the major NPC building blocks – each in a single step, with native elution and, as demonstrated, in excellent quality for structural analysis by electron microscopy. The presented strategies are applicable to any nanobody and nanobody-target. DOI: http://dx.doi.org/10.7554/eLife.11349.001 PMID:26633879

  8. Comparison of reverse transcription-quantitative polymerase chain reaction methods and platforms for single cell gene expression analysis.

    PubMed

    Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A

    2012-08-15

    Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Automated Gait Analysis Through Hues and Areas (AGATHA): a method to characterize the spatiotemporal pattern of rat gait

    PubMed Central

    Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.

    2016-01-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674

  10. Automated Gait Analysis Through Hues and Areas (AGATHA): A Method to Characterize the Spatiotemporal Pattern of Rat Gait.

    PubMed

    Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D

    2017-03-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.

  11. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  12. Exploring the Intersection of Education Policy and Discourse Analysis: An Introduction

    ERIC Educational Resources Information Center

    Lester, Jessica Nina; Lochmiller, Chad R.; Gabriel, Rachael

    2017-01-01

    In this article, we introduce the special issue focused on diverse perspectives to discourse analysis for education policy. This article lays the foundation for the special issue by introducing the notion of a third generation of policy research--a strand of policy research we argue is produced at the intersection of education policy and discourse…

  13. Analysis of Introducing Active Learning Methodologies in a Basic Computer Architecture Course

    ERIC Educational Resources Information Center

    Arbelaitz, Olatz; José I. Martín; Muguerza, Javier

    2015-01-01

    This paper presents an analysis of introducing active methodologies in the Computer Architecture course taught in the second year of the Computer Engineering Bachelor's degree program at the University of the Basque Country (UPV/EHU), Spain. The paper reports the experience from three academic years, 2011-2012, 2012-2013, and 2013-2014, in which…

  14. Using Linear Algebra to Introduce Computer Algebra, Numerical Analysis, Data Structures and Algorithms (and To Teach Linear Algebra, Too).

    ERIC Educational Resources Information Center

    Gonzalez-Vega, Laureano

    1999-01-01

    Using a Computer Algebra System (CAS) to help with the teaching of an elementary course in linear algebra can be one way to introduce computer algebra, numerical analysis, data structures, and algorithms. Highlights the advantages and disadvantages of this approach to the teaching of linear algebra. (Author/MM)

  15. 'TIME': A Web Application for Obtaining Insights into Microbial Ecology Using Longitudinal Microbiome Data.

    PubMed

    Baksi, Krishanu D; Kuntal, Bhusan K; Mande, Sharmila S

    2018-01-01

    Realization of the importance of microbiome studies, coupled with the decreasing sequencing cost, has led to the exponential growth of microbiome data. A number of these microbiome studies have focused on understanding changes in the microbial community over time. Such longitudinal microbiome studies have the potential to offer unique insights pertaining to the microbial social networks as well as their responses to perturbations. In this communication, we introduce a web based framework called 'TIME' (Temporal Insights into Microbial Ecology'), developed specifically to obtain meaningful insights from microbiome time series data. The TIME web-server is designed to accept a wide range of popular formats as input with options to preprocess and filter the data. Multiple samples, defined by a series of longitudinal time points along with their metadata information, can be compared in order to interactively visualize the temporal variations. In addition to standard microbiome data analytics, the web server implements popular time series analysis methods like Dynamic time warping, Granger causality and Dickey Fuller test to generate interactive layouts for facilitating easy biological inferences. Apart from this, a new metric for comparing metagenomic time series data has been introduced to effectively visualize the similarities/differences in the trends of the resident microbial groups. Augmenting the visualizations with the stationarity information pertaining to the microbial groups is utilized to predict the microbial competition as well as community structure. Additionally, the 'causality graph analysis' module incorporated in TIME allows predicting taxa that might have a higher influence on community structure in different conditions. TIME also allows users to easily identify potential taxonomic markers from a longitudinal microbiome analysis. We illustrate the utility of the web-server features on a few published time series microbiome data and demonstrate the ease with which it can be used to perform complex analysis.

  16. Non-contact FBG sensing based steam turbine rotor dynamic balance vibration detection system

    NASA Astrophysics Data System (ADS)

    Li, Tianliang; Tan, Yuegang; Cai, Lin

    2015-10-01

    This paper has proposed a non-contact vibration sensor based on fiber Bragg grating sensing, and applied to detect vibration of steam turbine rotor dynamic balance experimental platform. The principle of the sensor has been introduced, as well as the experimental analysis; performance of non-contact FBG vibration sensor has been analyzed in the experiment; in addition, turbine rotor dynamic vibration detection system based on eddy current displacement sensor and non-contact FBG vibration sensor have built; finally, compared with results of signals under analysis of the time domain and frequency domain. The analysis of experimental data contrast shows that: the vibration signal analysis of non-contact FBG vibration sensor is basically the same as the result of eddy current displacement sensor; it verified that the sensor can be used for non-contact measurement of steam turbine rotor dynamic balance vibration.

  17. Comparisons of synthesized and individual reinforcement contingencies during functional analysis.

    PubMed

    Fisher, Wayne W; Greer, Brian D; Romani, Patrick W; Zangrillo, Amanda N; Owen, Todd M

    2016-09-01

    Researchers typically modify individual functional analysis (FA) conditions after results are inconclusive (Hanley, Iwata, & McCord, 2003). Hanley, Jin, Vanselow, and Hanratty (2014) introduced a marked departure from this practice, using an interview-informed synthesized contingency analysis (IISCA). In the test condition, they delivered multiple contingencies simultaneously (e.g., attention and escape) after each occurrence of problem behavior; in the control condition, they delivered those same reinforcers noncontingently and continuously. In the current investigation, we compared the results of the IISCA with a more traditional FA in which we evaluated each putative reinforcer individually. Four of 5 participants displayed destructive behavior that was sensitive to the individual contingencies evaluated in the traditional FA. By contrast, none of the participants showed a response pattern consistent with the assumption of the IISCA. We discuss the implications of these findings on the development of accurate and efficient functional analyses. © 2016 Society for the Experimental Analysis of Behavior.

  18. Automatic Identification of Character Types from Film Dialogs

    PubMed Central

    Skowron, Marcin; Trapp, Martin; Payr, Sabine; Trappl, Robert

    2016-01-01

    ABSTRACT We study the detection of character types from fictional dialog texts such as screenplays. As approaches based on the analysis of utterances’ linguistic properties are not sufficient to identify all fictional character types, we develop an integrative approach that complements linguistic analysis with interactive and communication characteristics, and show that it can improve the identification performance. The interactive characteristics of fictional characters are captured by the descriptive analysis of semantic graphs weighted by linguistic markers of expressivity and social role. For this approach, we introduce a new data set of action movie character types with their corresponding sequences of dialogs. The evaluation results demonstrate that the integrated approach outperforms baseline approaches on the presented data set. Comparative in-depth analysis of a single screenplay leads on to the discussion of possible limitations of this approach and to directions for future research. PMID:29118463

  19. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals.

    PubMed

    Hedayatifar, L; Vahabi, M; Jafari, G R

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  20. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals

    NASA Astrophysics Data System (ADS)

    Hedayatifar, L.; Vahabi, M.; Jafari, G. R.

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  1. Which Triple Aim related measures are being used to evaluate population management initiatives? An international comparative analysis.

    PubMed

    Hendrikx, Roy J P; Drewes, Hanneke W; Spreeuwenberg, Marieke; Ruwaard, Dirk; Struijs, Jeroen N; Baan, Caroline A

    2016-05-01

    Population management (PM) initiatives are introduced in order to create sustainable health care systems. These initiatives should focus on the continuum of health and well-being of a population by introducing interventions that integrate various services. To be successful they should pursue the Triple Aim, i.e. simultaneously improve population health and quality of care while reducing costs per capita. This study explores how PM initiatives measure the Triple Aim in practice. An exploratory search was combined with expert consultations to identify relevant PM initiatives. These were analyzed based on general characteristics, utilized measures and related selection criteria. In total 865 measures were used by 20 PM initiatives. All quality of care domains were included by at least 11 PM initiatives, while most domains of population health and costs were included by less than 7 PM initiatives. Although their goals showed substantial overlap, the measures applied showed few similarities between PM initiatives and were predominantly selected based on local priority areas and data availability. Most PM initiatives do not measure the full scope of the Triple Aim. Additionally, variety between measures limits comparability between PM initiatives. Consensus on the coverage of Triple Aim domains and a set of standardized measures could further both the inclusion of the various domains as well as the comparability between PM initiatives. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Folding DNA into a Lipid-Conjugated Nanobarrel for Controlled Reconstitution of Membrane Proteins.

    PubMed

    Dong, Yuanchen; Chen, Shuobing; Zhang, Shijian; Sodroski, Joseph; Yang, Zhongqiang; Liu, Dongsheng; Mao, Youdong

    2018-02-19

    Building upon DNA origami technology, we introduce a method to reconstitute a single membrane protein into a self-assembled DNA nanobarrel that scaffolds a nanodisc-like lipid environment. Compared with the membrane-scaffolding-protein nanodisc technique, our approach gives rise to defined stoichiometry, controlled sizes, as well as enhanced stability and homogeneity in membrane protein reconstitution. We further demonstrate potential applications of the DNA nanobarrels in the structural analysis of membrane proteins. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Tri-state oriented parallel processing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tenenbaum, J.; Wallach, Y.

    1982-08-01

    An alternating sequential/parallel system, the MOPPS was introduced a few years ago and is modified despite the fact that it solved satisfactorily a number of real-time problems. The new system, the TOPPS is described and compared to MOPPS and two applications are chosen to prove it to be superior. The advantage of having a third basic, the ring mode, is illustrated when solving sets of linear equations with band matrices. The advantage of having independent I/O for the slaves is illustrated for biomedical signal analysis. 11 references.

  4. Studying aerosol light scattering based on aspect ratio distribution observed by fluorescence microscope.

    PubMed

    Li, Li; Zheng, Xu; Li, Zhengqiang; Li, Zhanhua; Dubovik, Oleg; Chen, Xingfeng; Wendisch, Manfred

    2017-08-07

    Particle shape is crucial to the properties of light scattered by atmospheric aerosol particles. A method of fluorescence microscopy direct observation was introduced to determine the aspect ratio distribution of aerosol particles. The result is comparable with that of the electron microscopic analysis. The measured aspect ratio distribution has been successfully applied in modeling light scattering and further in simulation of polarization measurements of the sun/sky radiometer. These efforts are expected to improve shape retrieval from skylight polarization by using directly measured aspect ratio distribution.

  5. Fatigue Lifespan of Engine Box Influenced by Fan Blade Out

    NASA Astrophysics Data System (ADS)

    Qiu, Ju; Shi, Jingwei; Su, Huaizhong; Zhang, Jinling; Feng, Juan; Shi, Qian; Tian, Xiaoyu

    2017-11-01

    This provides precious experience and reliable reference data for future design. This paper introduces the analysis process of Fan-blade-out, and considers the effect of windmill load on the fatigue lifespan of the case. According to Extended Operations (ETOPS) in the airworthiness regulations, the fatigue crack of it is analyzed by the unbalanced rotor load, during FBO. Compared with the lifespan in normal work of the engine, this research provides valuable design experience and reliable reference data for the case design in the near future.

  6. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  7. Application of Arrester Simulation Device in Training

    NASA Astrophysics Data System (ADS)

    Baoquan, Zhang; Ziqi, Chai; Genghua, Liu; Wei, Gao; Kaiyue, Wu

    2017-12-01

    Combining with the arrester simulation device put into use successfully, this paper introduces the application of arrester test in the insulation resistance measurement, counter test, Leakage current test under DC 1mA voltage and leakage current test under 0.75U1mA. By comparing with the existing training, this paper summarizes the arrester simulation device’s outstanding advantages including real time monitoring, multi-type fault data analysis and acousto-optic simulation. It effectively solves the contradiction between authenticity and safety in the existing test training, and provides a reference for further training.

  8. Analysis of the infrared detection system operating range based on polarization degree

    NASA Astrophysics Data System (ADS)

    Jiang, Kai; Liu, Wen; Liu, Kai; Duan, Jing; Yan, Pei-pei; Shan, Qiu-sha

    2018-02-01

    Infrared polarization detection technology has unique advantages in the field of target detection and identification because of using the polarization information of radiation. The mechanism of infrared polarization is introduced. Comparing with traditional infrared detection distance model, infrared detection operating range and Signal to Noise Ratio (SNR) model is built according to the polarization degree and noise. The influence of polarization degree on the SNR of infrared system is analyzed. At last, the basic condition of polarization detection SNR better than traditional infrared detection SNR is obtained.

  9. One Step Quantum Key Distribution Based on EPR Entanglement.

    PubMed

    Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao

    2016-06-30

    A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper's attack would introduce at least an error rate of 46.875%. Compared with the "Ping-pong" protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step.

  10. Assessment of autonomic response by broad-band respiration

    NASA Technical Reports Server (NTRS)

    Berger, R. D.; Saul, J. P.; Cohen, R. J.

    1989-01-01

    We present a technique for introducing broad-band respiratory perturbations so that the response characteristics of the autonomic nervous system can be determined noninvasively over a wide range of physiologically relevant frequencies. A subject's respiratory bandwidth was broadened by breathing on cue to a sequence of audible tones spaced by Poisson intervals. The transfer function between the respiratory input and the resulting instantaneous heart rate was then computed using spectral analysis techniques. Results using this method are comparable to those found using traditional techniques, but are obtained with an economy of data collection.

  11. The time series approach to short term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagan, M.T.; Behr, S.M.

    The application of time series analysis methods to load forecasting is reviewed. It is shown than Box and Jenkins time series models, in particular, are well suited to this application. The logical and organized procedures for model development using the autocorrelation function make these models particularly attractive. One of the drawbacks of these models is the inability to accurately represent the nonlinear relationship between load and temperature. A simple procedure for overcoming this difficulty is introduced, and several Box and Jenkins models are compared with a forecasting procedure currently used by a utility company.

  12. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  13. Novel index for micromixing characterization and comparative analysis

    PubMed Central

    Jain, Mranal; Nandakumar, K.

    2010-01-01

    The most basic micromixer is a T- or Y-mixer, where two confluent streams mix due to transverse diffusion. To enhance micromixing, various modifications of T-mixers are reported such as heterogeneously charged walls, grooves on the channel base, geometric variations by introducing physical constrictions, etc. The performance of these reported designs is evaluated against the T-mixer in terms of the deviation from perfectly mixed state and mixing length (device length required to achieve perfect mixing). Although many studies have noticed the reduced flow rates for improved mixer designs, the residence time is not taken into consideration for micromixing performance evaluation. In this work, we propose a novel index, based on residence time, for micromixing characterization and comparative analysis. For any given mixer, the proposed index identifies the nondiffusive mixing enhancement with respect to the T-mixer. Various micromixers are evaluated using the proposed index to demonstrate the usefulness of the index. It is also shown that physical constriction mixer types are equivalent to T-mixers. The proposed index is found to be insightful and could be used as a benchmark for comparing different mixing strategies. PMID:20689773

  14. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  15. An ocean data assimilation system and reanalysis of the World Ocean hydrophysical fields

    NASA Astrophysics Data System (ADS)

    Zelenko, A. A.; Vil'fand, R. M.; Resnyanskii, Yu. D.; Strukov, B. S.; Tsyrulnikov, M. D.; Svirenko, P. I.

    2016-07-01

    A new version of the ocean data assimilation system (ODAS) developed at the Hydrometcentre of Russia is presented. The assimilation is performed following the sequential scheme analysis-forecast-analysis. The main components of the ODAS are procedures for operational observation data processing, a variational analysis scheme, and an ocean general circulation model used to estimate the first guess fields involved in the analysis. In situ observations of temperature and salinity in the upper 1400-m ocean layer obtained from various observational platforms are used as input data. In the new ODAS version, the horizontal resolution of the assimilating model and of the output products is increased, the previous 2D-Var analysis scheme is replaced by a more general 3D-Var scheme, and a more flexible incremental analysis updating procedure is introduced to correct the model calculations. A reanalysis of the main World Ocean hydrophysical fields over the 2005-2015 period has been performed using the updated ODAS. The reanalysis results are compared with data from independent sources.

  16. Unfolding dimension and the search for functional markers in the human electroencephalogram

    NASA Astrophysics Data System (ADS)

    Dünki, Rudolf M.; Schmid, Gary Bruno

    1998-02-01

    A biparametric approach to dimensional analysis in terms of a so-called ``unfolding dimension'' is introduced to explore the extent to which the human EEG can be described by stable features characteristic of an individual despite the well-known problems of intraindividual variability. Our analysis comprises an EEG data set recorded from healthy individuals over a time span of 5 years. The outcome is shown to be comparable to advanced linear methods of spectral analysis with regard to intraindividual specificity and stability over time. Such linear methods have not yet proven to be specific to the EEG of different brain states. Thus we have also investigated the specificity of our biparametric approach by comparing the mental states schizophrenic psychosis and remission, i.e., illness versus full recovery. A difference between EEG in psychosis and remission became apparent within recordings taken at rest with eyes closed and no stimulated or requested mental activity. Hence our approach distinguishes these functional brain states even in the absence of an active or intentional stimulus. This sheds a different light upon theories of schizophrenia as an information-processing disturbance of the brain.

  17. Computer-based analysis of holography using ray tracing.

    PubMed

    Latta, J N

    1971-12-01

    The application of a ray-tracing methodology to holography is presented. Emphasis is placed on establishing a very general foundation from which to build a general computer-based implementation. As few restrictions as possible are placed on the recording and reconstruction geometry. The necessary equations are established from the construction and reconstruction parameters of the hologram. The aberrations are defined following H. H. Hopkins, and these aberration specification techniques are compared with those used previously to analyze holography. Representative of the flexibility of the ray-tracing approach, two examples are considered. The first compares the answers between a wavefront matching and the ray-tracing analysis in the case of aberration balancing to compensate for chromatic aberrations. The results are very close and establish the basic utility of aberration balancing. Further indicative of the power of a ray tracing, a thick media analysis is included in the computer programs. This section is then used to perform a study of the effects of hologram emulsion shrinkage and methods for compensation. The results of compensating such holograms are to introduce aberrations, and these are considered in both reflection and transmission holograms.

  18. Improvement of Quench Factor Analysis in Phase and Hardness Prediction of a Quenched Steel

    NASA Astrophysics Data System (ADS)

    Kianezhad, M.; Sajjadi, S. A.

    2013-05-01

    The accurate prediction of alloys' properties introduced by heat treatment has been considered by many researchers. The advantages of such predictions are reduction of test trails and materials' consumption as well as time and energy saving. One of the most important methods to predict hardness in quenched steel parts is Quench Factor Analysis (QFA). Classical QFA is based on the Johnson-Mehl-Avrami-Kolmogorov (JMAK) equation. In this study, a modified form of the QFA based on the work by Rometsch et al. is compared with the classical QFA, and they are applied to prediction of hardness of steels. For this purpose, samples of CK60 steel were utilized as raw material. They were austenitized at 1103 K (830 °C). After quenching in different environments, they were cut and their hardness was determined. In addition, the hardness values of the samples were fitted using the classical and modified equations for the quench factor analysis and the results were compared. Results showed a significant improvement in fitted values of the hardness and proved the higher efficiency of the new method.

  19. Introducing anisotropic Minkowski functionals and quantitative anisotropy measures for local structure analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2013-03-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10-4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications.

  20. Introducing Anisotropic Minkowski Functionals and Quantitative Anisotropy Measures for Local Structure Analysis in Biomedical Imaging

    PubMed Central

    Wismüller, Axel; De, Titas; Lochmüller, Eva; Eckstein, Felix; Nagarajan, Mahesh B.

    2017-01-01

    The ability of Minkowski Functionals to characterize local structure in different biological tissue types has been demonstrated in a variety of medical image processing tasks. We introduce anisotropic Minkowski Functionals (AMFs) as a novel variant that captures the inherent anisotropy of the underlying gray-level structures. To quantify the anisotropy characterized by our approach, we further introduce a method to compute a quantitative measure motivated by a technique utilized in MR diffusion tensor imaging, namely fractional anisotropy. We showcase the applicability of our method in the research context of characterizing the local structure properties of trabecular bone micro-architecture in the proximal femur as visualized on multi-detector CT. To this end, AMFs were computed locally for each pixel of ROIs extracted from the head, neck and trochanter regions. Fractional anisotropy was then used to quantify the local anisotropy of the trabecular structures found in these ROIs and to compare its distribution in different anatomical regions. Our results suggest a significantly greater concentration of anisotropic trabecular structures in the head and neck regions when compared to the trochanter region (p < 10−4). We also evaluated the ability of such AMFs to predict bone strength in the femoral head of proximal femur specimens obtained from 50 donors. Our results suggest that such AMFs, when used in conjunction with multi-regression models, can outperform more conventional features such as BMD in predicting failure load. We conclude that such anisotropic Minkowski Functionals can capture valuable information regarding directional attributes of local structure, which may be useful in a wide scope of biomedical imaging applications. PMID:29170580

  1. A Longitudinal Study of Early Reading Development in Two Languages: Comparing Literacy Outcomes in Irish Immersion, English Medium and Gaeltacht Schools

    ERIC Educational Resources Information Center

    Parsons, Christine E.; Lyddy, Fiona

    2016-01-01

    Schools in Ireland vary in how they introduce reading in the two official languages, Irish and English. There is particular variability within immersion (Irish medium) schools. Some introduce Irish reading first (IRF) and others English reading first (ERF). This study compared the development of Irish and English skills in children attending…

  2. A matching framework to improve causal inference in interrupted time-series analysis.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time-series analysis (ITSA) is a popular evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome, subsequent to its introduction. When ITSA is implemented without a comparison group, the internal validity may be quite poor. Therefore, adding a comparable control group to serve as the counterfactual is always preferred. This paper introduces a novel matching framework, ITSAMATCH, to create a comparable control group by matching directly on covariates and then use these matches in the outcomes model. We evaluate the effect of California's Proposition 99 (passed in 1988) for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. We compare ITSAMATCH results to 2 commonly used matching approaches, synthetic controls (SYNTH), and regression adjustment; SYNTH reweights nontreated units to make them comparable to the treated unit, and regression adjusts covariates directly. Methods are compared by assessing covariate balance and treatment effects. Both ITSAMATCH and SYNTH achieved covariate balance and estimated similar treatment effects. The regression model found no treatment effect and produced inconsistent covariate adjustment. While the matching framework achieved results comparable to SYNTH, it has the advantage of being technically less complicated, while producing statistical estimates that are straightforward to interpret. Conversely, regression adjustment may "adjust away" a treatment effect. Given its advantages, ITSAMATCH should be considered as a primary approach for evaluating treatment effects in multiple-group time-series analysis. © 2017 John Wiley & Sons, Ltd.

  3. Direct magnetic field estimation based on echo planar raw data.

    PubMed

    Testud, Frederik; Splitthoff, Daniel Nicolas; Speck, Oliver; Hennig, Jürgen; Zaitsev, Maxim

    2010-07-01

    Gradient recalled echo echo planar imaging is widely used in functional magnetic resonance imaging. The fast data acquisition is, however, very sensitive to field inhomogeneities which manifest themselves as artifacts in the images. Typically used correction methods have the common deficit that the data for the correction are acquired only once at the beginning of the experiment, assuming the field inhomogeneity distribution B(0) does not change over the course of the experiment. In this paper, methods to extract the magnetic field distribution from the acquired k-space data or from the reconstructed phase image of a gradient echo planar sequence are compared and extended. A common derivation for the presented approaches provides a solid theoretical basis, enables a fair comparison and demonstrates the equivalence of the k-space and the image phase based approaches. The image phase analysis is extended here to calculate the local gradient in the readout direction and improvements are introduced to the echo shift analysis, referred to here as "k-space filtering analysis." The described methods are compared to experimentally acquired B(0) maps in phantoms and in vivo. The k-space filtering analysis presented in this work demonstrated to be the most sensitive method to detect field inhomogeneities.

  4. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    PubMed Central

    Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331

  5. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    PubMed

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  6. Network Connectivity for Permanent, Transient, Independent, and Correlated Faults

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sicher, Courtney; henry, Courtney

    2012-01-01

    This paper develops a method for the quantitative analysis of network connectivity in the presence of both permanent and transient faults. Even though transient noise is considered a common occurrence in networks, a survey of the literature reveals an emphasis on permanent faults. Transient faults introduce a time element into the analysis of network reliability. With permanent faults it is sufficient to consider the faults that have accumulated by the end of the operating period. With transient faults the arrival and recovery time must be included. The number and location of faults in the system is a dynamic variable. Transient faults also introduce system recovery into the analysis. The goal is the quantitative assessment of network connectivity in the presence of both permanent and transient faults. The approach is to construct a global model that includes all classes of faults: permanent, transient, independent, and correlated. A theorem is derived about this model that give distributions for (1) the number of fault occurrences, (2) the type of fault occurrence, (3) the time of the fault occurrences, and (4) the location of the fault occurrence. These results are applied to compare and contrast the connectivity of different network architectures in the presence of permanent, transient, independent, and correlated faults. The examples below use a Monte Carlo simulation, but the theorem mentioned above could be used to guide fault-injections in a laboratory.

  7. Fire rehabilitation using native and introduced species: A landscape trial

    Treesearch

    Tyler W. Thompson; Bruce A. Roundy; E. Durant McArthur; Brad D. Jessop; Blair Waldron; James N. Davis

    2006-01-01

    rehabilitation study comparing a predominately introduced species seed mix used by the US Department of Interior-Bureau of Land Management (BLM), a mix of native and introduced species provided by the US Department of Agriculture-Agricultural Research Service (ARS), and 2 native seed mixes (high and low diversity). Mixes were seeded with a rangeland drill on the big...

  8. Metal artifact reduction in CT, a phantom study: subjective and objective evaluation of four commercial metal artifact reduction algorithms when used on three different orthopedic metal implants.

    PubMed

    Bolstad, Kirsten; Flatabø, Silje; Aadnevik, Daniel; Dalehaug, Ingvild; Vetti, Nils

    2018-01-01

    Background Metal implants may introduce severe artifacts in computed tomography (CT) images. Over the last few years dedicated algorithms have been developed in order to reduce metal artifacts in CT images. Purpose To investigate and compare metal artifact reduction algorithms (MARs) from four different CT vendors when imaging three different orthopedic metal implants. Material and Methods Three clinical metal implants were attached to the leg of an anthropomorphic phantom: cobalt-chrome; stainless steel; and titanium. Four commercial MARs were investigated: SmartMAR (GE); O-MAR (Philips); iMAR (Siemens); and SEMAR (Toshiba). The images were evaluated subjectively by three observers and analyzed objectively by calculating the fraction of pixels with CT number above 500 HU in a region of interest around the metal. The average CT number and image noise were also measured. Results Both subjective evaluation and objective analysis showed that MARs reduced metal artifacts and improved the image quality for CT images containing metal implants of steel and cobalt-chrome. When using MARs on titanium, all MARs introduced new visible artifacts. Conclusion The effect of MARs varied between CT vendors and different metal implants used in orthopedic surgery. Both in subjective evaluation and objective analysis the effect of applying MARs was most obvious on steel and cobalt-chrome implants when using SEMAR from Toshiba followed by SmartMAR from GE. However, MARs may also introduce new image artifacts especially when used on titanium implants. Therefore, it is important to reconstruct all CT images containing metal with and without MARs.

  9. Reduced risk of peanut sensitization following exposure through breast-feeding and early peanut introduction.

    PubMed

    Pitt, Tracy J; Becker, Allan B; Chan-Yeung, Moira; Chan, Edmond S; Watson, Wade T A; Chooniedass, Rishma; Azad, Meghan B

    2018-02-01

    Recent trials have shown that avoiding peanuts during infancy increases the risk of peanut allergy; however, these studies did not address maternal peanut consumption. We sought to investigate the relationship between maternal peanut consumption while breast-feeding, timing of direct peanut introduction, and peanut sensitization at age 7 years. Secondary analysis of a nested cohort within the 1995 Canadian Asthma Primary Prevention Study intervention study was performed. Breast-feeding and maternal and infant peanut consumption were captured by repeated questionnaires during infancy. Skin prick testing for peanut sensitization was performed at age 7 years. Overall, 58.2% of mothers consumed peanuts while breast-feeding and 22.5% directly introduced peanuts to their infant by 12 months. At 7 years, 9.4% of children were sensitized to peanuts. The lowest incidence (1.7%) was observed among children whose mothers consumed peanuts while breast-feeding and directly introduced peanuts before 12 months. Incidence was significantly higher (P < .05) if mothers consumed peanuts while breast-feeding but delayed introducing peanuts to their infant beyond 12 months (15.1%), or if mothers avoided peanuts themselves but directly introduced peanuts by 12 months (17.6%). Interaction analyses controlling for study group and maternal atopy confirmed that maternal peanut consumption while breast-feeding and infant peanut consumption by 12 months were protective in combination, whereas either exposure in isolation was associated with an increased risk of sensitization (P interaction = .003). In this secondary analysis, maternal peanut consumption while breast-feeding paired with direct introduction of peanuts in the first year of life was associated with the lowest risk of peanut sensitization, compared with all other combinations of maternal and infant peanut consumption. Copyright © 2017 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  10. Adhesive blood microsampling systems for steroid measurement via LC-MS/MS in the rat.

    PubMed

    Heussner, Kirsten; Rauh, Manfred; Cordasic, Nada; Menendez-Castro, Carlos; Huebner, Hanna; Ruebner, Matthias; Schmidt, Marius; Hartner, Andrea; Rascher, Wolfgang; Fahlbusch, Fabian B

    2017-04-01

    Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) allows for the direct analysis of multiple hormones in a single probe with minimal sample volume. Rodent-based animal studies strongly rely on microsampling, such as the dry blood spot (DBS) method. However, DBS suffers the drawback of hematocrit-dependence (non-volumetric). Hence, novel volumetric microsampling techniques were introduced recently, allowing sampling of fixed accurate volumes. We compared these methods for steroid analysis in the rat to improve inter-system comparability. We analyzed steroid levels in blood using the absorptive microsampling devices Whatman® 903 Protein Saver Cards, Noviplex™ Plasma Prep Cards and the Mitra™ Microsampling device and compared the obtained results to the respective EDTA plasma levels. Quantitative steroid analysis was performed via LC-MS/MS. For the determination of the plasma volume factor for each steroid, their levels in pooled blood samples from each human adults and rats (18weeks) were compared and the transferability of these factors was evaluated in a new set of juvenile (21days) and adult (18weeks) rats. Hematocrit was determined concomitantly. Using these approaches, we were unable to apply one single volume factor for each steroid. Instead, plasma volume factors had to be adjusted for the recovery rate of each steroid and device individually. The tested microsampling systems did not allow the use of one single volume factor for adult and juvenile rats based on an unexpectedly strong hematocrit-dependency and other steroid specific (pre-analytic) factors. Our study provides correction factors for LC-MS/MS steroid analysis of volumetric and non-volumetric microsampling systems in comparison to plasma. It argues for thorough analysis of chromatographic effects before the use of novel volumetric systems for steroid analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A Comparative Analysis of the ADOS-G and ADOS-2 Algorithms: Preliminary Findings.

    PubMed

    Dorlack, Taylor P; Myers, Orrin B; Kodituwakku, Piyadasa W

    2018-06-01

    The Autism Diagnostic Observation Schedule (ADOS) is a widely utilized observational assessment tool for diagnosis of autism spectrum disorders. The original ADOS was succeeded by the ADOS-G with noted improvements. More recently, the ADOS-2 was introduced to further increase its diagnostic accuracy. Studies examining the validity of the ADOS have produced mixed findings, and pooled relationship trends between the algorithm versions are yet to be analyzed. The current review seeks to compare the relative merits of the ADOS-G and ADOS-2 algorithms, Modules 1-3. Eight studies met inclusion criteria for the review, and six were selected for paired comparisons of the sensitivity and specificity of the ADOS. Results indicate several contradictory findings, underscoring the importance of further study.

  12. Computer modeling the fatigue crack growth rate behavior of metals in corrosive environments

    NASA Technical Reports Server (NTRS)

    Richey, Edward, III; Wilson, Allen W.; Pope, Jonathan M.; Gangloff, Richard P.

    1994-01-01

    The objective of this task was to develop a method to digitize FCP (fatigue crack propagation) kinetics data, generally presented in terms of extensive da/dN-Delta K pairs, to produce a file for subsequent linear superposition or curve-fitting analysis. The method that was developed is specific to the Numonics 2400 Digitablet and is comparable to commercially available software products as Digimatic(sup TM 4). Experiments demonstrated that the errors introduced by the photocopying of literature data, and digitization, are small compared to those inherent in laboratory methods to characterize FCP in benign and aggressive environments. The digitizing procedure was employed to obtain fifteen crack growth rate data sets for several aerospace alloys in aggressive environments.

  13. Mission analysis for the Martian Moons Explorer (MMX) mission

    NASA Astrophysics Data System (ADS)

    Campagnola, Stefano; Yam, Chit Hong; Tsuda, Yuichi; Ogawa, Naoko; Kawakatsu, Yasuhiro

    2018-05-01

    Mars Moon eXplorer (MMX) is JAXA's next candidate flagship mission to be launched in the early 2020s. MMX will explore the Martian moons and return a sample from Phobos. This paper presents the mission analysis work, focusing on the transfer legs and comparing several architectures, such as hybrid options with chemical and electric propulsion modules. The selected baseline is a chemical-propulsion Phobos sample return, which is discussed in detail with the launch- and return-window analysis. The trajectories are optimized with the jTOP software, using planetary ephemerides for Mars and the Earth; Earth re-entry constraints are modeled with simple analytical equations. Finally, we introduce an analytical approximation of the three-burn capture strategy used in the Mars system. The approximation can be used together with a Lambert solver to quickly determine the transfer Δ v costs.

  14. Close-packed structure dynamics with finite-range interaction: computational mechanics with individual layer interaction.

    PubMed

    Rodriguez-Horta, Edwin; Estevez-Rams, Ernesto; Lora-Serrano, Raimundo; Neder, Reinhard

    2017-09-01

    This is the second contribution in a series of papers dealing with dynamical models in equilibrium theories of polytypism. A Hamiltonian introduced by Ahmad & Khan [Phys. Status Solidi B (2000), 218, 425-430] avoids the unphysical assignment of interaction terms to fictitious entities given by spins in the Hägg coding of the stacking arrangement. In this paper an analysis of polytype generation and disorder in close-packed structures is made for such a Hamiltonian. Results are compared with a previous analysis using the Ising model. Computational mechanics is the framework under which the analysis is performed. The competing effects of disorder and structure, as given by entropy density and excess entropy, respectively, are discussed. It is argued that the Ahmad & Khan model is simpler and predicts a larger set of polytypes than previous treatments.

  15. A parallelization scheme of the periodic signals tracking algorithm for isochronous mass spectrometry on GPUs

    NASA Astrophysics Data System (ADS)

    Chen, R. J.; Wang, M.; Yan, X. L.; Yang, Q.; Lam, Y. H.; Yang, L.; Zhang, Y. H.

    2017-12-01

    The periodic signals tracking algorithm has been used to determine the revolution times of ions stored in storage rings in isochronous mass spectrometry (IMS) experiments. It has been a challenge to perform real-time data analysis by using the periodic signals tracking algorithm in the IMS experiments. In this paper, a parallelization scheme of the periodic signals tracking algorithm is introduced and a new program is developed. The computing time of data analysis can be reduced by a factor of ∼71 and of ∼346 by using our new program on Tesla C1060 GPU and Tesla K20c GPU, compared to using old program on Xeon E5540 CPU. We succeed in performing real-time data analysis for the IMS experiments by using the new program on Tesla K20c GPU.

  16. Fast Steerable Principal Component Analysis

    PubMed Central

    Zhao, Zhizhen; Shkolnisky, Yoel; Singer, Amit

    2016-01-01

    Cryo-electron microscopy nowadays often requires the analysis of hundreds of thousands of 2-D images as large as a few hundred pixels in each direction. Here, we introduce an algorithm that efficiently and accurately performs principal component analysis (PCA) for a large set of 2-D images, and, for each image, the set of its uniform rotations in the plane and their reflections. For a dataset consisting of n images of size L × L pixels, the computational complexity of our algorithm is O(nL3 + L4), while existing algorithms take O(nL4). The new algorithm computes the expansion coefficients of the images in a Fourier–Bessel basis efficiently using the nonuniform fast Fourier transform. We compare the accuracy and efficiency of the new algorithm with traditional PCA and existing algorithms for steerable PCA. PMID:27570801

  17. Evaluation of novel derivatisation reagents for the analysis of oxysterols

    PubMed Central

    Crick, Peter J.; Aponte, Jennifer; Bentley, T. William; Matthews, Ian; Wang, Yuqin; Griffiths, William J.

    2014-01-01

    Oxysterols are oxidised forms of cholesterol that are intermediates in the synthesis of bile acids and steroid hormones. They are also ligands to nuclear and G protein-coupled receptors. Analysis of oxysterols in biological systems is challenging due to their low abundance coupled with their lack of a strong chromophore and poor ionisation characteristics in mass spectrometry (MS). We have previously used enzyme-assisted derivatisation for sterol analysis (EADSA) to identify and quantitate oxysterols in biological samples. This technique relies on tagging sterols with the Girard P reagent to introduce a charged quaternary ammonium group. Here, we have compared several modified Girard-like reagents and show that the permanent charge is vital for efficient MSn fragmentation. However, we find that the reagent can be extended to include sites for potential stable isotope labels without a loss of performance. PMID:24525124

  18. Deriving quantitative dynamics information for proteins and RNAs using ROTDIF with a graphical user interface.

    PubMed

    Berlin, Konstantin; Longhini, Andrew; Dayie, T Kwaku; Fushman, David

    2013-12-01

    To facilitate rigorous analysis of molecular motions in proteins, DNA, and RNA, we present a new version of ROTDIF, a program for determining the overall rotational diffusion tensor from single- or multiple-field nuclear magnetic resonance relaxation data. We introduce four major features that expand the program's versatility and usability. The first feature is the ability to analyze, separately or together, (13)C and/or (15)N relaxation data collected at a single or multiple fields. A significant improvement in the accuracy compared to direct analysis of R2/R1 ratios, especially critical for analysis of (13)C relaxation data, is achieved by subtracting high-frequency contributions to relaxation rates. The second new feature is an improved method for computing the rotational diffusion tensor in the presence of biased errors, such as large conformational exchange contributions, that significantly enhances the accuracy of the computation. The third new feature is the integration of the domain alignment and docking module for relaxation-based structure determination of multi-domain systems. Finally, to improve accessibility to all the program features, we introduced a graphical user interface that simplifies and speeds up the analysis of the data. Written in Java, the new ROTDIF can run on virtually any computer platform. In addition, the new ROTDIF achieves an order of magnitude speedup over the previous version by implementing a more efficient deterministic minimization algorithm. We not only demonstrate the improvement in accuracy and speed of the new algorithm for synthetic and experimental (13)C and (15)N relaxation data for several proteins and nucleic acids, but also show that careful analysis required especially for characterizing RNA dynamics allowed us to uncover subtle conformational changes in RNA as a function of temperature that were opaque to previous analysis.

  19. Stylizing Genderlect Online for Social Action: A Corpus Analysis of "BIC Cristal for Her" Reviews

    ERIC Educational Resources Information Center

    Ray, Brian

    2016-01-01

    This article introduces the concept of stylization and illustrates its usefulness for studying online discourse by examining how writers have employed it in order to parody sexist products such as BIC Cristal for Her, using genderlect in order to introduce dissonance into and reframe patriarchal discourse. A corpus analysis of 671 reviews, written…

  20. Introducing and Integrating Gifted Education into an Existing Independent School: An Analysis of Practice

    ERIC Educational Resources Information Center

    McKibben, Stephen

    2013-01-01

    In this analysis of practice, I conduct a combination formative and summative program evaluation of an initiative introduced to serve gifted learners at The Ocean School (TOS), an independent, Pre-K-grade 8 day school located in a rural area of the West Coast. Using the best practices as articulated by the National Association of Gifted Children…

  1. A preliminary survey of Chlamydia psittaci genotypes from native and introduced birds in New Zealand.

    PubMed

    Gedye, K R; Fremaux, M; Garcia-Ramirez, J C; Gartrell, B D

    2018-05-01

    To describe the Chlamydia psittaci genotypes in samples from native and introduced birds from New Zealand by analysis of the sequence variation of the ompA gene. DNA was extracted from samples collected from a non-random sample of birds; either swabs from live asymptomatic birds or birds with clinical signs, or formalin-fixed, paraffin-embedded (FFPE) samples from historical post-mortem cases. The presence of C. psittaci in all samples had been confirmed using a quantitative PCR assay. The C. psittaci ompA gene was amplified and sequenced from samples from 26 native and introduced infected birds comprising 12 different species. These sequences were compared to published available C. psittaci genotypes. Genotypes A and C of C. psittaci were identified in the samples. Genotype A was identified in samples from nine birds, including various native and introduced species. Genotype C was identified in samples from 16 different waterfowl species, and a mixed infection of both genotypes was found in a kaka (Nestor meridionalis). In native birds, C. psittaci infection was confirmed in seven new host species. Two genotypes (A and C) of C. psittaci were found in samples from a wider range of both native and introduced species of birds in New Zealand than previously reported. Both genotypes have been globally associated with significant disease in birds and humans. These initial results suggest the host range of C. psittaci in New Zealand birds is under-reported. However, the prevalence of C. psittaci infection in New Zealand, and the associated impact on avian and public health, remains to be determined. There are biosecurity implications associated with the importation of birds to New Zealand if there is a limited diversity of C. psittaci genotypes present.

  2. Recovering Intrinsic Fragmental Vibrations Using the Generalized Subsystem Vibrational Analysis.

    PubMed

    Tao, Yunwen; Tian, Chuan; Verma, Niraj; Zou, Wenli; Wang, Chao; Cremer, Dieter; Kraka, Elfi

    2018-05-08

    Normal vibrational modes are generally delocalized over the molecular system, which makes it difficult to assign certain vibrations to specific fragments or functional groups. We introduce a new approach, the Generalized Subsystem Vibrational Analysis (GSVA), to extract the intrinsic fragmental vibrations of any fragment/subsystem from the whole system via the evaluation of the corresponding effective Hessian matrix. The retention of the curvature information with regard to the potential energy surface for the effective Hessian matrix endows our approach with a concrete physical basis and enables the normal vibrational modes of different molecular systems to be legitimately comparable. Furthermore, the intrinsic fragmental vibrations act as a new link between the Konkoli-Cremer local vibrational modes and the normal vibrational modes.

  3. Application of meta-analysis methods for identifying proteomic expression level differences.

    PubMed

    Amess, Bob; Kluge, Wolfgang; Schwarz, Emanuel; Haenisch, Frieder; Alsaif, Murtada; Yolken, Robert H; Leweke, F Markus; Guest, Paul C; Bahn, Sabine

    2013-07-01

    We present new statistical approaches for identification of proteins with expression levels that are significantly changed when applying meta-analysis to two or more independent experiments. We showed that the Euclidean distance measure has reduced risk of false positives compared to the rank product method. Our Ψ-ranking method has advantages over the traditional fold-change approach by incorporating both the fold-change direction as well as the p-value. In addition, the second novel method, Π-ranking, considers the ratio of the fold-change and thus integrates all three parameters. We further improved the latter by introducing our third technique, Σ-ranking, which combines all three parameters in a balanced nonparametric approach. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Nuclear reactor transient analysis via a quasi-static kinetics Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jo, YuGwon; Cho, Bumhee; Cho, Nam Zin, E-mail: nzcho@kaist.ac.kr

    2015-12-31

    The predictor-corrector quasi-static (PCQS) method is applied to the Monte Carlo (MC) calculation for reactor transient analysis. To solve the transient fixed-source problem of the PCQS method, fission source iteration is used and a linear approximation of fission source distributions during a macro-time step is introduced to provide delayed neutron source. The conventional particle-tracking procedure is modified to solve the transient fixed-source problem via MC calculation. The PCQS method with MC calculation is compared with the direct time-dependent method of characteristics (MOC) on a TWIGL two-group problem for verification of the computer code. Then, the results on a continuous-energy problemmore » are presented.« less

  5. Breaking free from chemical spreadsheets.

    PubMed

    Segall, Matthew; Champness, Ed; Leeding, Chris; Chisholm, James; Hunt, Peter; Elliott, Alex; Garcia-Martinez, Hector; Foster, Nick; Dowling, Samuel

    2015-09-01

    Drug discovery scientists often consider compounds and data in terms of groups, such as chemical series, and relationships, representing similarity or structural transformations, to aid compound optimisation. This is often supported by chemoinformatics algorithms, for example clustering and matched molecular pair analysis. However, chemistry software packages commonly present these data as spreadsheets or form views that make it hard to find relevant patterns or compare related compounds conveniently. Here, we review common data visualisation and analysis methods used to extract information from chemistry data. We introduce a new framework that enables scientists to work flexibly with drug discovery data to reflect their thought processes and interact with the output of algorithms to identify key structure-activity relationships and guide further optimisation intuitively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. High resolution melting analysis to genotype the most common variants in the HFE gene.

    PubMed

    Marotta, Roberta V; Turri, Olivia; Morandi, Antonella; Murano, Manuela; d'Eril, Gianlodovico Melzi; Biondi, Maria Luisa

    2011-09-01

    High resolution melting (HRM) analysis of PCR amplicons was recently introduced as a closed-tube, rapid, and inexpensive method of genotyping. This study evaluated this system as an option for detecting the three most common mutations in the HFE gene (C282Y, H63D, S65C), accounting for the main form of hereditary haemochromatosis. Ninety samples, previously screened by direct sequencing, and 27 controls were used. The analysis were performed on the Rotor Gene Q, using the commercial HRM mix containing the Eva Green dye (Qiagen). Specific primers allowed the amplification of the regions of interest in the HFE gene. Following amplification, a HRM analysis was conducted to detect DNA variants. The thermal denaturation profiles of the samples were compared with those of the controls. One hundred percent of heterozygous and homozygous samples were readily identified. Heterozygotes were easily identified because heteroduplexes altered the shape of the melting curves, but significant differences were also present in the melting curves of the homozygous carries compared with those of the wild-type subjects. HRM analysis is an appealing technology for HFE gene screening. It is a robust technique that can be widely adopted in diagnostic laboratories to facilitate gene mutation screening.

  7. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.

  8. Dicke’s Superradiance in Astrophysics. I. The 21 cm Line

    NASA Astrophysics Data System (ADS)

    Rajabi, Fereshteh; Houde, Martin

    2016-08-01

    We have applied the concept of superradiance introduced by Dicke in 1954 to astrophysics by extending the corresponding analysis to the magnetic dipole interaction characterizing the atomic hydrogen 21 cm line. Although it is unlikely that superradiance could take place in thermally relaxed regions and that the lack of observational evidence of masers for this transition reduces the probability of detecting superradiance, in situations where the conditions necessary for superradiance are met (close atomic spacing, high velocity coherence, population inversion, and long dephasing timescales compared to those related to coherent behavior), our results suggest that relatively low levels of population inversion over short astronomical length-scales (e.g., as compared to those required for maser amplification) can lead to the cooperative behavior required for superradiance in the interstellar medium. Given the results of our analysis, we expect the observational properties of 21 cm superradiance to be characterized by the emission of high-intensity, spatially compact, burst-like features potentially taking place over short periods ranging from minutes to days.

  9. Comparative analysis of public opinion research in the U.S. and Canada

    NASA Astrophysics Data System (ADS)

    Setlakwe, Linda; DiNunzio, Lisa A.

    2004-06-01

    Bank note producers are working to thwart the threat of counterfeit notes created using high resolution, digital image processing software and color output devices such as inkjet printers, color copiers, and scanners. Genuine notes must incorporate better overt and machine-readable security features that will reduce the chance of counterfeit notes being passed. Recently, Canada and the United States introduced newly designed bank notes that are intended to enable the general public to more easily distinguish genuine notes from counterfeits. The Bank of Canada (BoC) and the U.S. Department of Treasury"s Bureau of Engraving and Printing (BEP) have conducted similar market research projects to explore target audiences' perceptions and attitudes towards currency design and security features. This paper will present a comparative analysis of the two research projects, both of which were conducted using similar methodology. The results of these research studies assist in the selection of security features for future generations of bank notes.

  10. An index-based approach for the sustainability assessment of irrigation practice based on the water-energy-food nexus framework

    NASA Astrophysics Data System (ADS)

    de Vito, Rossella; Portoghese, Ivan; Pagano, Alessandro; Fratino, Umberto; Vurro, Michele

    2017-12-01

    Increasing pressure affects water resources, especially in the agricultural sector, with cascading impacts on energy consumption. This is particularly relevant in the Mediterranean area, showing significant water scarcity problems, further exacerbated by the crucial economic role of agricultural production. Assessing the sustainability of water resource use is thus essential to preserving ecosystems and maintaining high levels of agricultural productivity. This paper proposes an integrated methodology based on the Water-Energy-Food Nexus to evaluate the multi-dimensional implications of irrigation practices. Three different indices are introduced, based on an analysis of the most influential factors. The methodology is then implemented in a catchment located in Puglia (Italy) and a comparative analysis of the three indices is presented. The results mainly highlight that economic land productivity is a key driver of irrigated agriculture, and that groundwater is highly affordable compared to surface water, thus being often dangerously perceived as freely available.

  11. Measuring the spatial resolution of an optical system in an undergraduate optics laboratory

    NASA Astrophysics Data System (ADS)

    Leung, Calvin; Donnelly, T. D.

    2017-06-01

    Two methods of quantifying the spatial resolution of a camera are described, performed, and compared, with the objective of designing an imaging-system experiment for students in an undergraduate optics laboratory. With the goal of characterizing the resolution of a typical digital single-lens reflex (DSLR) camera, we motivate, introduce, and show agreement between traditional test-target contrast measurements and the technique of using Fourier analysis to obtain the modulation transfer function (MTF). The advantages and drawbacks of each method are compared. Finally, we explore the rich optical physics at work in the camera system by calculating the MTF as a function of wavelength and f-number. For example, we find that the Canon 40D demonstrates better spatial resolution at short wavelengths, in accordance with scalar diffraction theory, but is not diffraction-limited, being significantly affected by spherical aberration. The experiment and data analysis routines described here can be built and written in an undergraduate optics lab setting.

  12. The potential of latent semantic analysis for machine grading of clinical case summaries.

    PubMed

    Kintsch, Walter

    2002-02-01

    This paper introduces latent semantic analysis (LSA), a machine learning method for representing the meaning of words, sentences, and texts. LSA induces a high-dimensional semantic space from reading a very large amount of texts. The meaning of words and texts can be represented as vectors in this space and hence can be compared automatically and objectively. A generative theory of the mental lexicon based on LSA is described. The word vectors LSA constructs are context free, and each word, irrespective of how many meanings or senses it has, is represented by a single vector. However, when a word is used in different contexts, context appropriate word senses emerge. Several applications of LSA to educational software are described, involving the ability of LSA to quickly compare the content of texts, such as an essay written by a student and a target essay. An LSA-based software tool is sketched for machine grading of clinical case summaries written by medical students.

  13. Application of Nexus copy number software for CNV detection and analysis.

    PubMed

    Darvishi, Katayoon

    2010-04-01

    Among human structural genomic variation, copy number variants (CNVs) are the most frequently known component, comprised of gains/losses of DNA segments that are generally 1 kb in length or longer. Array-based comparative genomic hybridization (aCGH) has emerged as a powerful tool for detecting genomic copy number variants (CNVs). With the rapid increase in the density of array technology and with the adaptation of new high-throughput technology, a reliable and computationally scalable method for accurate mapping of recurring DNA copy number aberrations has become a main focus in research. Here we introduce Nexus Copy Number software, a platform-independent tool, to analyze the output files of all types of commercial and custom-made comparative genomic hybridization (CGH) and single-nucleotide polymorphism (SNP) arrays, such as those manufactured by Affymetrix, Agilent Technologies, Illumina, and Roche NimbleGen. It also supports data generated by various array image-analysis software tools such as GenePix, ImaGene, and BlueFuse. (c) 2010 by John Wiley & Sons, Inc.

  14. Generalization of Clustering Coefficients to Signed Correlation Networks

    PubMed Central

    Costantini, Giulio; Perugini, Marco

    2014-01-01

    The recent interest in network analysis applications in personality psychology and psychopathology has put forward new methodological challenges. Personality and psychopathology networks are typically based on correlation matrices and therefore include both positive and negative edge signs. However, some applications of network analysis disregard negative edges, such as computing clustering coefficients. In this contribution, we illustrate the importance of the distinction between positive and negative edges in networks based on correlation matrices. The clustering coefficient is generalized to signed correlation networks: three new indices are introduced that take edge signs into account, each derived from an existing and widely used formula. The performances of the new indices are illustrated and compared with the performances of the unsigned indices, both on a signed simulated network and on a signed network based on actual personality psychology data. The results show that the new indices are more resistant to sample variations in correlation networks and therefore have higher convergence compared with the unsigned indices both in simulated networks and with real data. PMID:24586367

  15. The use of a cognitive task analysis-based multimedia program to teach surgical decision making in flexor tendon repair.

    PubMed

    Luker, Kali R; Sullivan, Maura E; Peyre, Sarah E; Sherman, Randy; Grunwald, Tiffany

    2008-01-01

    The aim of this study was to compare the surgical knowledge of residents before and after receiving a cognitive task analysis-based multimedia teaching module. Ten plastic surgery residents were evaluated performing flexor tendon repair on 3 occasions. Traditional learning occurred between the first and second trial and served as the control. A teaching module was introduced as an intervention between the second and third trial using cognitive task analysis to illustrate decision-making skills. All residents showed improvement in their decision-making ability when performing flexor tendon repair after each surgical procedure. The group improved through traditional methods as well as exposure to our talk-aloud protocol (P > .01). After being trained using the cognitive task analysis curriculum the group displayed a statistically significant knowledge expansion (P < .01). Residents receiving cognitive task analysis-based multimedia surgical curriculum instruction achieved greater command of problem solving and are better equipped to make correct decisions in flexor tendon repair.

  16. Strategic analysis for health care organizations: the suitability of the SWOT-analysis.

    PubMed

    van Wijngaarden, Jeroen D H; Scholten, Gerard R M; van Wijk, Kees P

    2012-01-01

    Because of the introduction of (regulated) market competition and self-regulation, strategy is becoming an important management field for health care organizations in many European countries. That is why health managers are introducing more and more strategic principles and tools. Especially the SWOT (strengths, weaknesses, opportunities, threats)-analysis seems to be popular. However, hardly any empirical research has been done on the use and suitability of this instrument for the health care sector. In this paper four case studies are presented on the use of the SWOT-analysis in different parts of the health care sector in the Netherlands. By comparing these results with the premises of the SWOT and academic critique, it will be argued that the SWOT in its current form is not suitable as a tool for strategic analysis in health care in many European countries. Based on these findings an alternative SWOT-model is presented, in which expectations and learning of stakeholder are incorporated. Copyright © 2010 John Wiley & Sons, Ltd.

  17. Linearized spectrum correlation analysis for line emission measurements

    NASA Astrophysics Data System (ADS)

    Nishizawa, T.; Nornberg, M. D.; Den Hartog, D. J.; Sarff, J. S.

    2017-08-01

    A new spectral analysis method, Linearized Spectrum Correlation Analysis (LSCA), for charge exchange and passive ion Doppler spectroscopy is introduced to provide a means of measuring fast spectral line shape changes associated with ion-scale micro-instabilities. This analysis method is designed to resolve the fluctuations in the emission line shape from a stationary ion-scale wave. The method linearizes the fluctuations around a time-averaged line shape (e.g., Gaussian) and subdivides the spectral output channels into two sets to reduce contributions from uncorrelated fluctuations without averaging over the fast time dynamics. In principle, small fluctuations in the parameters used for a line shape model can be measured by evaluating the cross spectrum between different channel groupings to isolate a particular fluctuating quantity. High-frequency ion velocity measurements (100-200 kHz) were made by using this method. We also conducted simulations to compare LSCA with a moment analysis technique under a low photon count condition. Both experimental and synthetic measurements demonstrate the effectiveness of LSCA.

  18. Combining synthetic controls and interrupted time series analysis to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome. The internal validity is strengthened considerably when the treated unit is contrasted with a comparable control group. In this paper, we introduce a robust evaluation framework that combines the synthetic controls method (SYNTH) to generate a comparable control group and ITSA regression to assess covariate balance and estimate treatment effects. We evaluate the effect of California's Proposition 99 for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. SYNTH is used to reweight nontreated units to make them comparable to the treated unit. These weights are then used in ITSA regression models to assess covariate balance and estimate treatment effects. Covariate balance was achieved for all but one covariate. While California experienced a significant decrease in the annual trend of cigarette sales after Proposition 99, there was no statistically significant treatment effect when compared to synthetic controls. The advantage of using this framework over regression alone is that it ensures that a comparable control group is generated. Additionally, it offers a common set of statistical measures familiar to investigators, the capability for assessing covariate balance, and enhancement of the evaluation with a comprehensive set of postestimation measures. Therefore, this robust framework should be considered as a primary approach for evaluating treatment effects in multiple group time series analysis. © 2018 John Wiley & Sons, Ltd.

  19. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study.

    PubMed

    Berrevoets, Marvin A H; Pot, Johannes Hans L W; Houterman, Anne E; Dofferhoff, Anton Ton S M; Nabuurs-Franssen, Marrigje H; Fleuren, Hanneke W H A; Kullberg, Bart-Jan; Schouten, Jeroen A; Sprong, Tom

    2017-01-01

    Timely switch from intravenous (iv) antibiotics to oral therapy is a key component of antimicrobial stewardship programs in order to improve patient safety, promote early discharge and reduce costs. We have introduced a time-efficient and easily implementable intervention that relies on a computerized trigger tool, which identifies patients who are candidates for an iv to oral antibiotic switch. The intervention was introduced on all internal medicine wards in a teaching hospital. Patients were automatically identified by an electronic trigger tool when parenteral antibiotics were used for >48 h and clinical or pharmacological data did not preclude switch therapy. A weekly educational session was introduced to alert the physicians on the intervention wards. The intervention wards were compared with control wards, which included all other hospital wards. An interrupted time-series analysis was performed to compare the pre-intervention period with the post-intervention period using '% of i.v. prescriptions >72 h' and 'median duration of iv therapy per prescription' as outcomes. We performed a detailed prospective evaluation on a subset of 244 prescriptions to evaluate the efficacy and appropriateness of the intervention. The number of intravenous prescriptions longer than 72 h was reduced by 19% in the intervention group ( n  = 1519) ( p  < 0.01) and the median duration of iv antibiotics was reduced with 0.8 days ( p  = <0.05). Compared to the control group ( n  = 4366) the intervention was responsible for an additional decrease of 13% ( p  < 0.05) in prolonged prescriptions. The detailed prospective evaluation of a subgroup of patients showed that adherence to the electronic reminder was 72%. An electronic trigger tool combined with a weekly educational session was effective in reducing the duration of intravenous antimicrobial therapy.

  20. Robotic and open radical prostatectomy in the public health sector: cost comparison.

    PubMed

    Hall, Rohan Matthew; Linklater, Nicholas; Coughlin, Geoff

    2014-06-01

    During 2008, the Royal Brisbane and Women's Hospital became the first public hospital in Australia to have a da Vinci Surgical Robot purchased by government funding. The cost of performing robotic surgery in the public sector is a contentious issue. This study is a single centre, cost analysis comparing open radical prostatectomy (RRP) and robotic-assisted radical prostatectomy (RALP) based on the newly introduced pure case-mix funding model. A retrospective chart review was performed for the first 100 RALPs and the previous 100 RRPs. Estimates of tangible costing and funding were generated for each admission and readmission, using the Royal Brisbane Hospital Transition II database, based on pure case-mix funding. The average cost for admission for RRP was A$13 605, compared to A$17 582 for the RALP. The average funding received for a RRP was A$11 781 compared to A$5496 for a RALP based on the newly introduced case-mix model. The average length of stay for RRP was 4.4 days (2-14) and for RALP, 1.2 days (1-4). The total cost of readmissions for RRP patients was A$70 487, compared to that of the RALP patients, A$7160. These were funded at A$55 639 and A$7624, respectively. RALP has shown a significant advantage with respect to length of stay and readmission rate. Based on the case-mix funding model RALP is poorly funded compared to its open equivalent. Queensland Health needs to plan on how robotic surgery is implemented and assess whether this technology is truly affordable in the public sector. © 2013 The Authors. ANZ Journal of Surgery © 2013 Royal Australasian College of Surgeons.

  1. Breastfeeding, Infant Formula, and Introduction to Complementary Foods-Comparing Data Obtained by Questionnaires and Health Visitors' Reports to Weekly Short Message Service Text Messages.

    PubMed

    Bruun, Signe; Buhl, Susanne; Husby, Steffen; Jacobsen, Lotte Neergaard; Michaelsen, Kim F; Sørensen, Jan; Zachariassen, Gitte

    2017-11-01

    Studies on prevalence and effects of breastfeeding call for reliable and precise data collection to optimize infant nutrition, growth, and health. Data on breastfeeding and infant nutrition are at risk of, for example, recall bias or social desirability bias. The aim of the present analysis was to compare data on infant nutrition, that is, breastfeeding, use of infant formula, and introduction to complementary foods, obtained by four different methods. We assumed that weekly short message service (SMS) questions were the most reliable method, to which the other methods were compared. The study population was part of the Odense Child Cohort. The four methods used were: (a) self-administered questionnaire 3 months postpartum, (b) self-administered questionnaire 18 months postpartum, (c) registrations from health visitors visiting the families several times within the first year of life, and (d) weekly SMS questions introduced shortly after birth. In total, 639 singleton mothers with data from all four methods were included. The proportion of mothers initiating breastfeeding varied from 86% to 97%, the mean duration of exclusive breastfeeding from 12 to 19 weeks, and the mean age when introduced to complementary foods from 19 to 21 weeks. The mean duration of any breastfeeding was 33 weeks across methods. Compared with the weekly SMS questions, the self-administered questionnaires and the health visitors' reports resulted in a greater proportion of mothers with an unknown breastfeeding status, a longer duration of exclusive breastfeeding and later introduction to complementary foods, while the duration of any breastfeeding did not differ.

  2. Ultrasonic dissection versus electrocautery in mastectomy for breast cancer - a meta-analysis.

    PubMed

    Currie, A; Chong, K; Davies, G L; Cummins, R S

    2012-10-01

    Electrocautery has advanced the practice of mastectomy but significant morbidity, such as seroma and blood loss, remains a concern. This has led to newer forms of dissection being introduced including the ultrasonic dissection devices, which are thought to reduce tissue damage. The aim of this systematic review was to compare the outcomes after mastectomy using novel ultrasonic dissection or standard electrocautery in published trials. Medline, Embase, trial registries, conference proceedings and reference lists were searched for comparative trials of ultrasonic dissection versus electrocautery for mastectomy. The primary outcomes were total postoperative drainage, seroma development and intra-operative blood loss. Secondary outcomes were operative time and wound complications. Odds ratios were calculated for categorical outcomes and standardised mean differences for continuous outcomes. Six trials were included in the analysis of 287 mastectomies. There was no effect in total postoperative drainage (pooled analysis weight mean difference: -0.21 (95% CI: -0.70-0.29); p = 0.41) or seroma development (pooled analysis odds ratio: 0.77 (95% CIs 0.43-1.37); p = 0.37). Intra-operative blood was slightly less for ultrasonic dissection compared to standard electrocautery (pooled analysis weight mean difference: -1.04 (95% CI: -2.00 to -0.08); p = 0.03). Ultrasonic dissection and standard electrocautery had similar outcomes with regard to operative time and wound complications. Ultrasonic dissection and standard electrocautery appear to deliver similar results in the mastectomy setting. Further cost-effectiveness analysis may guide surgeon selection in the use of new technologies for mastectomy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Nonlinear spatio-temporal filtering of dynamic PET data using a four-dimensional Gaussian filter and expectation-maximization deconvolution

    NASA Astrophysics Data System (ADS)

    Floberg, J. M.; Holden, J. E.

    2013-02-01

    We introduce a method for denoising dynamic PET data, spatio-temporal expectation-maximization (STEM) filtering, that combines four-dimensional Gaussian filtering with EM deconvolution. The initial Gaussian filter suppresses noise at a broad range of spatial and temporal frequencies and EM deconvolution quickly restores the frequencies most important to the signal. We aim to demonstrate that STEM filtering can improve variance in both individual time frames and in parametric images without introducing significant bias. We evaluate STEM filtering with a dynamic phantom study, and with simulated and human dynamic PET studies of a tracer with reversible binding behaviour, [C-11]raclopride, and a tracer with irreversible binding behaviour, [F-18]FDOPA. STEM filtering is compared to a number of established three and four-dimensional denoising methods. STEM filtering provides substantial improvements in variance in both individual time frames and in parametric images generated with a number of kinetic analysis techniques while introducing little bias. STEM filtering does bias early frames, but this does not affect quantitative parameter estimates. STEM filtering is shown to be superior to the other simple denoising methods studied. STEM filtering is a simple and effective denoising method that could be valuable for a wide range of dynamic PET applications.

  4. The importance of scaling for detecting community patterns: success and failure in assemblages of introduced species

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.; Moulton, Michael P.; Holling, Crawford S.

    2015-01-01

    Community saturation can help to explain why biological invasions fail. However, previous research has documented inconsistent relationships between failed invasions (i.e., an invasive species colonizes but goes extinct) and the number of species present in the invaded community. We use data from bird communities of the Hawaiian island of Oahu, which supports a community of 38 successfully established introduced birds and where 37 species were introduced but went extinct (failed invasions). We develop a modified approach to evaluate the effects of community saturation on invasion failure. Our method accounts (1) for the number of species present (NSP) when the species goes extinct rather than during its introduction; and (2) scaling patterns in bird body mass distributions that accounts for the hierarchical organization of ecosystems and the fact that interaction strength amongst species varies with scale. We found that when using NSP at the time of extinction, NSP was higher for failed introductions as compared to successful introductions, supporting the idea that increasing species richness and putative community saturation mediate invasion resistance. Accounting for scale-specific patterns in body size distributions further improved the relationship between NSP and introduction failure. Results show that a better understanding of invasion outcomes can be obtained when scale-specific community structure is accounted for in the analysis.

  5. Design study of a re-bunching RFQ for the SPES project

    NASA Astrophysics Data System (ADS)

    Shin, Seung Wook; Palmieri, A.; Comunian, M.; Grespan, F.; Chai, Jong Seo

    2014-05-01

    An upgrade to the 2nd generation of the selective production of exotic species (SPES) to produce a radioactive ion beam (RIB) has been studied at the istituto nazionale di fisica nucleare — laboratory nazionali di Legnaro (INFN-LNL). Due to the long distance between the isotope separator online (ISOL) facility and the superconducting quarter wave resonator (QWR) cavity acceleratore lineare per ioni (ALPI), a new re-buncher cavity must be introduced to maintain the high beam quality during the beam transport. A particular radio frequency quadrupole (RFQ) structure has been suggested to meet the requirements of this project. A window-type RFQ, which has a high mode separation, less power dissipation and compact size compared to the conventional normal 4-vane RFQ, has been introduced. The RF design has been studied considering the requirements of the re-bunching machine for high figures of merit such as a proper operation frequency, a high shunt impedance, a high quality factor, a low power dissipation, etc. A sensitivity analysis of the fabrication and the misalignment error has been conducted. A micro-movement slug tuner has been introduced to compensate for the frequency variations that may occur due to the beam loading, the thermal instability, the microphonic effect, etc.

  6. Buckling Behavior of Compression-Loaded Quasi-Isotropic Curved Panels with a Circular Cutout

    NASA Technical Reports Server (NTRS)

    Hilburger, Mark W.; Britt, Vicki O.; Nemeth, Michael P.

    1999-01-01

    Results from a numerical and experimental study of the response of compression-loaded quasi-isotropic curved panels with a centrally located circular cutout are presented. The numerical results were obtained by using a geometrically nonlinear finite element analysis code. The effects of cutout size, panel curvature and initial geo- metric imperfections on the overall response of compression-loaded panels are described. In addition, results are presented from a numerical parametric study that indicate the effects of elastic circumferential edge restraints on the prebuckling and buckling response of a selected panel and these numerical results are compared to experimentally measured results. These restraints are used to identify the effects of circumferential edge restraints that are introduced by the test fixture that was used in the present study. It is shown that circumferential edge restraints can introduce substantial nonlinear prebuckling deformations into shallow compression-loaded curved panels that can results in a significant increase in buckling load.

  7. Analysis on accuracy improvement of rotor-stator rubbing localization based on acoustic emission beamforming method.

    PubMed

    He, Tian; Xiao, Denghong; Pan, Qiang; Liu, Xiandong; Shan, Yingchun

    2014-01-01

    This paper attempts to introduce an improved acoustic emission (AE) beamforming method to localize rotor-stator rubbing fault in rotating machinery. To investigate the propagation characteristics of acoustic emission signals in casing shell plate of rotating machinery, the plate wave theory is used in a thin plate. A simulation is conducted and its result shows the localization accuracy of beamforming depends on multi-mode, dispersion, velocity and array dimension. In order to reduce the effect of propagation characteristics on the source localization, an AE signal pre-process method is introduced by combining plate wave theory and wavelet packet transform. And the revised localization velocity to reduce effect of array size is presented. The accuracy of rubbing localization based on beamforming and the improved method of present paper are compared by the rubbing test carried on a test table of rotating machinery. The results indicate that the improved method can localize rub fault effectively. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Nonlinear image registration with bidirectional metric and reciprocal regularization

    PubMed Central

    Ying, Shihui; Li, Dan; Xiao, Bin; Peng, Yaxin; Du, Shaoyi; Xu, Meifeng

    2017-01-01

    Nonlinear registration is an important technique to align two different images and widely applied in medical image analysis. In this paper, we develop a novel nonlinear registration framework based on the diffeomorphic demons, where a reciprocal regularizer is introduced to assume that the deformation between two images is an exact diffeomorphism. In detail, first, we adopt a bidirectional metric to improve the symmetry of the energy functional, whose variables are two reciprocal deformations. Secondly, we slack these two deformations into two independent variables and introduce a reciprocal regularizer to assure the deformations being the exact diffeomorphism. Then, we utilize an alternating iterative strategy to decouple the model into two minimizing subproblems, where a new closed form for the approximate velocity of deformation is calculated. Finally, we compare our proposed algorithm on two data sets of real brain MR images with two relative and conventional methods. The results validate that our proposed method improves accuracy and robustness of registration, as well as the gained bidirectional deformations are actually reciprocal. PMID:28231342

  9. Calculating the Optimum Angle of Filament-Wound Pipes in Natural Gas Transmission Pipelines Using Approximation Methods.

    PubMed

    Reza Khoshravan Azar, Mohammad; Emami Satellou, Ali Akbar; Shishesaz, Mohammad; Salavati, Bahram

    2013-04-01

    Given the increasing use of composite materials in various industries, oil and gas industry also requires that more attention should be paid to these materials. Furthermore, due to variation in choice of materials, the materials needed for the mechanical strength, resistance in critical situations such as fire, costs and other priorities of the analysis carried out on them and the most optimal for achieving certain goals, are introduced. In this study, we will try to introduce appropriate choice for use in the natural gas transmission composite pipelines. Following a 4-layered filament-wound (FW) composite pipe will consider an offer our analyses under internal pressure. The analyses' results will be calculated for different combinations of angles 15 deg, 30 deg, 45 deg, 55 deg, 60 deg, 75 deg, and 80 deg. Finally, we will compare the calculated values and the optimal angle will be gained by using the Approximation methods. It is explained that this layering is as the symmetrical.

  10. A Comprehensive Review of Spirit Drink Safety Standards and Regulations from an International Perspective.

    PubMed

    Pang, Xiao-Na; Li, Zhao-Jie; Chen, Jing-Yu; Gao, Li-Juan; Han, Bei-Zhong

    2017-03-01

    Standards and regulations related to spirit drinks have been established by different countries and international organizations to ensure the safety and quality of spirits. Here, we introduce the principles of food safety and quality standards for alcoholic beverages and then compare the key indicators used in the distinct standards of the Codex Alimentarius Commission, the European Union, the People's Republic of China, the United States, Canada, and Australia. We also discuss in detail the "maximum level" of the following main contaminants of spirit drinks: methanol, higher alcohols, ethyl carbamate, hydrocyanic acid, heavy metals, mycotoxins, phthalates, and aldehydes. Furthermore, the control measures used for potential hazards are introduced. Harmonization of the current requirements based on comprehensive scope analysis and the risk assessment approach will enhance both the trade and quality of distilled spirits. This review article provides valuable information that will enable producers, traders, governments, and researchers to increase their knowledge of spirit drink safety requirements, control measures, and research trends.

  11. Boundary-to-Marker Evidence-Controlled Segmentation and MDL-Based Contour Inference for Overlapping Nuclei.

    PubMed

    Song, Jie; Xiao, Liang; Lian, Zhichao

    2017-03-01

    This paper presents a novel method for automated morphology delineation and analysis of cell nuclei in histopathology images. Combining the initial segmentation information and concavity measurement, the proposed method first segments clusters of nuclei into individual pieces, avoiding segmentation errors introduced by the scale-constrained Laplacian-of-Gaussian filtering. After that a nuclear boundary-to-marker evidence computing is introduced to delineate individual objects after the refined segmentation process. The obtained evidence set is then modeled by the periodic B-splines with the minimum description length principle, which achieves a practical compromise between the complexity of the nuclear structure and its coverage of the fluorescence signal to avoid the underfitting and overfitting results. The algorithm is computationally efficient and has been tested on the synthetic database as well as 45 real histopathology images. By comparing the proposed method with several state-of-the-art methods, experimental results show the superior recognition performance of our method and indicate the potential applications of analyzing the intrinsic features of nuclei morphology.

  12. Artifacts introduced by ion milling in Al-Li-Cu alloys.

    PubMed

    Singh, A K; Imam, M A; Sadananda, K

    1988-04-01

    Ion milling is commonly used to prepare specimens for observation under transmission electron microscope (TEM). This technique sometimes introduces artifacts in specimens contributing to misleading interpretation of TEM results as observed in the present investigation of Al-Li-Cu alloys. This type of alloy, in general, contains several kinds of precipitates, namely delta', T1, and theta'. It is found that ion milling even for a short time produces drastic changes in the precipitate characteristics as compared to standard electropolishing methods of specimen preparation for TEM. Careful analysis of selected area diffraction patterns and micrographs shows that after ion milling delta' precipitates are very irregular, whereas other precipitates coarsen and they are surrounded by misfit dislocations. In situ hot-stage TEM experiments were performed to relate the microstructure to that observed in the ion-milled specimen. Results and causes of ion milling effects on the microstructure are discussed in relation to standard electropolishing techniques and in situ hot-stage experiment.

  13. Soil-ecological conditions of Korean pine growth in its natural area and upon introduction in the European part of Russia

    NASA Astrophysics Data System (ADS)

    Voityuk, M. M.

    2015-05-01

    Socioeconomic expediency and soil-ecological potential of introducing Korean pine ( Pinus koraiensis) in the forest zone of the European part of Russia are discussed. The specificity of soil-ecological conditions and technologies applied for growing Korean pine in some tree farms in the Far East region and in the European part of Russia are compared. The main soil-ecological factors and optimum soil parameters for the successful development of Korean pine in its natural and introduction areas are determined. It is shown that development of Korean pine seedlings on well-drained soils depends on the contents of potassium, humus, and physical clay in the soils. The seedlings gain maximum size upon their growing on soddypodzolic soils (Retisols). The analysis of mineral nutrition of pine seedlings of different ages, soil conditions, and seasonal growth phases shows that the contents of potassium and some microelements play the leading role in the successful growth of introduced Korean pine.

  14. Prospects for reduced energy transports: A preliminary analysis

    NASA Technical Reports Server (NTRS)

    Ardema, M. D.; Harper, M.; Smith, C. L.; Waters, M. H.; Williams, L. J.

    1974-01-01

    The recent energy crisis and subsequent substantial increase in fuel prices have provided increased incentive to reduce the fuel consumption of civil transport aircraft. At the present time many changes in operational procedures have been introduced to decrease fuel consumption of the existing fleet. In the future, however, it may become desirable or even necessary to introduce new fuel-conservative aircraft designs. This paper reports the results of a preliminary study of new near-term fuel conservative aircraft. A parametric study was made to determine the effects of cruise Mach number and fuel cost on the optimum configuration characteristics and on economic performance. For each design, the wing geometry was optimized to give maximum return on investment at a particular fuel cost. Based on the results of the parametric study, a nominal reduced energy configuration was selected. Compared with existing transport designs, the reduced energy design has a higher aspect ratio wing with lower sweep, and cruises at a lower Mach number. It has about 30% less fuel consumption on a seat-mile basis.

  15. The national e-medication approaches in Germany, Switzerland and Austria: A structured comparison.

    PubMed

    Gall, Walter; Aly, Amin-Farid; Sojer, Reinhold; Spahni, Stéphane; Ammenwerth, Elske

    2016-09-01

    Recent studies show that many patients are harmed due to missing or erroneous information on prescribed and taken medication. Many countries are thus introducing eHealth solutions to improve the availability of this medication information on a national scale (often called "e-medication"). The objective of this study is to analyse and compare the national e-medication solutions just being introduced in Germany, Switzerland and Austria. Information on the situation in the three countries was collected within an expert group and complemented by an analysis of recent literature and legislation in each country. All three countries formulate comparable goals for the national eHealth solutions, focusing on improving medication safety. All three countries do not have a national e-prescription system. In all three countries, the implementation process was slower than expected and e-medication is not yet fully available. Differences of the three countries exist regarding chosen architectures, used standards, offered functionalities, and degree of voluntariness of participation. Nationwide e-medication systems and cross-border harmonization are acknowledged as important goals towards medication safety, but they develop slowly mainly due to privacy and security requirements, the need for law amendments and last but not least political interests. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Retina Image Vessel Segmentation Using a Hybrid CGLI Level Set Method

    PubMed Central

    Chen, Meizhu; Li, Jichun; Zhang, Encai

    2017-01-01

    As a nonintrusive method, the retina imaging provides us with a better way for the diagnosis of ophthalmologic diseases. Extracting the vessel profile automatically from the retina image is an important step in analyzing retina images. A novel hybrid active contour model is proposed to segment the fundus image automatically in this paper. It combines the signed pressure force function introduced by the Selective Binary and Gaussian Filtering Regularized Level Set (SBGFRLS) model with the local intensity property introduced by the Local Binary fitting (LBF) model to overcome the difficulty of the low contrast in segmentation process. It is more robust to the initial condition than the traditional methods and is easily implemented compared to the supervised vessel extraction methods. Proposed segmentation method was evaluated on two public datasets, DRIVE (Digital Retinal Images for Vessel Extraction) and STARE (Structured Analysis of the Retina) (the average accuracy of 0.9390 with 0.7358 sensitivity and 0.9680 specificity on DRIVE datasets and average accuracy of 0.9409 with 0.7449 sensitivity and 0.9690 specificity on STARE datasets). The experimental results show that our method is effective and our method is also robust to some kinds of pathology images compared with the traditional level set methods. PMID:28840122

  17. An Electromyographic-driven Musculoskeletal Torque Model using Neuro-Fuzzy System Identification: A Case Study

    PubMed Central

    Jafari, Zohreh; Edrisi, Mehdi; Marateb, Hamid Reza

    2014-01-01

    The purpose of this study was to estimate the torque from high-density surface electromyography signals of biceps brachii, brachioradialis, and the medial and lateral heads of triceps brachii muscles during moderate-to-high isometric elbow flexion-extension. The elbow torque was estimated in two following steps: First, surface electromyography (EMG) amplitudes were estimated using principal component analysis, and then a fuzzy model was proposed to illustrate the relationship between the EMG amplitudes and the measured torque signal. A neuro-fuzzy method, with which the optimum number of rules could be estimated, was used to identify the model with suitable complexity. Utilizing the proposed neuro-fuzzy model, the clinical interpretability was introduced; contrary to the previous linear and nonlinear black-box system identification models. It also reduced the estimation error compared with that of the most recent and accurate nonlinear dynamic model introduced in the literature. The optimum number of the rules for all trials was 4 ± 1, that might be related to motor control strategies and the % variance accounted for criterion was 96.40 ± 3.38 which in fact showed considerable improvement compared with the previous methods. The proposed method is thus a promising new tool for EMG-Torque modeling in clinical applications. PMID:25426427

  18. The influence of hydraulic conditions on coagulation process effectiveness

    NASA Astrophysics Data System (ADS)

    Sambor, Aleksandra; Ferenc, Zbigniew

    2017-11-01

    This paper presents the impact that small changes in the hydraulic installation between the flocculation chamber and the sedimentation tanks have on coagulation process effectiveness. This study has shown significant improvements in the parameters of the treated water. The research was conducted in two treatment systems: reference and test, in order to compare the changes that were introduced in the time period between January and May 2016. The hydraulic conditions between the flocculation chamber and the sedimentation tank were changed in the test system, leaving the reference system unchanged for comparative purposes. The height-wise positioning of the sedimentation tank relative to the flocculation chamber resulted in a formation of a cascade at the flocculation chamber drain at a height of 0.60m. Air was therefore introduced into the water, forming an air-water mixture, which disturbed the flow between the devices. It was found that floc transported by the pipeline was broken down, which hampered sedimentation in the sedimentation tank. This was confirmed by the analysis of chosen parameters from treated water. After changes in the hydraulic system, changes in water turbidity were noticed, indicating an increase in post-coagulation suspension separation effectiveness. Consequently, an increase in organic carbon removal was found relative to the reference system. This change influenced changes in UV254 absorbance to a much lesser extent.

  19. Examining the delivery modes of metacognitive awareness and active reading lessons in a college nonmajors introductory biology course.

    PubMed

    Hill, Kendra M; Brözel, Volker S; Heiberger, Greg A

    2014-05-01

    Current research supports the role of metacognitive strategies to enhance reading comprehension. This study measured the effectiveness of online versus face-to-face metacognitive and active reading skills lessons introduced by Biology faculty to college students in a nonmajors introductory biology course. These lessons were delivered in two lectures either online (Group 1: N = 154) or face to face (Group 2: N = 152). Previously validated pre- and post- surveys were used to collect and compare data by paired and independent t-test analysis (α = 0.05). Pre- and post- survey data showed a statistically significant improvement in both groups in metacognitive awareness (p = 0.001, p = 0.003, respectively) and reading comprehension (p < 0.001 for both groups). When comparing the delivery mode of these lessons, no difference was detected between the online and face-to-face instruction for metacognitive awareness (pre- p = 0.619, post- p = 0.885). For reading comprehension, no difference in gains was demonstrated between online and face-to-face (p = 0.381); however, differences in pre- and post- test scores were measured (pre- p = 0.005, post- p = 0.038). This study suggests that biology instructors can easily introduce effective metacognitive awareness and active reading lessons into their course, either through online or face-to-face instruction.

  20. Assessing the biophysical naturalness of grassland in eastern North Dakota with hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang

    Over the past two decades, non-native species within grassland communities have quickly developed due to human migration and commerce. Invasive species like Smooth Brome grass (Bromus inermis) and Kentucky Blue Grass (Poa pratensis), seriously threaten conservation of native grasslands. This study aims to discriminate between native grasslands and planted hayfields and conservation areas dominated by introduced grasses using hyperspectral imagery. Hyperspectral imageries from the Hyperion sensor on EO-1 were acquired in late spring and late summer on 2009 and 2010. Field spectra for widely distributed species as well as smooth brome grass and Kentucky blue grass were collected from the study sites throughout the growing season. Imagery was processed with an unmixing algorithm to estimate fractional cover of green and dry vegetation and bare soil. As the spectrum is significantly different through growing season, spectral libraries for the most common species are then built for both the early growing season and late growing season. After testing multiple methods, the Adaptive Coherence Estimator (ACE) was used for spectral matching analysis between the imagery and spectral libraries. Due in part to spectral similarity among key species, the results of spectral matching analysis were not definitive. Additional indexes, "Level of Dominance" and "Band variance", were calculated to measure the predominance of spectral signatures in any area. A Texture co-occurrence analysis was also performed on both "Level of Dominance" and "Band variance" indexes to extract spatial characteristics. The results suggest that compared with disturbed area, native prairie tend to have generally lower "Level of Dominance" and "Band variance" as well as lower spatial dissimilarity. A final decision tree model was created to predict presence of native or introduced grassland. The model was more effective for identification of Mixed Native Grassland than for grassland dominated by a single species. The discrimination of native and introduced grassland was limited by the similarity of spectral signatures between forb-dominated native grasslands and brome-grass stands. However, saline native grasslands were distinguishable from brome grass.

  1. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    NASA Astrophysics Data System (ADS)

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  2. Comparative ergonomic workflow and user experience analysis of MRI versus fluoroscopy-guided vascular interventions: an iliac angioplasty exemplar case study.

    PubMed

    Fernández-Gutiérrez, Fabiola; Martínez, Santiago; Rube, Martin A; Cox, Benjamin F; Fatahi, Mahsa; Scott-Brown, Kenneth C; Houston, J Graeme; McLeod, Helen; White, Richard D; French, Karen; Gueorguieva, Mariana; Immel, Erwin; Melzer, Andreas

    2015-10-01

    A methodological framework is introduced to assess and compare a conventional fluoroscopy protocol for peripheral angioplasty with a new magnetic resonant imaging (MRI)-guided protocol. Different scenarios were considered during interventions on a perfused arterial phantom with regard to time-based and cognitive task analysis, user experience and ergonomics. Three clinicians with different expertise performed a total of 43 simulated common iliac angioplasties (9 fluoroscopic, 34 MRI-guided) in two blocks of sessions. Six different configurations for MRI guidance were tested in the first block. Four of them were evaluated in the second block and compared to the fluoroscopy protocol. Relevant stages' durations were collected, and interventions were audio-visually recorded from different perspectives. A cued retrospective protocol analysis (CRPA) was undertaken, including personal interviews. In addition, ergonomic constraints in the MRI suite were evaluated. Significant differences were found when comparing the performance between MRI configurations versus fluoroscopy. Two configurations [with times of 8.56 (0.64) and 9.48 (1.13) min] led to reduce procedure time for MRI guidance, comparable to fluoroscopy [8.49 (0.75) min]. The CRPA pointed out the main influential factors for clinical procedure performance. The ergonomic analysis quantified musculoskeletal risks for interventional radiologists when utilising MRI. Several alternatives were suggested to prevent potential low-back injuries. This work presents a step towards the implementation of efficient operational protocols for MRI-guided procedures based on an integral and multidisciplinary framework, applicable to the assessment of current vascular protocols. The use of first-user perspective raises the possibility of establishing new forms of clinical training and education.

  3. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Jin; Yu, Yaming; Van Dyk, David A.

    2014-10-20

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use amore » principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.« less

  4. Sample introducing apparatus and sample modules for mass spectrometer

    DOEpatents

    Thompson, Cyril V.; Wise, Marcus B.

    1993-01-01

    An apparatus for introducing gaseous samples from a wide range of environmental matrices into a mass spectrometer for analysis of the samples is described. Several sample preparing modules including a real-time air monitoring module, a soil/liquid purge module, and a thermal desorption module are individually and rapidly attachable to the sample introducing apparatus for supplying gaseous samples to the mass spectrometer. The sample-introducing apparatus uses a capillary column for conveying the gaseous samples into the mass spectrometer and is provided with an open/split interface in communication with the capillary and a sample archiving port through which at least about 90 percent of the gaseous sample in a mixture with an inert gas that was introduced into the sample introducing apparatus is separated from a minor portion of the mixture entering the capillary discharged from the sample introducing apparatus.

  5. Profiling and relative quantification of phosphatidylethanolamine based on acetone stable isotope derivatization.

    PubMed

    Wang, Xiang; Wei, Fang; Xu, Ji-Qu; Lv, Xin; Dong, Xu-Yan; Han, Xianlin; Quek, Siew-Young; Huang, Feng-Hong; Chen, Hong

    2016-01-01

    Phosphatidylethanolamine (PE) is considered to be one of the pivotal lipids for normal cellular function as well as disease initiation and progression. In this study, a simple, efficient, reliable, and inexpensive method for the qualitative analysis and relative quantification of PE, based on acetone stable isotope derivatization combined with double neutral loss scan-shotgun electrospray ionization tandem-quadrupole mass spectrometry analysis (ASID-DNLS-Shotgun ESI-MS/MS), was developed. The ASID method led to alkylation of the primary amino groups of PE with an isopropyl moiety. The use of acetone (d0-acetone) and deuterium-labeled acetone (d6-acetone) introduced a 6 Da mass shift that was ideally suited for relative quantitative analysis, and enhanced sensitivity for mass analysis. The DNLS model was introduced to simultaneously analyze the differential derivatized PEs by shotgun ESI-MS/MS with high selectivity and accuracy. The reaction specificity, labeling efficiency, and linearity of the ASID method were thoroughly evaluated in this study. Its excellent applicability was validated by qualitative and relative quantitative analysis of PE species presented in liver samples from rats fed different diets. Using the ASID-DNLS-Shotgun ESI-MS/MS method, 45 PE species from rat livers have been identified and quantified in an efficient manner. The level of total PEs tended to decrease in the livers of rats on high fat diets compared with controls. The levels of PE 32:1, 34:3, 34:2, 36:3, 36:2, 42:10, plasmalogen PE 36:1 and lyso PE 22:6 were significantly reduced, while levels of PE 36:1 and lyso PE 16:0 increased. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Fractal analysis of the ischemic transition region in chronic ischemic heart disease using magnetic resonance imaging.

    PubMed

    Michallek, Florian; Dewey, Marc

    2017-04-01

    To introduce a novel hypothesis and method to characterise pathomechanisms underlying myocardial ischemia in chronic ischemic heart disease by local fractal analysis (FA) of the ischemic myocardial transition region in perfusion imaging. Vascular mechanisms to compensate ischemia are regulated at various vascular scales with their superimposed perfusion pattern being hypothetically self-similar. Dedicated FA software ("FraktalWandler") has been developed. Fractal dimensions during first-pass (FD first-pass ) and recirculation (FD recirculation ) are hypothesised to indicate the predominating pathomechanism and ischemic severity, respectively. Twenty-six patients with evidence of myocardial ischemia in 108 ischemic myocardial segments on magnetic resonance imaging (MRI) were analysed. The 40th and 60th percentiles of FD first-pass were used for pathomechanical classification, assigning lesions with FD first-pass  ≤ 2.335 to predominating coronary microvascular dysfunction (CMD) and ≥2.387 to predominating coronary artery disease (CAD). Optimal classification point in ROC analysis was FD first-pass  = 2.358. FD recirculation correlated moderately with per cent diameter stenosis in invasive coronary angiography in lesions classified CAD (r = 0.472, p = 0.001) but not CMD (r = 0.082, p = 0.600). The ischemic transition region may provide information on pathomechanical composition and severity of myocardial ischemia. FA of this region is feasible and may improve diagnosis compared to traditional noninvasive myocardial perfusion analysis. • A novel hypothesis and method is introduced to pathophysiologically characterise myocardial ischemia. • The ischemic transition region appears a meaningful diagnostic target in perfusion imaging. • Fractal analysis may characterise pathomechanical composition and severity of myocardial ischemia.

  7. Technical and Economical Feasibility of SSTO and TSTO Launch Vehicles

    NASA Astrophysics Data System (ADS)

    Lerch, Jens

    This paper discusses whether it is more cost effective to launch to low earth orbit in one or two stages, assuming current or near future technologies. First the paper provides an overview of the current state of the launch market and the hurdles to introducing new launch vehicles capable of significantly lowering the cost of access to space and discusses possible routes to solve those problems. It is assumed that reducing the complexity of launchers by reducing the number of stages and engines, and introducing reusability will result in lower launch costs. A number of operational and historic launch vehicle stages capable of near single stage to orbit (SSTO) performance are presented and the necessary steps to modify them into an expendable SSTO launcher and an optimized two stage to orbit (TSTO) launcher are shown, through parametric analysis. Then a ballistic reentry and recovery system is added to show that reusable SSTO and TSTO vehicles are also within the current state of the art. The development and recurring costs of the SSTO and the TSTO systems are estimated and compared. This analysis shows whether it is more economical to develop and operate expendable or reusable SSTO or TSTO systems under different assumption for launch rate and initial investment.

  8. Evolution and dynamics of a matter creation model

    NASA Astrophysics Data System (ADS)

    Pan, S.; de Haro, J.; Paliathanasis, A.; Slagter, R. J.

    2016-08-01

    In a flat Friedmann-Lemaître-Robertson-Walker (FLRW) geometry, we consider the expansion of the universe powered by the gravitationally induced `adiabatic' matter creation. To demonstrate how matter creation works well with the expanding universe, we have considered a general creation rate and analysed this rate in the framework of dynamical analysis. The dynamical analysis hints the presence of a non-singular universe (without the big bang singularity) with two successive accelerated phases, one at the very early phase of the universe (I.e. inflation), and the other one describes the current accelerating universe, where this early, late accelerated phases are associated with an unstable fixed point (I.e. repeller) and a stable fixed point (attractor), respectively. We have described this phenomena by analytic solutions of the Hubble function and the scale factor of the FLRW universe. Using Jacobi last multiplier method, we have found a Lagrangian for this matter creation rate describing this scenario of the universe. To match with our early physics results, we introduce an equivalent dynamics driven by a single scalar field, discuss the associated observable parameters and compare them with the latest Planck data sets. Finally, introducing the teleparallel modified gravity, we have established an equivalent gravitational theory in the framework of matter creation.

  9. The influence and analysis of natural crosswind on cooling characteristics of the high level water collecting natural draft wet cooling tower

    NASA Astrophysics Data System (ADS)

    Ma, Libin; Ren, Jianxing

    2018-01-01

    Large capacity and super large capacity thermal power is becoming the main force of energy and power industry in our country. The performance of cooling tower is related to the water temperature of circulating water, which has an important influence on the efficiency of power plant. The natural draft counter flow wet cooling tower is the most widely used cooling tower type at present, and the high cooling tower is a new cooling tower based on the natural ventilation counter flow wet cooling tower. In this paper, for high cooling tower, the application background of high cooling tower is briefly explained, and then the structure principle of conventional cooling tower and high cooling tower are introduced, and the difference between them is simply compared. Then, the influence of crosswind on cooling performance of high cooling tower under different wind speeds is introduced in detail. Through analysis and research, wind speed, wind cooling had little impact on the performance of high cooling tower; wind velocity, wind will destroy the tower inside and outside air flow, reducing the cooling performance of high cooling tower; Wind speed, high cooling performance of cooling tower has increased, but still lower than the wind speed.

  10. Comparison of allele-specific PCR, created restriction-site PCR, and PCR with primer-introduced restriction analysis methods used for screening complex vertebral malformation carriers in Holstein cattle

    PubMed Central

    Altınel, Ahmet

    2017-01-01

    Complex vertebral malformation (CVM) is an inherited, autosomal recessive disorder of Holstein cattle. The aim of this study was to compare sensitivity, specificity, positive and negative predictive values, accuracy, and rapidity of allele-specific polymerase chain reaction (AS-PCR), created restriction-site PCR (CRS-PCR), and PCR with primer-introduced restriction analysis (PCR-PIRA), three methods used in identification of CVM carriers in a Holstein cattle population. In order to screen for the G>T mutation in the solute carrier family 35 member A3 (SLC35A3) gene, DNA sequencing as the gold standard method was used. The prevalence of carriers and the mutant allele frequency were 3.2% and 0.016, respectively, among Holstein cattle in the Thrace region of Turkey. Among the three methods, the fastest but least accurate was AS-PCR. Although the rapidity of CRS-PCR and PCR-PIRA were nearly equal, the accuracy of PCR-PIRA was higher than that of CRS-PCR. Therefore, among the three methods, PCR-PIRA appears to be the most efficacious for screening of mutant alleles when identifying CVM carriers in a Holstein cattle population. PMID:28927256

  11. Cross-Cultural Validation of the Modified Practice Attitudes Scale: Initial Factor Analysis and a New Factor Model.

    PubMed

    Park, Heehoon; Ebesutani, Chad K; Chung, Kyong-Mee; Stanick, Cameo

    2018-01-01

    The objective of this study was to create the Korean version of the Modified Practice Attitudes Scale (K-MPAS) to measure clinicians' attitudes toward evidence-based treatments (EBTs) in the Korean mental health system. Using 189 U.S. therapists and 283 members from the Korean mental health system, we examined the reliability and validity of the MPAS scores. We also conducted the first exploratory and confirmatory factor analysis on the MPAS and compared EBT attitudes across U.S. and Korean therapists. Results revealed that the inclusion of both "reversed-worded" and "non-reversed-worded" items introduced significant method effects that compromised the integrity of the one-factor MPAS model. Problems with the one-factor structure were resolved by eliminating the "non-reversed-worded" items. Reliability and validity were adequate among both Korean and U.S. therapists. Korean therapists also reported significantly more negative attitudes toward EBTs on the MPAS than U.S. therapists. The K-MPAS is the first questionnaire designed to measure Korean service providers' attitudes toward EBTs to help advance the dissemination of EBTs in Korea. The current study also demonstrated the negative impacts that can be introduced by incorporating oppositely worded items into a scale, particularly with respect to factor structure and detecting significant group differences.

  12. Overexpression of the rice carotenoid cleavage dioxygenase 1 gene in Golden Rice endosperm suggests apocarotenoids as substrates in planta.

    PubMed

    Ilg, Andrea; Yu, Qiuju; Schaub, Patrick; Beyer, Peter; Al-Babili, Salim

    2010-08-01

    Carotenoids are converted by carotenoid cleavage dioxygenases that catalyze oxidative cleavage reactions leading to apocarotenoids. However, apocarotenoids can also be further truncated by some members of this enzyme family. The plant carotenoid cleavage dioxygenase 1 (CCD1) subfamily is known to degrade both carotenoids and apocarotenoids in vitro, leading to different volatile compounds. In this study, we investigated the impact of the rice CCD1 (OsCCD1) on the pigmentation of Golden Rice 2 (GR2), a genetically modified rice variety accumulating carotenoids in the endosperm. For this purpose, the corresponding cDNA was introduced into the rice genome under the control of an endosperm-specific promoter in sense and anti-sense orientations. Despite high expression levels of OsCCD1 in sense plants, pigment analysis revealed carotenoid levels and patterns comparable to those of GR2, pleading against carotenoids as substrates in rice endosperm. In support, similar carotenoid contents were determined in anti-sense plants. To check whether OsCCD1 overexpressed in GR2 endosperm is active, in vitro assays were performed with apocarotenoid substrates. HPLC analysis confirmed the cleavage activity of introduced OsCCD1. Our data indicate that apocarotenoids rather than carotenoids are the substrates of OsCCD1 in planta.

  13. The SSME HPFTP interstage seals: Analysis and experiments for leakage and reaction-force coefficients

    NASA Technical Reports Server (NTRS)

    Childs, D. W.

    1983-01-01

    An improved theory for the prediction of the rotordynamic coefficients of turbulent annular seals was developed. Predictions from the theory are compared to the experimental results and an approach for the direct calculation of empirical turbulent coefficients from test data are introduced. An improved short seal solution is shown to do a better job of calculating effective stiffness and damping coefficients than either the original short seal solution or a finite length solution. However, the original short seal solution does a much better job of predicting equivalent added mass coefficient.

  14. Simulation study on the maximum continuous working condition of a power plant boiler

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Han, Jiting; Sun, Haitian; Cheng, Jiwei; Jing, Ying'ai; Li, Wenbo

    2018-05-01

    First of all, the boiler is briefly introduced to determine the mathematical model and the boundary conditions, then the boiler under the BMCR condition numerical simulation study, and then the BMCR operating temperature field analysis. According to the boiler actual test results and the hot BMCR condition boiler output test results, the simulation results are verified. The main conclusions are as follows: the position and size of the inscribed circle in the furnace and the furnace temperature distribution and test results under different elevation are compared and verified; Accuracy of numerical simulation results.

  15. A comparison of the poverty impact of transfers, taxes and market income across five OECD countries.

    PubMed

    Bibi, Sami; Duclos, Jean-Yves

    2010-01-01

    This paper compares the poverty reduction impact of income sources, taxes and transfers across five OECD countries. Since the estimation of that impact can depend on the order in which the various income sources are introduced into the analysis, it is done by using the Shapley value. Estimates of the poverty reduction impact are presented in a normalized and unnormalized fashion, in order to take into account the total as well as the per dollar impacts. The methodology is applied to data from the Luxembourg Income Study database.

  16. The use of smoke acid as an alternative coagulating agent for natural rubber sheets' production.

    PubMed

    Ferreira, Vanda S; Rêgo, Ione N C; Pastore, Floriano; Mandai, Mariana M; Mendes, Leonardo S; Santos, Karin A M; Rubim, Joel C; Suarez, Paulo A Z

    2005-03-01

    A comparative study of rubber sheets obtained using formic, acetic, and smoke acid as coagulants is shown for latex obtained from native Amazonian trees and also from commercial cultivated trees. The evaluation of both processes of coagulation was carried out by spectroscopic and physical-chemical analysis, showing no differences in the rubber sheets obtained. This new method of rubber sheet preparation was introduced into Amazonian rainforest rubber tapper communities, which are actually producing in large scale. The physical-mechanical properties were similar among a large sheets made by different rubber tapper communities using this new method.

  17. Modified Gaussian influence function of deformable mirror actuators.

    PubMed

    Huang, Linhai; Rao, Changhui; Jiang, Wenhan

    2008-01-07

    A new deformable mirror influence function based on a Gaussian function is introduced to analyze the fitting capability of a deformable mirror. The modified expressions for both azimuthal and radial directions are presented based on the analysis of the residual error between a measured influence function and a Gaussian influence function. With a simplex search method, we further compare the fitting capability of our proposed influence function to fit the data produced by a Zygo interferometer with that of a Gaussian influence function. The result indicates that the modified Gaussian influence function provides much better performance in data fitting.

  18. The double-layer of penetrable ions: an alternative route to charge reversal.

    PubMed

    Frydel, Derek; Levin, Yan

    2013-05-07

    We investigate a double-layer of penetrable ions near a charged wall. We find a new mechanism for charge reversal that occurs in the weak-coupling regime and, accordingly, the system is suitable for the mean-field analysis. The penetrability is achieved by smearing-out the ionic charge inside a sphere, so there is no need to introduce non-electrostatic forces and the system in the low coupling limit can be described by a modified version of the Poisson-Boltzmann equation. The predictions of the theory are compared with the Monte Carlo simulations.

  19. May quasicrystals be good thermoelectric materials?

    NASA Astrophysics Data System (ADS)

    Maciá, Enrique

    2000-11-01

    We present a theoretical analysis of quasicrystals (QCs) as potential thermoelectric materials. We consider a self-similar density of states model and extend the framework introduced in [G. D. Mahan and J. O. Sofo, Proc. Natl. Acad. Sci. U.S.A. 93, 7436 (1996)] to systems exhibiting correlated features in their electronic structure. We show that relatively high values of the thermoelectric figure of merit, ranging from 0.01 up to 1.6 at room temperature, may be expected for these systems. We compare our results with available experimental data on transport properties of QCs and suggest some potential candidates for thermoelectric applications.

  20. Volume of the steady-state space of financial flows in a monetary stock-flow-consistent model

    NASA Astrophysics Data System (ADS)

    Hazan, Aurélien

    2017-05-01

    We show that a steady-state stock-flow consistent macro-economic model can be represented as a Constraint Satisfaction Problem (CSP). The set of solutions is a polytope, which volume depends on the constraints applied and reveals the potential fragility of the economic circuit, with no need to study the dynamics. Several methods to compute the volume are compared, inspired by operations research methods and the analysis of metabolic networks, both exact and approximate. We also introduce a random transaction matrix, and study the particular case of linear flows with respect to money stocks.

  1. Cannibalism in non-native brown trout Salmo trutta and rainbow trout Oncorhynchus mykiss stream-dwelling populations.

    PubMed

    Musseau, C; Vincenzi, S; Jesenšek, D; Crivelli, A J

    2017-12-01

    Introduced and allopatric populations of brown trout Salmo trutta and rainbow trout Oncorhynchus mykiss were sampled in Slovenia for stable isotope analysis to assess dietary niche shifts through ontogeny and estimate the propensity for cannibalism. Both S. trutta and O. mykiss are cannibals, with higher average relative contribution of conspecific assimilated energy for S. trutta (27·9%) compared with O. mykiss (7·7%). The smallest cannibal was 166 mm in the S. trutta population and 247 mm in the O. mykiss population. © 2017 The Fisheries Society of the British Isles.

  2. Mesh Denoising based on Normal Voting Tensor and Binary Optimization.

    PubMed

    Yadav, Sunil Kumar; Reitebuch, Ulrich; Polthier, Konrad

    2017-08-17

    This paper presents a two-stage mesh denoising algorithm. Unlike other traditional averaging approaches, our approach uses an element-based normal voting tensor to compute smooth surfaces. By introducing a binary optimization on the proposed tensor together with a local binary neighborhood concept, our algorithm better retains sharp features and produces smoother umbilical regions than previous approaches. On top of that, we provide a stochastic analysis on the different kinds of noise based on the average edge length. The quantitative results demonstrate that the performance of our method is better compared to state-of-the-art smoothing approaches.

  3. An empirical assessment of taxic paleobiology.

    PubMed

    Adrain, J M; Westrop, S R

    2000-07-07

    The analysis of major changes in faunal diversity through time is a central theme of analytical paleobiology. The most important sources of data are literature-based compilations of stratigraphic ranges of fossil taxa. The levels of error in these compilations and the possible effects of such error have often been discussed but never directly assessed. We compared our comprehensive database of trilobites to the equivalent portion of J. J. Sepkoski Jr.'s widely used global genus database. More than 70% of entries in the global database are inaccurate; however, as predicted, the error is randomly distributed and does not introduce bias.

  4. One Step Quantum Key Distribution Based on EPR Entanglement

    PubMed Central

    Li, Jian; Li, Na; Li, Lei-Lei; Wang, Tao

    2016-01-01

    A novel quantum key distribution protocol is presented, based on entanglement and dense coding and allowing asymptotically secure key distribution. Considering the storage time limit of quantum bits, a grouping quantum key distribution protocol is proposed, which overcomes the vulnerability of first protocol and improves the maneuverability. Moreover, a security analysis is given and a simple type of eavesdropper’s attack would introduce at least an error rate of 46.875%. Compared with the “Ping-pong” protocol involving two steps, the proposed protocol does not need to store the qubit and only involves one step. PMID:27357865

  5. Electrospray ionization time-of-flight mass spectrum analysis method of polyaluminum chloride flocculants.

    PubMed

    Feng, Chenghong; Bi, Zhe; Tang, Hongxiao

    2015-01-06

    Electrospray mass spectrometry has been reported as a novel technique for Al species identification, but to date, the working mechanism is not clear and no unanimous method exists for spectrum analysis of traditional Al salt flocculants, let alone for analysis of polyaluminum chloride (PAC) flocculants. Therefore, this paper introduces a novel theoretical calculation method to identify Al species from a mass spectrum, based on deducing changes in m/z (mass-to-charge ratio) and molecular formulas of oligomers in five typical PAC flocculants. The use of reference chemical species was specially proposed in the method to guarantee the uniqueness of the assigned species. The charge and mass reduction of the Al cluster was found to proceed by hydrolysis, gasification, and change of hydroxyl on the oxy bridge. The novel method was validated both qualitatively and quantitatively by comparing the results to those obtained with the (27)Al NMR spectrometry.

  6. Taking the First Steps towards a Standard for Reporting on Phylogenies: Minimal Information about a Phylogenetic Analysis (MIAPA)

    PubMed Central

    LEEBENS-MACK, JIM; VISION, TODD; BRENNER, ERIC; BOWERS, JOHN E.; CANNON, STEVEN; CLEMENT, MARK J.; CUNNINGHAM, CLIFFORD W.; dePAMPHILIS, CLAUDE; deSALLE, ROB; DOYLE, JEFF J.; EISEN, JONATHAN A.; GU, XUN; HARSHMAN, JOHN; JANSEN, ROBERT K.; KELLOGG, ELIZABETH A.; KOONIN, EUGENE V.; MISHLER, BRENT D.; PHILIPPE, HERVÉ; PIRES, J. CHRIS; QIU, YIN-LONG; RHEE, SEUNG Y.; SJÖLANDER, KIMMEN; SOLTIS, DOUGLAS E.; SOLTIS, PAMELA S.; STEVENSON, DENNIS W.; WALL, KERR; WARNOW, TANDY; ZMASEK, CHRISTIAN

    2011-01-01

    In the eight years since phylogenomics was introduced as the intersection of genomics and phylogenetics, the field has provided fundamental insights into gene function, genome history and organismal relationships. The utility of phylogenomics is growing with the increase in the number and diversity of taxa for which whole genome and large transcriptome sequence sets are being generated. We assert that the synergy between genomic and phylogenetic perspectives in comparative biology would be enhanced by the development and refinement of minimal reporting standards for phylogenetic analyses. Encouraged by the development of the Minimum Information About a Microarray Experiment (MIAME) standard, we propose a similar roadmap for the development of a Minimal Information About a Phylogenetic Analysis (MIAPA) standard. Key in the successful development and implementation of such a standard will be broad participation by developers of phylogenetic analysis software, phylogenetic database developers, practitioners of phylogenomics, and journal editors. PMID:16901231

  7. Evaluating the effect of aging on interference resolution with time-varying complex networks analysis

    PubMed Central

    Ariza, Pedro; Solesio-Jofre, Elena; Martínez, Johann H.; Pineda-Pardo, José A.; Niso, Guiomar; Maestú, Fernando; Buldú, Javier M.

    2015-01-01

    In this study we used graph theory analysis to investigate age-related reorganization of functional networks during the active maintenance of information that is interrupted by external interference. Additionally, we sought to investigate network differences before and after averaging network parameters between both maintenance and interference windows. We compared young and older adults by measuring their magnetoencephalographic recordings during an interference-based working memory task restricted to successful recognitions. Data analysis focused on the topology/temporal evolution of functional networks during both the maintenance and interference windows. We observed that: (a) Older adults require higher synchronization between cortical brain sites in order to achieve a successful recognition, (b) The main differences between age groups arise during the interference window, (c) Older adults show reduced ability to reorganize network topology when interference is introduced, and (d) Averaging network parameters leads to a loss of sensitivity to detect age differences. PMID:26029079

  8. IonGAP: integrative bacterial genome analysis for Ion Torrent sequence data.

    PubMed

    Baez-Ortega, Adrian; Lorenzo-Diaz, Fabian; Hernandez, Mariano; Gonzalez-Vila, Carlos Ignacio; Roda-Garcia, Jose Luis; Colebrook, Marcos; Flores, Carlos

    2015-09-01

    We introduce IonGAP, a publicly available Web platform designed for the analysis of whole bacterial genomes using Ion Torrent sequence data. Besides assembly, it integrates a variety of comparative genomics, annotation and bacterial classification routines, based on the widely used FASTQ, BAM and SRA file formats. Benchmarking with different datasets evidenced that IonGAP is a fast, powerful and simple-to-use bioinformatics tool. By releasing this platform, we aim to translate low-cost bacterial genome analysis for microbiological prevention and control in healthcare, agroalimentary and pharmaceutical industry applications. IonGAP is hosted by the ITER's Teide-HPC supercomputer and is freely available on the Web for non-commercial use at http://iongap.hpc.iter.es. mcolesan@ull.edu.es or cflores@ull.edu.es Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Automated analysis of connected speech reveals early biomarkers of Parkinson's disease in patients with rapid eye movement sleep behaviour disorder.

    PubMed

    Hlavnička, Jan; Čmejla, Roman; Tykalová, Tereza; Šonka, Karel; Růžička, Evžen; Rusz, Jan

    2017-02-02

    For generations, the evaluation of speech abnormalities in neurodegenerative disorders such as Parkinson's disease (PD) has been limited to perceptual tests or user-controlled laboratory analysis based upon rather small samples of human vocalizations. Our study introduces a fully automated method that yields significant features related to respiratory deficits, dysphonia, imprecise articulation and dysrhythmia from acoustic microphone data of natural connected speech for predicting early and distinctive patterns of neurodegeneration. We compared speech recordings of 50 subjects with rapid eye movement sleep behaviour disorder (RBD), 30 newly diagnosed, untreated PD patients and 50 healthy controls, and showed that subliminal parkinsonian speech deficits can be reliably captured even in RBD patients, which are at high risk of developing PD or other synucleinopathies. Thus, automated vocal analysis should soon be able to contribute to screening and diagnostic procedures for prodromal parkinsonian neurodegeneration in natural environments.

  10. Chromatographic background drift correction coupled with parallel factor analysis to resolve coelution problems in three-dimensional chromatographic data: quantification of eleven antibiotics in tap water samples by high-performance liquid chromatography coupled with a diode array detector.

    PubMed

    Yu, Yong-Jie; Wu, Hai-Long; Fu, Hai-Yan; Zhao, Juan; Li, Yuan-Na; Li, Shu-Fang; Kang, Chao; Yu, Ru-Qin

    2013-08-09

    Chromatographic background drift correction has been an important field of research in chromatographic analysis. In the present work, orthogonal spectral space projection for background drift correction of three-dimensional chromatographic data was described in detail and combined with parallel factor analysis (PARAFAC) to resolve overlapped chromatographic peaks and obtain the second-order advantage. This strategy was verified by simulated chromatographic data and afforded significant improvement in quantitative results. Finally, this strategy was successfully utilized to quantify eleven antibiotics in tap water samples. Compared with the traditional methodology of introducing excessive factors for the PARAFAC model to eliminate the effect of background drift, clear improvement in the quantitative performance of PARAFAC was observed after background drift correction by orthogonal spectral space projection. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Sparse non-negative matrix factorizations via alternating non-negativity-constrained least squares for microarray data analysis.

    PubMed

    Kim, Hyunsoo; Park, Haesun

    2007-06-15

    Many practical pattern recognition problems require non-negativity constraints. For example, pixels in digital images and chemical concentrations in bioinformatics are non-negative. Sparse non-negative matrix factorizations (NMFs) are useful when the degree of sparseness in the non-negative basis matrix or the non-negative coefficient matrix in an NMF needs to be controlled in approximating high-dimensional data in a lower dimensional space. In this article, we introduce a novel formulation of sparse NMF and show how the new formulation leads to a convergent sparse NMF algorithm via alternating non-negativity-constrained least squares. We apply our sparse NMF algorithm to cancer-class discovery and gene expression data analysis and offer biological analysis of the results obtained. Our experimental results illustrate that the proposed sparse NMF algorithm often achieves better clustering performance with shorter computing time compared to other existing NMF algorithms. The software is available as supplementary material.

  12. Gap-metric-based robustness analysis of nonlinear systems with full and partial feedback linearisation

    NASA Astrophysics Data System (ADS)

    Al-Gburi, A.; Freeman, C. T.; French, M. C.

    2018-06-01

    This paper uses gap metric analysis to derive robustness and performance margins for feedback linearising controllers. Distinct from previous robustness analysis, it incorporates the case of output unstructured uncertainties, and is shown to yield general stability conditions which can be applied to both stable and unstable plants. It then expands on existing feedback linearising control schemes by introducing a more general robust feedback linearising control design which classifies the system nonlinearity into stable and unstable components and cancels only the unstable plant nonlinearities. This is done in order to preserve the stabilising action of the inherently stabilising nonlinearities. Robustness and performance margins are derived for this control scheme, and are expressed in terms of bounds on the plant nonlinearities and the accuracy of the cancellation of the unstable plant nonlinearity by the controller. Case studies then confirm reduced conservatism compared with standard methods.

  13. Comparing cosmic web classifiers using information theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Ourmore » study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.« less

  14. Stability analysis of internally damped rotating composite shafts using a finite element formulation

    NASA Astrophysics Data System (ADS)

    Ben Arab, Safa; Rodrigues, José Dias; Bouaziz, Slim; Haddar, Mohamed

    2018-04-01

    This paper deals with the stability analysis of internally damped rotating composite shafts. An Euler-Bernoulli shaft finite element formulation based on Equivalent Single Layer Theory (ESLT), including the hysteretic internal damping of composite material and transverse shear effects, is introduced and then used to evaluate the influence of various parameters: stacking sequences, fiber orientations and bearing properties on natural frequencies, critical speeds, and instability thresholds. The obtained results are compared with those available in the literature using different theories. The agreement in the obtained results show that the developed Euler-Bernoulli finite element based on ESLT including hysteretic internal damping and shear transverse effects can be effectively used for the stability analysis of internally damped rotating composite shafts. Furthermore, the results revealed that rotor stability is sensitive to the laminate parameters and to the properties of the bearings.

  15. Species distribution models may misdirect assisted migration: insights from the introduction of Douglas-fir to Europe.

    PubMed

    Boiffin, Juliette; Badeau, Vincent; Bréda, Nathalie

    2017-03-01

    Species distribution models (SDMs), which statistically relate species occurrence to climatic variables, are widely used to identify areas suitable for species growth under future climates and to plan for assisted migration. When SDMs are projected across times or spaces, it is assumed that species climatic requirements remain constant. However, empirical evidence supporting this assumption is rare, and SDM predictions could be biased. Historical human-aided movements of tree species can shed light on the reliability of SDM predictions in planning for assisted migration. We used Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco), a North American conifer introduced into Europe during the mid-19th century, as a case-study to test niche conservatism. We combined transcontinental data sets of Douglas-fir occurrence and climatic predictors to compare the realized niches between native and introduced ranges. We calibrated a SDM in the native range and compared areas predicted to be climatically suitable with observed presences. The realized niches in the native and introduced ranges showed very limited overlap. The SDM calibrated in North America had very high predictive power in the native range, but failed to predict climatic suitability in Europe where Douglas-fir grows in climates that have no analogue in the native range. We review the ecological mechanisms and silvicultural practices that can trigger such shifts in realized niches. Retrospective analysis of tree species introduction revealed that the assumption of niche conservatism is erroneous. As a result, distributions predicted by SDM are importantly biased. There is a high risk that assisted migration programs may be misdirected and target inadequate species or introduction zones. © 2016 by the Ecological Society of America.

  16. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines

    PubMed Central

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J.; Li, Ming

    2013-01-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables. PMID:22552787

  17. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines.

    PubMed

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J; Li, Ming; Tabb, David L

    2012-09-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables.

  18. Geometry of thin liquid sheet flows

    NASA Technical Reports Server (NTRS)

    Chubb, Donald L.; Calfo, Frederick D.; Mcconley, Marc W.; Mcmaster, Matthew S.; Afjeh, Abdollah A.

    1994-01-01

    Incompresible, thin sheet flows have been of research interest for many years. Those studies were mainly concerned with the stability of the flow in a surrounding gas. Squire was the first to carry out a linear, invicid stability analysis of sheet flow in air and compare the results with experiment. Dombrowski and Fraser did an experimental study of the disintegration of sheet flows using several viscous liquids. They also detected the formulation of holes in their sheet flows. Hagerty and Shea carried out an inviscid stability analysis and calculated growth rates with experimental values. They compared their calculated growth rates with experimental values. Taylor studied extensively the stability of thin liquid sheets both theoretically and experimentally. He showed that thin sheets in a vacuum are stable. Brown experimentally investigated thin liquid sheet flows as a method of application of thin films. Clark and Dumbrowski carried out second-order stability analysis for invicid sheet flows. Lin introduced viscosity into the linear stability analysis of thin sheet flows in a vacuum. Mansour and Chigier conducted an experimental study of the breakup of a sheet flow surrounded by high-speed air. Lin et al. did a linear stability analysis that included viscosity and a surrounding gas. Rangel and Sirignano carried out both a linear and nonlinear invisid stability analysis that applies for any density ratio between the sheet liquid and the surrounding gas. Now there is renewed interest in sheet flows because of their possible application as low mass radiating surfaces. The objective of this study is to investigate the fluid dynamics of sheet flows that are of interest for a space radiator system. Analytical expressions that govern the sheet geometry are compared with experimental results. Since a space radiator will operate in a vacuum, the analysis does not include any drag force on the sheet flow.

  19. Can statistical linkage of missing variables reduce bias in treatment effect estimates in comparative effectiveness research studies?

    PubMed

    Crown, William; Chang, Jessica; Olson, Melvin; Kahler, Kristijan; Swindle, Jason; Buzinec, Paul; Shah, Nilay; Borah, Bijan

    2015-09-01

    Missing data, particularly missing variables, can create serious analytic challenges in observational comparative effectiveness research studies. Statistical linkage of datasets is a potential method for incorporating missing variables. Prior studies have focused upon the bias introduced by imperfect linkage. This analysis uses a case study of hepatitis C patients to estimate the net effect of statistical linkage on bias, also accounting for the potential reduction in missing variable bias. The results show that statistical linkage can reduce bias while also enabling parameter estimates to be obtained for the formerly missing variables. The usefulness of statistical linkage will vary depending upon the strength of the correlations of the missing variables with the treatment variable, as well as the outcome variable of interest.

  20. Comparing an analytical spacetime metric for a merging binary to a fully nonlinear numerical evolution using curvature scalars

    NASA Astrophysics Data System (ADS)

    Sadiq, Jam; Zlochower, Yosef; Nakano, Hiroyuki

    2018-04-01

    We introduce a new geometrically invariant prescription for comparing two different spacetimes based on geodesic deviation. We use this method to compare a family of recently introduced analytical spacetime representing inspiraling black-hole binaries to fully nonlinear numerical solutions to the Einstein equations. Our method can be used to improve analytical spacetime models by providing a local measure of the effects that violations of the Einstein equations will have on timelike geodesics, and indirectly, gas dynamics. We also discuss the advantages and limitations of this method.

  1. Introducing HEP to schools through educational scenaria

    NASA Astrophysics Data System (ADS)

    Kourkoumelis, C.; Vourakis, S.

    2015-05-01

    Recent activities, towards the goal of introducing High Energy Physics in the school class, are reviewed. The most efficient method is a half or a full day workshop where the students are introduced to one of the large LHC experiments, follow a "virtual visit" to the experiment's Control Room and perform an interactive analysis of real data. Science cafes and visits to the CERN expositions are also very helpful, provided that the tours/discussions are led by an active scientist and/or a trained teacher. Several EU outreach projects provide databases rich with education scenaria and data analysis tools ready to be used by the teachers in order to bridge the gap between modern research and technology and school education.

  2. Quaternion normalization in additive EKF for spacecraft attitude determination. [Extended Kalman Filters

    NASA Technical Reports Server (NTRS)

    Bar-Itzhack, I. Y.; Deutschmann, J.; Markley, F. L.

    1991-01-01

    This work introduces, examines and compares several quaternion normalization algorithms, which are shown to be an effective stage in the application of the additive extended Kalman filter to spacecraft attitude determination, which is based on vector measurements. Three new normalization schemes are introduced. They are compared with one another and with the known brute force normalization scheme, and their efficiency is examined. Simulated satellite data are used to demonstate the performance of all four schemes.

  3. Mosaic Graphs and Comparative Genomics in Phage Communities

    PubMed Central

    Belcaid, Mahdi; Bergeron, Anne

    2010-01-01

    Abstract Comparing the genomes of two closely related viruses often produces mosaics where nearly identical sequences alternate with sequences that are unique to each genome. When several closely related genomes are compared, the unique sequences are likely to be shared with third genomes, leading to virus mosaic communities. Here we present comparative analysis of sets of Staphylococcus aureus phages that share large identical sequences with up to three other genomes, and with different partners along their genomes. We introduce mosaic graphs to represent these complex recombination events, and use them to illustrate the breath and depth of sequence sharing: some genomes are almost completely made up of shared sequences, while genomes that share very large identical sequences can adopt alternate functional modules. Mosaic graphs also allow us to identify breakpoints that could eventually be used for the construction of recombination networks. These findings have several implications on phage metagenomics assembly, on the horizontal gene transfer paradigm, and more generally on the understanding of the composition and evolutionary dynamics of virus communities. PMID:20874413

  4. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis.

    PubMed

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-07-23

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.

  5. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  6. Introducing Aliphatic Substitution with a Discovery Experiment Using Competing Electrophiles

    ERIC Educational Resources Information Center

    Curran, Timothy P.; Mostovoy, Amelia J.; Curran, Margaret E.; Berger, Clara

    2016-01-01

    A facile, discovery-based experiment is described that introduces aliphatic substitution in an introductory undergraduate organic chemistry curriculum. Unlike other discovery-based experiments that examine substitution using two competing nucleophiles with a single electrophile, this experiment compares two isomeric, competing electrophiles…

  7. Retrospective economic analysis of the transfer of services from hospitals to the community: an application to an enhanced eye care service.

    PubMed

    Mason, Thomas; Jones, Cheryl; Sutton, Matt; Konstantakopoulou, Evgenia; Edgar, David F; Harper, Robert A; Birch, Stephen; Lawrenson, John G

    2017-07-10

    This research aims to evaluate the wider health system effects of the introduction of an intermediate-tier service for eye care. This research employs the Minor Eye Conditions Scheme (MECS), an intermediate-tier eye care service introduced in two London boroughs, Lewisham and Lambeth, in April 2013. Retrospective difference-in-differences analysis comparing changes over time in service use and costs between April 2011 and October 2014 in two commissioning areas that introduced an intermediate-tier service programme with changes in a neighbouring area that did not introduce the programme. MECS audit data; unit costs for MECS visits; volumes of first and follow-up outpatient attendances to hospital ophthalmology; the national schedule of reference costs. Volumes and costs of patients treated. In one intervention area (Lewisham), general practitioner (GP) referrals to hospital ophthalmology decreased differentially by 75.2% (95% CI -0.918% to -0.587%) for first attendances, and by 40.3% for follow-ups (95% CI -0.489% to -0.316%). GP referrals to hospital ophthalmology decreased differentially by 30.2% (95% CI -0.468% to -0.137%) for first attendances in the other intervention area (Lambeth). Costs increased by 3.1% in the comparison area between 2011/2012 and 2013/2014. Over the same period, costs increased by less (2.5%) in one intervention area and fell by 13.8% in the other intervention area. Intermediate-tier services based in the community could potentially reduce volumes of patients referred to hospitals by GPs and provide replacement services at lower unit costs. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  9. MindEdit: A P300-based text editor for mobile devices.

    PubMed

    Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M

    2017-01-01

    Practical application of Brain-Computer Interfaces (BCIs) requires that the whole BCI system be portable. The mobility of BCI systems involves two aspects: making the electroencephalography (EEG) recording devices portable, and developing software applications with low computational complexity to be able to run on low computational-power devices such as tablets and smartphones. This paper addresses the development of MindEdit; a P300-based text editor for Android-based devices. Given the limited resources of mobile devices and their limited computational power, a novel ensemble classifier is utilized that uses Principal Component Analysis (PCA) features to identify P300 evoked potentials from EEG recordings. PCA computations in the proposed method are channel-based as opposed to concatenating all channels as in traditional feature extraction methods; thus, this method has less computational complexity compared to traditional P300 detection methods. The performance of the method is demonstrated on data recorded from MindEdit on an Android tablet using the Emotiv wireless neuroheadset. Results demonstrate the capability of the introduced PCA ensemble classifier to classify P300 data with maximum average accuracy of 78.37±16.09% for cross-validation data and 77.5±19.69% for online test data using only 10 trials per symbol and a 33-character training dataset. Our analysis indicates that the introduced method outperforms traditional feature extraction methods. For a faster operation of MindEdit, a variable number of trials scheme is introduced that resulted in an online average accuracy of 64.17±19.6% and a maximum bitrate of 6.25bit/min. These results demonstrate the efficacy of using the developed BCI application with mobile devices. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  11. Correcting for Blood Arrival Time in Global Mean Regression Enhances Functional Connectivity Analysis of Resting State fMRI-BOLD Signals.

    PubMed

    Erdoğan, Sinem B; Tong, Yunjie; Hocke, Lia M; Lindsey, Kimberly P; deB Frederick, Blaise

    2016-01-01

    Resting state functional connectivity analysis is a widely used method for mapping intrinsic functional organization of the brain. Global signal regression (GSR) is commonly employed for removing systemic global variance from resting state BOLD-fMRI data; however, recent studies have demonstrated that GSR may introduce spurious negative correlations within and between functional networks, calling into question the meaning of anticorrelations reported between some networks. In the present study, we propose that global signal from resting state fMRI is composed primarily of systemic low frequency oscillations (sLFOs) that propagate with cerebral blood circulation throughout the brain. We introduce a novel systemic noise removal strategy for resting state fMRI data, "dynamic global signal regression" (dGSR), which applies a voxel-specific optimal time delay to the global signal prior to regression from voxel-wise time series. We test our hypothesis on two functional systems that are suggested to be intrinsically organized into anticorrelated networks: the default mode network (DMN) and task positive network (TPN). We evaluate the efficacy of dGSR and compare its performance with the conventional "static" global regression (sGSR) method in terms of (i) explaining systemic variance in the data and (ii) enhancing specificity and sensitivity of functional connectivity measures. dGSR increases the amount of BOLD signal variance being modeled and removed relative to sGSR while reducing spurious negative correlations introduced in reference regions by sGSR, and attenuating inflated positive connectivity measures. We conclude that incorporating time delay information for sLFOs into global noise removal strategies is of crucial importance for optimal noise removal from resting state functional connectivity maps.

  12. Recce imagery compression options

    NASA Astrophysics Data System (ADS)

    Healy, Donald J.

    1995-09-01

    The errors introduced into reconstructed RECCE imagery by ATARS DPCM compression are compared to those introduced by the more modern DCT-based JPEG compression algorithm. For storage applications in which uncompressed sensor data is available JPEG provides better mean-square-error performance while also providing more flexibility in the selection of compressed data rates. When ATARS DPCM compression has already been performed, lossless encoding techniques may be applied to the DPCM deltas to achieve further compression without introducing additional errors. The abilities of several lossless compression algorithms including Huffman, Lempel-Ziv, Lempel-Ziv-Welch, and Rice encoding to provide this additional compression of ATARS DPCM deltas are compared. It is shown that the amount of noise in the original imagery significantly affects these comparisons.

  13. Combining Flux Balance and Energy Balance Analysis for Large-Scale Metabolic Network: Biochemical Circuit Theory for Analysis of Large-Scale Metabolic Networks

    NASA Technical Reports Server (NTRS)

    Beard, Daniel A.; Liang, Shou-Dan; Qian, Hong; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Predicting behavior of large-scale biochemical metabolic networks represents one of the greatest challenges of bioinformatics and computational biology. Approaches, such as flux balance analysis (FBA), that account for the known stoichiometry of the reaction network while avoiding implementation of detailed reaction kinetics are perhaps the most promising tools for the analysis of large complex networks. As a step towards building a complete theory of biochemical circuit analysis, we introduce energy balance analysis (EBA), which compliments the FBA approach by introducing fundamental constraints based on the first and second laws of thermodynamics. Fluxes obtained with EBA are thermodynamically feasible and provide valuable insight into the activation and suppression of biochemical pathways.

  14. Correlation analysis of the Korean stock market: Revisited to consider the influence of foreign exchange rate

    NASA Astrophysics Data System (ADS)

    Jo, Sang Kyun; Kim, Min Jae; Lim, Kyuseong; Kim, Soo Yong

    2018-02-01

    We investigated the effect of foreign exchange rate in a correlation analysis of the Korean stock market using both random matrix theory and minimum spanning tree. We collected data sets which were divided into two types of stock price, the original stock price in Korean Won and the price converted into US dollars at contemporary foreign exchange rates. Comparing the random matrix theory based on the two different prices, a few particular sectors exhibited substantial differences while other sectors changed little. The particular sectors were closely related to economic circumstances and the influence of foreign financial markets during that period. The method introduced in this paper offers a way to pinpoint the effect of exchange rate on an emerging stock market.

  15. Polarizing Grids, their Assemblies and Beams of Radiation

    NASA Technical Reports Server (NTRS)

    Houde, Martin; Akeson, Rachel L.; Carlstrom, John E.; Lamb, James W.; Schleuning, David A.; Woody, David P.

    2001-01-01

    This article gives an analysis of the behavior of polarizing grids and reflecting polarizers by solving Maxwell's equations, for arbitrary angles of incidence and grid rotation, for cases where the excitation is provided by an incident plane wave or a beam of radiation. The scattering and impedance matrix representations are derived and used to solve more complicated configurations of grid assemblies. The results are also compared with data obtained in the calibration of reflecting polarizers at the Owens Valley Radio Observatory (OVRO). From these analysis, we propose a method for choosing the optimum grid parameters (wire radius and spacing). We also provide a study of the effects of two types of errors (in wire separation and radius size) that can be introduced in the fabrication of a grid.

  16. Analysis of the whole mitochondrial genome: translation of the Ion Torrent Personal Genome Machine system to the diagnostic bench?

    PubMed

    Seneca, Sara; Vancampenhout, Kim; Van Coster, Rudy; Smet, Joél; Lissens, Willy; Vanlander, Arnaud; De Paepe, Boel; Jonckheere, An; Stouffs, Katrien; De Meirleir, Linda

    2015-01-01

    Next-generation sequencing (NGS), an innovative sequencing technology that enables the successful analysis of numerous gene sequences in a massive parallel sequencing approach, has revolutionized the field of molecular biology. Although NGS was introduced in a rather recent past, the technology has already demonstrated its potential and effectiveness in many research projects, and is now on the verge of being introduced into the diagnostic setting of routine laboratories to delineate the molecular basis of genetic disease in undiagnosed patient samples. We tested a benchtop device on retrospective genomic DNA (gDNA) samples of controls and patients with a clinical suspicion of a mitochondrial DNA disorder. This Ion Torrent Personal Genome Machine platform is a high-throughput sequencer with a fast turnaround time and reasonable running costs. We challenged the chemistry and technology with the analysis and processing of a mutational spectrum composed of samples with single-nucleotide substitutions, indels (insertions and deletions) and large single or multiple deletions, occasionally in heteroplasmy. The output data were compared with previously obtained conventional dideoxy sequencing results and the mitochondrial revised Cambridge Reference Sequence (rCRS). We were able to identify the majority of all nucleotide alterations, but three false-negative results were also encountered in the data set. At the same time, the poor performance of the PGM instrument in regions associated with homopolymeric stretches generated many false-positive miscalls demanding additional manual curation of the data.

  17. Structural Control of Metabolic Flux

    PubMed Central

    Sajitz-Hermstein, Max; Nikoloski, Zoran

    2013-01-01

    Organisms have to continuously adapt to changing environmental conditions or undergo developmental transitions. To meet the accompanying change in metabolic demands, the molecular mechanisms of adaptation involve concerted interactions which ultimately induce a modification of the metabolic state, which is characterized by reaction fluxes and metabolite concentrations. These state transitions are the effect of simultaneously manipulating fluxes through several reactions. While metabolic control analysis has provided a powerful framework for elucidating the principles governing this orchestrated action to understand metabolic control, its applications are restricted by the limited availability of kinetic information. Here, we introduce structural metabolic control as a framework to examine individual reactions' potential to control metabolic functions, such as biomass production, based on structural modeling. The capability to carry out a metabolic function is determined using flux balance analysis (FBA). We examine structural metabolic control on the example of the central carbon metabolism of Escherichia coli by the recently introduced framework of functional centrality (FC). This framework is based on the Shapley value from cooperative game theory and FBA, and we demonstrate its superior ability to assign “share of control” to individual reactions with respect to metabolic functions and environmental conditions. A comparative analysis of various scenarios illustrates the usefulness of FC and its relations to other structural approaches pertaining to metabolic control. We propose a Monte Carlo algorithm to estimate FCs for large networks, based on the enumeration of elementary flux modes. We further give detailed biological interpretation of FCs for production of lactate and ATP under various respiratory conditions. PMID:24367246

  18. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    PubMed

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  19. The Efficacy and Safety of Mainstream Medications for Patients With cDMARD-Naïve Rheumatoid Arthritis: A Network Meta-Analysis.

    PubMed

    Cai, Weiyan; Gu, Youyi; Cui, Huanqin; Cao, Yinyin; Wang, Xiaoliang; Yao, Yi; Wang, Mingyu

    2018-01-01

    Background: The mainstream medications for rheumatoid arthritis (RA) include conventional disease-modifying antirheumatic drugs (cDMARDs), which mostly are methotrexate (MTX), and biologic agents such as adalimumab (ADA), certolizumab (CZP), etanercept (ETN), golimumab (GOL), infliximab (IFX), and tocilizumab (TCZ). This network meta-analysis was aimed at evaluating the efficacy and safety of the medications above and interventions combining cDMARDs and biologic agents for patients with RA. Methods: PubMed, EMBASE, Cochrane Library, and ClinicalTrials.gov were searched systematically for eligible randomized controlled trials (RCTs). Outcomes concerning efficacy and safety were evaluated utilizing odds ratios (ORs) and 95% credible intervals ( CrI ). The outcomes of efficacy would be evaluated through remission and American College of Rheumatology (ACR) scores. The surface under the cumulative ranking curve (SUCRA) was calculated to rank each treatment on each index. Results: A total of 20 RCTs with 9,047 patients were included, and the efficacy and safety of the concerning interventions for RA were evaluated. Compared with cDMARDs alone, TCZ+MTX, ETN+MTX, IFX+MTX, TCZ, and ADA+MTX showed significant statistical advantage on ACR20, ACR50, and ACR70. Apart from that, as for remission, TCZ+MTX, IFX+MTX, TCZ, and CZP+MTX performed better compared to cDMARDs alone. The SUCRA ranking also indicated that TCZ+MTX was the intervention with best ranking in the entire four efficacy indexes followed by ETX+MTX and IFX+MTX. However, there was no obvious difference among these medications compared with cDMARDs when it comes to safety, which need more specific studies on that. Conclusion: TCZ+MTX was potentially the most recommended combination of medications for RA due to its good performance in all outcomes of efficacy. ETX+MTX and IFX+MTX, which also performed well, could be introduced as alternative treatments. However, considering the adverse events, the treatments concerning should be introduced with caution.

  20. Are Australians concerned about nanoparticles? A comparative analysis with established and emerging environmental health issues.

    PubMed

    Capon, Adam; Rolfe, Margaret; Gillespie, James; Smith, Wayne

    2015-02-01

    Introducing new technologies into society raises considerable public concern. We determine the public concern about nanoparticles, and compare this concern to other environmental health issues such as wind farms and coal seam gas production. A repeat cross sectional survey examining views on environmental health issues, risk, chemicals and trust was undertaken in more than 1,300 Australian residents in 2000 and 2013. Logistic regression and principal component analysis was used to investigate predictors of nanoparticle concern and identify a component structure for environmental health issues that could explain a trend of future nanoparticle concern. Australians have a relatively low level of concern about the risks of nanoparticles to health when compared to their concerns about other environmental health issues. Items associated with concern included gender, a general wish to avoid chemicals and possibly trust in politicians. Concern over nanoparticles clustered with similar views on technological risks. Current public concern over the risks of nanoparticles is low. However, a reframing of the issue towards 'chemicals' is likely to have a negative effect on risk perceptions. This paper raises questions about appropriate channels for the effective communication of risk. © 2015 Public Health Association of Australia.

  1. Shape memory alloy smart knee spacer to enhance knee functionality: model design and finite element analysis.

    PubMed

    Gautam, Arvind; Rani, A Bhargavi; Callejas, Miguel A; Acharyya, Swati Ghosh; Acharyya, Amit; Biswas, Dwaipayan; Bhandari, Vasundhra; Sharma, Paresh; Naik, Ganesh R

    2016-08-01

    In this paper we introduce Shape Memory Alloy (SMA) for designing the tibial part of Total Knee Arthroplasty (TKA) by exploiting the shape-memory and pseudo-elasticity property of the SMA (e.g. NiTi). This would eliminate the drawbacks of the state-of-the art PMMA based knee-spacer including fracture, sustainability, dislocation, tilting, translation and subluxation for tackling the Osteoarthritis especially for the aged people of 45-plus or the athletes. In this paper a Computer Aided Design (CAD) model using SolidWorks for the knee-spacer is presented based on the proposed SMA adopting the state-of-the art industry-standard geometry that is used in the PMMA based spacer design. Subsequently Ansys based Finite Element Analysis is carried out to measure and compare the performance between the proposed SMA based model with the state-of-the art PMMA ones. 81% more bending is noticed in the PMMA based spacer compared to the proposed SMA that would eventually cause fracture and tilting or translation of spacer. Permanent shape deformation of approximately 58.75% in PMMA based spacer is observed compared to recoverable 11% deformation in SMA when same load is applied on both separately.

  2. The ecological risks of genetically engineered organisms

    NASA Astrophysics Data System (ADS)

    Wolfenbarger, Lareesa

    2001-03-01

    Highly publicized studies have suggested environmental risks of releasing genetically engineered organisms (GEOs) and have renewed concerns over the evaluation and regulation of these products in domestic and international arenas. I present an overview of the risks of GEOs and the available evidence addressing these and discuss the challenges for risk assessment. Main categories of risk include non-target effects from GEOs, emergence of new viral diseases, and the spread of invasive (weedy) characteristics. Studies have detected non-target effects in some cases but not all; however, much less information exists on other risks, in part due to a lack of conceptual knowledge. For example, general models for predicting invasiveness are not well developed for any introduced organism. The risks of GEOs appear comparable to those for any introduced species or organism, but the magnitude of the risk or the pathway of exposure to the risk can differ among introduced organisms. Therefore, assessing the risks requires a case-by-case analysis so that any differences can be identified. Challenges to assessing risks to valued ecosystems include variability in effects and ecosystem complexity. Ecosystems are a dynamic and complex network of biological and physical interactions. Introducing a new biological entity, such as a GEO, may potentially alter any of these interactions, but evaluating all of these is unrealistic. Effects on a valued ecosystem could vary greatly depending on the geographical location of the experimental site, the GEO used, the plot size of the experiment (scaling effects), and the biological and physical parameters used in the experiment. Experiments that address these sources of variability will provide the most useful information for risk assessments.

  3. Predicting potential global distributions of two Miscanthus grasses: implications for horticulture, biofuel production, and biological invasions.

    PubMed

    Hager, Heather A; Sinasac, Sarah E; Gedalof, Ze'ev; Newman, Jonathan A

    2014-01-01

    In many regions, large proportions of the naturalized and invasive non-native floras were originally introduced deliberately by humans. Pest risk assessments are now used in many jurisdictions to regulate the importation of species and usually include an estimation of the potential distribution in the import area. Two species of Asian grass (Miscanthus sacchariflorus and M. sinensis) that were originally introduced to North America as ornamental plants have since escaped cultivation. These species and their hybrid offspring are now receiving attention for large-scale production as biofuel crops in North America and elsewhere. We evaluated their potential global climate suitability for cultivation and potential invasion using the niche model CLIMEX and evaluated the models' sensitivity to the parameter values. We then compared the sensitivity of projections of future climatically suitable area under two climate models and two emissions scenarios. The models indicate that the species have been introduced to most of the potential global climatically suitable areas in the northern but not the southern hemisphere. The more narrowly distributed species (M. sacchariflorus) is more sensitive to changes in model parameters, which could have implications for modelling species of conservation concern. Climate projections indicate likely contractions in potential range in the south, but expansions in the north, particularly in introduced areas where biomass production trials are under way. Climate sensitivity analysis shows that projections differ more between the selected climate change models than between the selected emissions scenarios. Local-scale assessments are required to overlay suitable habitat with climate projections to estimate areas of cultivation potential and invasion risk.

  4. Calorie Changes in Large Chain Restaurants

    PubMed Central

    Bleich, Sara N.; Wolfson, Julia A.; Jarlenski, Marian P.

    2015-01-01

    Introduction Large chain restaurants reduced the number of calories in newly introduced menu items in 2013 by about 60 calories (or 12%) relative to 2012. This paper describes trends in calories available in large U.S. chain restaurants to understand whether previously documented patterns persist. Methods Data (a census of items for included restaurants) were obtained from the MenuStat project. This analysis included 66 of the 100 largest U.S. restaurants that are available in all three 3 of the data (2012–2014; N=23,066 items). Generalized linear models were used to examine: (1) per-item calorie changes from 2012 to 2014 among items on the menu in all years; and (2) mean calories in new items in 2013 and 2014 compared with items on the menu in 2012 only. Data were analyzed in 2014. Results Overall, calories in newly introduced menu items declined by 71 (or 15%) from 2012 to 2013 (p=0.001) and by 69 (or 14%) from 2012 to 2014 (p=0.03). These declines were concentrated mainly in new main course items (85 fewer calories in 2013 and 55 fewer calories in 2014; p=0.01). Although average calories in newly introduced menu items are declining, they are higher than items common to the menu in all 3 years. No differences in mean calories among items on menus in 2012, 2013, or 2014 were found. Conclusions The previously observed declines in newly introduced menu items among large restaurant chains have been maintained, which suggests the beginning of a trend toward reducing calories. PMID:26163168

  5. MetaComp: comprehensive analysis software for comparative meta-omics including comparative metagenomics.

    PubMed

    Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu

    2017-10-02

    During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis. Moreover, since the automatically chosen two-group sample test is verified to be outperformed, MetaComp is friendly to users without adequate statistical training. These improvements are aiming to overcome the new challenges under big data era for all meta-omics data. MetaComp is available at: http://cqb.pku.edu.cn/ZhuLab/MetaComp/ and https://github.com/pzhaipku/MetaComp/ .

  6. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  7. Maximum Entropy Method applied to Real-time Time-Dependent Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Zempo, Yasunari; Toogoshi, Mitsuki; Kano, Satoru S.

    Maximum Entropy Method (MEM) is widely used for the analysis of a time-series data such as an earthquake, which has fairly long-periodicity but short observable data. We have examined MEM to apply to the optical analysis of the time-series data from the real-time TDDFT. In the analysis, usually Fourier Transform (FT) is used, and we have to pay our attention to the lower energy part such as the band gap, which requires the long time evolution. The computational cost naturally becomes quite expensive. Since MEM is based on the autocorrelation of the signal, in which the periodicity can be described as the difference of time-lags, its value in the lower energy naturally gets small compared to that in the higher energy. To improve the difficulty, our MEM has the two features: the raw data is repeated it many times and concatenated, which provides the lower energy resolution in high resolution; together with the repeated data, an appropriate phase for the target frequency is introduced to reduce the side effect of the artificial periodicity. We have compared our improved MEM and FT spectrum using small-to-medium size molecules. We can see the clear spectrum of MEM, compared to that of FT. Our new technique provides higher resolution in fewer steps, compared to that of FT. This work was partially supported by JSPS Grants-in-Aid for Scientific Research (C) Grant number 16K05047, Sumitomo Chemical, Co. Ltd., and Simulatio Corp.

  8. Clinical Laboratory Automation: A Case Study

    PubMed Central

    Archetti, Claudia; Montanelli, Alessandro; Finazzi, Dario; Caimi, Luigi; Garrafa, Emirena

    2017-01-01

    Background This paper presents a case study of an automated clinical laboratory in a large urban academic teaching hospital in the North of Italy, the Spedali Civili in Brescia, where four laboratories were merged in a unique laboratory through the introduction of laboratory automation. Materials and Methods The analysis compares the preautomation situation and the new setting from a cost perspective, by considering direct and indirect costs. It also presents an analysis of the turnaround time (TAT). The study considers equipment, staff and indirect costs. Results The introduction of automation led to a slight increase in equipment costs which is highly compensated by a remarkable decrease in staff costs. Consequently, total costs decreased by 12.55%. The analysis of the TAT shows an improvement of nonemergency exams while emergency exams are still validated within the maximum time imposed by the hospital. Conclusions The strategy adopted by the management, which was based on re-using the available equipment and staff when merging the pre-existing laboratories, has reached its goal: introducing automation while minimizing the costs. Significance for public health Automation is an emerging trend in modern clinical laboratories with a positive impact on service level to patients and on staff safety as shown by different studies. In fact, it allows process standardization which, in turn, decreases the frequency of outliers and errors. In addition, it induces faster processing times, thus improving the service level. On the other side, automation decreases the staff exposition to accidents strongly improving staff safety. In this study, we analyse a further potential benefit of automation, that is economic convenience. We study the case of the automated laboratory of one of the biggest hospital in Italy and compare the cost related to the pre and post automation situation. Introducing automation lead to a cost decrease without affecting the service level to patients. This was a key goal of the hospital which, as public health entities in general, is constantly struggling with budget constraints. PMID:28660178

  9. Sample introducing apparatus and sample modules for mass spectrometer

    DOEpatents

    Thompson, C.V.; Wise, M.B.

    1993-12-21

    An apparatus for introducing gaseous samples from a wide range of environmental matrices into a mass spectrometer for analysis of the samples is described. Several sample preparing modules including a real-time air monitoring module, a soil/liquid purge module, and a thermal desorption module are individually and rapidly attachable to the sample introducing apparatus for supplying gaseous samples to the mass spectrometer. The sample-introducing apparatus uses a capillary column for conveying the gaseous samples into the mass spectrometer and is provided with an open/split interface in communication with the capillary and a sample archiving port through which at least about 90 percent of the gaseous sample in a mixture with an inert gas that was introduced into the sample introducing apparatus is separated from a minor portion of the mixture entering the capillary discharged from the sample introducing apparatus. 5 figures.

  10. Spatial and spectral analysis of corneal epithelium injury using hyperspectral images

    NASA Astrophysics Data System (ADS)

    Md Noor, Siti Salwa; Michael, Kaleena; Marshall, Stephen; Ren, Jinchang

    2017-12-01

    Eye assessment is essential in preventing blindness. Currently, the existing methods to assess corneal epithelium injury are complex and require expert knowledge. Hence, we have introduced a non-invasive technique using hyperspectral imaging (HSI) and an image analysis algorithm of corneal epithelium injury. Three groups of images were compared and analyzed, namely healthy eyes, injured eyes, and injured eyes with stain. Dimensionality reduction using principal component analysis (PCA) was applied to reduce massive data and redundancies. The first 10 principal components (PCs) were selected for further processing. The mean vector of 10 PCs with 45 pairs of all combinations was computed and sent to two classifiers. A quadratic Bayes normal classifier (QDC) and a support vector classifier (SVC) were used in this study to discriminate the eleven eyes into three groups. As a result, the combined classifier of QDC and SVC showed optimal performance with 2D PCA features (2DPCA-QDSVC) and was utilized to classify normal and abnormal tissues, using color image segmentation. The result was compared with human segmentation. The outcome showed that the proposed algorithm produced extremely promising results to assist the clinician in quantifying a cornea injury.

  11. Comparative Sensitivity Analysis of Muscle Activation Dynamics

    PubMed Central

    Günther, Michael; Götz, Thomas

    2015-01-01

    We mathematically compared two models of mammalian striated muscle activation dynamics proposed by Hatze and Zajac. Both models are representative for a broad variety of biomechanical models formulated as ordinary differential equations (ODEs). These models incorporate parameters that directly represent known physiological properties. Other parameters have been introduced to reproduce empirical observations. We used sensitivity analysis to investigate the influence of model parameters on the ODE solutions. In addition, we expanded an existing approach to treating initial conditions as parameters and to calculating second-order sensitivities. Furthermore, we used a global sensitivity analysis approach to include finite ranges of parameter values. Hence, a theoretician striving for model reduction could use the method for identifying particularly low sensitivities to detect superfluous parameters. An experimenter could use it for identifying particularly high sensitivities to improve parameter estimation. Hatze's nonlinear model incorporates some parameters to which activation dynamics is clearly more sensitive than to any parameter in Zajac's linear model. Other than Zajac's model, Hatze's model can, however, reproduce measured shifts in optimal muscle length with varied muscle activity. Accordingly we extracted a specific parameter set for Hatze's model that combines best with a particular muscle force-length relation. PMID:26417379

  12. Molecular Eigensolution Symmetry Analysis and Fine Structure

    PubMed Central

    Harter, William G.; Mitchell, Justin C.

    2013-01-01

    Spectra of high-symmetry molecules contain fine and superfine level cluster structure related to J-tunneling between hills and valleys on rovibronic energy surfaces (RES). Such graphic visualizations help disentangle multi-level dynamics, selection rules, and state mixing effects including widespread violation of nuclear spin symmetry species. A review of RES analysis compares it to that of potential energy surfaces (PES) used in Born–Oppenheimer approximations. Both take advantage of adiabatic coupling in order to visualize Hamiltonian eigensolutions. RES of symmetric and D2 asymmetric top rank-2-tensor Hamiltonians are compared with Oh spherical top rank-4-tensor fine-structure clusters of 6-fold and 8-fold tunneling multiplets. Then extreme 12-fold and 24-fold multiplets are analyzed by RES plots of higher rank tensor Hamiltonians. Such extreme clustering is rare in fundamental bands but prevalent in hot bands, and analysis of its superfine structure requires more efficient labeling and a more powerful group theory. This is introduced using elementary examples involving two groups of order-6 (C6 and D3~C3v), then applied to families of Oh clusters in SF6 spectra and to extreme clusters. PMID:23344041

  13. The Model Experiments and Finite Element Analysis on Deformation and Failure by Excavation of Grounds in Foregoing-roof Method

    NASA Astrophysics Data System (ADS)

    Sotokoba, Yasumasa; Okajima, Kenji; Iida, Toshiaki; Tanaka, Tadatsugu

    We propose the trenchless box culvert construction method to construct box culverts in small covering soil layers while keeping roads or tracks open. When we use this construction method, it is necessary to clarify deformation and shear failure by excavation of grounds. In order to investigate the soil behavior, model experiments and elasto-plactic finite element analysis were performed. In the model experiments, it was shown that the shear failure was developed from the end of the roof to the toe of the boundary surface. In the finite element analysis, a shear band effect was introduced. Comparing the observed shear bands in model experiments with computed maximum shear strain contours, it was found that the observed direction of the shear band could be simulated reasonably by the finite element analysis. We may say that the finite element method used in this study is useful tool for this construction method.

  14. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  15. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    NASA Astrophysics Data System (ADS)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  16. Analysis of human tissues by total reflection X-ray fluorescence. Application of chemometrics for diagnostic cancer recognition

    NASA Astrophysics Data System (ADS)

    Benninghoff, L.; von Czarnowski, D.; Denkhaus, E.; Lemke, K.

    1997-07-01

    For the determination of trace element distributions of more than 20 elements in malignant and normal tissues of the human colon, tissue samples (approx. 400 mg wet weight) were digested with 3 ml of nitric acid (sub-boiled quality) by use of an autoclave system. The accuracy of measurements has been investigated by using certified materials. The analytical results were evaluated by using a spreadsheet program to give an overview of the element distribution in cancerous samples and in normal colon tissues. A further application, cluster analysis of the analytical results, was introduced to demonstrate the possibility of classification for cancer diagnosis. To confirm the results of cluster analysis, multivariate three-way principal component analysis was performed. Additionally, microtome frozen sections (10 μm) were prepared from the same tissue samples to compare the analytical results, i.e. the mass fractions of elements, according to the preparation method and to exclude systematic errors depending on the inhomogeneity of the tissues.

  17. Ray propagation path analysis of acousto-ultrasonic signals in composites

    NASA Technical Reports Server (NTRS)

    Kautz, Harold E.

    1987-01-01

    The most important result was the demonstration that acousto-ultrasonic (AU) energy introduced into a laminated graphite/resin propagates by two modes through the structure. The first mode, along the graphite fibers, is the faster. The second mode, through the resin matrix, besides being slower is also more strongly attenuated at the higher frequencies. This demonstration was accomplished by analyzing the time and frequency domain of the composite AU signal and comparing them to the same for a neat resin specimen of the same chemistry and geometry as the composite matrix. Analysis of the fine structure of AU spectra was accomplished by various geometrical strategies. It was shown that the multitude of narrow peaks associated with AU spectra are the effect of the many pulse arrivals in the signal. The shape and distribution of the peaks is mainly determined by the condition of nonnormal reflections of ray paths. A cepstrum analysis was employed which can be useful in detecting characteristic times. Analysis of propagation modes can be accomplished while ignoring the fine structure.

  18. Fast and sensitive trace analysis of malachite green using a surface-enhanced Raman microfluidic sensor.

    PubMed

    Lee, Sangyeop; Choi, Junghyun; Chen, Lingxin; Park, Byungchoon; Kyong, Jin Burm; Seong, Gi Hun; Choo, Jaebum; Lee, Yeonjung; Shin, Kyung-Hoon; Lee, Eun Kyu; Joo, Sang-Woo; Lee, Kyeong-Hee

    2007-05-08

    A rapid and highly sensitive trace analysis technique for determining malachite green (MG) in a polydimethylsiloxane (PDMS) microfluidic sensor was investigated using surface-enhanced Raman spectroscopy (SERS). A zigzag-shaped PDMS microfluidic channel was fabricated for efficient mixing between MG analytes and aggregated silver colloids. Under the optimal condition of flow velocity, MG molecules were effectively adsorbed onto silver nanoparticles while flowing along the upper and lower zigzag-shaped PDMS channel. A quantitative analysis of MG was performed based on the measured peak height at 1615 cm(-1) in its SERS spectrum. The limit of detection, using the SERS microfluidic sensor, was found to be below the 1-2 ppb level and this low detection limit is comparable to the result of the LC-Mass detection method. In the present study, we introduce a new conceptual detection technology, using a SERS microfluidic sensor, for the highly sensitive trace analysis of MG in water.

  19. Biology, host instar suitability and susceptibility, and interspecific competition of three introduced parasitoids of Paracoccus marginatus (Hemiptera: Pseudococcidae)

    USDA-ARS?s Scientific Manuscript database

    Biology, host stage suitability and susceptibility, and interspecific competition of three previously introduced parasitoids (Acerophagus papayae, Anagyrus loecki, and Pseudleptomastix mexicana) (Hymenoptera: Encyrtidae) of Paracoccus marginatus were studied in the laboratory. Compared to P. mexica...

  20. Differential escape from parasites by two competing introduced crabs

    USGS Publications Warehouse

    Blakeslee, April M.; Keogh, Carolyn L.; Byers, James E.; Kuris, Armand M.; Lafferty, Kevin D.; Torchin, Mark E.

    2009-01-01

    Although introduced species often interact with one another in their novel communities, the role of parasites in these interactions remains less clear. We examined parasite richness and prevalence in 2 shorecrab species with different invasion histories and residency times in an introduced region where their distributions overlap broadly. On the northeastern coast of the USA, the Asian shorecrab Hemigrapsus sanguineus was discovered 20 yr ago, while the European green crab Carcinus maenas has been established for over 200 yr. We used literature and field surveys to evaluate parasitism in both crabs in their native and introduced ranges. We found only 1 parasite species infecting H. sanguineus on the US East Coast compared to 6 species in its native range, while C. maenas was host to 3 parasite species on the East Coast compared to 10 in its native range. The prevalence of parasite infection was also lower for both crabs in the introduced range compared to their native ranges; however, the difference was almost twice as much for H. sanguineus as for C. maenas. There are several explanations that could contribute to C. maenas' greater parasite diversity than that of H. sanguineus on the US East Coast, including differences in susceptibility, time since introduction, manner of introduction (vector), distance from native range, taxonomic isolation, and the potential for parasite identification bias. Our study underscores not just that non-native species lose parasites upon introduction, but that they may do so differentially, with ramifications for their direct interactions and with potential community-level influences.

  1. Double-Stage Delay Multiply and Sum Beamforming Algorithm Applied to Ultrasound Medical Imaging.

    PubMed

    Mozaffarzadeh, Moein; Sadeghi, Masume; Mahloojifar, Ali; Orooji, Mahdi

    2018-03-01

    In ultrasound (US) imaging, delay and sum (DAS) is the most common beamformer, but it leads to low-quality images. Delay multiply and sum (DMAS) was introduced to address this problem. However, the reconstructed images using DMAS still suffer from the level of side lobes and low noise suppression. Here, a novel beamforming algorithm is introduced based on expansion of the DMAS formula. We found that there is a DAS algebra inside the expansion, and we proposed use of the DMAS instead of the DAS algebra. The introduced method, namely double-stage DMAS (DS-DMAS), is evaluated numerically and experimentally. The quantitative results indicate that DS-DMAS results in an approximately 25% lower level of side lobes compared with DMAS. Moreover, the introduced method leads to 23%, 22% and 43% improvement in signal-to-noise ratio, full width at half-maximum and contrast ratio, respectively, compared with the DMAS beamformer. Copyright © 2018. Published by Elsevier Inc.

  2. Outcomes and cost comparisons after introducing a robotics program for endometrial cancer surgery.

    PubMed

    Lau, Susie; Vaknin, Zvi; Ramana-Kumar, Agnihotram V; Halliday, Darron; Franco, Eduardo L; Gotlieb, Walter H

    2012-04-01

    To evaluate the effect of introducing a robotic program on cost and patient outcome. This was a prospective evaluation of clinical outcome and cost after introducing a robotics program for the treatment of endometrial cancer and a retrospective comparison to the entire historical cohort. Consecutive patients with endometrial cancer who underwent robotic surgery (n=143) were compared with all consecutive patients who underwent surgery (n=160) before robotics. The rate of minimally invasive surgery increased from 17% performed by laparoscopy to 98% performed by robotics in 2 years. The patient characteristics were comparable in both eras, except for a higher body mass index in the robotics era (median 29.8 compared with 27.6; P<.005). Patients undergoing robotics had longer operating times (233 compared with 206 minutes), but fewer adverse events (13% compared with 42%; P<.001), lower estimated median blood loss (50 compared with 200 mL; P<.001), and shorter median hospital stay (1 compared with 5 days; P<.001). The overall hospital costs were significantly lower for robotics compared with the historical group (Can$7,644 compared with Can$10,368 [Canadian dollars]; P<.001) even when acquisition and maintenance cost were included (Can$8,370 compared with Can$10,368; P=.001). Within 2 years after surgery, the short-term recurrence rate appeared lower in the robotics group compared with the historic cohort (11 recurrences compared with 19 recurrences; P<.001). Introduction of robotics for endometrial cancer surgery increased the proportion of patients benefitting from minimally invasive surgery, improved short-term outcomes, and resulted in lower hospital costs. II.

  3. The positive effects of a peer-led intervention system for individuals with a risk of metabolic syndrome.

    PubMed

    Sanee, Aree; Somrongthong, Ratana; Plianbangchang, Samlee

    2017-01-01

    Metabolic syndrome (MetS) is a major health risk in Thailand. Although it is reported that females have a higher rate of MetS than males, very few peer-led intervention studies have been conducted on specific groups, such as seamstresses, at risk of MetS. This study aimed to evaluate the effect of a peer-led intervention program on reducing MetS risk factors in individuals working in Thai Uniform Sewing Military Factories. A quasiexperimental program was introduced using a pre- and posttest design that was applied to female sewing factory workers selected for this research. All participants had at least one of the key MetS symptoms. The experimental group (N=50 participants) received 12 weekly peer-led individual support discussion sessions that included both dietary and physical activity (PA) advice and the control group (N=50 participants) followed their usual daily routines. The Student's t -test and the Pearson's chi-squared test were used to compare the differences of baseline data and analysis of variance was used for analysis of the data after intervention. The results showed that after 3 months of participation, when compared to the control group, the experimental group had significantly improved systolic blood pressure (BP) ( P =0.04), diastolic BP ( P <0.001), PA ( P =0.05), knowledge scores of MetS, perception of MetS and risk factors ( P <0.001), and stress assessment ( P =0.002). Waist circumference, body mass index, and Food Frequency Questionnaire score were not significantly different but still improved. Findings from this study suggest that a peer-led support program can be introduced as an effective means of improving the behaviors of mostly sedentary factory workers at risk of MetS caused by working habits that are detrimental to health.

  4. Muscle-targeted hydrodynamic gene introduction of insulin-like growth factor-1 using polyplex nanomicelle to treat peripheral nerve injury.

    PubMed

    Nagata, Kazuya; Itaka, Keiji; Baba, Miyuki; Uchida, Satoshi; Ishii, Takehiko; Kataoka, Kazunori

    2014-06-10

    The recovery of neurologic function after peripheral nerve injury often remains incomplete because of the prolonged reinnervation process, which leads to skeletal muscle atrophy and articular contracture from disuse over time. To rescue the skeletal muscle and promote functional recovery, insulin-like growth factor-1 (IGF-1), a potent myogenic factor, was introduced into the muscle by hydrodynamic injection of IGF-1-expressing plasmid DNA using a biocompatible nonviral gene carrier, a polyplex nanomicelle. In a mouse model of sciatic nerve injury, the introduction of IGF-1 into the skeletal muscle of the paralyzed limb effectively alleviated a decrease in muscle weight compared with that in untreated control mice. Histologic analysis of the muscle revealed the IGF-1-expressing plasmid DNA (pDNA) to have a myogenic effect, inducing muscle hypertrophy with the upregulation of the myogenic regulatory factors, myogenin and MyoD. The evaluation of motor function by walking track analysis revealed that the group that received the hydrodynamic injection of IGF-1-expressing pDNA using the polyplex nanomicelle had significantly early recovery of motor function compared with groups receiving negative control pDNA and untreated controls. Early recovery of sensation in the distal area of sciatic nerve injury was also induced by the introduction of IGF-1-expressing pDNA, presumably because of the effect of secreted IGF-1 protein in the vicinity of the injured sciatic nerve exerting a synergistic effect with muscle hypertrophy, inducing a more favorable prognosis. This approach of introducing IGF-1 into skeletal muscle is promising for the treatment of peripheral nerve injury by promoting early motor function recovery. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Toward Genetics-Based Virus Taxonomy: Comparative Analysis of a Genetics-Based Classification and the Taxonomy of Picornaviruses

    PubMed Central

    Lauber, Chris

    2012-01-01

    Virus taxonomy has received little attention from the research community despite its broad relevance. In an accompanying paper (C. Lauber and A. E. Gorbalenya, J. Virol. 86:3890–3904, 2012), we have introduced a quantitative approach to hierarchically classify viruses of a family using pairwise evolutionary distances (PEDs) as a measure of genetic divergence. When applied to the six most conserved proteins of the Picornaviridae, it clustered 1,234 genome sequences in groups at three hierarchical levels (to which we refer as the “GENETIC classification”). In this study, we compare the GENETIC classification with the expert-based picornavirus taxonomy and outline differences in the underlying frameworks regarding the relation of virus groups and genetic diversity that represent, respectively, the structure and content of a classification. To facilitate the analysis, we introduce two novel diagrams. The first connects the genetic diversity of taxa to both the PED distribution and the phylogeny of picornaviruses. The second depicts a classification and the accommodated genetic diversity in a standardized manner. Generally, we found striking agreement between the two classifications on species and genus taxa. A few disagreements concern the species Human rhinovirus A and Human rhinovirus C and the genus Aphthovirus, which were split in the GENETIC classification. Furthermore, we propose a new supergenus level and universal, level-specific PED thresholds, not reached yet by many taxa. Since the species threshold is approached mostly by taxa with large sampling sizes and those infecting multiple hosts, it may represent an upper limit on divergence, beyond which homologous recombination in the six most conserved genes between two picornaviruses might not give viable progeny. PMID:22278238

  6. Evaluation of human papilloma virus (HPV) vaccination strategies and vaccination coverage in adolescent girls worldwide.

    PubMed

    Owsianka, Barbara; Gańczak, Maria

    2015-01-01

    An analysis of HPV vaccination strategies and vaccination coverage in adolescent girls worldwide for the last eight years with regard to potential improvement of vaccination coverage rates in Poland. Literature search, covering the period 2006-2014, was performed using Medline. Comparative analysis of HPV vaccination strategies and coverage between Poland and other countries worldwide was conducted. In the last eight years, a number of countries introduced HPV vaccination for adolescent girls to their national immunization programmes. Vaccination strategies differ, consequently affecting vaccination coverage, ranging from several percent to more than 90%. Usually, there are also disparities at national level. The highest HPV vaccination coverage rates are observed in countries where vaccines are administered in school settings and funded from the national budget. Poland is one of the eight EU countries where HPV vaccination has not been introduced to mandatory immunization programme and where paid vaccination is only provided in primary health care settings. HPV vaccination coverage in adolescent girls is estimated at 7.5-10%. Disparities in HPV vaccination coverage rates in adolescent girls worldwide may be due to different strategies of vaccination implementation between countries. Having compared to other countries, the low HPV vaccination coverage in Polish adolescent girls may result from the lack of funding at national level and the fact that vaccines are administered in a primary health care setting. A multidimensional approach, involving the engagement of primary health care and school personnel as well as financial assistance of government at national and local level and the implementation of media campaigns, particularly in regions with high incidence of cervical cancer, could result in an increase of HPV vaccination coverage rates in Poland.

  7. Hybrid Aorta Constructs via In Situ Crosslinking of Poly(glycerol-sebacate) Elastomer Within a Decellularized Matrix.

    PubMed

    Guler, Selcan; Hosseinian, Pezhman; Aydin, Halil Murat

    2017-01-01

    Decellularization of tissues and organs has high potential to obtain unique conformation and composition as native tissue structure but may result in weakened tissue mechanical strength. In this study, poly(glycerol-sebacate) (PGS) elastomers were combined with decellularized aorta fragments to investigate the changes in mechanical properties. PGS prepolymer was synthesized via microwave irradiation and then in situ crosslinked within the decellularized aorta extracellular matrix (ECM). Tensile strength (σ) values were found comparable as 0.44 ± 0.10 MPa and 0.57 ± 0.18 MPa for native and hybrid aorta samples, respectively, while elongation at break (ɛ) values were 261% ± 17%, 7.5% ± 0.57%, and 22.18% ± 2.48% for wet control (native), decellularized dried aortae, and hybrid matrices, showing elastic contribution. Young's modulus data indicate that there was a threefold decrease in stiffness compared to decellularized samples once PGS is introduced into the ECM structure. Scanning electron microscopy (SEM) analysis of hybrid grafts revealed that the construct preserves porosity in medial layer of the vessel. Biocompatibility analyses showed no cytotoxic effects on human abdominal aorta smooth muscle cells. Cell studies showed 98% activity in hybrid graft extracts. As a control, collagen coating of the hybrid grafts was performed in the recellularization stage. SEM analysis of recellularized hybrid grafts revealed that cells were attached to the surface of the hybrid graft and proliferated during the 14 days of culture in both groups. This study shows that introducing an elastomer into the native ECM structure following decellularization process can be a useful approach for the preparation of mechanically enhanced composites for soft tissues.

  8. Toward genetics-based virus taxonomy: comparative analysis of a genetics-based classification and the taxonomy of picornaviruses.

    PubMed

    Lauber, Chris; Gorbalenya, Alexander E

    2012-04-01

    Virus taxonomy has received little attention from the research community despite its broad relevance. In an accompanying paper (C. Lauber and A. E. Gorbalenya, J. Virol. 86:3890-3904, 2012), we have introduced a quantitative approach to hierarchically classify viruses of a family using pairwise evolutionary distances (PEDs) as a measure of genetic divergence. When applied to the six most conserved proteins of the Picornaviridae, it clustered 1,234 genome sequences in groups at three hierarchical levels (to which we refer as the "GENETIC classification"). In this study, we compare the GENETIC classification with the expert-based picornavirus taxonomy and outline differences in the underlying frameworks regarding the relation of virus groups and genetic diversity that represent, respectively, the structure and content of a classification. To facilitate the analysis, we introduce two novel diagrams. The first connects the genetic diversity of taxa to both the PED distribution and the phylogeny of picornaviruses. The second depicts a classification and the accommodated genetic diversity in a standardized manner. Generally, we found striking agreement between the two classifications on species and genus taxa. A few disagreements concern the species Human rhinovirus A and Human rhinovirus C and the genus Aphthovirus, which were split in the GENETIC classification. Furthermore, we propose a new supergenus level and universal, level-specific PED thresholds, not reached yet by many taxa. Since the species threshold is approached mostly by taxa with large sampling sizes and those infecting multiple hosts, it may represent an upper limit on divergence, beyond which homologous recombination in the six most conserved genes between two picornaviruses might not give viable progeny.

  9. Real-World Use of Apixaban for Stroke Prevention in Atrial Fibrillation: A Systematic Review and Meta-Analysis.

    PubMed

    Proietti, Marco; Romanazzi, Imma; Romiti, Giulio Francesco; Farcomeni, Alessio; Lip, Gregory Y H

    2018-01-01

    The use of oral anticoagulant therapy for stroke prevention in atrial fibrillation has been transformed by the availability of the nonvitamin K antagonist oral anticoagulants. Real-world studies on the use of nonvitamin K antagonist oral anticoagulants would help elucidate their effectiveness and safety in daily clinical practice. Apixaban was the third nonvitamin K antagonist oral anticoagulants introduced to clinical practice, and increasing real-world studies have been published. Our aim was to summarize current evidence about real-world studies on apixaban for stroke prevention in atrial fibrillation. We performed a systematic review and meta-analysis of all observational real-world studies comparing apixaban with other available oral anticoagulant drugs. From the original 9680 results retrieved, 16 studies have been included in the final meta-analysis. Compared with warfarin, apixaban regular dose was more effective in reducing any thromboembolic event (odds ratio: 0.77; 95% confidence interval: 0.64-0.93), but no significant difference was found for stroke risk. Apixaban was as effective as dabigatran and rivaroxaban in reducing thromboembolic events and stroke. The risk of major bleeding was significantly lower for apixaban compared with warfarin, dabigatran, and rivaroxaban (relative risk reduction, 38%, 35%, and 46%, respectively). Similarly, the risk for intracranial hemorrhage was significantly lower for apixaban than warfarin and rivaroxaban (46% and 54%, respectively) but not dabigatran. The risk of gastrointestinal bleeding was lower with apixaban when compared with all oral anticoagulant agents ( P <0.00001 for all comparisons). Use of apixaban in real-life is associated with an overall similar effectiveness in reducing stroke and any thromboembolic events when compared with warfarin. A better safety profile was found with apixaban compared with warfarin, dabigatran, and rivaroxaban. © 2017 American Heart Association, Inc.

  10. Biosimilar medicines and cost-effectiveness

    PubMed Central

    Simoens, Steven

    2011-01-01

    Given that biosimilars are agents that are similar but not identical to the reference biopharmaceutical, this study aims to introduce and describe specific issues related to the economic evaluation of biosimilars by focusing on the relative costs, relative effectiveness, and cost-effectiveness of biosimilars. Economic evaluation assesses the cost-effectiveness of a medicine by comparing the costs and outcomes of a medicine with those of a relevant comparator. The assessment of cost-effectiveness of a biosimilar is complicated by the fact that evidence needed to obtain marketing authorization from a registration authority does not always correspond to the data requirements of a reimbursement authority. In particular, this relates to the availability of adequately powered equivalence or noninferiority studies, the need for comparative data about the effectiveness in a real-world setting rather than the efficacy in a structured setting, and the use of health outcome measures instead of surrogate endpoints. As a biosimilar is likely to be less expensive than the comparator (eg, the reference biopharmaceutical), the assessment of the cost-effectiveness of a biosimilar depends on the relative effectiveness. If appropriately designed and powered clinical studies demonstrate equivalent effectiveness between a biosimilar and the comparator, then a cost-minimization analysis identifies the least expensive medicine. If there are differences in the effectiveness of a biosimilar and the comparator, other techniques of economic evaluation need to be employed, such as cost-effectiveness analysis or cost-utility analysis. Given that there may be uncertainty surrounding the long-term safety (ie, risk of immunogenicity and rare adverse events) and effectiveness of a biosimilar, the cost-effectiveness of a biosimilar needs to be calculated at multiple time points throughout the life cycle of the product. PMID:21935330

  11. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.

    PubMed

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.

  12. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations

    PubMed Central

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508

  13. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.

  14. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  15. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  16. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  17. Bayesian analysis of non-homogeneous Markov chains: application to mental health data.

    PubMed

    Sung, Minje; Soyer, Refik; Nhan, Nguyen

    2007-07-10

    In this paper we present a formal treatment of non-homogeneous Markov chains by introducing a hierarchical Bayesian framework. Our work is motivated by the analysis of correlated categorical data which arise in assessment of psychiatric treatment programs. In our development, we introduce a Markovian structure to describe the non-homogeneity of transition patterns. In doing so, we introduce a logistic regression set-up for Markov chains and incorporate covariates in our model. We present a Bayesian model using Markov chain Monte Carlo methods and develop inference procedures to address issues encountered in the analyses of data from psychiatric treatment programs. Our model and inference procedures are implemented to some real data from a psychiatric treatment study. Copyright 2006 John Wiley & Sons, Ltd.

  18. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  19. Evaluation of Event-Based Algorithms for Optical Flow with Ground-Truth from Inertial Measurement Sensor

    PubMed Central

    Rueckauer, Bodo; Delbruck, Tobi

    2016-01-01

    In this study we compare nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost. In contrast to conventional, frame-based motion flow algorithms, our open-source implementations compute optical flow based on address-events from a neuromorphic Dynamic Vision Sensor (DVS). For this benchmarking we created a dataset of two synthesized and three real samples recorded from a 240 × 180 pixel Dynamic and Active-pixel Vision Sensor (DAVIS). This dataset contains events from the DVS as well as conventional frames to support testing state-of-the-art frame-based methods. We introduce a new source for the ground truth: In the special case that the perceived motion stems solely from a rotation of the vision sensor around its three camera axes, the true optical flow can be estimated using gyro data from the inertial measurement unit integrated with the DAVIS camera. This provides a ground-truth to which we can compare algorithms that measure optical flow by means of motion cues. An analysis of error sources led to the use of a refractory period, more accurate numerical derivatives and a Savitzky-Golay filter to achieve significant improvements in accuracy. Our pure Java implementations of two recently published algorithms reduce computational cost by up to 29% compared to the original implementations. Two of the algorithms introduced in this paper further speed up processing by a factor of 10 compared with the original implementations, at equal or better accuracy. On a desktop PC, they run in real-time on dense natural input recorded by a DAVIS camera. PMID:27199639

  20. Evaluation of an educational "toolbox" for improving nursing staff competence and psychosocial work environment in elderly care: results of a prospective, non-randomized controlled intervention.

    PubMed

    Arnetz, J E; Hasson, H

    2007-07-01

    Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the psychosocial work environment improved in the municipality where the toolbox was introduced.

  1. Cost-effectiveness of pneumococcal conjugate vaccination in Georgia.

    PubMed

    Komakhidze, T; Hoestlandt, C; Dolakidze, T; Shakhnazarova, M; Chlikadze, R; Kopaleishvili, N; Goginashvili, K; Kherkheulidze, M; Clark, A D; Blau, J

    2015-05-07

    Financial support from the Global Alliance for Vaccines and Immunization (GAVI) to introduce the 10-valent pneumococcal conjugate vaccine (PCV10) into the routine childhood immunization schedule in Georgia is ending in 2015. As a result, the Interagency Coordination Committee (ICC) decided to carry out a cost-effectiveness analysis to gather additional evidence to advocate for an appropriate evidence-based decision after GAVI support is over. The study also aimed to strengthen national capacity to conduct cost-effectiveness studies, and to introduce economic evaluations into Georgia's decision-making process. A multidisciplinary team of national experts led by a member of the ICC carried out the analysis that compared two scenarios: introducing PCV10 vs no vaccination. The TRIVAC model was used to evaluate 10 cohorts of children over the period 2014-2023. National data was used to inform demographics, disease burden, vaccine coverage, health service utilization, and costs. Evidence from clinical trials and the scientific literature was used to estimate the impact of the vaccine. A 3+0 schedule and a vaccine price increasing to US$ 3.50 per dose was assumed for the base-case scenario. Alternative univariate and multivariate scenarios were evaluated. Over the 10-year period, PCV10 was estimated to prevent 7170 (8288 undiscounted) outpatient visits due to all-cause acute otitis media, 5325 (6154 undiscounted) admissions due to all-cause pneumonia, 87 (100 undiscounted) admissions due to pneumococcal meningitis, and 508 (588 undiscounted) admissions due to pneumococcal non-pneumonia and non-meningitis (NPNM). In addition, the vaccine was estimated to prevent 41 (48 undiscounted) deaths. This is equivalent to approximately 5 deaths and 700 admissions prevented each year in Georgia. Over the 10-year period, PCV10 would cost the government approximately US$ 4.4 million ($440,000 per year). However, about half of this would be offset by the treatment costs prevented. The discounted cost-effectiveness ratio was estimated to be US$ 1599 per DALY averted with scenarios ranging from US$ 286 to US$ 7787. This study led to better multi-sectoral collaboration and improved national capacity to perform economic evaluations. Routine infant vaccination against Streptococcus pneumoniae would be highly cost-effective in Georgia. The decision to introduce PCV10 was already made some time before the study was initiated but it provided important economic evidence in support of that decision. There are several uncertainties around many of the parameters used, but a multivariate scenario analysis with several conservative assumptions (including no herd effect in older individuals) shows that this recommendation is robust. This study supports the decision to introduce PCV10 in Georgia. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Hilbert-Huang Transform: A Spectral Analysis Tool Applied to Sunspot Number and Total Solar Irradiance Variations, as well as Near-Surface Atmospheric Variables

    NASA Astrophysics Data System (ADS)

    Barnhart, B. L.; Eichinger, W. E.; Prueger, J. H.

    2010-12-01

    Hilbert-Huang transform (HHT) is a relatively new data analysis tool which is used to analyze nonstationary and nonlinear time series data. It consists of an algorithm, called empirical mode decomposition (EMD), which extracts the cyclic components embedded within time series data, as well as Hilbert spectral analysis (HSA) which displays the time and frequency dependent energy contributions from each component in the form of a spectrogram. The method can be considered a generalized form of Fourier analysis which can describe the intrinsic cycles of data with basis functions whose amplitudes and phases may vary with time. The HHT will be introduced and compared to current spectral analysis tools such as Fourier analysis, short-time Fourier analysis, wavelet analysis and Wigner-Ville distributions. A number of applications are also presented which demonstrate the strengths and limitations of the tool, including analyzing sunspot number variability and total solar irradiance proxies as well as global averaged temperature and carbon dioxide concentration. Also, near-surface atmospheric quantities such as temperature and wind velocity are analyzed to demonstrate the nonstationarity of the atmosphere.

  3. Traditional cattle vs. introduced deer management in Chaco Serrano woodlands (Argentina): Analysis of environmental sustainability at increasing densities.

    PubMed

    Charro, José Luis; López-Sánchez, Aida; Perea, Ramón

    2018-01-15

    Wild ungulate populations have increased and expanded considerably in many regions, including austral woodlands and forests where deer (Cervus elaphus) have been introduced as an alternative management to traditional cattle grazing. In this study, we compared traditional cattle with introduced deer management at increasing deer densities in the "Chaco Serrano" woodlands of Argentina to assess their ecological sustainability. We used three ecological indicators (abundance of tree regeneration, woody plant diversity and browsing damage) as proxies for environmental sustainability in woody systems. Our results indicate that traditional cattle management, at stocking rates of ∼10 ind km -2 , was the most ecologically sustainable management since it allowed greater tree regeneration abundance, higher richness of woody species and lower browsing damage. Importantly, cattle management and deer management at low densities (10 ind km -2 ) showed no significant differences in species richness and abundance of seedlings, although deer caused greater browsing damage on saplings and juveniles. However, management regimes involving high deer densities (∼35 deer km 2 ) was highly unsustainable in comparison to low (∼10 deer km -2 ) and medium (∼20 deer km -2 ) densities, with 40% probability of unsustainable browsing as opposed to less than 5% probability at low and medium densities. In addition, high deer densities caused a strong reduction in tree regeneration, with a 19-30% reduction in the abundance of seedlings and young trees when compared to low deer densities. These results showed that the effect of increasing deer densities on woody plant conservation was not linear, with high deer densities causing a disproportional deleterious effect on tree regeneration and sustainable browsing. Our results suggest that traditional management at low densities or the use of introduced ungulates (deer breeding areas) at low-medium densities (<20 deer km -2 ) are compatible with woody vegetation conservation. However, further research is needed on plant palatability, animal habitat use (spatial heterogeneity) and species turnover and extinction (comparison to areas of low-null historical browsing) to better estimate environmental sustainability of Neotropical ungulate-dominated woodlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. BEHAVIORAL RESPONSES OF ANURAN LARVAE TO CHEMICAL CUES OF NATIVE AND INTRODUCED PREDATORS IN THE PACIFIC NORTHWESTERN UNITED STATES

    EPA Science Inventory

    We compared behavioral responses of larvae of three Pacific Northwest anurans from different hydroperiods to water borne cues of native and introduced predators. Two native anurans (Pacific Treefrog, Pseudacris regilla, and Northern Red-Legged Frog, Rana aurora aurora) and intro...

  5. Education, Markets and the Pedagogy of Personalisation

    ERIC Educational Resources Information Center

    Hartley, David

    2008-01-01

    The marketisation of education in England began in the 1980s. It was facilitated by national testing (which gave objective and comparable information to parents), and by the New Public Management (which introduced a posteriori funding and competition among providers). Now a new complementary phase of marketisation is being introduced:…

  6. Specificity of extrafloral nectar induction by herbivores differs among native and invasive populations of tallow tree

    USDA-ARS?s Scientific Manuscript database

    Invasive plants are released from specialist herbivores and encounter novel generalists in their introduced ranges so their defenses may vary between native and introduced ranges. But few studies have examined how constitutive and induced indirect defenses change during plant invasion. We compared t...

  7. User-oriented summary extraction for soccer video based on multimodal analysis

    NASA Astrophysics Data System (ADS)

    Liu, Huayong; Jiang, Shanshan; He, Tingting

    2011-11-01

    An advanced user-oriented summary extraction method for soccer video is proposed in this work. Firstly, an algorithm of user-oriented summary extraction for soccer video is introduced. A novel approach that integrates multimodal analysis, such as extraction and analysis of the stadium features, moving object features, audio features and text features is introduced. By these features the semantic of the soccer video and the highlight mode are obtained. Then we can find the highlight position and put them together by highlight degrees to obtain the video summary. The experimental results for sports video of world cup soccer games indicate that multimodal analysis is effective for soccer video browsing and retrieval.

  8. Latitudinal variation in cold hardiness in introduced Tamarix and native Populus

    USGS Publications Warehouse

    Friedman, Jonathan M.; Roelle, James E.; Gaskin, John F.; Pepper, Alan E.; Manhart, James R.

    2008-01-01

    To investigate the evolution of clinal variation in an invasive plant, we compared cold hardiness in the introduced saltcedar (Tamarix ramosissima, Tamarix chinensis, and hybrids) and the native plains cottonwood (Populus deltoidessubsp. monilifera). In a shadehouse in Colorado (41°N), we grew plants collected along a latitudinal gradient in the central United States (29–48°N). On 17 occasions between September 2005 and June 2006, we determined killing temperatures using freeze-induced electrolyte leakage and direct observation. In midwinter, cottonwood survived cooling to −70°C, while saltcedar was killed at −33 to −47°C. Frost sensitivity, therefore, may limit northward expansion of saltcedar in North America. Both species demonstrated inherited latitudinal variation in cold hardiness. For example, from September through January killing temperatures for saltcedar from 29.18°N were 5–21°C higher than those for saltcedar from 47.60°N, and on September 26 and October 11, killing temperatures for cottonwood from 33.06°N were >43°C higher than those for cottonwood from 47.60°N. Analysis of nine microsatellite loci showed that southern saltcedars are more closely related to T. chinensis while northern plants are more closely related to T. ramosissima. Hybridization may have introduced the genetic variability necessary for rapid evolution of the cline in saltcedar cold hardiness.

  9. Assessment of the Potential Impact and Cost-effectiveness of Self-Testing for HIV in Low-Income Countries

    PubMed Central

    Cambiano, Valentina; Ford, Deborah; Mabugu, Travor; Napierala Mavedzenge, Sue; Miners, Alec; Mugurungi, Owen; Nakagawa, Fumiyo; Revill, Paul; Phillips, Andrew

    2015-01-01

    Background Studies have demonstrated that self-testing for human immunodeficiency virus (HIV) is highly acceptable among individuals and could allow cost savings, compared with provider-delivered HIV testing and counseling (PHTC), although the longer-term population-level effects are uncertain. We evaluated the cost-effectiveness of introducing self-testing in 2015 over a 20-year time frame in a country such as Zimbabwe. Methods The HIV synthesis model was used. Two scenarios were considered. In the reference scenario, self-testing is not available, and the rate of first-time and repeat PHTC is assumed to increase from 2015 onward, in line with past trends. In the intervention scenario, self-testing is introduced at a unit cost of $3. Results We predict that the introduction of self-testing would lead to modest savings in healthcare costs of $75 million, while averting around 7000 disability-adjusted life-years over 20 years. Findings were robust to most variations in assumptions; however, higher cost of self-testing, lower linkage to care for people whose diagnosis is a consequence of a positive self-test result, and lower threshold for antiretroviral therapy eligibility criteria could lead to situations in which self-testing is not cost-effective. Conclusions This analysis suggests that introducing self-testing offers some health benefits and may well save costs. PMID:25767214

  10. Clinical high-resolution mapping of the proteoglycan-bound water fraction in articular cartilage of the human knee joint.

    PubMed

    Bouhrara, Mustapha; Reiter, David A; Sexton, Kyle W; Bergeron, Christopher M; Zukley, Linda M; Spencer, Richard G

    2017-11-01

    We applied our recently introduced Bayesian analytic method to achieve clinically-feasible in-vivo mapping of the proteoglycan water fraction (PgWF) of human knee cartilage with improved spatial resolution and stability as compared to existing methods. Multicomponent driven equilibrium single-pulse observation of T 1 and T 2 (mcDESPOT) datasets were acquired from the knees of two healthy young subjects and one older subject with previous knee injury. Each dataset was processed using Bayesian Monte Carlo (BMC) analysis incorporating a two-component tissue model. We assessed the performance and reproducibility of BMC and of the conventional analysis of stochastic region contraction (SRC) in the estimation of PgWF. Stability of the BMC analysis of PgWF was tested by comparing independent high-resolution (HR) datasets from each of the two young subjects. Unlike SRC, the BMC-derived maps from the two HR datasets were essentially identical. Furthermore, SRC maps showed substantial random variation in estimated PgWF, and mean values that differed from those obtained using BMC. In addition, PgWF maps derived from conventional low-resolution (LR) datasets exhibited partial volume and magnetic susceptibility effects. These artifacts were absent in HR PgWF images. Finally, our analysis showed regional variation in PgWF estimates, and substantially higher values in the younger subjects as compared to the older subject. BMC-mcDESPOT permits HR in-vivo mapping of PgWF in human knee cartilage in a clinically-feasible acquisition time. HR mapping reduces the impact of partial volume and magnetic susceptibility artifacts compared to LR mapping. Finally, BMC-mcDESPOT demonstrated excellent reproducibility in the determination of PgWF. Published by Elsevier Inc.

  11. Computer-Aided Recognition of Facial Attributes for Fetal Alcohol Spectrum Disorders.

    PubMed

    Valentine, Matthew; Bihm, Dustin C J; Wolf, Lior; Hoyme, H Eugene; May, Philip A; Buckley, David; Kalberg, Wendy; Abdul-Rahman, Omar A

    2017-12-01

    To compare the detection of facial attributes by computer-based facial recognition software of 2-D images against standard, manual examination in fetal alcohol spectrum disorders (FASD). Participants were gathered from the Fetal Alcohol Syndrome Epidemiology Research database. Standard frontal and oblique photographs of children were obtained during a manual, in-person dysmorphology assessment. Images were submitted for facial analysis conducted by the facial dysmorphology novel analysis technology (an automated system), which assesses ratios of measurements between various facial landmarks to determine the presence of dysmorphic features. Manual blinded dysmorphology assessments were compared with those obtained via the computer-aided system. Areas under the curve values for individual receiver-operating characteristic curves revealed the computer-aided system (0.88 ± 0.02) to be comparable to the manual method (0.86 ± 0.03) in detecting patients with FASD. Interestingly, cases of alcohol-related neurodevelopmental disorder (ARND) were identified more efficiently by the computer-aided system (0.84 ± 0.07) in comparison to the manual method (0.74 ± 0.04). A facial gestalt analysis of patients with ARND also identified more generalized facial findings compared to the cardinal facial features seen in more severe forms of FASD. We found there was an increased diagnostic accuracy for ARND via our computer-aided method. As this category has been historically difficult to diagnose, we believe our experiment demonstrates that facial dysmorphology novel analysis technology can potentially improve ARND diagnosis by introducing a standardized metric for recognizing FASD-associated facial anomalies. Earlier recognition of these patients will lead to earlier intervention with improved patient outcomes. Copyright © 2017 by the American Academy of Pediatrics.

  12. GenGIS 2: Geospatial Analysis of Traditional and Genetic Biodiversity, with New Gradient Algorithms and an Extensible Plugin Framework

    PubMed Central

    Parks, Donovan H.; Mankowski, Timothy; Zangooei, Somayyeh; Porter, Michael S.; Armanini, David G.; Baird, Donald J.; Langille, Morgan G. I.; Beiko, Robert G.

    2013-01-01

    GenGIS is free and open source software designed to integrate biodiversity data with a digital map and information about geography and habitat. While originally developed with microbial community analyses and phylogeography in mind, GenGIS has been applied to a wide range of datasets. A key feature of GenGIS is the ability to test geographic axes that can correspond to routes of migration or gradients that influence community similarity. Here we introduce GenGIS version 2, which extends the linear gradient tests introduced in the first version to allow comprehensive testing of all possible linear geographic axes. GenGIS v2 also includes a new plugin framework that supports the development and use of graphically driven analysis packages: initial plugins include implementations of linear regression and the Mantel test, calculations of alpha-diversity (e.g., Shannon Index) for all samples, and geographic visualizations of dissimilarity matrices. We have also implemented a recently published method for biomonitoring reference condition analysis (RCA), which compares observed species richness and diversity to predicted values to determine whether a given site has been impacted. The newest version of GenGIS supports vector data in addition to raster files. We demonstrate the new features of GenGIS by performing a full gradient analysis of an Australian kangaroo apple data set, by using plugins and embedded statistical commands to analyze human microbiome sample data, and by applying RCA to a set of samples from Atlantic Canada. GenGIS release versions, tutorials and documentation are freely available at http://kiwi.cs.dal.ca/GenGIS, and source code is available at https://github.com/beiko-lab/gengis. PMID:23922841

  13. Mutual dilution of infection by an introduced parasite in native and invasive stream fishes across Hawaii.

    PubMed

    Gagne, Roderick B; Heins, David C; McIntyre, Peter B; Gilliam, James F; Blum, Michael J

    2016-10-01

    The presence of introduced hosts can increase or decrease infections of co-introduced parasites in native species of conservation concern. In this study, we compared parasite abundance, intensity, and prevalence between native Awaous stamineus and introduced poeciliid fishes by a co-introduced nematode parasite (Camallanus cotti) in 42 watersheds across the Hawaiian Islands. We found that parasite abundance, intensity and prevalence were greater in native than introduced hosts. Parasite abundance, intensity and prevalence within A. stamineus varied between years, which largely reflected a transient spike in infection in three remote watersheds on Molokai. At each site we measured host factors (length, density of native host, density of introduced host) and environmental factors (per cent agricultural and urban land use, water chemistry, watershed area and precipitation) hypothesized to influence C. cotti abundance, intensity and prevalence. Factors associated with parasitism differed between native and introduced hosts. Notably, parasitism of native hosts was higher in streams with lower water quality, whereas parasitism of introduced hosts was lower in streams with lower water quality. We also found that parasite burdens were lower in both native and introduced hosts when coincident. Evidence of a mutual dilution effect indicates that introduced hosts can ameliorate parasitism of native fishes by co-introduced parasites, which raises questions about the value of remediation actions, such as the removal of introduced hosts, in stemming the rise of infectious disease in species of conservation concern.

  14. Comparison of methods applied in photoinduced transient spectroscopy to determining the defect center parameters: The correlation procedure and the signal analysis based on inverse Laplace transformation

    NASA Astrophysics Data System (ADS)

    Suproniuk, M.; Pawłowski, M.; Wierzbowski, M.; Majda-Zdancewicz, E.; Pawłowski, Ma.

    2018-04-01

    The procedure for determination of trap parameters by photo-induced transient spectroscopy is based on the Arrhenius plot that illustrates a thermal dependence of the emission rate. In this paper, we show that the Arrhenius plot obtained by the correlation method is shifted toward lower temperatures as compared to the one obtained with the inverse Laplace transformation. This shift is caused by the model adequacy error of the correlation method and introduces errors to a calculation procedure of defect center parameters. The effect is exemplified by comparing the results of the determination of trap parameters with both methods based on photocurrent transients for defect centers observed in tin-doped neutron-irradiated silicon crystals and in gallium arsenide grown with the Vertical Gradient Freeze method.

  15. The comparative effectiveness of conventional and digital image libraries.

    PubMed

    McColl, R I; Johnson, A

    2001-03-01

    Before introducing a hospital-wide image database to improve access, navigation and retrieval speed, a comparative study between a conventional slide library and a matching image database was undertaken to assess its relative benefits. Paired time trials and personal questionnaires revealed faster retrieval rates, higher image quality, and easier viewing for the pilot digital image database. Analysis of confidentiality, copyright and data protection exposed similar issues for both systems, thus concluding that the digital image database is a more effective library system. The authors suggest that in the future, medical images will be stored on large, professionally administered, centrally located file servers, allowing specialist image libraries to be tailored locally for individual users. The further integration of the database with web technology will enable cheap and efficient remote access for a wide range of users.

  16. Statistical primer: how to deal with missing data in scientific research?

    PubMed

    Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M

    2018-05-10

    Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.

  17. Microsatellite genetic diversity and differentiation of native and introduced grass carp populations in three continents

    USGS Publications Warehouse

    Chapman, Duane C.; Chen, Qin; Wang, Chenghui; Zhao, Jinlian; Lu, Guoqing; Zsigmond, Jeney; Li, Si-Fa

    2012-01-01

    Grass carp (Ctenopharyngodon idella), a freshwater species native to China, has been introduced to about 100 countries/regions and poses both biological and environmental challenges to the receiving ecosystems. In this study, we analyzed genetic variation in grass carp from three introduced river systems (Mississippi River Basin in US, Danube River in Hungary, and Tone River in Japan) as well as its native ranges (Yangtze, Pearl, and Amur Rivers) in China using 21 novel microsatellite loci. The allelic richness, observed heterozygosity, and within-population gene diversity were found to be lower in the introduced populations than in the native populations, presumably due to the small founder population size of the former. Significant genetic differentiation was found between all pairwise populations from different rivers. Both principal component analysis and Bayesian clustering analysis revealed obvious genetic distinction between the native and introduced populations. Interestingly, genetic bottlenecks were detected in the Hungarian and Japanese grass carp populations, but not in the North American population, suggesting that the Mississippi River Basin grass carp has experienced rapid population expansion with potential genetic diversification during the half-century since its introduction. Consequently, the combined forces of the founder effect, introduction history, and rapid population expansion help explaining the observed patterns of genetic diversity within and among both native and introduced populations of the grass carp.

  18. The introduction of the new dental contract in England - a baseline qualitative assessment.

    PubMed

    Milsom, K M; Threlfall, A; Pine, K; Tickle, M; Blinkhorn, A S; Kearney-Mitchell, P

    2008-01-26

    To record immediately prior to its inception the views of key stakeholders about the new dental contract introduced in April 2006. Nineteen participants (11 dental practice principals and eight primary care trust dental leads) were interviewed using a semi structured approach to find out their views and opinions about dental practice, the reasons for introducing the new dental contract, its implementation and content of the new dental contract. An analysis based upon the constant comparative method was used to identify the common themes about these topics. Practice principals expressed satisfaction with working under pilot Personal Dental Services schemes but there was a concern among dental leads about a fall in dental activity among some dentists. All participants believed the new contract was introduced for political, financial and management reasons. All participants believed that it was introduced to limit and control the dental budget. Participants felt that implementation of the contract was rushed and there was insufficient negotiation. There were also concerns that the contract had not been tested. Dental practitioners were concerned about the calculation and future administration of the unit of dental activity system, the fixing of the budget and the fairness of the new dental charge scheme. Dental leads were concerned about patient access and retention and recruitment of dentists under the new contract. The study found a number of reasons for unease about the new dental contract; it was not perceived as being necessary, it was implemented at speed with insufficient negotiation and it was seen as being untested. Numerous and varied problems were foreseen, the most important being the retention of dentists within the NHS. Participants felt the contract was introduced for financial, political and managerial reasons rather than improving patient care. The initial high uptake of the new dental contract should not be viewed as indicating a high level of approval of its content.

  19. Sociology and comparative education: A comparative educationist's perspective

    NASA Astrophysics Data System (ADS)

    Holmes, Brian

    1981-12-01

    The thesis examined in this paper is that comparative educationists, far from following intellectual trends, have introduced new dimensions to methods of enquiry developed first by historians, then by social scientists and, finally and perhaps marginally, by conceptual philosophers. In so far as comparisons can be made, it is asserted that, in the twentieth century, paradigms in comparative education reflect revolutions in the natural sciences, and that to the extent that these preceded shifts in social science paradigms after about 1900, comparative educationists debated and rejected positivism before sociologists did, in the Anglo-Saxon world at least. A second assumption which is examined is that comparative educationists have anticipated issues which subsequently became important in the growth of national and parochial research in sociology, and for that matter in political science. A disclaimer is necessary: little attempt has been made to examine the causal influences in either field or between comparative educationists and sociologists. The history of such interaction at a personal and institutional level is very recent, the participants in the interchanges are, in many cases, still alive, and a substantive recent history of comparative education has yet to be written, although to some extent E. Shils' analysis of the history of sociological enquiry provides a schema against which the growth of comparative education can be compared.

  20. DICKE’S SUPERRADIANCE IN ASTROPHYSICS. I. THE 21 cm LINE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajabi, Fereshteh; Houde, Martin

    We have applied the concept of superradiance introduced by Dicke in 1954 to astrophysics by extending the corresponding analysis to the magnetic dipole interaction characterizing the atomic hydrogen 21 cm line. Although it is unlikely that superradiance could take place in thermally relaxed regions and that the lack of observational evidence of masers for this transition reduces the probability of detecting superradiance, in situations where the conditions necessary for superradiance are met (close atomic spacing, high velocity coherence, population inversion, and long dephasing timescales compared to those related to coherent behavior), our results suggest that relatively low levels of populationmore » inversion over short astronomical length-scales (e.g., as compared to those required for maser amplification) can lead to the cooperative behavior required for superradiance in the interstellar medium. Given the results of our analysis, we expect the observational properties of 21 cm superradiance to be characterized by the emission of high-intensity, spatially compact, burst-like features potentially taking place over short periods ranging from minutes to days.« less

  1. Statistical study on the variations of OH and O2 rotational temperatures observed by SATI at King Sejong Station (62.22S, 58.78W), Antarctica

    NASA Astrophysics Data System (ADS)

    Kim, J.; Kim, J. H.; Jee, G.; Lee, C.; Kim, Y.

    2017-12-01

    Spectral Airglow Temperature Imager (SATI) installed at King Sejong Station (62.22S, 58.78W), Antarctica, has been continuously measured the airglow emissions from OH (6-2) Meinel and O2 (0-1) atmospheric bands since 2002, in order to investigate the dynamics of the polar MLT region. The measurements allow us to derive the rotational temperature at peak emission heights known as about 87 km and 94 km for OH and O2 airglows, respectively. In this study, we briefly introduce improved analysis technique that modified original analysis code. The major change compared to original program is the improvement of the function to find the exact center position in the observed image. In addition to brief introduction of the improved technique, we also present the results statistically investigating the periodic variations on the temperatures of two layers during the period of 2002 through 2011 and compare our results with those from the temperatures measured by satellite.

  2. Vulnerability analysis and critical areas identification of the power systems under terrorist attacks

    NASA Astrophysics Data System (ADS)

    Wang, Shuliang; Zhang, Jianhua; Zhao, Mingwei; Min, Xu

    2017-05-01

    This paper takes central China power grid (CCPG) as an example, and analyzes the vulnerability of the power systems under terrorist attacks. To simulate the intelligence of terrorist attacks, a method of critical attack area identification according to community structures is introduced. Meanwhile, three types of vulnerability models and the corresponding vulnerability metrics are given for comparative analysis. On this basis, influence of terrorist attacks on different critical areas is studied. Identifying the vulnerability of different critical areas will be conducted. At the same time, vulnerabilities of critical areas under different tolerance parameters and different vulnerability models are acquired and compared. Results show that only a few number of vertex disruptions may cause some critical areas collapse completely, they can generate great performance losses the whole systems. Further more, the variation of vulnerability values under different scenarios is very large. Critical areas which can cause greater damage under terrorist attacks should be given priority of protection to reduce vulnerability. The proposed method can be applied to analyze the vulnerability of other infrastructure systems, they can help decision makers search mitigation action and optimum protection strategy.

  3. Analysis and measurement of electromagnetic scattering by pyramidal and wedge absorbers

    NASA Technical Reports Server (NTRS)

    Dewitt, B. T.; Burnside, Walter D.

    1986-01-01

    By modifying the reflection coefficients in the Uniform Geometrical Theory of Diffraction a solution that approximates the scattering from a dielectric wedge is found. This solution agrees closely with the exact solution of Rawlins which is only valid for a few minor cases. This modification is then applied to the corner diffraction coefficient and combined with an equivalent current and geometrical optics solutions to model scattering from pyramid and wedge absorbers. Measured results from 12 inch pyramid absorbers from 2 to 18 GHz are compared to calculations assuming the returns add incoherently and assuming the returns add coherently. The measured results tend to be between the two curves. Measured results from the 8 inch wedge absorber are also compared to calculations with the return being dominated by the wedge diffraction. The procedures for measuring and specifying absorber performance are discussed and calibration equations are derived to calculate a reflection coefficient or a reflectivity using a reference sphere. Shaping changes to the present absorber designs are introduced to improve performance based on both high and low frequency analysis. Some prototypes were built and tested.

  4. Deriving Quantitative Dynamics Information for Proteins and RNAs using ROTDIF with a Graphical User Interface

    PubMed Central

    Berlin, Konstantin; Longhini, Andrew; Dayie, T. Kwaku; Fushman, David

    2013-01-01

    To facilitate rigorous analysis of molecular motions in proteins, DNA, and RNA, we present a new version of ROTDIF, a program for determining the overall rotational diffusion tensor from single-or multiple-field Nuclear Magnetic Resonance (NMR) relaxation data. We introduce four major features that expand the program’s versatility and usability. The first feature is the ability to analyze, separately or together, 13C and/or 15N relaxation data collected at a single or multiple fields. A significant improvement in the accuracy compared to direct analysis of R2/R1 ratios, especially critical for analysis of 13C relaxation data, is achieved by subtracting high-frequency contributions to relaxation rates. The second new feature is an improved method for computing the rotational diffusion tensor in the presence of biased errors, such as large conformational exchange contributions, that significantly enhances the accuracy of the computation. The third new feature is the integration of the domain alignment and docking module for relaxation-based structure determination of multi-domain systems. Finally, to improve accessibility to all the program features, we introduced a graphical user interface (GUI) that simplifies and speeds up the analysis of the data. Written in Java, the new ROTDIF can run on virtually any computer platform. In addition, the new ROTDIF achieves an order of magnitude speedup over the previous version by implementing a more efficient deterministic minimization algorithm. We not only demonstrate the improvement in accuracy and speed of the new algorithm for synthetic and experimental 13C and 15N relaxation data for several proteins and nucleic acids, but also show that careful analysis required especially for characterizing RNA dynamics allowed us to uncover subtle conformational changes in RNA as a function of temperature that were opaque to previous analysis. PMID:24170368

  5. Numerical methods for the inverse problem of density functional theory

    DOE PAGES

    Jensen, Daniel S.; Wasserman, Adam

    2017-07-17

    Here, the inverse problem of Kohn–Sham density functional theory (DFT) is often solved in an effort to benchmark and design approximate exchange-correlation potentials. The forward and inverse problems of DFT rely on the same equations but the numerical methods for solving each problem are substantially different. We examine both problems in this tutorial with a special emphasis on the algorithms and error analysis needed for solving the inverse problem. Two inversion methods based on partial differential equation constrained optimization and constrained variational ideas are introduced. We compare and contrast several different inversion methods applied to one-dimensional finite and periodic modelmore » systems.« less

  6. Epidemic spreading and global stability of an SIS model with an infective vector on complex networks

    NASA Astrophysics Data System (ADS)

    Kang, Huiyan; Fu, Xinchu

    2015-10-01

    In this paper, we present a new SIS model with delay on scale-free networks. The model is suitable to describe some epidemics which are not only transmitted by a vector but also spread between individuals by direct contacts. In view of the biological relevance and real spreading process, we introduce a delay to denote average incubation period of disease in a vector. By mathematical analysis, we obtain the epidemic threshold and prove the global stability of equilibria. The simulation shows the delay will effect the epidemic spreading. Finally, we investigate and compare two major immunization strategies, uniform immunization and targeted immunization.

  7. Monitoring by forward scatter radar techniques: an improved second-order analytical model

    NASA Astrophysics Data System (ADS)

    Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco

    2017-10-01

    In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.

  8. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    NASA Astrophysics Data System (ADS)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  9. AEROFROSH: a shock condition calculator for multi-component fuel aerosol-laden flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Matthew Frederick; Haylett, D. R.; Davidson, D. F.

    Here, this paper introduces an algorithm that determines the thermodynamic conditions behind incident and reflectedshocksinaerosol-ladenflows.Importantly,the algorithm accounts for the effects of droplet evaporation on post-shock properties. Additionally, this article describes an algorithm for resolving the effects of multiple-component- fuel droplets. This article presents the solution methodology and compares the results to those of another similar shock calculator. It also provides examples to show the impact of droplets on post-shock properties and the impact that multi-component fuel droplets have on shock experimental parameters. Finally, this paper presents a detailed uncertainty analysis of this algorithm’s calculations given typical exper- imental uncertainties

  10. Analysis of Size Correlations for Microdroplets Produced by Ultrasonic Atomization

    PubMed Central

    Barba, Anna Angela; d'Amore, Matteo

    2013-01-01

    Microencapsulation techniques are widely applied in the field of pharmaceutical production to control drugs release in time and in physiological environments. Ultrasonic-assisted atomization is a new technique to produce microencapsulated systems by a mechanical approach. Interest in this technique is due to the advantages evidenceable (low level of mechanical stress in materials, reduced energy request, reduced apparatuses size) when comparing it to more conventional techniques. In this paper, the groundwork of atomization is introduced, the role of relevant parameters in ultrasonic atomization mechanism is discussed, and correlations to predict droplets size starting from process parameters and material properties are presented and tested. PMID:24501580

  11. An anthropometric survey using digital photogrammetry: a case study in Recife, Pernambuco, Brazil.

    PubMed

    Barros, Bruno; Soares, Marcelo

    2012-01-01

    This study was carried out in a partnership with the Federal University of Pernambuco and the Faculty of Human Motricity of the Technical University in Lisbon (Portugal). The aim of the study was the measurement of human body segments throughout the digital photogramety, comparing and analysing data into Recife sample and to validate the Digital System as anthropometric survey tool. The result of the analysis has introduced: Data from the sample; Data by age; Data by Sex; Data by ethnicity; Data by region of birth; Difference of data from population individually. Besides to prove the true efficiency of the software.

  12. Validation of thermal effects of LED package by using Elmer finite element simulation method

    NASA Astrophysics Data System (ADS)

    Leng, Lai Siang; Retnasamy, Vithyacharan; Mohamad Shahimin, Mukhzeer; Sauli, Zaliman; Taniselass, Steven; Bin Ab Aziz, Muhamad Hafiz; Vairavan, Rajendaran; Kirtsaeng, Supap

    2017-02-01

    The overall performance of the Light-emitting diode, LED package is critically affected by the heat attribution. In this study, open source software - Elmer FEM has been utilized to study the thermal analysis of the LED package. In order to perform a complete simulation study, both Salome software and ParaView software were introduced as Pre and Postprocessor. The thermal effect of the LED package was evaluated by this software. The result has been validated with commercially licensed software based on previous work. The percentage difference from both simulation results is less than 5% which is tolerable and comparable.

  13. Analysis of turbojet-engine controls for afterburning starting

    NASA Technical Reports Server (NTRS)

    Phillips, W E , Jr

    1956-01-01

    A simulation procedure is developed for studying the effects of an afterburner start on a controlled turbojet engine. The afterburner start is represented by introducing a step decrease in the effective exhaust-nozzle area, after which the control returns the controlled engine variables to their initial values. The degree and speed with which the control acts are a measure of the effectiveness of the particular control system. Data are presented from five systems investigated using an electronic analog computer and the developed simulation procedure. These systems are compared with respect to steady-state errors, speed of response, and transient deviations of the system variables.

  14. Plate falling in a fluid: Regular and chaotic dynamics of finite-dimensional models

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Sergey P.

    2015-05-01

    Results are reviewed concerning the planar problem of a plate falling in a resisting medium studied with models based on ordinary differential equations for a small number of dynamical variables. A unified model is introduced to conduct a comparative analysis of the dynamical behaviors of models of Kozlov, Tanabe-Kaneko, Belmonte-Eisenberg-Moses and Andersen-Pesavento-Wang using common dimensionless variables and parameters. It is shown that the overall structure of the parameter spaces for the different models manifests certain similarities caused by the same inherent symmetry and by the universal nature of the phenomena involved in nonlinear dynamics (fixed points, limit cycles, attractors, and bifurcations).

  15. Evaluation of Market Design Agents: The Mertacor Perspective

    NASA Astrophysics Data System (ADS)

    Stavrogiannis, Lampros C.; Mitkas, Pericles A.

    The annual Trading Agent Competition for Market Design, CAT, provides a testbed to study the mechanisms that modern stock exchanges use in their effort to attract potential traders while maximizing their profit. This paper presents an evaluation of the agents that participated in the 2008 competition. The evaluation is based on the analysis of the CAT finals as well as on the results obtained from post-tournament experiments. We present Mertacor, our entrant for 2008, and compare it with the other available agents. In addition, we introduce a simple yet effective way of computing the global competitive equilibrium that Mertacor utilizes and discuss its importance for the game.

  16. AEROFROSH: a shock condition calculator for multi-component fuel aerosol-laden flows

    DOE PAGES

    Campbell, Matthew Frederick; Haylett, D. R.; Davidson, D. F.; ...

    2015-08-18

    Here, this paper introduces an algorithm that determines the thermodynamic conditions behind incident and reflectedshocksinaerosol-ladenflows.Importantly,the algorithm accounts for the effects of droplet evaporation on post-shock properties. Additionally, this article describes an algorithm for resolving the effects of multiple-component- fuel droplets. This article presents the solution methodology and compares the results to those of another similar shock calculator. It also provides examples to show the impact of droplets on post-shock properties and the impact that multi-component fuel droplets have on shock experimental parameters. Finally, this paper presents a detailed uncertainty analysis of this algorithm’s calculations given typical exper- imental uncertainties

  17. Muzzle flash issues related to the Waco FLIR analysis

    NASA Astrophysics Data System (ADS)

    Grant, Barbara G.; Hardy, David T.

    2001-09-01

    The controversy surrounding the origin of flashes on the Mt. Carmel FLIR videotape acquired on April 19, 1993, is introduced. The characteristics of muzzle flash are reviewed. A comparative weapons description is offered. The temporal, spatial, and radiance characteristics of thermal infrared muzzle flash are addressed. Data acquired from a field experiment are presented. The authors conclude that the spatial characteristics of muzzle flash enable its detection by equipment such as the FLIR in use at Mt. Carmel on April 19, 1993; that while flashes obtained in the field appear highly radiant, measurements are necessary to quantify their values; and that the temporal behavior of muzzle flash deserves further study.

  18. The case for introducing pre-registered confirmatory pharmacological pre-clinical studies.

    PubMed

    Kiwanuka, Olivia; Bellander, Bo-Michael; Hånell, Anders

    2018-05-01

    When evaluating the design of pre-clinical studies in the field of traumatic brain injury, we found substantial differences compared to phase III clinical trials, which in part may explain the difficulties in translating promising experimental drugs into approved treatments. By using network analysis, we also found cases where a large proportion of the studies evaluating a pre-clinical treatment was performed by inter-related researchers, which is potentially problematic. Subjecting all pre-clinical trials to the rigor of a phase III clinical trial is, however, likely not practically achievable. Instead, we repeat the call for a distinction to be made between exploratory and confirmatory pre-clinical studies.

  19. Numerical methods for the inverse problem of density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Daniel S.; Wasserman, Adam

    Here, the inverse problem of Kohn–Sham density functional theory (DFT) is often solved in an effort to benchmark and design approximate exchange-correlation potentials. The forward and inverse problems of DFT rely on the same equations but the numerical methods for solving each problem are substantially different. We examine both problems in this tutorial with a special emphasis on the algorithms and error analysis needed for solving the inverse problem. Two inversion methods based on partial differential equation constrained optimization and constrained variational ideas are introduced. We compare and contrast several different inversion methods applied to one-dimensional finite and periodic modelmore » systems.« less

  20. An experimental investigation of Iosipescu specimen for composite materials

    NASA Technical Reports Server (NTRS)

    Ho, H.; Tsai, M. Y.; Morton, J.; Farley, G. L.

    1991-01-01

    A detailed experimental evaluation of the Iosipescu specimen tested in the modified Wyoming fixture is presented. Moire interferometry is employed to determine the deformation of unidirectional and cross-ply graphite-epoxy specimens. The results of the moire experiments are compared to those from the traditional strain-gage method. It is shown that the strain-gage readings from one surface of a specimen together with corresponding data from moire interferometry on the opposite face documented an extreme sensitivity of some fiber orientations to twisting. A localized hybrid analysis is introduced to perform efficient reduction of moire data, producing whole-field strain distributions in the specimen test sections.

Top