Sample records for single methodological approach

  1. Gender and Leadership Styles in Single-Sex Academic Institutions

    ERIC Educational Resources Information Center

    Taleb, Hanan M.

    2010-01-01

    Purpose: This paper aims to investigate the relationship between gender and female leadership styles in a single-sex academic institution in Saudi Arabia. Design/methodology/approach: Essentially, a qualitative research approach that utilised a single case-study methodology was adopted. As part of this research, seven in-depth semi-structured…

  2. MBO: A Rational Approach and a Comparative Frameworks Approach

    ERIC Educational Resources Information Center

    Harries, T. W.

    1974-01-01

    Considering an organizational phenomenon from more than one theoretical perspective may be more fruitful than using a single rational approach. There is a danger that the restriction of information generation caused by the single approach may produce a false certainty engendered in part through the methodology itself. (Author/WM)

  3. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  4. Single-case research design in pediatric psychology: considerations regarding data analysis.

    PubMed

    Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E

    2014-03-01

    Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.

  5. Evaluating Change in Behavioral Preferences: Multidimensional Scaling Single-Ideal Point Model

    ERIC Educational Resources Information Center

    Ding, Cody

    2016-01-01

    The purpose of the article is to propose a multidimensional scaling single-ideal point model as a method to evaluate changes in individuals' preferences under the explicit methodological framework of behavioral preference assessment. One example is used to illustrate the approach for a clear idea of what this approach can accomplish.

  6. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    PubMed

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Exploring Academics' Approaches to Managing Team Assessment

    ERIC Educational Resources Information Center

    Augar, Naomi; Woodley, Carolyn J.; Whitefield, Despina; Winchester, Maxwell

    2016-01-01

    Purpose: The purpose of this paper is to develop an understanding of academics' approaches to managing team assessment at an Australian University with a view to informing policy development and assessment design. Design/methodology/approach: The research was conducted using a single exploratory case study approach focussing on the team assessment…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Dirk C.; Deceglie, Michael G.; Kurtz, Sarah R.

    What is the best method to determine long-term PV system performance and degradation rates? Ideally, one universally applicable methodology would be desirable so that a single number could be derived. However, data sets vary in their attributes and evidence is presented that defining two methodologies may be preferable. Monte Carlo simulations of artificial performance data allowed investigation of different methodologies and their respective confidence intervals. Tradeoffs between different approaches were delineated, elucidating as to why two separate approaches may need to be included in a standard. Regression approaches tend to be preferable when data sets are less contaminated by seasonality,more » noise and occurrence of outliers although robust regression can significantly improve the accuracy when outliers are present. In the presence of outliers, marked seasonality, or strong soiling events, year-on-year approaches tend to outperform regression approaches.« less

  9. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  10. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    PubMed

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  11. The future is now: single-cell genomics of bacteria and archaea

    PubMed Central

    Blainey, Paul C.

    2013-01-01

    Interest in the expanding catalog of uncultivated microorganisms, increasing recognition of heterogeneity among seemingly similar cells, and technological advances in whole-genome amplification and single-cell manipulation are driving considerable progress in single-cell genomics. Here, the spectrum of applications for single-cell genomics, key advances in the development of the field, and emerging methodology for single-cell genome sequencing are reviewed by example with attention to the diversity of approaches and their unique characteristics. Experimental strategies transcending specific methodologies are identified and organized as a road map for future studies in single-cell genomics of environmental microorganisms. Over the next decade, increasingly powerful tools for single-cell genome sequencing and analysis will play key roles in accessing the genomes of uncultivated organisms, determining the basis of microbial community functions, and fundamental aspects of microbial population biology. PMID:23298390

  12. A methodology for least-squares local quasi-geoid modelling using a noisy satellite-only gravity field model

    NASA Astrophysics Data System (ADS)

    Klees, R.; Slobbe, D. C.; Farahani, H. H.

    2018-04-01

    The paper is about a methodology to combine a noisy satellite-only global gravity field model (GGM) with other noisy datasets to estimate a local quasi-geoid model using weighted least-squares techniques. In this way, we attempt to improve the quality of the estimated quasi-geoid model and to complement it with a full noise covariance matrix for quality control and further data processing. The methodology goes beyond the classical remove-compute-restore approach, which does not account for the noise in the satellite-only GGM. We suggest and analyse three different approaches of data combination. Two of them are based on a local single-scale spherical radial basis function (SRBF) model of the disturbing potential, and one is based on a two-scale SRBF model. Using numerical experiments, we show that a single-scale SRBF model does not fully exploit the information in the satellite-only GGM. We explain this by a lack of flexibility of a single-scale SRBF model to deal with datasets of significantly different bandwidths. The two-scale SRBF model performs well in this respect, provided that the model coefficients representing the two scales are estimated separately. The corresponding methodology is developed in this paper. Using the statistics of the least-squares residuals and the statistics of the errors in the estimated two-scale quasi-geoid model, we demonstrate that the developed methodology provides a two-scale quasi-geoid model, which exploits the information in all datasets.

  13. Implementing a Social-Ecological Model of Health in Wales

    ERIC Educational Resources Information Center

    Rothwell, Heather; Shepherd, Michael; Murphy, Simon; Burgess, Stephen; Townsend, Nick; Pimm, Claire

    2010-01-01

    Purpose: The purpose of this paper is to assess the implementation of the Welsh Network of Healthy School Schemes (WNHSS) at national, local and school levels, using a systems approach drawing on the Ottawa Charter. Design/methodology/approach: The approach takes the form of a single-case study using data from a documentary analysis, interviews…

  14. Applying Generalizability Theory To Evaluate Treatment Effect in Single-Subject Research.

    ERIC Educational Resources Information Center

    Lefebvre, Daniel J.; Suen, Hoi K.

    An empirical investigation of methodological issues associated with evaluating treatment effect in single-subject research (SSR) designs is presented. This investigation: (1) conducted a generalizability (G) study to identify the sources of systematic and random measurement error (SRME); (2) used an analytic approach based on G theory to integrate…

  15. Novel method for enumeration of viable Lactobacillus plantarum WCFS1 cells after single-droplet drying.

    PubMed

    Perdana, Jimmy; Bereschenko, Ludmila; Roghair, Mark; Fox, Martijn B; Boom, Remko M; Kleerebezem, Michiel; Schutyser, Maarten A I

    2012-11-01

    Survival of probiotic bacteria during drying is not trivial. Survival percentages are very specific for each probiotic strain and can be improved by careful selection of drying conditions and proper drying carrier formulation. An experimental approach is presented, comprising a single-droplet drying method and a subsequent novel screening methodology, to assess the microbial viability within single particles. The drying method involves the drying of a single droplet deposited on a flat, hydrophobic surface under well-defined drying conditions and carrier formulations. Semidried or dried particles were subjected to rehydration, fluorescence staining, and live/dead enumeration using fluorescence microscopy. The novel screening methodology provided accurate survival percentages in line with conventional plating enumeration and was evaluated in single-droplet drying experiments with Lactobacillus plantarum WCFS1 as a model probiotic strain. Parameters such as bulk air temperatures and the carrier matrices (glucose, trehalose, and maltodextrin DE 6) were varied. Following the experimental approach, the influence on the viability as a function of the drying history could be monitored. Finally, the applicability of the novel viability assessment was demonstrated for samples obtained from drying experiments at a larger scale.

  16. Novel Method for Enumeration of Viable Lactobacillus plantarum WCFS1 Cells after Single-Droplet Drying

    PubMed Central

    Perdana, Jimmy; Bereschenko, Ludmila; Roghair, Mark; Fox, Martijn B.; Boom, Remko M.; Kleerebezem, Michiel

    2012-01-01

    Survival of probiotic bacteria during drying is not trivial. Survival percentages are very specific for each probiotic strain and can be improved by careful selection of drying conditions and proper drying carrier formulation. An experimental approach is presented, comprising a single-droplet drying method and a subsequent novel screening methodology, to assess the microbial viability within single particles. The drying method involves the drying of a single droplet deposited on a flat, hydrophobic surface under well-defined drying conditions and carrier formulations. Semidried or dried particles were subjected to rehydration, fluorescence staining, and live/dead enumeration using fluorescence microscopy. The novel screening methodology provided accurate survival percentages in line with conventional plating enumeration and was evaluated in single-droplet drying experiments with Lactobacillus plantarum WCFS1 as a model probiotic strain. Parameters such as bulk air temperatures and the carrier matrices (glucose, trehalose, and maltodextrin DE 6) were varied. Following the experimental approach, the influence on the viability as a function of the drying history could be monitored. Finally, the applicability of the novel viability assessment was demonstrated for samples obtained from drying experiments at a larger scale. PMID:22983965

  17. A Descent Rate Control Approach to Developing an Autonomous Descent Vehicle

    NASA Astrophysics Data System (ADS)

    Fields, Travis D.

    Circular parachutes have been used for aerial payload/personnel deliveries for over 100 years. In the past two decades, significant work has been done to improve the landing accuracies of cargo deliveries for humanitarian and military applications. This dissertation discusses the approach developed in which a circular parachute is used in conjunction with an electro-mechanical reefing system to manipulate the landing location. Rather than attempt to steer the autonomous descent vehicle directly, control of the landing location is accomplished by modifying the amount of time spent in a particular wind layer. Descent rate control is performed by reversibly reefing the parachute canopy. The first stage of the research investigated the use of a single actuation during descent (with periodic updates), in conjunction with a curvilinear target. Simulation results using real-world wind data are presented, illustrating the utility of the methodology developed. Additionally, hardware development and flight-testing of the single actuation autonomous descent vehicle are presented. The next phase of the research focuses on expanding the single actuation descent rate control methodology to incorporate a multi-actuation path-planning system. By modifying the parachute size throughout the descent, the controllability of the system greatly increases. The trajectory planning methodology developed provides a robust approach to accurately manipulate the landing location of the vehicle. The primary benefits of this system are the inherent robustness to release location errors and the ability to overcome vehicle uncertainties (mass, parachute size, etc.). A separate application of the path-planning methodology is also presented. An in-flight path-prediction system was developed for use in high-altitude ballooning by utilizing the path-planning methodology developed for descent vehicles. The developed onboard system improves landing location predictions in-flight using collected flight information during the ascent and descent. Simulation and real-world flight tests (using the developed low-cost hardware) demonstrate the significance of the improvements achievable when flying the developed system.

  18. On multi-site damage identification using single-site training data

    NASA Astrophysics Data System (ADS)

    Barthorpe, R. J.; Manson, G.; Worden, K.

    2017-11-01

    This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.

  19. Proportional-delayed controllers design for LTI-systems: a geometric approach

    NASA Astrophysics Data System (ADS)

    Hernández-Díez, J.-E.; Méndez-Barrios, C.-F.; Mondié, S.; Niculescu, S.-I.; González-Galván, E. J.

    2018-04-01

    This paper focuses on the design of P-δ controllers for single-input-single-output linear time-invariant systems. The basis of this work is a geometric approach allowing to partitioning the parameter space in regions with constant number of unstable roots. This methodology defines the hyper-planes separating the aforementioned regions and characterises the way in which the number of unstable roots changes when crossing such a hyper-plane. The main contribution of the paper is that it provides an explicit tool to find P-δ gains ensuring the stability of the closed-loop system. In addition, the proposed methodology allows to design a non-fragile controller with a desired exponential decay rate σ. Several numerical examples illustrate the results and a haptic experimental set-up shows the effectiveness of P-δ controllers.

  20. Curve of Factors Model: A Latent Growth Modeling Approach for Educational Research

    ERIC Educational Resources Information Center

    Isiordia, Marilu; Ferrer, Emilio

    2018-01-01

    A first-order latent growth model assesses change in an unobserved construct from a single score and is commonly used across different domains of educational research. However, examining change using a set of multiple response scores (e.g., scale items) affords researchers several methodological benefits not possible when using a single score. A…

  1. Redundancy and Novelty Mining in the Business Blogosphere

    ERIC Educational Resources Information Center

    Tsai, Flora S.; Chan, Kap Luk

    2010-01-01

    Purpose: The paper aims to explore the performance of redundancy and novelty mining in the business blogosphere, which has not been studied before. Design/methodology/approach: Novelty mining techniques are implemented to single out novel information out of a massive set of text documents. This paper adopted the mixed metric approach which…

  2. Reading Out Single-Molecule Digital RNA and DNA Isothermal Amplification in Nanoliter Volumes with Unmodified Camera Phones

    PubMed Central

    2016-01-01

    Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709

  3. Single Cell Multi-Omics Technology: Methodology and Application.

    PubMed

    Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying

    2018-01-01

    In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions.

  4. Single Cell Multi-Omics Technology: Methodology and Application

    PubMed Central

    Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying

    2018-01-01

    In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions. PMID:29732369

  5. 76 FR 36092 - Antidumping Methodologies in Proceedings Involving Non-Market Economies: Valuing the Factor of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-21

    ...This notice addresses the methodology used by the Department of Commerce (``the Department'') to value the cost of labor in non- market economy (``NME'') countries. After reviewing all comments received on the Department's interim, industry-specific wage calculation methodology that is currently applied in NME antidumping proceedings, the Department has determined that the single surrogate- country approach is best. In addition, the Department has decided to use International Labor Organization (``ILO'') Yearbook Chapter 6A as its primary source of labor cost data in NME antidumping proceedings.

  6. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  7. Plant Systems Biology at the Single-Cell Level.

    PubMed

    Libault, Marc; Pingault, Lise; Zogli, Prince; Schiefelbein, John

    2017-11-01

    Our understanding of plant biology is increasingly being built upon studies using 'omics and system biology approaches performed at the level of the entire plant, organ, or tissue. Although these approaches open new avenues to better understand plant biology, they suffer from the cellular complexity of the analyzed sample. Recent methodological advances now allow plant scientists to overcome this limitation and enable biological analyses of single-cells or single-cell-types. Coupled with the development of bioinformatics and functional genomics resources, these studies provide opportunities for high-resolution systems analyses of plant phenomena. In this review, we describe the recent advances, current challenges, and future directions in exploring the biology of single-cells and single-cell-types to enhance our understanding of plant biology as a system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Mixed methods research.

    PubMed

    Halcomb, Elizabeth; Hickman, Louise

    2015-04-08

    Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.

  9. The potential of neuroscience for health sciences education: towards convergence of evidence and resisting seductive allure.

    PubMed

    de Bruin, Anique B H

    2016-12-01

    Since emergence of the field 'Educational Neuroscience' (EN) in the late nineties of the previous century, a debate has emerged about the potential this field holds to influence teaching and learning in the classroom. By now, most agree that the original claims promising direct translations to teaching and learning were too strong. I argue here that research questions in (health professions) education require multi-methodological approaches, including neuroscience, while carefully weighing what (combination of) approaches are most suitable given a research question. Only through a multi-methodological approach will convergence of evidence emerge, which is so desperately needed for improving teaching and learning in the classroom. However, both researchers and teachers should become aware of the so-called 'seductive allure' of EN; that is, the demonstrable physical location and apparent objectivity of the measurements can be interpreted as yielding more powerful evidence and warranting stronger conclusions than, e.g., behavioral experiments, where in fact oftentimes the reverse is the case. I conclude that our tendency as researchers to commit ourselves to one methodological approach and to addressing educational research questions from a single methodological perspective is limiting progress in educational science and in translation to education.

  10. Methodological standards in single-case experimental design: Raising the bar.

    PubMed

    Ganz, Jennifer B; Ayres, Kevin M

    2018-04-12

    Single-case experimental designs (SCEDs), or small-n experimental research, are frequently implemented to assess approaches to improving outcomes for people with disabilities, particularly those with low-incidence disabilities, such as some developmental disabilities. SCED has become increasingly accepted as a research design. As this literature base is needed to determine what interventions are evidence-based practices, the acceptance of SCED has resulted in increased critiques with regard to methodological quality. Recent trends include recommendations from a number of expert scholars and institutions. The purpose of this article is to summarize the recent history of methodological quality considerations, synthesize the recommendations found in the SCED literature, and provide recommendations to researchers designing SCEDs with regard to essential and aspirational standards for methodological quality. Conclusions include imploring SCED to increase the quality of their experiments, with particular consideration regarding the applied nature of SCED research to be published in Research in Developmental Disabilities and beyond. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Design of feedback control systems for stable plants with saturating actuators

    NASA Technical Reports Server (NTRS)

    Kapasouris, Petros; Athans, Michael; Stein, Gunter

    1988-01-01

    A systematic control design methodology is introduced for multi-input/multi-output stable open loop plants with multiple saturations. This new methodology is a substantial improvement over previous heuristic single-input/single-output approaches. The idea is to introduce a supervisor loop so that when the references and/or disturbances are sufficiently small, the control system operates linearly as designed. For signals large enough to cause saturations, the control law is modified in such a way as to ensure stability and to preserve, to the extent possible, the behavior of the linear control design. Key benefits of the methodology are: the modified compensator never produces saturating control signals, integrators and/or slow dynamics in the compensator never windup, the directional properties of the controls are maintained, and the closed loop system has certain guaranteed stability properties. The advantages of the new design methodology are illustrated in the simulation of an academic example and the simulation of the multivariable longitudinal control of a modified model of the F-8 aircraft.

  12. A methodology for cloud masking uncalibrated lidar signals

    NASA Astrophysics Data System (ADS)

    Binietoglou, Ioannis; D'Amico, Giuseppe; Baars, Holger; Belegante, Livio; Marinou, Eleni

    2018-04-01

    Most lidar processing algorithms, such as those included in EARLINET's Single Calculus Chain, can be applied only to cloud-free atmospheric scenes. In this paper, we present a methodology for masking clouds in uncalibrated lidar signals. First, we construct a reference dataset based on manual inspection and then train a classifier to separate clouds and cloud-free regions. Here we present details of this approach together with an example cloud masks from an EARLINET station.

  13. Automatic lesion detection in capsule endoscopy based on color saliency: closer to an essential adjunct for reviewing software.

    PubMed

    Iakovidis, Dimitris K; Koulaouzidis, Anastasios

    2014-11-01

    The advent of wireless capsule endoscopy (WCE) has revolutionized the diagnostic approach to small-bowel disease. However, the task of reviewing WCE video sequences is laborious and time-consuming; software tools offering automated video analysis would enable a timelier and potentially a more accurate diagnosis. To assess the validity of innovative, automatic lesion-detection software in WCE. A color feature-based pattern recognition methodology was devised and applied to the aforementioned image group. This study was performed at the Royal Infirmary of Edinburgh, United Kingdom, and the Technological Educational Institute of Central Greece, Lamia, Greece. A total of 137 deidentified WCE single images, 77 showing pathology and 60 normal images. The proposed methodology, unlike state-of-the-art approaches, is capable of detecting several different types of lesions. The average performance, in terms of the area under the receiver-operating characteristic curve, reached 89.2 ± 0.9%. The best average performance was obtained for angiectasias (97.5 ± 2.4%) and nodular lymphangiectasias (96.3 ± 3.6%). Single expert for annotation of pathologies, single type of WCE model, use of single images instead of entire WCE videos. A simple, yet effective, approach allowing automatic detection of all types of abnormalities in capsule endoscopy is presented. Based on color pattern recognition, it outperforms previous state-of-the-art approaches. Moreover, it is robust in the presence of luminal contents and is capable of detecting even very small lesions. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  14. Application of the HARDMAN methodology to the single channel ground-airborne radio system (SINCGARS)

    NASA Astrophysics Data System (ADS)

    Balcom, J.; Park, J.; Toomer, L.; Feng, T.

    1984-12-01

    The HARDMAN methodology is designed to assess the human resource requirements early in the weapon system acquisition process. In this case, the methodology was applied to the family of radios known as SINCGARS (Single Channel Ground-Airborne Radio System). At the time of the study, SINCGARS was approaching the Full-Scale Development phase, with 2 contractors in competition. Their proposed systems were compared with a composite baseline comparison (reference) system. The systems' manpower, personnel and training requirements were compared. Based on RAM data, the contractors' MPT figures showed a significant reduction from the figures derived for the baseline comparison system. Differences between the two contractors were relatively small. Impact and some tradeoff analyses were hindered by data access problems. Tactical radios, manpower and personnel requirements analysis, impact and tradeoff analysis, human resource sensitivity, training requirements analysis, human resources in LCSMM, and logistics analyses are discussed.

  15. Single-molecule pull-down (SiMPull) for new-age biochemistry: methodology and biochemical applications of single-molecule pull-down (SiMPull) for probing biomolecular interactions in crude cell extracts.

    PubMed

    Aggarwal, Vasudha; Ha, Taekjip

    2014-11-01

    Macromolecular interactions play a central role in many biological processes. Protein-protein interactions have mostly been studied by co-immunoprecipitation, which cannot provide quantitative information on all possible molecular connections present in the complex. We will review a new approach that allows cellular proteins and biomolecular complexes to be studied in real-time at the single-molecule level. This technique is called single-molecule pull-down (SiMPull), because it integrates principles of conventional immunoprecipitation with the powerful single-molecule fluorescence microscopy. SiMPull is used to count how many of each protein is present in the physiological complexes found in cytosol and membranes. Concurrently, it serves as a single-molecule biochemical tool to perform functional studies on the pulled-down proteins. In this review, we will focus on the detailed methodology of SiMPull, its salient features and a wide range of biological applications in comparison with other biosensing tools. © 2014 WILEY Periodicals, Inc.

  16. Design for performance enhancement in feedback control systems with multiple saturating nonlinearities. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kapasouris, Petros

    1988-01-01

    A systematic control design methodology is introduced for multi-input/multi-output systems with multiple saturations. The methodology can be applied to stable and unstable open loop plants with magnitude and/or rate control saturations and to systems in which state limitations are desired. This new methodology is a substantial improvement over previous heuristic single-input/single-output approaches. The idea is to introduce a supervisor loop so that when the references and/or disturbances are sufficiently small, the control system operates linearly as designed. For signals large enough to cause saturations, the control law is modified in such a way to ensure stability and to preserve, to the extent possible, the behavior of the linear control design. Key benefits of this methodology are: the modified compensator never produces saturating control signals, integrators and/or slow dynamics in the compensator never windup, the directional properties of the controls are maintained, and the closed loop system has certain guaranteed stability properties. The advantages of the new design methodology are illustrated by numerous simulations, including the multivariable longitudinal control of modified models of the F-8 (stable) and F-16 (unstable) aircraft.

  17. Towards lexicographic multi-objective linear programming using grossone methodology

    NASA Astrophysics Data System (ADS)

    Cococcioni, Marco; Pappalardo, Massimo; Sergeyev, Yaroslav D.

    2016-10-01

    Lexicographic Multi-Objective Linear Programming (LMOLP) problems can be solved in two ways: preemptive and nonpreemptive. The preemptive approach requires the solution of a series of LP problems, with changing constraints (each time the next objective is added, a new constraint appears). The nonpreemptive approach is based on a scalarization of the multiple objectives into a single-objective linear function by a weighted combination of the given objectives. It requires the specification of a set of weights, which is not straightforward and can be time consuming. In this work we present both mathematical and software ingredients necessary to solve LMOLP problems using a recently introduced computational methodology (allowing one to work numerically with infinities and infinitesimals) based on the concept of grossone. The ultimate goal of such an attempt is an implementation of a simplex-like algorithm, able to solve the original LMOLP problem by solving only one single-objective problem and without the need to specify finite weights. The expected advantages are therefore obvious.

  18. Social Neuroscience and Hyperscanning Techniques: Past, Present and Future

    PubMed Central

    Babiloni, Fabio; Astolfi, Laura

    2012-01-01

    This paper reviews the published literature on the hyperscanning methodologies using hemodynamic or neuro-electric modalities. In particular, we describe how different brain recording devices have been employed in different experimental paradigms to gain information about the subtle nature of human interactions. This review also included papers based on single-subject recordings in which a correlation was found between the activities of different (non-simultaneously recorded) participants in the experiment. The descriptions begin with the methodological issues related to the simultaneous measurements and the descriptions of the results generated by such approaches will follow. Finally, a discussion of the possible future uses of such new approaches to explore human social interactions will be presented. PMID:22917915

  19. Designing for Annual Spacelift Performance

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.; Zapata, Edgar

    2017-01-01

    This paper presents a methodology for approaching space launch system design from a total architectural point of view. This different approach to conceptual design is contrasted with traditional approaches that focus on a single set of metrics for flight system performance, i.e., payload lift per flight, vehicle mass, specific impulse, etc. The approach presented works with a larger set of metrics, including annual system lift, or "spacelift" performance. Spacelift performance is more inclusive of the flight production capability of the total architecture, i.e., the flight and ground systems working together as a whole to produce flights on a repeated basis. In the proposed methodology, spacelift performance becomes an important design-for-support parameter for flight system concepts and truly advanced spaceport architectures of the future. The paper covers examples of existing system spacelift performance as benchmarks, points out specific attributes of space transportation systems that must be greatly improved over these existing designs, and outlines current activity in this area.

  20. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences

    PubMed Central

    Peffer, Melanie; Renken, Maggie

    2016-01-01

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446

  1. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  2. Skyscape Archaeology: an emerging interdiscipline for archaeoastronomers and archaeologists

    NASA Astrophysics Data System (ADS)

    Henty, Liz

    2016-02-01

    For historical reasons archaeoastronomy and archaeology differ in their approach to prehistoric monuments and this has created a divide between the disciplines which adopt seemingly incompatible methodologies. The reasons behind the impasse will be explored to show how these different approaches gave rise to their respective methods. Archaeology investigations tend to concentrate on single site analysis whereas archaeoastronomical surveys tend to be data driven from the examination of a large number of similar sets. A comparison will be made between traditional archaeoastronomical data gathering and an emerging methodology which looks at sites on a small scale and combines archaeology and astronomy. Silva's recent research in Portugal and this author's survey in Scotland have explored this methodology and termed it skyscape archaeology. This paper argues that this type of phenomenological skyscape archaeology offers an alternative to large scale statistical studies which analyse astronomical data obtained from a large number of superficially similar archaeological sites.

  3. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  5. Adult Learning Theories: Implications for Online Instruction

    ERIC Educational Resources Information Center

    Arghode, Vishal; Brieger, Earl W.; McLean, Gary N.

    2017-01-01

    Purpose: This paper analyzes critically four selected learning theories and their role in online instruction for adults. Design/methodology/approach: A literature review was conducted to analyze the theories. Findings: The theory comparison revealed that no single theory encompasses the entirety of online instruction for adult learning; each…

  6. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  7. A machine learning approach for the identification of key markers involved in brain development from single-cell transcriptomic data.

    PubMed

    Hu, Yongli; Hase, Takeshi; Li, Hui Peng; Prabhakar, Shyam; Kitano, Hiroaki; Ng, See Kiong; Ghosh, Samik; Wee, Lawrence Jin Kiat

    2016-12-22

    The ability to sequence the transcriptomes of single cells using single-cell RNA-seq sequencing technologies presents a shift in the scientific paradigm where scientists, now, are able to concurrently investigate the complex biology of a heterogeneous population of cells, one at a time. However, till date, there has not been a suitable computational methodology for the analysis of such intricate deluge of data, in particular techniques which will aid the identification of the unique transcriptomic profiles difference between the different cellular subtypes. In this paper, we describe the novel methodology for the analysis of single-cell RNA-seq data, obtained from neocortical cells and neural progenitor cells, using machine learning algorithms (Support Vector machine (SVM) and Random Forest (RF)). Thirty-eight key transcripts were identified, using the SVM-based recursive feature elimination (SVM-RFE) method of feature selection, to best differentiate developing neocortical cells from neural progenitor cells in the SVM and RF classifiers built. Also, these genes possessed a higher discriminative power (enhanced prediction accuracy) as compared commonly used statistical techniques or geneset-based approaches. Further downstream network reconstruction analysis was carried out to unravel hidden general regulatory networks where novel interactions could be further validated in web-lab experimentation and be useful candidates to be targeted for the treatment of neuronal developmental diseases. This novel approach reported for is able to identify transcripts, with reported neuronal involvement, which optimally differentiate neocortical cells and neural progenitor cells. It is believed to be extensible and applicable to other single-cell RNA-seq expression profiles like that of the study of the cancer progression and treatment within a highly heterogeneous tumour.

  8. Restoration of a single superresolution image from several blurred, noisy, and undersampled measured images.

    PubMed

    Elad, M; Feuer, A

    1997-01-01

    The three main tools in the single image restoration theory are the maximum likelihood (ML) estimator, the maximum a posteriori probability (MAP) estimator, and the set theoretic approach using projection onto convex sets (POCS). This paper utilizes the above known tools to propose a unified methodology toward the more complicated problem of superresolution restoration. In the superresolution restoration problem, an improved resolution image is restored from several geometrically warped, blurred, noisy and downsampled measured images. The superresolution restoration problem is modeled and analyzed from the ML, the MAP, and POCS points of view, yielding a generalization of the known superresolution restoration methods. The proposed restoration approach is general but assumes explicit knowledge of the linear space- and time-variant blur, the (additive Gaussian) noise, the different measured resolutions, and the (smooth) motion characteristics. A hybrid method combining the simplicity of the ML and the incorporation of nonellipsoid constraints is presented, giving improved restoration performance, compared with the ML and the POCS approaches. The hybrid method is shown to converge to the unique optimal solution of a new definition of the optimization problem. Superresolution restoration from motionless measurements is also discussed. Simulations demonstrate the power of the proposed methodology.

  9. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  10. A modular inverse elastostatics approach to resolve the pressure-induced stress state for in vivo imaging based cardiovascular modeling.

    PubMed

    Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno

    2018-05-28

    Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Mixed-Methods Design in Biology Education Research: Approach and Uses

    ERIC Educational Resources Information Center

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both…

  12. Evaluating a Tacit Knowledge Sharing Initiative: A Case Study

    ERIC Educational Resources Information Center

    Gubbins, Claire; Corrigan, Siobhan; Garavan, Thomas N.; O'Connor, Christy; Leahy, Damien; Long, David; Murphy, Eamonn

    2012-01-01

    Purpose: This paper aims to present a case study illustrating the issues involved in the tacit knowledge conversion process and to determine whether such conversion delivers value to the organisation in terms of business value and return on investment (ROI). Design/methodology/approach: A single-case multiple baseline participants experimental…

  13. Organizational Commitment, Knowledge Management Interventions, and Learning Organization Capacity

    ERIC Educational Resources Information Center

    Massingham, Peter; Diment, Kieren

    2009-01-01

    Purpose: The purpose of this paper is to examine the relationship between organizational commitment and knowledge management initiatives in developing learning organization capacity (LOC). Design/methodology/approach: This is an empirical study based on a single case study, using partial least squares (PLS) analysis. Findings: The strategic…

  14. Elementary Teachers' Perceptions of Elementary Principals' Effectiveness

    ERIC Educational Resources Information Center

    Fridenvalds, Kriss R.

    2012-01-01

    This dissertation examined the beliefs of elementary teachers to determine if their perceptions of effective principal leadership align to transformational leadership theory vis-a-vis the Educational Leadership Policy Standards (ELPS). A phenomenological, single-case study approach was utilized by means of a mixed-methodological, Web-based survey,…

  15. Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors

    NASA Technical Reports Server (NTRS)

    Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele

    2010-01-01

    This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.

  16. A methodology to compile food metrics related to diet sustainability into a single food database: Application to the French case.

    PubMed

    Gazan, Rozenn; Barré, Tangui; Perignon, Marlène; Maillot, Matthieu; Darmon, Nicole; Vieux, Florent

    2018-01-01

    The holistic approach required to assess diet sustainability is hindered by lack of comprehensive databases compiling relevant food metrics. Those metrics are generally scattered in different data sources with various levels of aggregation hampering their matching. The objective was to develop a general methodology to compile food metrics describing diet sustainability dimensions into a single database and to apply it to the French context. Each step of the methodology is detailed: indicators and food metrics identification and selection, food list definition, food matching and values assignment. For the French case, nutrient and contaminant content, bioavailability factors, distribution of dietary intakes, portion sizes, food prices, greenhouse gas emission, acidification and marine eutrophication estimates were allocated to 212 commonly consumed generic foods. This generic database compiling 279 metrics will allow the simultaneous evaluation of the four dimensions of diet sustainability, namely health, economic, social and environmental, dimensions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A single sensor and single actuator approach to performance tailoring over a prescribed frequency band.

    PubMed

    Wang, Jiqiang

    2016-03-01

    Restricted sensing and actuation control represents an important area of research that has been overlooked in most of the design methodologies. In many practical control engineering problems, it is necessitated to implement the design through a single sensor and single actuator for multivariate performance variables. In this paper, a novel approach is proposed for the solution to the single sensor and single actuator control problem where performance over any prescribed frequency band can also be tailored. The results are obtained for the broad band control design based on the formulation for discrete frequency control. It is shown that the single sensor and single actuator control problem over a frequency band can be cast into a Nevanlinna-Pick interpolation problem. An optimal controller can then be obtained via the convex optimization over LMIs. Even remarkable is that robustness issues can also be tackled in this framework. A numerical example is provided for the broad band attenuation of rotor blade vibration to illustrate the proposed design procedures. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Interest and limits of the six sigma methodology in medical laboratory.

    PubMed

    Scherrer, Florian; Bouilloux, Jean-Pierre; Calendini, Ors'Anton; Chamard, Didier; Cornu, François

    2017-02-01

    The mandatory accreditation of clinical laboratories in France provides an incentive to develop real tools to measure performance management methods and to optimize the management of internal quality controls. Six sigma methodology is an approach commonly applied to software quality management and discussed in numerous publications. This paper discusses the primary factors that influence the sigma index (the choice of the total allowable error, the approach used to address bias) and compares the performance of different analyzers on the basis of the sigma index. Six sigma strategy can be applied to the policy management of internal quality control in a laboratory and demonstrates through a comparison of four analyzers that there is no single superior analyzer in clinical chemistry. Similar sigma results are obtained using approaches toward bias based on the EQAS or the IQC. The main difficulty in using the six sigma methodology lies in the absence of official guidelines for the definition of the total error acceptable. Despite this drawback, our comparison study suggests that difficulties with defined analytes do not vary with the analyzer used.

  19. Developmental psycholinguistics teaches us that we need multi-method, not single-method, approaches to the study of linguistic representation.

    PubMed

    Rowland, Caroline F; Monaghan, Padraic

    2017-01-01

    In developmental psycholinguistics, we have, for many years, been generating and testing theories that propose both descriptions of adult representations and explanations of how those representations develop. We have learnt that restricting ourselves to any one methodology yields only incomplete data about the nature of linguistic representations. We argue that we need a multi-method approach to the study of representation.

  20. Proceedings of the Third International Workshop on Multistrategy Learning, May 23-25 Harpers Ferry, WV.

    DTIC Science & Technology

    1996-09-16

    approaches are: • Adaptive filtering • Single exponential smoothing (Brown, 1963) * The Box-Jenkins methodology ( ARIMA modeling ) - Linear exponential... ARIMA • Linear exponential smoothing: Holt’s two parameter modeling (Box and Jenkins, 1976). However, there are two approach (Holt et al., 1960) very...crucial disadvantages: The most important point in - Winters’ three parameter method (Winters, 1960) ARIMA modeling is model identification. As shown in

  1. On the intersection of phonetic detail and the organization of interaction: clinical connections.

    PubMed

    Walker, Gareth; Local, John

    2013-01-01

    The analysis of language use in real-world contexts poses particular methodological challenges. We codify responses to these challenges as a series of methodological imperatives. To demonstrate the relevance of these imperatives to clinical investigation, we present analyses of single episodes of interaction where one participant has a speech and/or language impairment: atypical prosody, echolalia and dysarthria. We demonstrate there is considerable heuristic and analytic value in taking this approach to analysing the organization of interaction involving individuals with a speech and/or language impairment.

  2. Solar energy program evaluation: an introduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    deLeon, P.

    The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less

  3. The Drosophila genome nexus: a population genomic resource of 623 Drosophila melanogaster genomes, including 197 from a single ancestral range population.

    PubMed

    Lack, Justin B; Cardeno, Charis M; Crepeau, Marc W; Taylor, William; Corbett-Detig, Russell B; Stevens, Kristian A; Langley, Charles H; Pool, John E

    2015-04-01

    Hundreds of wild-derived Drosophila melanogaster genomes have been published, but rigorous comparisons across data sets are precluded by differences in alignment methodology. The most common approach to reference-based genome assembly is a single round of alignment followed by quality filtering and variant detection. We evaluated variations and extensions of this approach and settled on an assembly strategy that utilizes two alignment programs and incorporates both substitutions and short indels to construct an updated reference for a second round of mapping prior to final variant detection. Utilizing this approach, we reassembled published D. melanogaster population genomic data sets and added unpublished genomes from several sub-Saharan populations. Most notably, we present aligned data from phase 3 of the Drosophila Population Genomics Project (DPGP3), which provides 197 genomes from a single ancestral range population of D. melanogaster (from Zambia). The large sample size, high genetic diversity, and potentially simpler demographic history of the DPGP3 sample will make this a highly valuable resource for fundamental population genetic research. The complete set of assemblies described here, termed the Drosophila Genome Nexus, presently comprises 623 consistently aligned genomes and is publicly available in multiple formats with supporting documentation and bioinformatic tools. This resource will greatly facilitate population genomic analysis in this model species by reducing the methodological differences between data sets. Copyright © 2015 by the Genetics Society of America.

  4. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  5. Application of mixed-methods design in community-engaged research: Lessons learned from an evidence-based intervention for Latinos with chronic illness and minor depression.

    PubMed

    Aguado Loi, Claudia X; Alfonso, Moya L; Chan, Isabella; Anderson, Kelsey; Tyson, Dinorah Dina Martinez; Gonzales, Junius; Corvin, Jaime

    2017-08-01

    The purpose of this paper is to share lessons learned from a collaborative, community-informed mixed-methods approach to adapting an evidence-based intervention to meet the needs of Latinos with chronic disease and minor depression and their family members. Mixed-methods informed by community-based participatory research (CBPR) were employed to triangulate multiple stakeholders' perceptions of facilitators and barriers of implementing the adapted intervention in community settings. Community partners provided an insider perspective to overcome methodological challenges. The study's community informed mixed-methods: research approach offered advantages to a single research methodology by expanding or confirming research findings and engaging multiple stakeholders in data collection. This approach also allowed community partners to collaborate with academic partners in key research decisions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Immunity to Transformational Learning and Change

    ERIC Educational Resources Information Center

    Bochman, David J.; Kroth, Michael

    2010-01-01

    Purpose: The purpose of this paper is to examine and synthesize Argyris and Schon's Theory of Action and Kegan and Lahey's theory of Immunity to Change in order to produce an integrated model. Design/methodology/approach: Literature discussing Argyris and Schon's Theory of Action (Model I and Model II), single and double-loop learning, espoused…

  7. Comparative Toxicity of Eight Oil Dispersants, Louisiana Sweet Crude Oil (LSC) and Chemically Dispersed LSC to Two Aquatic Test Species

    EPA Science Inventory

    This study describes the acute toxicity of eight commercial oil dispersants, Louisiana sweet crude oil (LSC), and chemically dispersed LSC. The approach utilized consistent test methodologies within a single laboratory in assessing the relative acute toxicity of the eight dispers...

  8. Creating a Sustainable University and Community through a Common Experience

    ERIC Educational Resources Information Center

    Lopez, Omar S.

    2013-01-01

    Purpose: This article aims to provide an overview of Texas State University's Common Experience, an innovative initiative that engaged tens of thousands of people in shared consideration of sustainability as a single topic during academic year 2010-2011. Design/methodology/approach: The discourse begins with an overview of the Common Experience…

  9. Evaluation of Scale Reliability with Binary Measures Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Dimitrov, Dimiter M.; Asparouhov, Tihomir

    2010-01-01

    A method for interval estimation of scale reliability with discrete data is outlined. The approach is applicable with multi-item instruments consisting of binary measures, and is developed within the latent variable modeling methodology. The procedure is useful for evaluation of consistency of single measures and of sum scores from item sets…

  10. Strategically Focused Training in Six Sigma Way: A Case Study

    ERIC Educational Resources Information Center

    Pandey, Ashish

    2007-01-01

    Purpose: The purpose of the current study is to examine the utility of Six Sigma interventions as a performance measure and explore its applicability for making the training design and delivery operationally efficient and strategically effective. Design/methodology/approach: This is a single revelatory case study. Data were collected from multiple…

  11. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  12. A methodology to determine the elastic moduli of crystals by matching experimental and simulated lattice strain pole figures using discrete harmonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wielewski, Euan; Boyce, Donald E.; Park, Jun-Sang

    Determining reliable single crystal material parameters for complex polycrystalline materials is a significant challenge for the materials community. In this work, a novel methodology for determining those parameters is outlined and successfully applied to the titanium alloy, Ti-6Al-4V. Utilizing the results from a lattice strain pole figure experiment conducted at the Cornell High Energy Synchrotron Source, an iterative approach is used to optimize the single crystal elastic moduli by comparing experimental and simulated lattice strain pole figures at discrete load steps during a uniaxial tensile test. Due to the large number of unique measurements taken during the experiments, comparisons weremore » made by using the discrete spherical harmonic modes of both the experimental and simulated lattice strain pole figures, allowing the complete pole figures to be used to determine the single crystal elastic moduli. (C) 2016 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.« less

  13. Ordered macro-microporous metal-organic framework single crystals

    NASA Astrophysics Data System (ADS)

    Shen, Kui; Zhang, Lei; Chen, Xiaodong; Liu, Lingmei; Zhang, Daliang; Han, Yu; Chen, Junying; Long, Jilan; Luque, Rafael; Li, Yingwei; Chen, Banglin

    2018-01-01

    We constructed highly oriented and ordered macropores within metal-organic framework (MOF) single crystals, opening up the area of three-dimensional–ordered macro-microporous materials (that is, materials containing both macro- and micropores) in single-crystalline form. Our methodology relies on the strong shaping effects of a polystyrene nanosphere monolith template and a double-solvent–induced heterogeneous nucleation approach. This process synergistically enabled the in situ growth of MOFs within ordered voids, rendering a single crystal with oriented and ordered macro-microporous structure. The improved mass diffusion properties of such hierarchical frameworks, together with their robust single-crystalline nature, endow them with superior catalytic activity and recyclability for bulky-molecule reactions, as compared with conventional, polycrystalline hollow, and disordered macroporous ZIF-8.

  14. Interdisciplinary mixed methods research with structurally vulnerable populations: case studies of injection drug users in San Francisco.

    PubMed

    Lopez, Andrea M; Bourgois, Philippe; Wenger, Lynn D; Lorvick, Jennifer; Martinez, Alexis N; Kral, Alex H

    2013-03-01

    Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.'s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific "value added" that allowed for more robust theoretical and practical findings about drug use and risk-taking. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Interdisciplinary Mixed Methods Research with Structurally Vulnerable Populations: Case Studies of Injection Drug Users in San Francisco

    PubMed Central

    Lopez, Andrea; Bourgois, Philippe; Wenger, Lynn; Lorvick, Jennifer; Martinez, Alexis; Kral, Alex H.

    2013-01-01

    Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.’s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific “value added” that allowed for more robust theoretical and practical findings about drug use and risk-taking. PMID:23312109

  16. Interrater reliability levels of multiple clinical examiners in the evaluation of a schizophrenic patient: quality of life, level of functioning, and neuropsychological symptomatology.

    PubMed

    Cicchetti, D V; Rosenheck, R; Showalter, D; Charney, D; Cramer, J

    1999-05-01

    Sir Ronald Fisher used a single-subject design to derive the concepts of appropriate research design, randomization, sensitivity, and tests of statistical significance. The seminal work of Broca demonstrated that valid and generalizable findings can and have emerged from studies of a single patient in neuropsychology. In order to assess the reliability and/or validity of any clinical phenomena that derive from single subject research, it becomes necessary to apply appropriate biostatistical methodology. The authors develop just such an approach and apply it successfully to the evaluation of the functioning, quality of life, and neuropsychological symptomatology of a single schizophrenic patient.

  17. Symposium on single cell analysis and genomic approaches, Experimental Biology 2017 Chicago, Illinois, April 23, 2017.

    PubMed

    Coller, Hilary A

    2017-09-01

    Emerging technologies for the analysis of genome-wide information in single cells have the potential to transform many fields of biology, including our understanding of cell states, the response of cells to external stimuli, mosaicism, and intratumor heterogeneity. At Experimental Biology 2017 in Chicago, Physiological Genomics hosted a symposium in which five leaders in the field of single cell genomics presented their recent research. The speakers discussed emerging methodologies in single cell analysis and critical issues for the analysis of single cell data. Also discussed were applications of single cell genomics to understanding the different types of cells within an organism or tissue and the basis for cell-to-cell variability in response to stimuli. Copyright © 2017 the American Physiological Society.

  18. Using mixed methods in health research

    PubMed Central

    Woodman, Jenny

    2013-01-01

    Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291

  19. Learning and Growing: Trust, Leadership, and Response to Crisis

    ERIC Educational Resources Information Center

    Sutherland, Ian E.

    2017-01-01

    Purpose: The purpose of this paper is to explore the nature of trust in a school community related to the leadership response to crisis. Design/Methodology/Approach: This study was a multiple-source qualitative study of a single case of a PreK-12 international school called The Learning School. Findings: The findings revealed the nature of how…

  20. Integrated design of the CSI evolutionary structure: A verification of the design methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Joshi, S. M.; Elliott, Kenny B.; Walz, J. E.

    1993-01-01

    One of the main objectives of the Controls-Structures Interaction (CSI) program is to develop and evaluate integrated controls-structures design methodology for flexible space structures. Thus far, integrated design methodologies for a class of flexible spacecraft, which require fine attitude pointing and vibration suppression with no payload articulation, have been extensively investigated. Various integrated design optimization approaches, such as single-objective optimization, and multi-objective optimization, have been implemented with an array of different objectives and constraints involving performance and cost measures such as total mass, actuator mass, steady-state pointing performance, transient performance, control power, and many more. These studies have been performed using an integrated design software tool (CSI-DESIGN CODE) which is under development by the CSI-ADM team at the NASA Langley Research Center. To date, all of these studies, irrespective of the type of integrated optimization posed or objectives and constraints used, have indicated that integrated controls-structures design results in an overall spacecraft design which is considerably superior to designs obtained through a conventional sequential approach. Consequently, it is believed that validation of some of these results through fabrication and testing of a structure which is designed through an integrated design approach is warranted. The objective of this paper is to present and discuss the efforts that have been taken thus far for the validation of the integrated design methodology.

  1. Single-particle mineralogy of Chinese soil particles by the combined use of low-Z particle electron probe X-ray microanalysis and attenuated total reflectance-FT-IR imaging techniques.

    PubMed

    Malek, Md Abdul; Kim, Bowha; Jung, Hae-Jin; Song, Young-Chul; Ro, Chul-Un

    2011-10-15

    Our previous work on the speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis technique (EPMA), low-Z particle EPMA, demonstrated that the combined use of these two techniques is a powerful approach for looking at the single-particle mineralogy of externally heterogeneous minerals. In this work, this analytical methodology was applied to characterize six soil samples collected at arid areas in China, in order to identify mineral types present in the samples. The six soil samples were collected from two types of soil, i.e., loess and desert soils, for which overall 665 particles were analyzed on a single particle basis. The six soil samples have different mineralogical characteristics, which were clearly differentiated in this work. As this analytical methodology provides complementary information, the ATR-FT-IR imaging on mineral types, and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles, more detailed information can be obtained using this approach than when either low-Z particle EPMA or ATR-FT-IR imaging techniques are used alone, which has a great potential for the characterization of Asian dust and mineral dust particles. © 2011 American Chemical Society

  2. Multi-electrode array technologies for neuroscience and cardiology

    NASA Astrophysics Data System (ADS)

    Spira, Micha E.; Hai, Aviad

    2013-02-01

    At present, the prime methodology for studying neuronal circuit-connectivity, physiology and pathology under in vitro or in vivo conditions is by using substrate-integrated microelectrode arrays. Although this methodology permits simultaneous, cell-non-invasive, long-term recordings of extracellular field potentials generated by action potentials, it is 'blind' to subthreshold synaptic potentials generated by single cells. On the other hand, intracellular recordings of the full electrophysiological repertoire (subthreshold synaptic potentials, membrane oscillations and action potentials) are, at present, obtained only by sharp or patch microelectrodes. These, however, are limited to single cells at a time and for short durations. Recently a number of laboratories began to merge the advantages of extracellular microelectrode arrays and intracellular microelectrodes. This Review describes the novel approaches, identifying their strengths and limitations from the point of view of the end users -- with the intention to help steer the bioengineering efforts towards the needs of brain-circuit research.

  3. Multi-electrode array technologies for neuroscience and cardiology.

    PubMed

    Spira, Micha E; Hai, Aviad

    2013-02-01

    At present, the prime methodology for studying neuronal circuit-connectivity, physiology and pathology under in vitro or in vivo conditions is by using substrate-integrated microelectrode arrays. Although this methodology permits simultaneous, cell-non-invasive, long-term recordings of extracellular field potentials generated by action potentials, it is 'blind' to subthreshold synaptic potentials generated by single cells. On the other hand, intracellular recordings of the full electrophysiological repertoire (subthreshold synaptic potentials, membrane oscillations and action potentials) are, at present, obtained only by sharp or patch microelectrodes. These, however, are limited to single cells at a time and for short durations. Recently a number of laboratories began to merge the advantages of extracellular microelectrode arrays and intracellular microelectrodes. This Review describes the novel approaches, identifying their strengths and limitations from the point of view of the end users--with the intention to help steer the bioengineering efforts towards the needs of brain-circuit research.

  4. Methodological challenges of optical tweezers-based X-ray fluorescence imaging of biological model organisms at synchrotron facilities.

    PubMed

    Vergucht, Eva; Brans, Toon; Beunis, Filip; Garrevoet, Jan; Bauters, Stephen; De Rijcke, Maarten; Deruytter, David; Janssen, Colin; Riekel, Christian; Burghammer, Manfred; Vincze, Laszlo

    2015-07-01

    Recently, a radically new synchrotron radiation-based elemental imaging approach for the analysis of biological model organisms and single cells in their natural in vivo state was introduced. The methodology combines optical tweezers (OT) technology for non-contact laser-based sample manipulation with synchrotron radiation confocal X-ray fluorescence (XRF) microimaging for the first time at ESRF-ID13. The optical manipulation possibilities and limitations of biological model organisms, the OT setup developments for XRF imaging and the confocal XRF-related challenges are reported. In general, the applicability of the OT-based setup is extended with the aim of introducing the OT XRF methodology in all research fields where highly sensitive in vivo multi-elemental analysis is of relevance at the (sub)micrometre spatial resolution level.

  5. Organisational Learning and HRD: How Appropriate Are They for Small Firms?

    ERIC Educational Resources Information Center

    Saru, Essi

    2007-01-01

    Purpose: The purpose of this paper is to study human resource development (HRD) and organisational learning issues in a small expert organisation. Design/methodology/approach: This is a qualitative single case study conducted in one Finnish SME. It is part of an ongoing study. It is descriptive in nature and the aim is to find out whether the…

  6. "Let's Talk about Drugs": Pilot Study of a Community-Level Drug Prevention Intervention Based on Motivational Interviewing Principles

    ERIC Educational Resources Information Center

    Newbery, Natasha; McCambridge, Jim; Strang, John

    2007-01-01

    Purpose: The feasibility of a community-level drug prevention intervention based upon the principles of motivational interviewing within a further education college was investigated in a pilot study. Design/methodology/approach: The implementation over the course of a single term of "Let's Talk about Drugs" was studied with both action…

  7. The Use of AJAX in Searching a Bibliographic Database: A Case Study of the Italian Biblioteche Oggi Database

    ERIC Educational Resources Information Center

    Cavaleri, Piero

    2008-01-01

    Purpose: The purpose of this paper is to describe the use of AJAX for searching the Biblioteche Oggi database of bibliographic records. Design/methodology/approach: The paper is a demonstration of how bibliographic database single page interfaces allow the implementation of more user-friendly features for social and collaborative tasks. Findings:…

  8. CT radiation profile width measurement using CR imaging plate raw data

    PubMed Central

    Yang, Chang‐Ying Joseph

    2015-01-01

    This technical note demonstrates computed tomography (CT) radiation profile measurement using computed radiography (CR) imaging plate raw data showing it is possible to perform the CT collimation width measurement using a single scan without saturating the imaging plate. Previously described methods require careful adjustments to the CR reader settings in order to avoid signal clipping in the CR processed image. CT radiation profile measurements were taken as part of routine quality control on 14 CT scanners from four vendors. CR cassettes were placed on the CT scanner bed, raised to isocenter, and leveled. Axial scans were taken at all available collimations, advancing the cassette for each scan. The CR plates were processed and raw CR data were analyzed using MATLAB scripts to measure collimation widths. The raw data approach was compared with previously established methodology. The quality control analysis scripts are released as open source using creative commons licensing. A log‐linear relationship was found between raw pixel value and air kerma, and raw data collimation width measurements were in agreement with CR‐processed, bit‐reduced data, using previously described methodology. The raw data approach, with intrinsically wider dynamic range, allows improved measurement flexibility and precision. As a result, we demonstrate a methodology for CT collimation width measurements using a single CT scan and without the need for CR scanning parameter adjustments which is more convenient for routine quality control work. PACS numbers: 87.57.Q‐, 87.59.bd, 87.57.uq PMID:26699559

  9. Single-Cell Mass Spectrometry Reveals Changes in Lipid and Metabolite Expression in RAW 264.7 Cells upon Lipopolysaccharide Stimulation

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Patterson, Nathan Heath; Tsui, Tina; Caprioli, Richard M.; Norris, Jeremy L.

    2018-05-01

    It has been widely recognized that individual cells that exist within a large population of cells, even if they are genetically identical, can have divergent molecular makeups resulting from a variety of factors, including local environmental factors and stochastic processes within each cell. Presently, numerous approaches have been described that permit the resolution of these single-cell expression differences for RNA and protein; however, relatively few techniques exist for the study of lipids and metabolites in this manner. This study presents a methodology for the analysis of metabolite and lipid expression at the level of a single cell through the use of imaging mass spectrometry on a high-performance Fourier transform ion cyclotron resonance mass spectrometer. This report provides a detailed description of the overall experimental approach, including sample preparation as well as the data acquisition and analysis strategy for single cells. Applying this approach to the study of cultured RAW264.7 cells, we demonstrate that this method can be used to study the variation in molecular expression with cell populations and is sensitive to alterations in that expression that occurs upon lipopolysaccharide stimulation. [Figure not available: see fulltext.

  10. Single-Cell Mass Spectrometry Reveals Changes in Lipid and Metabolite Expression in RAW 264.7 Cells upon Lipopolysaccharide Stimulation

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Patterson, Nathan Heath; Tsui, Tina; Caprioli, Richard M.; Norris, Jeremy L.

    2018-03-01

    It has been widely recognized that individual cells that exist within a large population of cells, even if they are genetically identical, can have divergent molecular makeups resulting from a variety of factors, including local environmental factors and stochastic processes within each cell. Presently, numerous approaches have been described that permit the resolution of these single-cell expression differences for RNA and protein; however, relatively few techniques exist for the study of lipids and metabolites in this manner. This study presents a methodology for the analysis of metabolite and lipid expression at the level of a single cell through the use of imaging mass spectrometry on a high-performance Fourier transform ion cyclotron resonance mass spectrometer. This report provides a detailed description of the overall experimental approach, including sample preparation as well as the data acquisition and analysis strategy for single cells. Applying this approach to the study of cultured RAW264.7 cells, we demonstrate that this method can be used to study the variation in molecular expression with cell populations and is sensitive to alterations in that expression that occurs upon lipopolysaccharide stimulation. [Figure not available: see fulltext.

  11. Cumulative risk and developmental health: an argument for the importance of a family-wide science.

    PubMed

    Browne, Dillon T; Plamondon, Andre; Prime, Heather; Puente-Duran, Sofia; Wade, Mark

    2015-01-01

    A substantial body of research links social disadvantage and developmental health via a cascade running from poverty, to cumulative psychosocial risk, to disrupted family dynamics, to child biological regulatory systems and neurocognitive processing, and finally to morbidity across the lifespan. Most research in this area employs single-dyad or between-family methodology. While informative, there are limitations to this approach. Specifically, it is impossible to determine how risk alters psychosocial environments that are similar for all persons within a household, versus processes that are unique to particular children. This is important in light of literature citing the primacy of child-specific environments in driving developmental health. Methodologically speaking, there are both benefits and challenges to family-wide approaches that differentiate between- and within-family environments. This review describes literature linking cumulative risk and developmental health via family process, while articulating the importance of family-wide approaches. Areas of shortcoming and recommendations for a family-wide science are provided. © 2015 John Wiley & Sons, Ltd.

  12. Validation of high-throughput single cell analysis methodology.

    PubMed

    Devonshire, Alison S; Baradez, Marc-Olivier; Morley, Gary; Marshall, Damian; Foy, Carole A

    2014-05-01

    High-throughput quantitative polymerase chain reaction (qPCR) approaches enable profiling of multiple genes in single cells, bringing new insights to complex biological processes and offering opportunities for single cell-based monitoring of cancer cells and stem cell-based therapies. However, workflows with well-defined sources of variation are required for clinical diagnostics and testing of tissue-engineered products. In a study of neural stem cell lines, we investigated the performance of lysis, reverse transcription (RT), preamplification (PA), and nanofluidic qPCR steps at the single cell level in terms of efficiency, precision, and limit of detection. We compared protocols using a separate lysis buffer with cell capture directly in RT-PA reagent. The two methods were found to have similar lysis efficiencies, whereas the direct RT-PA approach showed improved precision. Digital PCR was used to relate preamplified template copy numbers to Cq values and reveal where low-quality signals may affect the analysis. We investigated the impact of calibration and data normalization strategies as a means of minimizing the impact of inter-experimental variation on gene expression values and found that both approaches can improve data comparability. This study provides validation and guidance for the application of high-throughput qPCR workflows for gene expression profiling of single cells. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Human genetic mapping studies using single sperm typing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubert, R.S.

    1993-01-01

    Sperm typing is a powerful technique that uses the polymerase chain reaction (PCR) to analyze DNA sequences within single sperm cells in order to construct genetic maps. This methodology was used to estimate the recombination fraction between D3S2 and D3S2 which was found to be 0.28 (95% CI = 0.20-0.36). Pedigree analysis was unable to determine genetic distance between these two markers due to their low informativeness. We also showed that dinucleotide and tetranucleotide repeat polymorphisms can be analyzed in single cells without using radioactivity or denaturing gels. This provides a rich new source of DANA polymorphisms for genetic mappingmore » by sperm typing. In addition, an approach that uses the sperm typing methodology is described that can define the physical boundaries of meiotic recombination hotspots. The hotspot at 4p16.3 near the Huntington disease gene was localized to an interval between D4S10 and D4S126. These studies demonstrated the usefulness of sperm typing as a tool for the study of human genetic.« less

  14. Determining radiated sound power of building structures by means of laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.

    2015-06-01

    This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.

  15. Mixed-Methods Design in Biology Education Research: Approach and Uses

    PubMed Central

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both quantitative and qualitative inquiries. Specifically, the paper provides an overview of mixed-methods design typologies most relevant in biology education research. It also discusses common methodological issues that may arise in mixed-methods studies and ways to address them. The paper concludes with recommendations on how to report and write about MMR. PMID:27856556

  16. Use of Invariant Manifolds for Transfers Between Three-Body Systems

    NASA Technical Reports Server (NTRS)

    Beckman, Mark; Howell, Kathleen

    2003-01-01

    The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.

  17. Representations of Invariant Manifolds for Applications in Three-Body Systems

    NASA Technical Reports Server (NTRS)

    Howell, K.; Beckman, M.; Patterson, C.; Folta, D.

    2004-01-01

    The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.

  18. DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS

    DTIC Science & Technology

    2017-10-01

    DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ

  19. Single-molecule Force Spectroscopy Approach to Enzyme Catalysis*

    PubMed Central

    Alegre-Cebollada, Jorge; Perez-Jimenez, Raul; Kosuri, Pallav; Fernandez, Julio M.

    2010-01-01

    Enzyme catalysis has been traditionally studied using a diverse set of techniques such as bulk biochemistry, x-ray crystallography, and NMR. Recently, single-molecule force spectroscopy by atomic force microscopy has been used as a new tool to study the catalytic properties of an enzyme. In this approach, a mechanical force ranging up to hundreds of piconewtons is applied to the substrate of an enzymatic reaction, altering the conformational energy of the substrate-enzyme interactions during catalysis. From these measurements, the force dependence of an enzymatic reaction can be determined. The force dependence provides valuable new information about the dynamics of enzyme catalysis with sub-angstrom resolution, a feat unmatched by any other current technique. To date, single-molecule force spectroscopy has been applied to gain insight into the reduction of disulfide bonds by different enzymes of the thioredoxin family. This minireview aims to present a perspective on this new approach to study enzyme catalysis and to summarize the results that have already been obtained from it. Finally, the specific requirements that must be fulfilled to apply this new methodology to any other enzyme will be discussed. PMID:20382731

  20. Single-molecule force spectroscopy approach to enzyme catalysis.

    PubMed

    Alegre-Cebollada, Jorge; Perez-Jimenez, Raul; Kosuri, Pallav; Fernandez, Julio M

    2010-06-18

    Enzyme catalysis has been traditionally studied using a diverse set of techniques such as bulk biochemistry, x-ray crystallography, and NMR. Recently, single-molecule force spectroscopy by atomic force microscopy has been used as a new tool to study the catalytic properties of an enzyme. In this approach, a mechanical force ranging up to hundreds of piconewtons is applied to the substrate of an enzymatic reaction, altering the conformational energy of the substrate-enzyme interactions during catalysis. From these measurements, the force dependence of an enzymatic reaction can be determined. The force dependence provides valuable new information about the dynamics of enzyme catalysis with sub-angstrom resolution, a feat unmatched by any other current technique. To date, single-molecule force spectroscopy has been applied to gain insight into the reduction of disulfide bonds by different enzymes of the thioredoxin family. This minireview aims to present a perspective on this new approach to study enzyme catalysis and to summarize the results that have already been obtained from it. Finally, the specific requirements that must be fulfilled to apply this new methodology to any other enzyme will be discussed.

  1. Single crystal diamond membranes for nanoelectronics.

    PubMed

    Bray, Kerem; Kato, Hiromitsu; Previdi, Rodolfo; Sandstrom, Russell; Ganesan, Kumaravelu; Ogura, Masahiko; Makino, Toshiharu; Yamasaki, Satoshi; Magyar, Andrew P; Toth, Milos; Aharonovich, Igor

    2018-02-22

    Single crystal, nanoscale diamond membranes are highly sought after for a variety of applications including nanophotonics, nanoelectronics and quantum information science. However, so far, the availability of conductive diamond membranes has remained an unreachable goal. In this work we present a complete nanofabrication methodology for engineering high aspect ratio, electrically active single crystal diamond membranes. The membranes have large lateral directions, exceeding ∼500 × 500 μm 2 and are only several hundreds of nanometers thick. We further realize vertical single crystal p-n junctions made from the diamond membranes that exhibit onset voltages of ∼10 V and a current of several mA. Moreover, we deterministically introduce optically active color centers into the membranes, and demonstrate for the first time a single crystal nanoscale diamond LED. The robust and scalable approach to engineer the electrically active single crystal diamond membranes offers new pathways for advanced nanophotonic, nanoelectronic and optomechanical devices employing diamond.

  2. The Mayo Clinic Value Creation System.

    PubMed

    Swensen, Stephen J; Dilling, James A; Harper, C Michel; Noseworthy, John H

    2012-01-01

    The authors present Mayo Clinic's Value Creation System, a coherent systems engineering approach to delivering a single high-value practice. There are 4 tightly linked, interdependent phases of the system: alignment, discovery, managed diffusion, and measurement. The methodology is described and examples of the results to date are presented. The Value Creation System has been demonstrated to improve the quality of patient care while reducing costs and increasing productivity.

  3. ENVIRONMENTAL PROTECTION: Overcoming Obstacles to Innovative State Regulatory Programs

    DTIC Science & Technology

    2002-01-01

    proposals. For example: • The New Hampshire Department of Environmental Services sought flexibility under federal regulations for a single pulp and paper ...Virginia. The facility receives industrial wastewater from a variety of manufacturers, including makers of pulp and paper , organic chemicals, and ...Approaches Conclusions Agency Comments Objectives, Scope, and Methodology 1 2 4 5 13 24 26 27 29 Appendixes Appendix I: Key Innovations

  4. Evolving African Attitudes to European Education: Resistance, Pervert Effects of the Single System Paradox, and the "Ubuntu" Framework for Renewal

    ERIC Educational Resources Information Center

    Assié-Lumumba, N'Dri Thérèse

    2016-01-01

    This paper is a reflection that critically examines the dynamics of education and the struggle by African people for freedom, control of the mind, self-definition and the right to determine their own destiny from the start of colonial rule to the present. The primary methodological approach is historical structuralism, which stipulates that social…

  5. CNN based approach for activity recognition using a wrist-worn accelerometer.

    PubMed

    Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R

    2017-07-01

    In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.

  6. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  7. Client Perceptions of Helpfulness in Therapy: a Novel Video-Rating Methodology for Examining Process Variables at Brief Intervals During a Single Session.

    PubMed

    Cocklin, Alexandra A; Mansell, Warren; Emsley, Richard; McEvoy, Phil; Preston, Chloe; Comiskey, Jody; Tai, Sara

    2017-11-01

    The value of clients' reports of their experiences in therapy is widely recognized, yet quantitative methodology has rarely been used to measure clients' self-reported perceptions of what is helpful over a single session. A video-rating method using was developed to gather data at brief intervals using process measures of client perceived experience and standardized measures of working alliance (Session Rating Scale; SRS). Data were collected over the course of a single video-recorded session of cognitive therapy (Method of Levels Therapy; Carey, 2006; Mansell et al., 2012). We examined the acceptability and feasibility of the methodology and tested the concurrent validity of the measure by utilizing theory-led constructs. Eighteen therapy sessions were video-recorded and clients each rated a 20-minute session of therapy at two-minute intervals using repeated measures. A multi-level analysis was used to test for correlations between perceived levels of helpfulness and client process variables. The design proved to be feasible. Concurrent validity was borne out through high correlations between constructs. A multi-level regression examined the independent contributions of client process variables to client perceived helpfulness. Client perceived control (b = 0.39, 95% CI .05 to 0.73), the ability to talk freely (b = 0.30, SE = 0.11, 95% CI .09 to 0.51) and therapist approach (b = 0.31, SE = 0.14, 95% CI .04 to 0.57) predicted client-rated helpfulness. We identify a feasible and acceptable method for studying continuous measures of helpfulness and their psychological correlates during a single therapy session.

  8. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    PubMed

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  9. Methodological improvements in quantifying cognitive change in clinical trials: an example with single-dose administration of donepezil.

    PubMed

    Pietrzak, R H; Maruff, P; Snyder, P J

    2009-03-01

    Change in cognitive function in response to a pharmacologic challenge can be observed with greater sensitivity by employing cognitive tests with optimal psychometric properties and a statistical approach that more accurately accounts for individual variability in performance. To demonstrate this approach we examined the cognitive effects of a single acute dose administration of an acetylcholinesterase inhibitor, donepezil, in healthy older adults and in older adults with mild Alzheimer's disease (AD). Placebo-controlled crossover study with three separate testing days: baseline, placebo, and donepezil, with assessments at baseline, and 1-, 2-, 3-, 6-, and 8-hrs post-dosing on each day. Early phase I clinical trial. 15 healthy older adults; 14 older adults with mild Alzheimer's disease. Single acute dose of 5mg donepezil. Performance on the Groton Maze Learning Test (GMLT), a computerized neuropsychological measure of spatial working memory and error monitoring. A single acute dose of donepezil improved GMLT performance in healthy older adults (effect size: 0.83 at 6 hrs post-dosing) and older adults with mild AD (effect size: 0.58 at 3 hrs post-dosing). The GMLT detected cognitive improvement following a single, acute dose administration of donepezil in healthy older adults and older adults with mild AD. The choice of cognitive tests designed for repeated administration, as well as an analytic approach that emphasizes individual-level change in cognitive function, provides a sensitive approach to detecting central nervous system drug penetration and activity of cognitive-enhancing agents.

  10. Variometric approach for real-time GNSS navigation: First demonstration of Kin-VADASE capabilities

    NASA Astrophysics Data System (ADS)

    Branzanti, Mara; Colosimo, Gabriele; Mazzoni, Augusto

    2017-06-01

    The use of Global Navigation Satellite Systems (GNSS) kinematic positioning for navigational applications dramatically increased over the last decade. Real-time high performance navigation (positioning accuracy from one to few centimeters) can be achieved with established techniques such as Real Time Kinematic (RTK), and Precise Point Positioning (PPP). Despite their potential, the application of these techniques is limited mainly by their high cost. This work proposes the Kinematic implementation of the Variometric Approach for Displacement Analysis Standalone Engine (Kin-VADASE) and gives a demonstration of its performances in the field of GNSS navigation. VADASE is a methodology for the real-time detection of a standalone GNSS receiver displacements. It was originally designed for seismology and monitoring applications, where the receiver is supposed to move for few minutes, in the range of few meters, around a predefined position. Kin-VADASE overcomes the aforementioned limitations and aims to be a complete methodology with fully kinematic capabilities. Here, for the first time, we present its application to two test cases in order to estimate high rate (i.e., 10 Hz) kinematic parameters of moving vehicles. In this demonstration, data are collected and processed in the office, but the same results can be obtained in real-time through the implementation of Kin-VADASE in the firmware of a GNSS receiver. All the Kin-VADASE processing were carried out using double and single frequency observations in order to investigate the potentialities of the software with geodetic class and low-cost single frequency receivers. Root Mean Square Errors in 3D with respect to differential positioning are at the level of 50 cm for dual frequency and better than 1 meter for single frequency data. This reveals how Kin-VADASE features the main advantage of the standalone approach and the single frequency capability and, although with slightly lower accuracy with respect to the established techniques, can be a valid alternative to estimate kinematic parameters of vehicle in motions.

  11. Single Subject Research: Applications to Special Education

    ERIC Educational Resources Information Center

    Cakiroglu, Orhan

    2012-01-01

    Single subject research is a scientific research methodology that is increasingly used in the field of special education. Therefore, understanding the unique characteristics of single subject research methodology is critical both for educators and practitioners. Certain characteristics make single subject research one of the most preferred…

  12. MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways

    PubMed Central

    Koumakis, Lefteris; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Vassou, Despoina; Marias, Kostas; Moustakis, Vassilis; Potamias, George

    2016-01-01

    Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers’ exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes. PMID:27832067

  13. MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways.

    PubMed

    Koumakis, Lefteris; Kanterakis, Alexandros; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Tsiknakis, Manolis; Vassou, Despoina; Kafetzopoulos, Dimitris; Marias, Kostas; Moustakis, Vassilis; Potamias, George

    2016-11-01

    Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers' exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes.

  14. Bayesian seismic inversion based on rock-physics prior modeling for the joint estimation of acoustic impedance, porosity and lithofacies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Passos de Figueiredo, Leandro, E-mail: leandrop.fgr@gmail.com; Grana, Dario; Santos, Marcio

    We propose a Bayesian approach for seismic inversion to estimate acoustic impedance, porosity and lithofacies within the reservoir conditioned to post-stack seismic and well data. The link between elastic and petrophysical properties is given by a joint prior distribution for the logarithm of impedance and porosity, based on a rock-physics model. The well conditioning is performed through a background model obtained by well log interpolation. Two different approaches are presented: in the first approach, the prior is defined by a single Gaussian distribution, whereas in the second approach it is defined by a Gaussian mixture to represent the well datamore » multimodal distribution and link the Gaussian components to different geological lithofacies. The forward model is based on a linearized convolutional model. For the single Gaussian case, we obtain an analytical expression for the posterior distribution, resulting in a fast algorithm to compute the solution of the inverse problem, i.e. the posterior distribution of acoustic impedance and porosity as well as the facies probability given the observed data. For the Gaussian mixture prior, it is not possible to obtain the distributions analytically, hence we propose a Gibbs algorithm to perform the posterior sampling and obtain several reservoir model realizations, allowing an uncertainty analysis of the estimated properties and lithofacies. Both methodologies are applied to a real seismic dataset with three wells to obtain 3D models of acoustic impedance, porosity and lithofacies. The methodologies are validated through a blind well test and compared to a standard Bayesian inversion approach. Using the probability of the reservoir lithofacies, we also compute a 3D isosurface probability model of the main oil reservoir in the studied field.« less

  15. Cascade Optimization Strategy with Neural Network and Regression Approximations Demonstrated on a Preliminary Aircraft Engine Design

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Patnaik, Surya N.

    2000-01-01

    A preliminary aircraft engine design methodology is being developed that utilizes a cascade optimization strategy together with neural network and regression approximation methods. The cascade strategy employs different optimization algorithms in a specified sequence. The neural network and regression methods are used to approximate solutions obtained from the NASA Engine Performance Program (NEPP), which implements engine thermodynamic cycle and performance analysis models. The new methodology is proving to be more robust and computationally efficient than the conventional optimization approach of using a single optimization algorithm with direct reanalysis. The methodology has been demonstrated on a preliminary design problem for a novel subsonic turbofan engine concept that incorporates a wave rotor as a cycle-topping device. Computations of maximum thrust were obtained for a specific design point in the engine mission profile. The results (depicted in the figure) show a significant improvement in the maximum thrust obtained using the new methodology in comparison to benchmark solutions obtained using NEPP in a manual design mode.

  16. On process optimization considering LCA methodology.

    PubMed

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. A system and methodology for high-content visual screening of individual intact living cells in suspension

    NASA Astrophysics Data System (ADS)

    Renaud, Olivier; Heintzmann, Rainer; Sáez-Cirión, Asier; Schnelle, Thomas; Mueller, Torsten; Shorte, Spencer

    2007-02-01

    Three dimensional imaging provides high-content information from living intact biology, and can serve as a visual screening cue. In the case of single cell imaging the current state of the art uses so-called "axial through-stacking". However, three-dimensional axial through-stacking requires that the object (i.e. a living cell) be adherently stabilized on an optically transparent surface, usually glass; evidently precluding use of cells in suspension. Aiming to overcome this limitation we present here the utility of dielectric field trapping of single cells in three-dimensional electrode cages. Our approach allows gentle and precise spatial orientation and vectored rotation of living, non-adherent cells in fluid suspension. Using various modes of widefield, and confocal microscope imaging we show how so-called "microrotation" can provide a unique and powerful method for multiple point-of-view (three-dimensional) interrogation of intact living biological micro-objects (e.g. single-cells, cell aggregates, and embryos). Further, we show how visual screening by micro-rotation imaging can be combined with micro-fluidic sorting, allowing selection of rare phenotype targets from small populations of cells in suspension, and subsequent one-step single cell cloning (with high-viability). Our methodology combining high-content 3D visual screening with one-step single cell cloning, will impact diverse paradigms, for example cytological and cytogenetic analysis on haematopoietic stem cells, blood cells including lymphocytes, and cancer cells.

  18. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  19. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  20. Numerical Modelling of Tsunami Generated by Deformable Submarine Slides: Parameterisation of Slide Dynamics for Coupling to Tsunami Propagation Model

    NASA Astrophysics Data System (ADS)

    Smith, R. C.; Collins, G. S.; Hill, J.; Piggott, M. D.; Mouradian, S. L.

    2015-12-01

    Numerical modelling informs risk assessment of tsunami generated by submarine slides; however, for large-scale slides modelling can be complex and computationally challenging. Many previous numerical studies have approximated slides as rigid blocks that moved according to prescribed motion. However, wave characteristics are strongly dependent on the motion of the slide and previous work has recommended that more accurate representation of slide dynamics is needed. We have used the finite-element, adaptive-mesh CFD model Fluidity, to perform multi-material simulations of deformable submarine slide-generated waves at real world scales for a 2D scenario in the Gulf of Mexico. Our high-resolution approach represents slide dynamics with good accuracy, compared to other numerical simulations of this scenario, but precludes tracking of wave propagation over large distances. To enable efficient modelling of further propagation of the waves, we investigate an approach to extract information about the slide evolution from our multi-material simulations in order to drive a single-layer wave propagation model, also using Fluidity, which is much less computationally expensive. The extracted submarine slide geometry and position as a function of time are parameterised using simple polynomial functions. The polynomial functions are used to inform a prescribed velocity boundary condition in a single-layer simulation, mimicking the effect the submarine slide motion has on the water column. The approach is verified by successful comparison of wave generation in the single-layer model with that recorded in the multi-material, multi-layer simulations. We then extend this approach to 3D for further validation of this methodology (using the Gulf of Mexico scenario proposed by Horrillo et al., 2013) and to consider the effect of lateral spreading. This methodology is then used to simulate a series of hypothetical submarine slide events in the Arctic Ocean (based on evidence of historic slides) and examine the hazard posed to the UK coast.

  1. Timing and Mode of Landscape Response to Glacial-Interglacial Climate Forcing From Fluvial Fill Terrace Sediments: Humahuaca Basin, E Cordillera, NW Argentina

    NASA Astrophysics Data System (ADS)

    Schildgen, T. F.; Robinson, R. A. J.; Savi, S.; Bookhagen, B.; Tofelde, S.; Strecker, M. R.

    2014-12-01

    Numerical modelling informs risk assessment of tsunami generated by submarine slides; however, for large-scale slides modelling can be complex and computationally challenging. Many previous numerical studies have approximated slides as rigid blocks that moved according to prescribed motion. However, wave characteristics are strongly dependent on the motion of the slide and previous work has recommended that more accurate representation of slide dynamics is needed. We have used the finite-element, adaptive-mesh CFD model Fluidity, to perform multi-material simulations of deformable submarine slide-generated waves at real world scales for a 2D scenario in the Gulf of Mexico. Our high-resolution approach represents slide dynamics with good accuracy, compared to other numerical simulations of this scenario, but precludes tracking of wave propagation over large distances. To enable efficient modelling of further propagation of the waves, we investigate an approach to extract information about the slide evolution from our multi-material simulations in order to drive a single-layer wave propagation model, also using Fluidity, which is much less computationally expensive. The extracted submarine slide geometry and position as a function of time are parameterised using simple polynomial functions. The polynomial functions are used to inform a prescribed velocity boundary condition in a single-layer simulation, mimicking the effect the submarine slide motion has on the water column. The approach is verified by successful comparison of wave generation in the single-layer model with that recorded in the multi-material, multi-layer simulations. We then extend this approach to 3D for further validation of this methodology (using the Gulf of Mexico scenario proposed by Horrillo et al., 2013) and to consider the effect of lateral spreading. This methodology is then used to simulate a series of hypothetical submarine slide events in the Arctic Ocean (based on evidence of historic slides) and examine the hazard posed to the UK coast.

  2. Energy management and vehicle synthesis

    NASA Astrophysics Data System (ADS)

    Czysz, P.; Murthy, S. N. B.

    The major drivers in the development of launch vehicles for the twenty-first century are reduction in cost of vehicles and operations, continuous reusability, mission abort capability with vehicle recovery, and readiness. One approach to the design of such vehicles is to emphasize energy management and propulsion as being the principal means of improvements given the available industrial capability and the required freedom in selecting configuration concept geometries. A methodology has been developed for the rational synthesis of vehicles based on the setting up and utilization of available data and projections, and a reference vehicle. The application of the methodology is illustrated for a single stage to orbit (SSTO) with various limits for the use of airbreathing propulsion.

  3. Energy management and vehicle synthesis

    NASA Technical Reports Server (NTRS)

    Czysz, P.; Murthy, S. N. B.

    1995-01-01

    The major drivers in the development of launch vehicles for the twenty-first century are reduction in cost of vehicles and operations, continuous reusability, mission abort capability with vehicle recovery, and readiness. One approach to the design of such vehicles is to emphasize energy management and propulsion as being the principal means of improvements given the available industrial capability and the required freedom in selecting configuration concept geometries. A methodology has been developed for the rational synthesis of vehicles based on the setting up and utilization of available data and projections, and a reference vehicle. The application of the methodology is illustrated for a single stage to orbit (SSTO) with various limits for the use of airbreathing propulsion.

  4. Sharing stories: life history narratives in stuttering research.

    PubMed

    Kathard, H

    2001-01-01

    The life experiences of people who stutter (PWS) have not featured prominently in research. Historically, the profession of speech and language therapy has amassed data and developed its theory of stuttering within a positivistic frame. As a consequence, the existing over-arching theory of research and practice does not engage holistically with the dynamic personal, socio-cultural and political contexts of the individual who stutters. Therefore a conceptual shift is required in ontology, epistemology and methodology underpinning the knowledge construction process. The use of the life history narratives as a research tool is promoted. An exploratory study of a single participant is presented to illuminate the methodological approach and emerging theoretical constructs.

  5. Hydrogel microstructure live-cell array for multiplexed analyses of cancer stem cells, tumor heterogeneity and differential drug response at single-element resolution.

    PubMed

    Afrimzon, E; Botchkina, G; Zurgil, N; Shafran, Y; Sobolev, M; Moshkov, S; Ravid-Hermesh, O; Ojima, I; Deutsch, M

    2016-03-21

    Specific phenotypic subpopulations of cancer stem cells (CSCs) are responsible for tumor development, production of heterogeneous differentiated tumor mass, metastasis, and resistance to therapies. The development of therapeutic approaches based on targeting rare CSCs has been limited partially due to the lack of appropriate experimental models and measurement approaches. The current study presents new tools and methodologies based on a hydrogel microstructure array (HMA) for identification and multiplex analyses of CSCs. Low-melt agarose integrated with type I collagen, a major component of the extracellular matrix (ECM), was used to form a solid hydrogel array with natural non-adhesive characteristics and high optical quality. The array contained thousands of individual pyramidal shaped, nanoliter-volume micro-chambers (MCs), allowing concomitant generation and measurement of large populations of free-floating CSC spheroids from single cells, each in an individual micro-chamber (MC). The optical live cell platform, based on an imaging plate patterned with HMA, was validated using CSC-enriched prostate and colon cancer cell lines. The HMA methodology and quantitative image analysis at single-element resolution clearly demonstrates several levels of tumor cell heterogeneity, including morphological and phenotypic variability, differences in proliferation capacity and in drug response. Moreover, the system facilitates real-time examination of single stem cell (SC) fate, as well as drug-induced alteration in expression of stemness markers. The technology may be applicable in personalized cancer treatment, including multiplex ex vivo analysis of heterogeneous patient-derived tumor specimens, precise detection and characterization of potentially dangerous cell phenotypes, and for representative evaluation of drug sensitivity of CSCs and other types of tumor cells.

  6. A Novel Semi-Supervised Methodology for Extracting Tumor Type-Specific MRS Sources in Human Brain Data

    PubMed Central

    Ortega-Martorell, Sandra; Ruiz, Héctor; Vellido, Alfredo; Olier, Iván; Romero, Enrique; Julià-Sapé, Margarida; Martín, José D.; Jarman, Ian H.; Arús, Carles; Lisboa, Paulo J. G.

    2013-01-01

    Background The clinical investigation of human brain tumors often starts with a non-invasive imaging study, providing information about the tumor extent and location, but little insight into the biochemistry of the analyzed tissue. Magnetic Resonance Spectroscopy can complement imaging by supplying a metabolic fingerprint of the tissue. This study analyzes single-voxel magnetic resonance spectra, which represent signal information in the frequency domain. Given that a single voxel may contain a heterogeneous mix of tissues, signal source identification is a relevant challenge for the problem of tumor type classification from the spectroscopic signal. Methodology/Principal Findings Non-negative matrix factorization techniques have recently shown their potential for the identification of meaningful sources from brain tissue spectroscopy data. In this study, we use a convex variant of these methods that is capable of handling negatively-valued data and generating sources that can be interpreted as tumor class prototypes. A novel approach to convex non-negative matrix factorization is proposed, in which prior knowledge about class information is utilized in model optimization. Class-specific information is integrated into this semi-supervised process by setting the metric of a latent variable space where the matrix factorization is carried out. The reported experimental study comprises 196 cases from different tumor types drawn from two international, multi-center databases. The results indicate that the proposed approach outperforms a purely unsupervised process by achieving near perfect correlation of the extracted sources with the mean spectra of the tumor types. It also improves tissue type classification. Conclusions/Significance We show that source extraction by unsupervised matrix factorization benefits from the integration of the available class information, so operating in a semi-supervised learning manner, for discriminative source identification and brain tumor labeling from single-voxel spectroscopy data. We are confident that the proposed methodology has wider applicability for biomedical signal processing. PMID:24376744

  7. Multi-Objective Optimization of Mixed Variable, Stochastic Systems Using Single-Objective Formulations

    DTIC Science & Technology

    2008-03-01

    investigated, as well as the methodology used . Chapter IV presents the data collection and analysis procedures, and the resulting analysis and...interpolate the data, although a non-interpolating model is possible. For this research Design and Analysis of Computer Experiments (DACE) is used ...followed by the analysis . 4.1. Testing Approach The initial SMOMADS algorithm used for this research was acquired directly from Walston [70]. The

  8. Shear-wave velocity profiling according to three alternative approaches: A comparative case study

    NASA Astrophysics Data System (ADS)

    Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.

    2016-11-01

    The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.

  9. Effectiveness evaluation of objective and subjective weighting methods for aquifer vulnerability assessment in urban context

    NASA Astrophysics Data System (ADS)

    Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet

    2016-10-01

    Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.

  10. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  11. Metamaterial bricks and quantization of meta-surfaces

    PubMed Central

    Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram

    2017-01-01

    Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units—which we call metamaterial bricks—each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators. PMID:28240283

  12. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE PAGES

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei; ...

    2017-03-02

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  13. Metamaterial bricks and quantization of meta-surfaces

    NASA Astrophysics Data System (ADS)

    Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram

    2017-02-01

    Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units--which we call metamaterial bricks--each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.

  14. Mixed methods research - the best of both worlds?

    PubMed

    van Griensven, Hubert; Moore, Ann P; Hall, Valerie

    2014-10-01

    There has been a bias towards quantitative research approaches within manual therapy, which may have resulted in a narrow understanding of manual therapy practice. The aim of this Masterclass is to make a contribution to the expansion of methodologies used in manual therapy enquiry by discussing mixed methods research (MMR), a methodology which utilises both qualitative and quantitative methods within a single study in order to provide more comprehensive insights. To review rationales for MMR, as well as some of the common design options and potential difficulties. The paper also discusses theoretical frameworks that have been used to underpin qualitative and quantitative research, and ongoing debates about the possibility of combining them. Complexities associated with health and manual therapy cannot always be investigated satisfactorily by using a single research method. Some issues require a more comprehensive understanding, which may be provided by combining the strengths of quantitative and qualitative methods in a mixed methods study. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  16. Metamaterial bricks and quantization of meta-surfaces.

    PubMed

    Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R; Drinkwater, Bruce W; Subramanian, Sriram

    2017-02-27

    Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units-which we call metamaterial bricks-each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.

  17. A Psychobiographical Study of Intuition in a Writer's Life: Paulo Coelho Revisited

    PubMed Central

    Mayer, Claude-Hélène; Maree, David

    2017-01-01

    Intuition is defined as a form of knowledge which materialises as awareness of thoughts, feelings and physical sensations. It is a key to a deeper understanding and meaningfulness. Intuition, used as a psychological function, supports the transmission and integration of perceptions from unconscious and conscious realms. This study uses a psychobiographical single case study approach to explore intuition across the life span of Paulo Coelho. Methodologically, the study is based on a single case study, using the methodological frame of Dilthey's modern hermeneutics. The author, Paulo Coelho, was chosen as a subject of research, based on the content analysis of first- and third-person perspective documents. Findings show that Paulo Coelho, as one of the most famous and most read contemporary authors in the world, uses his intuitions as a deeper guidance in life, for decision-making and self-development. Intuitive decision-making is described throughout his life and by referring to selected creative works. PMID:28904596

  18. Decoding the Regulatory Network for Blood Development from Single-Cell Gene Expression Measurements

    PubMed Central

    Haghverdi, Laleh; Lilly, Andrew J.; Tanaka, Yosuke; Wilkinson, Adam C.; Buettner, Florian; Macaulay, Iain C.; Jawaid, Wajid; Diamanti, Evangelia; Nishikawa, Shin-Ichi; Piterman, Nir; Kouskoff, Valerie; Theis, Fabian J.; Fisher, Jasmin; Göttgens, Berthold

    2015-01-01

    Here we report the use of diffusion maps and network synthesis from state transition graphs to better understand developmental pathways from single cell gene expression profiling. We map the progression of mesoderm towards blood in the mouse by single-cell expression analysis of 3,934 cells, capturing cells with blood-forming potential at four sequential developmental stages. By adapting the diffusion plot methodology for dimensionality reduction to single-cell data, we reconstruct the developmental journey to blood at single-cell resolution. Using transitions between individual cellular states as input, we develop a single-cell network synthesis toolkit to generate a computationally executable transcriptional regulatory network model that recapitulates blood development. Model predictions were validated by showing that Sox7 inhibits primitive erythropoiesis, and that Sox and Hox factors control early expression of Erg. We therefore demonstrate that single-cell analysis of a developing organ coupled with computational approaches can reveal the transcriptional programs that control organogenesis. PMID:25664528

  19. Quantum dot ternary-valued full-adder: Logic synthesis by a multiobjective design optimization based on a genetic algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klymenko, M. V.; Remacle, F., E-mail: fremacle@ulg.ac.be

    2014-10-28

    A methodology is proposed for designing a low-energy consuming ternary-valued full adder based on a quantum dot (QD) electrostatically coupled with a single electron transistor operating as a charge sensor. The methodology is based on design optimization: the values of the physical parameters of the system required for implementing the logic operations are optimized using a multiobjective genetic algorithm. The searching space is determined by elements of the capacitance matrix describing the electrostatic couplings in the entire device. The objective functions are defined as the maximal absolute error over actual device logic outputs relative to the ideal truth tables formore » the sum and the carry-out in base 3. The logic units are implemented on the same device: a single dual-gate quantum dot and a charge sensor. Their physical parameters are optimized to compute either the sum or the carry out outputs and are compatible with current experimental capabilities. The outputs are encoded in the value of the electric current passing through the charge sensor, while the logic inputs are supplied by the voltage levels on the two gate electrodes attached to the QD. The complex logic ternary operations are directly implemented on an extremely simple device, characterized by small sizes and low-energy consumption compared to devices based on switching single-electron transistors. The design methodology is general and provides a rational approach for realizing non-switching logic operations on QD devices.« less

  20. Using Multiorder Time-Correlation Functions (TCFs) To Elucidate Biomolecular Reaction Pathways from Microsecond Single-Molecule Fluorescence Experiments.

    PubMed

    Phelps, Carey; Israels, Brett; Marsh, Morgan C; von Hippel, Peter H; Marcus, Andrew H

    2016-12-29

    Recent advances in single-molecule fluorescence imaging have made it possible to perform measurements on microsecond time scales. Such experiments have the potential to reveal detailed information about the conformational changes in biological macromolecules, including the reaction pathways and dynamics of the rearrangements involved in processes, such as sequence-specific DNA "breathing" and the assembly of protein-nucleic acid complexes. Because microsecond-resolved single-molecule trajectories often involve "sparse" data, that is, they contain relatively few data points per unit time, they cannot be easily analyzed using the standard protocols that were developed for single-molecule experiments carried out with tens-of-millisecond time resolution and high "data density." Here, we describe a generalized approach, based on time-correlation functions, to obtain kinetic information from microsecond-resolved single-molecule fluorescence measurements. This approach can be used to identify short-lived intermediates that lie on reaction pathways connecting relatively long-lived reactant and product states. As a concrete illustration of the potential of this methodology for analyzing specific macromolecular systems, we accompany the theoretical presentation with the description of a specific biologically relevant example drawn from studies of reaction mechanisms of the assembly of the single-stranded DNA binding protein of the T4 bacteriophage replication complex onto a model DNA replication fork.

  1. Reversible on-surface wiring of resistive circuits.

    PubMed

    Inkpen, Michael S; Leroux, Yann R; Hapiot, Philippe; Campos, Luis M; Venkataraman, Latha

    2017-06-01

    Whilst most studies in single-molecule electronics involve components first synthesized ex situ , there is also great potential in exploiting chemical transformations to prepare devices in situ . Here, as a first step towards this goal, we conduct reversible reactions on monolayers to make and break covalent bonds between alkanes of different lengths, then measure the conductance of these molecules connected between electrodes using the scanning tunneling microscopy-based break junction (STM-BJ) method. In doing so, we develop the critical methodology required for assembling and disassembling surface-bound single-molecule circuits. We identify effective reaction conditions for surface-bound reagents, and importantly demonstrate that the electronic characteristics of wires created in situ agree with those created ex situ . Finally, we show that the STM-BJ technique is unique in its ability to definitively probe surface reaction yields both on a local (∼50 nm 2 ) and pseudo-global (≥10 mm 2 ) level. This investigation thus highlights a route to the construction and integration of more complex, and ultimately functional, surface-based single-molecule circuitry, as well as advancing a methodology that facilitates studies beyond the reach of traditional ex situ synthetic approaches.

  2. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool.

    PubMed

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery.

  3. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool

    PubMed Central

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962

  4. Multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.

  5. Chemical silicon surface modification and bioreceptor attachment to develop competitive integrated photonic biosensors.

    PubMed

    Escorihuela, Jorge; Bañuls, María José; García Castelló, Javier; Toccafondo, Veronica; García-Rupérez, Jaime; Puchades, Rosa; Maquieira, Ángel

    2012-12-01

    Methodology for the functionalization of silicon-based materials employed for the development of photonic label-free nanobiosensors is reported. The studied functionalization based on organosilane chemistry allowed the direct attachment of biomolecules in a single step, maintaining their bioavailability. Using this immobilization approach in probe microarrays, successful specific detection of bacterial DNA is achieved, reaching hybridization sensitivities of 10 pM. The utility of the immobilization approach for the functionalization of label-free nanobiosensors based on photonic crystals and ring resonators was demonstrated using bovine serum albumin (BSA)/anti-BSA as a model system.

  6. Building the Core Architecture of a Multiagent System Product Line: With an example from a future NASA Mission

    NASA Technical Reports Server (NTRS)

    Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio

    2006-01-01

    The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.

  7. Multistate approaches in computational protein design

    PubMed Central

    Davey, James A; Chica, Roberto A

    2012-01-01

    Computational protein design (CPD) is a useful tool for protein engineers. It has been successfully applied towards the creation of proteins with increased thermostability, improved binding affinity, novel enzymatic activity, and altered ligand specificity. Traditionally, CPD calculations search and rank sequences using a single fixed protein backbone template in an approach referred to as single-state design (SSD). While SSD has enjoyed considerable success, certain design objectives require the explicit consideration of multiple conformational and/or chemical states. Cases where a “multistate” approach may be advantageous over the SSD approach include designing conformational changes into proteins, using native ensembles to mimic backbone flexibility, and designing ligand or oligomeric association specificities. These design objectives can be efficiently tackled using multistate design (MSD), an emerging methodology in CPD that considers any number of protein conformational or chemical states as inputs instead of a single protein backbone template, as in SSD. In this review article, recent examples of the successful design of a desired property into proteins using MSD are described. These studies employing MSD are divided into two categories—those that utilized multiple conformational states, and those that utilized multiple chemical states. In addition, the scoring of competing states during negative design is discussed as a current challenge for MSD. PMID:22811394

  8. Study on the performance of different craniofacial superimposition approaches (I).

    PubMed

    Ibáñez, O; Vicente, R; Navega, D S; Wilkinson, C; Jayaprakash, P T; Huete, M I; Briers, T; Hardiman, R; Navarro, F; Ruiz, E; Cavalli, F; Imaizumi, K; Jankauskas, R; Veselovskaya, E; Abramov, A; Lestón, P; Molinero, F; Cardoso, J; Çağdır, A S; Humpire, D; Nakanishi, Y; Zeuner, A; Ross, A H; Gaudio, D; Damas, S

    2015-12-01

    As part of the scientific tasks coordinated throughout The 'New Methodologies and Protocols of Forensic Identification by Craniofacial Superimposition (MEPROCS)' project, the current study aims to analyse the performance of a diverse set of CFS methodologies and the corresponding technical approaches when dealing with a common dataset of real-world cases. Thus, a multiple-lab study on craniofacial superimposition has been carried out for the first time. In particular, 26 participants from 17 different institutions in 13 countries were asked to deal with 14 identification scenarios, some of them involving the comparison of multiple candidates and unknown skulls. In total, 60 craniofacial superimposition problems divided in two set of females and males. Each participant follow her/his own methodology and employed her/his particular technological means. For each single case they were asked to report the final identification decision (either positive or negative) along with the rationale supporting the decision and at least one image illustrating the overlay/superimposition outcome. This study is expected to provide important insights to better understand the most convenient characteristics of every method included in this study. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Time-resolved Analysis of Proteome Dynamics by Tandem Mass Tags and Stable Isotope Labeling in Cell Culture (TMT-SILAC) Hyperplexing*

    PubMed Central

    Welle, Kevin A.; Zhang, Tian; Hryhorenko, Jennifer R.; Shen, Shichen; Qu, Jun; Ghaemmaghami, Sina

    2016-01-01

    Recent advances in mass spectrometry have enabled system-wide analyses of protein turnover. By globally quantifying the kinetics of protein clearance and synthesis, these methodologies can provide important insights into the regulation of the proteome under varying cellular and environmental conditions. To facilitate such analyses, we have employed a methodology that combines metabolic isotopic labeling (Stable Isotope Labeling in Cell Culture - SILAC) with isobaric tagging (Tandem Mass Tags - TMT) for analysis of multiplexed samples. The fractional labeling of multiple time-points can be measured in a single mass spectrometry run, providing temporally resolved measurements of protein turnover kinetics. To demonstrate the feasibility of the approach, we simultaneously measured the kinetics of protein clearance and accumulation for more than 3000 proteins in dividing and quiescent human fibroblasts and verified the accuracy of the measurements by comparison to established non-multiplexed approaches. The results indicate that upon reaching quiescence, fibroblasts compensate for lack of cellular growth by globally downregulating protein synthesis and upregulating protein degradation. The described methodology significantly reduces the cost and complexity of temporally-resolved dynamic proteomic experiments and improves the precision of proteome-wide turnover data. PMID:27765818

  10. Standardizing economic analysis in prevention will require substantial effort.

    PubMed

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  11. Integrated control-system design via generalized LQG (GLQG) theory

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Hyland, David C.; Richter, Stephen; Haddad, Wassim M.

    1989-01-01

    Thirty years of control systems research has produced an enormous body of theoretical results in feedback synthesis. Yet such results see relatively little practical application, and there remains an unsettling gap between classical single-loop techniques (Nyquist, Bode, root locus, pole placement) and modern multivariable approaches (LQG and H infinity theory). Large scale, complex systems, such as high performance aircraft and flexible space structures, now demand efficient, reliable design of multivariable feedback controllers which optimally tradeoff performance against modeling accuracy, bandwidth, sensor noise, actuator power, and control law complexity. A methodology is described which encompasses numerous practical design constraints within a single unified formulation. The approach, which is based upon coupled systems or modified Riccati and Lyapunov equations, encompasses time-domain linear-quadratic-Gaussian theory and frequency-domain H theory, as well as classical objectives such as gain and phase margin via the Nyquist circle criterion. In addition, this approach encompasses the optimal projection approach to reduced-order controller design. The current status of the overall theory will be reviewed including both continuous-time and discrete-time (sampled-data) formulations.

  12. Simulated fault injection - A methodology to evaluate fault tolerant microprocessor architectures

    NASA Technical Reports Server (NTRS)

    Choi, Gwan S.; Iyer, Ravishankar K.; Carreno, Victor A.

    1990-01-01

    A simulation-based fault-injection method for validating fault-tolerant microprocessor architectures is described. The approach uses mixed-mode simulation (electrical/logic analysis), and injects transient errors in run-time to assess the resulting fault impact. As an example, a fault-tolerant architecture which models the digital aspects of a dual-channel real-time jet-engine controller is used. The level of effectiveness of the dual configuration with respect to single and multiple transients is measured. The results indicate 100 percent coverage of single transients. Approximately 12 percent of the multiple transients affect both channels; none result in controller failure since two additional levels of redundancy exist.

  13. Improving the performance of the mass transfer-based reference evapotranspiration estimation approaches through a coupled wavelet-random forest methodology

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal

    2018-06-01

    Among different reference evapotranspiration (ETo) modeling approaches, mass transfer-based methods have been less studied. These approaches utilize temperature and wind speed records. On the other hand, the empirical equations proposed in this context generally produce weak simulations, except when a local calibration is used for improving their performance. This might be a crucial drawback for those equations in case of local data scarcity for calibration procedure. So, application of heuristic methods can be considered as a substitute for improving the performance accuracy of the mass transfer-based approaches. However, given that the wind speed records have usually higher variation magnitudes than the other meteorological parameters, application of a wavelet transform for coupling with heuristic models would be necessary. In the present paper, a coupled wavelet-random forest (WRF) methodology was proposed for the first time to improve the performance accuracy of the mass transfer-based ETo estimation approaches using cross-validation data management scenarios in both local and cross-station scales. The obtained results revealed that the new coupled WRF model (with the minimum scatter index values of 0.150 and 0.192 for local and external applications, respectively) improved the performance accuracy of the single RF models as well as the empirical equations to great extent.

  14. A new mathematical modeling approach for the energy of threonine molecule

    NASA Astrophysics Data System (ADS)

    Sahiner, Ahmet; Kapusuz, Gulden; Yilmaz, Nurullah

    2017-07-01

    In this paper, we propose an improved new methodology in energy conformation problems for finding optimum energy values. First, we construct the Bezier surfaces near local minimizers based on the data obtained from Density Functional Theory (DFT) calculations. Second, we blend the constructed surfaces in order to obtain a single smooth model. Finally, we apply the global optimization algorithm to find two torsion angles those make the energy of the molecule minimum.

  15. Transcriptome In Vivo Analysis (TIVA) of spatially defined single cells in intact live mouse and human brain tissue

    PubMed Central

    Lovatt, Ditte; Ruble, Brittani K.; Lee, Jaehee; Dueck, Hannah; Kim, Tae Kyung; Fisher, Stephen; Francis, Chantal; Spaethling, Jennifer M.; Wolf, John A.; Grady, M. Sean; Ulyanova, Alexandra V.; Yeldell, Sean B.; Griepenburg, Julianne C.; Buckley, Peter T.; Kim, Junhyong; Sul, Jai-Yoon; Dmochowski, Ivan J.; Eberwine, James

    2014-01-01

    Transcriptome profiling is an indispensable tool in advancing the understanding of single cell biology, but depends upon methods capable of isolating mRNA at the spatial resolution of a single cell. Current capture methods lack sufficient spatial resolution to isolate mRNA from individual in vivo resident cells without damaging adjacent tissue. Because of this limitation, it has been difficult to assess the influence of the microenvironment on the transcriptome of individual neurons. Here, we engineered a Transcriptome In Vivo Analysis (TIVA)-tag, which upon photoactivation enables mRNA capture from single cells in live tissue. Using the TIVA-tag in combination with RNA-seq to analyze transcriptome variance among single dispersed cells and in vivo resident mouse and human neurons, we show that the tissue microenvironment shapes the transcriptomic landscape of individual cells. The TIVA methodology provides the first noninvasive approach for capturing mRNA from single cells in their natural microenvironment. PMID:24412976

  16. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    PubMed

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.

  17. Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik

    2005-01-01

    This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.

  18. Combined Volatolomics for Monitoring of Human Body Chemistry

    PubMed Central

    Broza, Yoav Y.; Zuri, Liat; Haick, Hossam

    2014-01-01

    Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans. PMID:24714440

  19. Combined volatolomics for monitoring of human body chemistry.

    PubMed

    Broza, Yoav Y; Zuri, Liat; Haick, Hossam

    2014-04-09

    Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans.

  20. Predicting the disease of Alzheimer with SNP biomarkers and clinical data using data mining classification approach: decision tree.

    PubMed

    Erdoğan, Onur; Aydin Son, Yeşim

    2014-01-01

    Single Nucleotide Polymorphisms (SNPs) are the most common genomic variations where only a single nucleotide differs between individuals. Individual SNPs and SNP profiles associated with diseases can be utilized as biological markers. But there is a need to determine the SNP subsets and patients' clinical data which is informative for the diagnosis. Data mining approaches have the highest potential for extracting the knowledge from genomic datasets and selecting the representative SNPs as well as most effective and informative clinical features for the clinical diagnosis of the diseases. In this study, we have applied one of the widely used data mining classification methodology: "decision tree" for associating the SNP biomarkers and significant clinical data with the Alzheimer's disease (AD), which is the most common form of "dementia". Different tree construction parameters have been compared for the optimization, and the most accurate tree for predicting the AD is presented.

  1. Monocular correspondence detection for symmetrical objects by template matching

    NASA Astrophysics Data System (ADS)

    Vilmar, G.; Besslich, Philipp W., Jr.

    1990-09-01

    We describe a possibility to reconstruct 3-D information from a single view of an 3-D bilateral symmetric object. The symmetry assumption allows us to obtain a " second view" from a different viewpoint by a simple reflection of the monocular image. Therefore we have to solve the correspondence problem in a special case where known feature-based or area-based binocular approaches fail. In principle our approach is based on a frequency domain template matching of the features on the epipolar lines. During a training period our system " learns" the assignment of correspondence models to image features. The object shape is interpolated when no template matches to the image features. This fact is an important advantage of this methodology because no " real world" image holds the symmetry assumption perfectly. To simplify the training process we used single views on human faces (e. g. passport photos) but our system is trainable on any other kind of objects.

  2. Towards sustainable design for single-use medical devices.

    PubMed

    Hanson, Jacob J; Hitchcock, Robert W

    2009-01-01

    Despite their sophistication and value, single-use medical devices have become commodity items in the developed world. Cheap raw materials along with large scale manufacturing and distribution processes have combined to make many medical devices more expensive to resterilize, package and restock than to simply discard. This practice is not sustainable or scalable on a global basis. As the petrochemicals that provide raw materials become more expensive and the global reach of these devices continues into rapidly developing economies, there is a need for device designs that take into account the total life-cycle of these products, minimize the amount of non-renewable materials consumed and consider alternative hybrid reusable / disposable approaches. In this paper, we describe a methodology to perform life cycle and functional analyses to create additional design requirements for medical devices. These types of sustainable approaches can move the medical device industry even closer to the "triple bottom line"--people, planet, profit.

  3. Complex basis functions for molecular resonances: Methodology and applications

    NASA Astrophysics Data System (ADS)

    White, Alec; McCurdy, C. William; Head-Gordon, Martin

    The computation of positions and widths of metastable electronic states is a challenge for molecular electronic structure theory because, in addition to the difficulty of the many-body problem, such states obey scattering boundary conditions. These resonances cannot be addressed with naïve application of traditional bound state electronic structure theory. Non-Hermitian electronic structure methods employing complex basis functions is one way that we may rigorously treat resonances within the framework of traditional electronic structure theory. In this talk, I will discuss our recent work in this area including the methodological extension from single determinant SCF-based approaches to highly correlated levels of wavefunction-based theory such as equation of motion coupled cluster and many-body perturbation theory. These approaches provide a hierarchy of theoretical methods for the computation of positions and widths of molecular resonances. Within this framework, we may also examine properties of resonances including the dependence of these parameters on molecular geometry. Some applications of these methods to temporary anions and dianions will also be discussed.

  4. In search of the skeletal stem cell: isolation and separation strategies at the macro/micro scale for skeletal regeneration.

    PubMed

    Gothard, David; Tare, Rahul S; Mitchell, Peter D; Dawson, Jonathan I; Oreffo, Richard O C

    2011-04-07

    Skeletal stem cells (SSCs) show great capacity for bone and cartilage repair however, current in vitro cultures are heterogeneous displaying a hierarchy of differentiation potential. SSCs represent the diminutive true multipotent stem cell fraction of bone marrow mononuclear cell (BMMNC) populations. Endeavours to isolate SSCs have generated a multitude of separation methodologies. SSCs were first identified and isolated by their ability to adhere to culture plastic. Once isolated, further separation is achieved via culture in selective or conditioned media (CM). Indeed, preferential SSC growth has been demonstrated through selective in vitro culture conditions. Other approaches have utilised cell morphology (size and shape) as selection criteria. Studies have also targeted SSCs based on their preferential adhesion to specified compounds, individually or in combination, on both macro and microscale platforms. Nevertheless, most of these methods which represent macroscale function with relatively high throughput, yield insufficient purity. Consequently, research has sought to downsize isolation methodologies to the microscale for single cell analysis. The central approach is identification of the requisite cell populations of SSC-specific surface markers that can be targeted for isolation by either positive or negative selection. SELEX and phage display technology provide apt means to sift through substantial numbers of candidate markers. In contrast, single cell analysis is the paramount advantage of microfluidics, a relatively new field for cell biology. Here cells can be separated under continuous or discontinuous flow according to intrinsic phenotypic and physicochemical properties. The combination of macroscale quantity with microscale specificity to generate robust high-throughput (HT) technology for pure SSC sorting, isolation and enrichment offers significant implications therein for skeletal regenerative strategies as a consequence of lab on chip derived methodology.

  5. Evaluating the abuse potential of opioids and abuse-deterrent -opioid formulations: A review of clinical study methodology.

    PubMed

    Setnik, Beatrice; Schoedel, Kerri A; Levy-Cooperman, Naama; Shram, Megan; Pixton, Glenn C; Roland, Carl L

    With the development of opioid abuse-deterrent formulations (ADFs), there is a need to conduct well-designed human abuse potential studies to evaluate the effectiveness of their deterrent properties. Although these types of studies have been conducted for many years, largely to evaluate inherent abuse potential of a molecule and inform drug scheduling, methodological approaches have varied across studies. The focus of this review is to describe current "best practices" and methodological adaptations required to assess abuse-deterrent opioid formulations for regulatory submissions. A literature search was conducted in PubMed® to review methodological approaches (study conduct and analysis) used in opioid human abuse potential studies. Search terms included a combination of "opioid," "opiate," "abuse potential," "abuse liability," "liking," AND "pharmacodynamic," and only studies that evaluated single doses of opioids in healthy, nondependent individuals with or without prior opioid experience were included. Seventy-one human abuse potential studies meeting the prespecified criteria were identified, of which 21 studies evaluated a purported opioid ADF. Based on these studies, key methodological considerations were reviewed and summarized according to participant demographics, study prequalification, comparator and dose selection, route of administration and drug manipulation, study blinding, outcome measures and training, safety, and statistical analyses. The authors recommend careful consideration of key elements (eg, a standardized definition of a "nondependent recreational user"), as applicable, and offer key principles and "best practices" when conducting human abuse potential studies for opioid ADFs. Careful selection of appropriate study conditions is dependent on the type of ADF technology being evaluated.

  6. Multi-Objectivising Combinatorial Optimisation Problems by Means of Elementary Landscape Decompositions.

    PubMed

    Ceberio, Josu; Calvo, Borja; Mendiburu, Alexander; Lozano, Jose A

    2018-02-15

    In the last decade, many works in combinatorial optimisation have shown that, due to the advances in multi-objective optimisation, the algorithms from this field could be used for solving single-objective problems as well. In this sense, a number of papers have proposed multi-objectivising single-objective problems in order to use multi-objective algorithms in their optimisation. In this article, we follow up this idea by presenting a methodology for multi-objectivising combinatorial optimisation problems based on elementary landscape decompositions of their objective function. Under this framework, each of the elementary landscapes obtained from the decomposition is considered as an independent objective function to optimise. In order to illustrate this general methodology, we consider four problems from different domains: the quadratic assignment problem and the linear ordering problem (permutation domain), the 0-1 unconstrained quadratic optimisation problem (binary domain), and the frequency assignment problem (integer domain). We implemented two widely known multi-objective algorithms, NSGA-II and SPEA2, and compared their performance with that of a single-objective GA. The experiments conducted on a large benchmark of instances of the four problems show that the multi-objective algorithms clearly outperform the single-objective approaches. Furthermore, a discussion on the results suggests that the multi-objective space generated by this decomposition enhances the exploration ability, thus permitting NSGA-II and SPEA2 to obtain better results in the majority of the tested instances.

  7. Children and youth with disabilities: innovative methods for single qualitative interviews.

    PubMed

    Teachman, Gail; Gibson, Barbara E

    2013-02-01

    There is a paucity of explicit literature outlining methods for single-interview studies with children, and almost none have focused on engaging children with disabilities. Drawing from a pilot study, we address these gaps by describing innovative techniques, strategies, and methods for engaging children and youth with disabilities in a single qualitative interview. In the study, we explored the beliefs, assumptions, and experiences of children and youth with cerebral palsy and their parents regarding the importance of walking. We describe three key aspects of our child-interview methodological approach: collaboration with parents, a toolkit of customizable interview techniques, and strategies to consider the power differential inherent in child-researcher interactions. Examples from our research illustrate what worked well and what was less successful. Researchers can optimize single interviews with children with disabilities by collaborating with family members and by preparing a toolkit of customizable interview techniques.

  8. Lead generation in crop protection research: a portfolio approach to agrochemical discovery.

    PubMed

    Loso, Michael R; Garizi, Negar; Hegde, Vidyadhar B; Hunter, James E; Sparks, Thomas C

    2017-04-01

    The need for increased food and feed supply to support future global demand with the added challenges of resistance pressure and an evolving regulatory environment necessitates the discovery of new crop protection agents for growers of today and tomorrow. Lead generation is the critical 'engine' for maintaining a robust pipeline of new high-value products. A wide variety of approaches exist for the generation of new leads, many of which have demonstrated success. Each approach features some degree of merit or benefit while also having some inherent drawback or level of risk. While risk for any single approach can be mitigated in a variety of different ways depending on the approach, long-term viability of a successful lead generation program merits utilization of a portfolio of different approaches and methodologies for the generation of new leads. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  9. Improved Temperature Sounding and Quality Control Methodology Using AIRS/AMSU Data: The AIRS Science Team Version 5 Retrieval Algorithm

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky

    2009-01-01

    This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.

  10. [Not Available].

    PubMed

    Paturzo, Marco; Colaceci, Sofia; Clari, Marco; Mottola, Antonella; Alvaro, Rosaria; Vellone, Ercole

    2016-01-01

    . Mixed methods designs: an innovative methodological approach for nursing research. The mixed method research designs (MM) combine qualitative and quantitative approaches in the research process, in a single study or series of studies. Their use can provide a wider understanding of multifaceted phenomena. This article presents a general overview of the structure and design of MM to spread this approach in the Italian nursing research community. The MM designs most commonly used in the nursing field are the convergent parallel design, the sequential explanatory design, the exploratory sequential design and the embedded design. For each method a research example is presented. The use of MM can be an added value to improve clinical practices as, through the integration of qualitative and quantitative methods, researchers can better assess complex phenomena typical of nursing.

  11. Discourse analysis in general practice: a sociolinguistic approach.

    PubMed

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here.

  12. Development and application of optimum sensitivity analysis of structures

    NASA Technical Reports Server (NTRS)

    Barthelemy, J. F. M.; Hallauer, W. L., Jr.

    1984-01-01

    The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.

  13. Annotate-it: a Swiss-knife approach to annotation, analysis and interpretation of single nucleotide variation in human disease

    PubMed Central

    2012-01-01

    The increasing size and complexity of exome/genome sequencing data requires new tools for clinical geneticists to discover disease-causing variants. Bottlenecks in identifying the causative variation include poor cross-sample querying, constantly changing functional annotation and not considering existing knowledge concerning the phenotype. We describe a methodology that facilitates exploration of patient sequencing data towards identification of causal variants under different genetic hypotheses. Annotate-it facilitates handling, analysis and interpretation of high-throughput single nucleotide variant data. We demonstrate our strategy using three case studies. Annotate-it is freely available and test data are accessible to all users at http://www.annotate-it.org. PMID:23013645

  14. Convergent close coupling versus the generalized Sturmian function approach: Wave-function analysis

    NASA Astrophysics Data System (ADS)

    Ambrosio, M.; Mitnik, D. M.; Gasaneo, G.; Randazzo, J. M.; Kadyrov, A. S.; Fursa, D. V.; Bray, I.

    2015-11-01

    We compare the physical information contained in the Temkin-Poet (TP) scattering wave function representing electron-impact ionization of hydrogen, calculated by the convergent close-coupling (CCC) and generalized Sturmian function (GSF) methodologies. The idea is to show that the ionization cross section can be extracted from the wave functions themselves. Using two different procedures based on hyperspherical Sturmian functions we show that the transition amplitudes contained in both GSF and CCC scattering functions lead to similar single-differential cross sections. The single-continuum channels were also a subject of the present studies, and we show that the elastic and excitation amplitudes are essentially the same as well.

  15. Practical Strategies for Collaboration across Discipline-Based Education Research and the Learning Sciences.

    PubMed

    Peffer, Melanie; Renken, Maggie

    Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline's approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. © 2016 M. Peffer and M. Renken. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  16. The effect of vision on knee biomechanics during functional activities - A systematic review.

    PubMed

    Louw, Quinette; Gillion, Nadia; van Niekerk, Sjan-Mari; Morris, Linzette; Baumeister, Jochen

    2015-07-01

    The objective of this study was to assess the effect of occluded vision on lower limb kinematics and kinetics of the knee joint during functional tasks including drop landing (single or double leg), squatting (single or double leg), stepping down, cutting movement and hopping in healthy individuals, or individuals who had an ACL reconstruction or deficiency with no vision impairments. A systematic review was conducted. A systematic review was conducted and electronic databases were searched between March 2012 and April 2013 for eligible papers. Methodological quality of each study was assessed using the Downs and Black revised checklist. Six studies met the eligibility criteria and a wide variation in methodological approaches was reported. This small evidence base indicated equivocal evidence about the effect of vision on knee biomechanics in individuals with healthy and compromised somatosensory function post an ACL reconstruction or injury. Clinicians should consider innovative, individualised ACL rehabilitation strategies when prescribing exercises which involve visual occlusion. Further research to increase the relatively small evidence base for the effect of vision on knee biomechanics is warranted. Copyright © 2014 Sports Medicine Australia. All rights reserved.

  17. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. A stochastic optimal feedforward and feedback control methodology for superagility

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Direskeneli, Haldun; Taylor, Deborah B.

    1992-01-01

    A new control design methodology is developed: Stochastic Optimal Feedforward and Feedback Technology (SOFFT). Traditional design techniques optimize a single cost function (which expresses the design objectives) to obtain both the feedforward and feedback control laws. This approach places conflicting demands on the control law such as fast tracking versus noise atttenuation/disturbance rejection. In the SOFFT approach, two cost functions are defined. The feedforward control law is designed to optimize one cost function, the feedback optimizes the other. By separating the design objectives and decoupling the feedforward and feedback design processes, both objectives can be achieved fully. A new measure of command tracking performance, Z-plots, is also developed. By analyzing these plots at off-nominal conditions, the sensitivity or robustness of the system in tracking commands can be predicted. Z-plots provide an important tool for designing robust control systems. The Variable-Gain SOFFT methodology was used to design a flight control system for the F/A-18 aircraft. It is shown that SOFFT can be used to expand the operating regime and provide greater performance (flying/handling qualities) throughout the extended flight regime. This work was performed under the NASA SBIR program. ICS plans to market the software developed as a new module in its commercial CACSD software package: ACET.

  19. Stochastic approach for radionuclides quantification

    NASA Astrophysics Data System (ADS)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  20. Knowledge management in secondary pharmaceutical manufacturing by mining of data historians-A proof-of-concept study.

    PubMed

    Meneghetti, Natascia; Facco, Pierantonio; Bezzo, Fabrizio; Himawan, Chrismono; Zomer, Simeone; Barolo, Massimiliano

    2016-05-30

    In this proof-of-concept study, a methodology is proposed to systematically analyze large data historians of secondary pharmaceutical manufacturing systems using data mining techniques. The objective is to develop an approach enabling to automatically retrieve operation-relevant information that can assist the management in the periodic review of a manufactory system. The proposed methodology allows one to automatically perform three tasks: the identification of single batches within the entire data-sequence of the historical dataset, the identification of distinct operating phases within each batch, and the characterization of a batch with respect to an assigned multivariate set of operating characteristics. The approach is tested on a six-month dataset of a commercial-scale granulation/drying system, where several millions of data entries are recorded. The quality of results and the generality of the approach indicate that there is a strong potential for extending the method to even larger historical datasets and to different operations, thus making it an advanced PAT tool that can assist the implementation of continual improvement paradigms within a quality-by-design framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Investigating Nanoscale Electrochemistry with Surface- and Tip-Enhanced Raman Spectroscopy.

    PubMed

    Zaleski, Stephanie; Wilson, Andrew J; Mattei, Michael; Chen, Xu; Goubert, Guillaume; Cardinal, M Fernanda; Willets, Katherine A; Van Duyne, Richard P

    2016-09-20

    The chemical sensitivity of surface-enhanced Raman spectroscopy (SERS) methodologies allows for the investigation of heterogeneous chemical reactions with high sensitivity. Specifically, SERS methodologies are well-suited to study electron transfer (ET) reactions, which lie at the heart of numerous fundamental processes: electrocatalysis, solar energy conversion, energy storage in batteries, and biological events such as photosynthesis. Heterogeneous ET reactions are commonly monitored by electrochemical methods such as cyclic voltammetry, observing billions of electrochemical events per second. Since the first proof of detecting single molecules by redox cycling, there has been growing interest in examining electrochemistry at the nanoscale and single-molecule levels. Doing so unravels details that would otherwise be obscured by an ensemble experiment. The use of optical spectroscopies, such as SERS, to elucidate nanoscale electrochemical behavior is an attractive alternative to traditional approaches such as scanning electrochemical microscopy (SECM). While techniques such as single-molecule fluorescence or electrogenerated chemiluminescence have been used to optically monitor electrochemical events, SERS methodologies, in particular, have shown great promise for exploring electrochemistry at the nanoscale. SERS is ideally suited to study nanoscale electrochemistry because the Raman-enhancing metallic, nanoscale substrate duly serves as the working electrode material. Moreover, SERS has the ability to directly probe single molecules without redox cycling and can achieve nanoscale spatial resolution in combination with super-resolution or scanning probe microscopies. This Account summarizes the latest progress from the Van Duyne and Willets groups toward understanding nanoelectrochemistry using Raman spectroscopic methodologies. The first half of this Account highlights three techniques that have been recently used to probe few- or single-molecule electrochemical events: single-molecule SERS (SMSERS), superlocalization SERS imaging, and tip-enhanced Raman spectroscopy (TERS). While all of the studies we discuss probe model redox dye systems, the experiments described herein push the study of nanoscale electrochemistry toward the fundamental limit, in terms of both chemical sensitivity and spatial resolution. The second half of this Account discusses current experimental strategies for studying nanoelectrochemistry with SERS techniques, which includes relevant electrochemically and optically active molecules, substrates, and substrate functionalization methods. In particular, we highlight the wide variety of SERS-active substrates and optically active molecules that can be implemented for EC-SERS, as well as the need to carefully characterize both the electrochemistry and resultant EC-SERS response of each new redox-active molecule studied. Finally, we conclude this Account with our perspective on the future directions of studying nanoscale electrochemistry with SERS/TERS, which includes the integration of SECM with TERS and the use of theoretical methods to further describe the fundamental intricacies of single-molecule, single-site electrochemistry at the nanoscale.

  2. Status of Single-Case Research Designs for Evidence-Based Practice

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Matson, Michael L.

    2012-01-01

    The single-case research design has become a paradoxical methodology in the applied sciences. While various experimental designs have been in place for over 50 years, there has not been wide acceptance of single-case methodology outside clinical and school psychology, or the field of special education. These methods were developed in the U.S.A.,…

  3. Joint Models of Longitudinal and Time-to-Event Data with More Than One Event Time Outcome: A Review.

    PubMed

    Hickey, Graeme L; Philipson, Pete; Jorgensen, Andrea; Kolamunnage-Dona, Ruwanthi

    2018-01-31

    Methodological development and clinical application of joint models of longitudinal and time-to-event outcomes have grown substantially over the past two decades. However, much of this research has concentrated on a single longitudinal outcome and a single event time outcome. In clinical and public health research, patients who are followed up over time may often experience multiple, recurrent, or a succession of clinical events. Models that utilise such multivariate event time outcomes are quite valuable in clinical decision-making. We comprehensively review the literature for implementation of joint models involving more than a single event time per subject. We consider the distributional and modelling assumptions, including the association structure, estimation approaches, software implementations, and clinical applications. Research into this area is proving highly promising, but to-date remains in its infancy.

  4. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  5. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    NASA Astrophysics Data System (ADS)

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-08-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.

  6. A self-contained, automated methodology for optimal flow control validated for transition delay

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, R. A.; Erlebacher, Gordon; Hussaini, M. Yousuff

    1995-01-01

    This paper describes a self-contained, automated methodology for flow control along with a validation of the methodology for the problem of boundary layer instability suppression. The objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow, e.g., Blasius boundary layer. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The present approach couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields, and control, e.g., actuators, may be determined. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc.

  7. Real-Time Prognostics of a Rotary Valve Actuator

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew

    2015-01-01

    Valves are used in many domains and often have system-critical functions. As such, it is important to monitor the health of valves and their actuators and predict remaining useful life. In this work, we develop a model-based prognostics approach for a rotary valve actuator. Due to limited observability of the component with multiple failure modes, a lumped damage approach is proposed for estimation and prediction of damage progression. In order to support the goal of real-time prognostics, an approach to prediction is developed that does not require online simulation to compute remaining life, rather, a function mapping the damage state to remaining useful life is found offline so that predictions can be made quickly online with a single function evaluation. Simulation results demonstrate the overall methodology, validating the lumped damage approach and demonstrating real-time prognostics.

  8. Reduced representation approaches to interrogate genome diversity in large repetitive plant genomes.

    PubMed

    Hirsch, Cory D; Evans, Joseph; Buell, C Robin; Hirsch, Candice N

    2014-07-01

    Technology and software improvements in the last decade now provide methodologies to access the genome sequence of not only a single accession, but also multiple accessions of plant species. This provides a means to interrogate species diversity at the genome level. Ample diversity among accessions in a collection of species can be found, including single-nucleotide polymorphisms, insertions and deletions, copy number variation and presence/absence variation. For species with small, non-repetitive rich genomes, re-sequencing of query accessions is robust, highly informative, and economically feasible. However, for species with moderate to large sized repetitive-rich genomes, technical and economic barriers prevent en masse genome re-sequencing of accessions. Multiple approaches to access a focused subset of loci in species with larger genomes have been developed, including reduced representation sequencing, exome capture and transcriptome sequencing. Collectively, these approaches have enabled interrogation of diversity on a genome scale for large plant genomes, including crop species important to worldwide food security. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  11. Patch-based Adaptive Mesh Refinement for Multimaterial Hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lomov, I; Pember, R; Greenough, J

    2005-10-18

    We present a patch-based direct Eulerian adaptive mesh refinement (AMR) algorithm for modeling real equation-of-state, multimaterial compressible flow with strength. Our approach to AMR uses a hierarchical, structured grid approach first developed by (Berger and Oliger 1984), (Berger and Oliger 1984). The grid structure is dynamic in time and is composed of nested uniform rectangular grids of varying resolution. The integration scheme on the grid hierarchy is a recursive procedure in which the coarse grids are advanced, then the fine grids are advanced multiple steps to reach the same time, and finally the coarse and fine grids are synchronized tomore » remove conservation errors during the separate advances. The methodology presented here is based on a single grid algorithm developed for multimaterial gas dynamics by (Colella et al. 1993), refined by(Greenough et al. 1995), and extended to the solution of solid mechanics problems with significant strength by (Lomov and Rubin 2003). The single grid algorithm uses a second-order Godunov scheme with an approximate single fluid Riemann solver and a volume-of-fluid treatment of material interfaces. The method also uses a non-conservative treatment of the deformation tensor and an acoustic approximation for shear waves in the Riemann solver. This departure from a strict application of the higher-order Godunov methodology to the equation of solid mechanics is justified due to the fact that highly nonlinear behavior of shear stresses is rare. This algorithm is implemented in two codes, Geodyn and Raptor, the latter of which is a coupled rad-hydro code. The present discussion will be solely concerned with hydrodynamics modeling. Results from a number of simulations for flows with and without strength will be presented.« less

  12. Control and stabilization of decentralized systems

    NASA Technical Reports Server (NTRS)

    Byrnes, Christopher I.; Gilliam, David; Martin, Clyde F.

    1989-01-01

    Proceeding from the problem posed by the need to stabilize the motion of two helicopters maneuvering a single load, a methodology is developed for the stabilization of classes of decentralized systems based on a more algebraic approach, which involves the external symmetries of decentralized systems. Stabilizing local-feedback laws are derived for any class of decentralized systems having a semisimple algebra of symmetries; the helicopter twin-lift problem, as well as certain problems involving the stabilization of discretizations of distributed parameter problems, have just such algebras of symmetries.

  13. Drug target inference through pathway analysis of genomics data

    PubMed Central

    Ma, Haisu; Zhao, Hongyu

    2013-01-01

    Statistical modeling coupled with bioinformatics is commonly used for drug discovery. Although there exist many approaches for single target based drug design and target inference, recent years have seen a paradigm shift to system-level pharmacological research. Pathway analysis of genomics data represents one promising direction for computational inference of drug targets. This article aims at providing a comprehensive review on the evolving issues is this field, covering methodological developments, their pros and cons, as well as future research directions. PMID:23369829

  14. Refinements to the Graves and Pitarka (2010) Broadband Ground-Motion Simulation Method

    DOE PAGES

    Graves, Robert; Pitarka, Arben

    2014-12-17

    This brief article describes refinements to the Graves and Pitarka (2010) broadband ground-motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the Southern California Earthquake Center (SCEC) Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Here, our simulation technique is a hybrid approach that combines low- and high-frequency motions computed with different methods into a single broadband response.

  15. Application of a single-objective, hybrid genetic algorithm approach to pharmacokinetic model building.

    PubMed

    Sherer, Eric A; Sale, Mark E; Pollock, Bruce G; Belani, Chandra P; Egorin, Merrill J; Ivy, Percy S; Lieberman, Jeffrey A; Manuck, Stephen B; Marder, Stephen R; Muldoon, Matthew F; Scher, Howard I; Solit, David B; Bies, Robert R

    2012-08-01

    A limitation in traditional stepwise population pharmacokinetic model building is the difficulty in handling interactions between model components. To address this issue, a method was previously introduced which couples NONMEM parameter estimation and model fitness evaluation to a single-objective, hybrid genetic algorithm for global optimization of the model structure. In this study, the generalizability of this approach for pharmacokinetic model building is evaluated by comparing (1) correct and spurious covariate relationships in a simulated dataset resulting from automated stepwise covariate modeling, Lasso methods, and single-objective hybrid genetic algorithm approaches to covariate identification and (2) information criteria values, model structures, convergence, and model parameter values resulting from manual stepwise versus single-objective, hybrid genetic algorithm approaches to model building for seven compounds. Both manual stepwise and single-objective, hybrid genetic algorithm approaches to model building were applied, blinded to the results of the other approach, for selection of the compartment structure as well as inclusion and model form of inter-individual and inter-occasion variability, residual error, and covariates from a common set of model options. For the simulated dataset, stepwise covariate modeling identified three of four true covariates and two spurious covariates; Lasso identified two of four true and 0 spurious covariates; and the single-objective, hybrid genetic algorithm identified three of four true covariates and one spurious covariate. For the clinical datasets, the Akaike information criterion was a median of 22.3 points lower (range of 470.5 point decrease to 0.1 point decrease) for the best single-objective hybrid genetic-algorithm candidate model versus the final manual stepwise model: the Akaike information criterion was lower by greater than 10 points for four compounds and differed by less than 10 points for three compounds. The root mean squared error and absolute mean prediction error of the best single-objective hybrid genetic algorithm candidates were a median of 0.2 points higher (range of 38.9 point decrease to 27.3 point increase) and 0.02 points lower (range of 0.98 point decrease to 0.74 point increase), respectively, than that of the final stepwise models. In addition, the best single-objective, hybrid genetic algorithm candidate models had successful convergence and covariance steps for each compound, used the same compartment structure as the manual stepwise approach for 6 of 7 (86 %) compounds, and identified 54 % (7 of 13) of covariates included by the manual stepwise approach and 16 covariate relationships not included by manual stepwise models. The model parameter values between the final manual stepwise and best single-objective, hybrid genetic algorithm models differed by a median of 26.7 % (q₁ = 4.9 % and q₃ = 57.1 %). Finally, the single-objective, hybrid genetic algorithm approach was able to identify models capable of estimating absorption rate parameters for four compounds that the manual stepwise approach did not identify. The single-objective, hybrid genetic algorithm represents a general pharmacokinetic model building methodology whose ability to rapidly search the feasible solution space leads to nearly equivalent or superior model fits to pharmacokinetic data.

  16. Effect of Methodological and Ecological Approaches on Heterogeneity of Nest-Site Selection of a Long-Lived Vulture

    PubMed Central

    Moreno-Opo, Rubén; Fernández-Olalla, Mariana; Margalida, Antoni; Arredondo, Ángel; Guil, Francisco

    2012-01-01

    The application of scientific-based conservation measures requires that sampling methodologies in studies modelling similar ecological aspects produce comparable results making easier their interpretation. We aimed to show how the choice of different methodological and ecological approaches can affect conclusions in nest-site selection studies along different Palearctic meta-populations of an indicator species. First, a multivariate analysis of the variables affecting nest-site selection in a breeding colony of cinereous vulture (Aegypius monachus) in central Spain was performed. Then, a meta-analysis was applied to establish how methodological and habitat-type factors determine differences and similarities in the results obtained by previous studies that have modelled the forest breeding habitat of the species. Our results revealed patterns in nesting-habitat modelling by the cinereous vulture throughout its whole range: steep and south-facing slopes, great cover of large trees and distance to human activities were generally selected. The ratio and situation of the studied plots (nests/random), the use of plots vs. polygons as sampling units and the number of years of data set determined the variability explained by the model. Moreover, a greater size of the breeding colony implied that ecological and geomorphological variables at landscape level were more influential. Additionally, human activities affected in greater proportion to colonies situated in Mediterranean forests. For the first time, a meta-analysis regarding the factors determining nest-site selection heterogeneity for a single species at broad scale was achieved. It is essential to homogenize and coordinate experimental design in modelling the selection of species' ecological requirements in order to avoid that differences in results among studies would be due to methodological heterogeneity. This would optimize best conservation and management practices for habitats and species in a global context. PMID:22413023

  17. Next-generation Sequencing (NGS) Analysis on Single Circulating Tumor Cells (CTCs) with No Need of Whole-genome Amplification (WGA).

    PubMed

    Palmirotta, Raffaele; Lovero, Domenica; Silvestris, Erica; Felici, Claudia; Quaresmini, Davide; Cafforio, Paola; Silvestris, Franco

    2017-01-01

    Isolation and genotyping of circulating tumor cells (CTCs) is gaining an increasing interest by clinical researchers in oncology not only for investigative purposes, but also for concrete application in clinical practice in terms of diagnosis, prognosis and decision treatment with targeted therapies. For the mutational analysis of single CTCs, the most advanced biotechnology methodology currently available includes the combination of whole genome amplification (WGA) followed by next-generation sequencing (NGS). However, the sequence of these molecular techniques is time-consuming and may also favor operator-dependent errors, related to the procedures themselves that, as in the case of the WGA technique, might affect downstream molecular analyses. A preliminary approach of molecular analysis by NGS on a model of CTCs without previous WGA procedural step was performed. We set-up an artificial sample obtained by spiking the SK-MEL-28 melanoma cell line in normal donor peripheral whole blood. Melanoma cells were first enriched using an AutoMACS® (Miltenyi) cell separator and then isolated as single and pooled CTCs by DEPArray™ System (Silicon Biosystems). NGS analysis, using the Ion AmpliSeq™ Cancer Hotspot Panel v2 (Life Technologies) with the Ion Torrent PGM™ system (Life Technologies), was performed on the SK-MEL-28 cell pellet, a single CTC previously processed with WGA and on 1, 2, 4 and 8 recovered CTCs without WGA pre-amplification. NGS directly carried out on CTCs without WGA showed the same mutations identified in SK-MEL-28 cell line pellet, with a considerable efficiency and avoiding the errors induced by the WGA procedure. We identified a cost-effective, time-saving and reliable methodological approach that could improve the analytical accuracy of the liquid biopsy and appears promising in studying CTCs from cancer patients for both research and clinical purposes. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  18. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  19. Using object-oriented analysis to design a multi-mission ground data system

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1995-01-01

    This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.

  20. Condensing Raman spectrum for single-cell phenotype analysis.

    PubMed

    Sun, Shiwei; Wang, Xuetao; Gao, Xin; Ren, Lihui; Su, Xiaoquan; Bu, Dongbo; Ning, Kang

    2015-01-01

    In recent years, high throughput and non-invasive Raman spectrometry technique has matured as an effective approach to identification of individual cells by species, even in complex, mixed populations. Raman profiling is an appealing optical microscopic method to achieve this. To fully utilize Raman proling for single-cell analysis, an extensive understanding of Raman spectra is necessary to answer questions such as which filtering methodologies are effective for pre-processing of Raman spectra, what strains can be distinguished by Raman spectra, and what features serve best as Raman-based biomarkers for single-cells, etc. In this work, we have proposed an approach called rDisc to discretize the original Raman spectrum into only a few (usually less than 20) representative peaks (Raman shifts). The approach has advantages in removing noises, and condensing the original spectrum. In particular, effective signal processing procedures were designed to eliminate noise, utilising wavelet transform denoising, baseline correction, and signal normalization. In the discretizing process, representative peaks were selected to signicantly decrease the Raman data size. More importantly, the selected peaks are chosen as suitable to serve as key biological markers to differentiate species and other cellular features. Additionally, the classication performance of discretized spectra was found to be comparable to full spectrum having more than 1000 Raman shifts. Overall, the discretized spectrum needs about 5storage space of a full spectrum and the processing speed is considerably faster. This makes rDisc clearly superior to other methods for single-cell classication.

  1. Residential Building Energy Code Field Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Bartlett, M. Halverson, V. Mendon, J. Hathaway, Y. Xie

    This document presents a methodology for assessing baseline energy efficiency in new single-family residential buildings and quantifying related savings potential. The approach was developed by Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy (DOE) Building Energy Codes Program with the objective of assisting states as they assess energy efficiency in residential buildings and implementation of their building energy codes, as well as to target areas for improvement through energy codes and broader energy-efficiency programs. It is also intended to facilitate a consistent and replicable approach to research studies of this type and establish a transparent data setmore » to represent baseline construction practices across U.S. states.« less

  2. Rapidly evolving homing CRISPR barcodes

    PubMed Central

    Kalhor, Reza; Mali, Prashant; Church, George M.

    2017-01-01

    We present here an approach for engineering evolving DNA barcodes in living cells. The methodology entails using a homing guide RNA (hgRNA) scaffold that directs the Cas9-hgRNA complex to target the DNA locus of the hgRNA itself. We show that this homing CRISPR-Cas9 system acts as an expressed genetic barcode that diversifies its sequence and that the rate of diversification can be controlled in cultured cells. We further evaluate these barcodes in cell populations and show the barcode RNAs can be assayed as single molecules in situ . This integrated approach will have wide ranging applications, such as in deep lineage tracing, cellular barcoding, molecular recording, dissecting cancer biology, and connectome mapping. PMID:27918539

  3. A micromechanics-based strength prediction methodology for notched metal matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1992-01-01

    An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  4. A micromechanics-based strength prediction methodology for notched metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1993-01-01

    An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  5. Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.

    PubMed

    Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V

    2008-12-22

    A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk; Jensen, Morten Bang; Götze, Ramona

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In thismore » study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single-family and multi-family house areas), the individual percentage composition of food waste, paper, and glass was significantly different between the housing types. This indicates that housing type is a critical stratification parameter. Separating food leftovers from food packaging during manual sorting of the sampled waste did not have significant influence on the proportions of food waste and packaging materials, indicating that this step may not be required.« less

  7. Coupled variational formulations of linear elasticity and the DPG methodology

    NASA Astrophysics Data System (ADS)

    Fuentes, Federico; Keith, Brendan; Demkowicz, Leszek; Le Tallec, Patrick

    2017-11-01

    This article presents a general approach akin to domain-decomposition methods to solve a single linear PDE, but where each subdomain of a partitioned domain is associated to a distinct variational formulation coming from a mutually well-posed family of broken variational formulations of the original PDE. It can be exploited to solve challenging problems in a variety of physical scenarios where stability or a particular mode of convergence is desired in a part of the domain. The linear elasticity equations are solved in this work, but the approach can be applied to other equations as well. The broken variational formulations, which are essentially extensions of more standard formulations, are characterized by the presence of mesh-dependent broken test spaces and interface trial variables at the boundaries of the elements of the mesh. This allows necessary information to be naturally transmitted between adjacent subdomains, resulting in coupled variational formulations which are then proved to be globally well-posed. They are solved numerically using the DPG methodology, which is especially crafted to produce stable discretizations of broken formulations. Finally, expected convergence rates are verified in two different and illustrative examples.

  8. A unified approach for development of Urdu Corpus for OCR and demographic purpose

    NASA Astrophysics Data System (ADS)

    Choudhary, Prakash; Nain, Neeta; Ahmed, Mushtaq

    2015-02-01

    This paper presents a methodology for the development of an Urdu handwritten text image Corpus and application of Corpus linguistics in the field of OCR and information retrieval from handwritten document. Compared to other language scripts, Urdu script is little bit complicated for data entry. To enter a single character it requires a combination of multiple keys entry. Here, a mixed approach is proposed and demonstrated for building Urdu Corpus for OCR and Demographic data collection. Demographic part of database could be used to train a system to fetch the data automatically, which will be helpful to simplify existing manual data-processing task involved in the field of data collection such as input forms like Passport, Ration Card, Voting Card, AADHAR, Driving licence, Indian Railway Reservation, Census data etc. This would increase the participation of Urdu language community in understanding and taking benefit of the Government schemes. To make availability and applicability of database in a vast area of corpus linguistics, we propose a methodology for data collection, mark-up, digital transcription, and XML metadata information for benchmarking.

  9. Modern Methods for Modeling Change in Obesity Research in Nursing.

    PubMed

    Sereika, Susan M; Zheng, Yaguang; Hu, Lu; Burke, Lora E

    2017-08-01

    Persons receiving treatment for weight loss often demonstrate heterogeneity in lifestyle behaviors and health outcomes over time. Traditional repeated measures approaches focus on the estimation and testing of an average temporal pattern, ignoring the interindividual variability about the trajectory. An alternate person-centered approach, group-based trajectory modeling, can be used to identify distinct latent classes of individuals following similar trajectories of behavior or outcome change as a function of age or time and can be expanded to include time-invariant and time-dependent covariates and outcomes. Another latent class method, growth mixture modeling, builds on group-based trajectory modeling to investigate heterogeneity within the distinct trajectory classes. In this applied methodologic study, group-based trajectory modeling for analyzing changes in behaviors or outcomes is described and contrasted with growth mixture modeling. An illustration of group-based trajectory modeling is provided using calorie intake data from a single-group, single-center prospective study for weight loss in adults who are either overweight or obese.

  10. Structural Design Methodology Based on Concepts of Uncertainty

    NASA Technical Reports Server (NTRS)

    Lin, K. Y.; Du, Jiaji; Rusk, David

    2000-01-01

    In this report, an approach to damage-tolerant aircraft structural design is proposed based on the concept of an equivalent "Level of Safety" that incorporates past service experience in the design of new structures. The discrete "Level of Safety" for a single inspection event is defined as the compliment of the probability that a single flaw size larger than the critical flaw size for residual strength of the structure exists, and that the flaw will not be detected. The cumulative "Level of Safety" for the entire structure is the product of the discrete "Level of Safety" values for each flaw of each damage type present at each location in the structure. Based on the definition of "Level of Safety", a design procedure was identified and demonstrated on a composite sandwich panel for various damage types, with results showing the sensitivity of the structural sizing parameters to the relative safety of the design. The "Level of Safety" approach has broad potential application to damage-tolerant aircraft structural design with uncertainty.

  11. Associations between attention, affect and cardiac activity in a single yoga session for female cancer survivors: an enactive neurophenomenology-based approach.

    PubMed

    Mackenzie, Michael J; Carlson, Linda E; Paskevich, David M; Ekkekakis, Panteleimon; Wurz, Amanda J; Wytsma, Kathryn; Krenz, Katie A; McAuley, Edward; Culos-Reed, S Nicole

    2014-07-01

    Yoga practice is reported to lead to improvements in quality of life, psychological functioning, and symptom indices in cancer survivors. Importantly, meditative states experienced within yoga practice are correlated to neurophysiological systems that moderate both focus of attention and affective valence. The current study used a mixed methods approach based in neurophenomenology to investigate associations between attention, affect, and cardiac activity during a single yoga session for female cancer survivors. Yoga practice was associated with a linear increase in associative attention and positive affective valence, while shifts in cardiac activity were related to the intensity of each yoga sequence. Changes in attention and affect were predicted by concurrently assessed cardiac activity. Awareness of breathing, physical movement, and increased relaxation were reported by participants as potential mechanisms for yoga's salutary effects. While yoga practice shares commonalities with exercise and relaxation training, yoga may serve primarily as a promising meditative attention-affect regulation training methodology. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Precise detection of de novo single nucleotide variants in human genomes.

    PubMed

    Gómez-Romero, Laura; Palacios-Flores, Kim; Reyes, José; García, Delfino; Boege, Margareta; Dávila, Guillermo; Flores, Margarita; Schatz, Michael C; Palacios, Rafael

    2018-05-22

    The precise determination of de novo genetic variants has enormous implications across different fields of biology and medicine, particularly personalized medicine. Currently, de novo variations are identified by mapping sample reads from a parent-offspring trio to a reference genome, allowing for a certain degree of differences. While widely used, this approach often introduces false-positive (FP) results due to misaligned reads and mischaracterized sequencing errors. In a previous study, we developed an alternative approach to accurately identify single nucleotide variants (SNVs) using only perfect matches. However, this approach could be applied only to haploid regions of the genome and was computationally intensive. In this study, we present a unique approach, coverage-based single nucleotide variant identification (COBASI), which allows the exploration of the entire genome using second-generation short sequence reads without extensive computing requirements. COBASI identifies SNVs using changes in coverage of exactly matching unique substrings, and is particularly suited for pinpointing de novo SNVs. Unlike other approaches that require population frequencies across hundreds of samples to filter out any methodological biases, COBASI can be applied to detect de novo SNVs within isolated families. We demonstrate this capability through extensive simulation studies and by studying a parent-offspring trio we sequenced using short reads. Experimental validation of all 58 candidate de novo SNVs and a selection of non-de novo SNVs found in the trio confirmed zero FP calls. COBASI is available as open source at https://github.com/Laura-Gomez/COBASI for any researcher to use. Copyright © 2018 the Author(s). Published by PNAS.

  13. Early breast tumor and late SARS detections using space-variant multispectral infrared imaging at a single pixel

    NASA Astrophysics Data System (ADS)

    Szu, Harold H.; Buss, James R.; Kopriva, Ivica

    2004-04-01

    We proposed the physics approach to solve a physical inverse problem, namely to choose the unique equilibrium solution (at the minimum free energy: H= E - ToS, including the Wiener, l.m.s E, and ICA, Max S, as special cases). The "unsupervised classification" presumes that required information must be learned and derived directly and solely from the data alone, in consistence with the classical Duda-Hart ATR definition of the "unlabelled data". Such truly unsupervised methodology is presented for space-variant imaging processing for a single pixel in the real world case of remote sensing, early tumor detections and SARS. The indeterminacy of the multiple solutions of the inverse problem is regulated or selected by means of the absolute minimum of isothermal free energy as the ground truth of local equilibrium condition at the single-pixel foot print.

  14. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  15. Spoilt for choice: A critical review on the chemical and biological assessment of current wastewater treatment technologies.

    PubMed

    Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A

    2015-12-15

    The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Dynamics of Coupled Electron-Boson Systems with the Multiple Davydov D1 Ansatz and the Generalized Coherent State.

    PubMed

    Chen, Lipeng; Borrelli, Raffaele; Zhao, Yang

    2017-11-22

    The dynamics of a coupled electron-boson system is investigated by employing a multitude of the Davydov D 1 trial states, also known as the multi-D 1 Ansatz, and a second trial state based on a superposition of the time-dependent generalized coherent state (GCS Ansatz). The two Ansätze are applied to study population dynamics in the spin-boson model and the Holstein molecular crystal model, and a detailed comparison with numerically exact results obtained by the (multilayer) multiconfiguration time-dependent Hartree method and the hierarchy equations of motion approach is drawn. It is found that the two methodologies proposed here have significantly improved over that with the single D 1 Ansatz, yielding quantitatively accurate results even in the critical cases of large energy biases and large transfer integrals. The two methodologies provide new effective tools for accurate, efficient simulation of many-body quantum dynamics thanks to a relatively small number of parameters which characterize the electron-nuclear wave functions. The wave-function-based approaches are capable of tracking explicitly detailed bosonic dynamics, which is absent by construct in approaches based on the reduced density matrix. The efficiency and flexibility of our methods are also advantages as compared with numerically exact approaches such as QUAPI and HEOM, especially at low temperatures and in the strong coupling regime.

  17. A participatory approach to the study of lifting demands and musculoskeletal symptoms among Hong Kong workers

    PubMed Central

    Yeung, S; Genaidy, A; Deddens, J; Shoaf, C; Leung, P

    2003-01-01

    Aims: To investigate the use of a worker based methodology to assess the physical stresses of lifting tasks on effort expended, and to associate this loading with musculoskeletal outcomes (MO). Methods: A cross sectional study was conducted on 217 male manual handling workers from the Hong Kong area. The effects of four lifting variables (weight of load, horizontal distance, twisting angle, and vertical travel distance) on effort were examined using a linguistic approach (that is, characterising variables in descriptors such as "heavy" for weight of load). The numerical interpretations of linguistic descriptors were established. In addition, the associations between on the job effort and MO were investigated for 10 body regions including the spine, and both upper and lower extremities. Results: MO were prevalent in multiple body regions (range 12–58%); effort was significantly associated with MO in 8 of 10 body regions (odds ratios with age adjusted ranged from 1.31 for low back to 1.71 for elbows and forearm). The lifting task variables had significant effects on effort, with the weight of load having twice the effect of other variables; each linguistic descriptor was better described by a range of numerical values rather than a single numerical value. Conclusions: The participatory worker based approach on musculoskeletal outcomes is a promising methodology. Further testing of this approach is recommended. PMID:14504360

  18. Dynamic task allocation for a man-machine symbiotic system

    NASA Technical Reports Server (NTRS)

    Parker, L. E.; Pin, F. G.

    1987-01-01

    This report presents a methodological approach to the dynamic allocation of tasks in a man-machine symbiotic system in the context of dexterous manipulation and teleoperation. This report addresses a symbiotic system containing two symbiotic partners which work toward controlling a single manipulator arm for the execution of a series of sequential manipulation tasks. It is proposed that an automated task allocator use knowledge about the constraints/criteria of the problem, the available resources, the tasks to be performed, and the environment to dynamically allocate task recommendations for the man and the machine. The presentation of the methodology includes discussions concerning the interaction of the knowledge areas, the flow of control, the necessary communication links, and the replanning of the task allocation. Examples of task allocation are presented to illustrate the results of this methodolgy.

  19. Multiple reaction monitoring (MRM) of plasma proteins in cardiovascular proteomics.

    PubMed

    Dardé, Verónica M; Barderas, Maria G; Vivanco, Fernando

    2013-01-01

    Different methodologies have been used through years to discover new potential biomarkers related with cardiovascular risk. The conventional proteomic strategy involves a discovery phase that requires the use of mass spectrometry (MS) and a validation phase, usually on an alternative platform such as immunoassays that can be further implemented in clinical practice. This approach is suitable for a single biomarker, but when large panels of biomarkers must be validated, the process becomes inefficient and costly. Therefore, it is essential to find an alternative methodology to perform the biomarker discovery, validation, and -quantification. The skills provided by quantitative MS turn it into an extremely attractive alternative to antibody-based technologies. Although it has been traditionally used for quantification of small molecules in clinical chemistry, MRM is now emerging as an alternative to traditional immunoassays for candidate protein biomarker validation.

  20. Simple methodologies to estimate the energy amount stored in a tree due to an explosive seed dispersal mechanism

    NASA Astrophysics Data System (ADS)

    do Carmo, Eduardo; Goncalves Hönnicke, Marcelo

    2018-05-01

    There are different forms to introduce/illustrate the energy concepts for the basic physics students. The explosive seed dispersal mechanism found in a variety of trees could be one of them. Sibipiruna trees carry out fruits (pods) who show such an explosive mechanism. During the explosion, the pods throw out seeds several meters away. In this manuscript we show simple methodologies to estimate the energy amount stored in the Sibipiruna tree due to such a process. Two different physics approaches were used to carry out this study: by monitoring indoor and in situ the explosive seed dispersal mechanism and by measuring the elastic constant of the pod shell. An energy of the order of kJ was found to be stored in a single tree due to such an explosive mechanism.

  1. Correcting sample drift using Fourier harmonics.

    PubMed

    Bárcena-González, G; Guerrero-Lebrero, M P; Guerrero, E; Reyes, D F; Braza, V; Yañez, A; Nuñez-Moraleda, B; González, D; Galindo, P L

    2018-07-01

    During image acquisition of crystalline materials by high-resolution scanning transmission electron microscopy, the sample drift could lead to distortions and shears that hinder their quantitative analysis and characterization. In order to measure and correct this effect, several authors have proposed different methodologies making use of series of images. In this work, we introduce a methodology to determine the drift angle via Fourier analysis by using a single image based on the measurements between the angles of the second Fourier harmonics in different quadrants. Two different approaches, that are independent of the angle of acquisition of the image, are evaluated. In addition, our results demonstrate that the determination of the drift angle is more accurate by using the measurements of non-consecutive quadrants when the angle of acquisition is an odd multiple of 45°. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Systematic Review of the Effect of Pictorial Warnings on Cigarette Packages in Smoking Behavior

    PubMed Central

    Liu, Bojing; Greiner, Felix; Bremberg, Sven; Galanti, Rosaria

    2014-01-01

    We used a structured approach to assess whether active smokers presented with pictorial warnings on cigarette packages (PWCP) had a higher probability of quitting, reducing, and attempting to quit smoking than did unexposed smokers. We identified 21 articles from among nearly 2500 published between 1993 and 2013, prioritizing coverage over relevance or quality because we expected to find only a few studies with behavioral outcomes. We found very large heterogeneity across studies, poor or very poor methodological quality, and generally null or conflicting findings for any explored outcome. The evidence for or against the use of PWCP is insufficient, suggesting that any effect of PWCP on behavior would be modest. Determining the single impact of PWCP on behavior requires studies with strong methodological designs and longer follow-up periods. PMID:25122019

  3. Single-Vector Calibration of Wind-Tunnel Force Balances

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; DeLoach, R.

    2003-01-01

    An improved method of calibrating a wind-tunnel force balance involves the use of a unique load application system integrated with formal experimental design methodology. The Single-Vector Force Balance Calibration System (SVS) overcomes the productivity and accuracy limitations of prior calibration methods. A force balance is a complex structural spring element instrumented with strain gauges for measuring three orthogonal components of aerodynamic force (normal, axial, and side force) and three orthogonal components of aerodynamic torque (rolling, pitching, and yawing moments). Force balances remain as the state-of-the-art instrument that provide these measurements on a scale model of an aircraft during wind tunnel testing. Ideally, each electrical channel of the balance would respond only to its respective component of load, and it would have no response to other components of load. This is not entirely possible even though balance designs are optimized to minimize these undesirable interaction effects. Ultimately, a calibration experiment is performed to obtain the necessary data to generate a mathematical model and determine the force measurement accuracy. In order to set the independent variables of applied load for the calibration 24 NASA Tech Briefs, October 2003 experiment, a high-precision mechanical system is required. Manual deadweight systems have been in use at Langley Research Center (LaRC) since the 1940s. These simple methodologies produce high confidence results, but the process is mechanically complex and labor-intensive, requiring three to four weeks to complete. Over the past decade, automated balance calibration systems have been developed. In general, these systems were designed to automate the tedious manual calibration process resulting in an even more complex system which deteriorates load application quality. The current calibration approach relies on a one-factor-at-a-time (OFAT) methodology, where each independent variable is incremented individually throughout its full-scale range, while all other variables are held at a constant magnitude. This OFAT approach has been widely accepted because of its inherent simplicity and intuitive appeal to the balance engineer. LaRC has been conducting research in a "modern design of experiments" (MDOE) approach to force balance calibration. Formal experimental design techniques provide an integrated view to the entire calibration process covering all three major aspects of an experiment; the design of the experiment, the execution of the experiment, and the statistical analyses of the data. In order to overcome the weaknesses in the available mechanical systems and to apply formal experimental techniques, a new mechanical system was required. The SVS enables the complete calibration of a six-component force balance with a series of single force vectors.

  4. Systematic reviews, overviews of reviews and comparative effectiveness reviews: a discussion of approaches to knowledge synthesis.

    PubMed

    Hartling, Lisa; Vandermeer, Ben; Fernandes, Ricardo M

    2014-06-01

    The Cochrane Collaboration has been at the forefront of developing methods for knowledge synthesis internationally. We discuss three approaches to synthesize evidence for healthcare interventions: systematic reviews (SRs), overviews of reviews and comparative effectiveness reviews. We illustrate these approaches with examples from knowledge syntheses on interventions for bronchiolitis, a common acute paediatric condition. Some of the differences among these approaches are subtle and methods are not necessarily mutually exclusive to a single review type. Systematic reviews bring together evidence from multiple studies in a rigorous fashion for a single intervention or group of interventions. Systematic reviews, as they have developed within healthcare, often focus on single or select interventions and direct pairwise comparisons; therefore, end-users may need to read several individual SRs to inform decision making. Overviews of reviews compile information from multiple SRs relevant to a single health problem. Overviews provide the end-user with a quick overview of the available evidence; however, overviews are dependent on the methods and decisions employed at the SR level. Furthermore, overviews do not often integrate evidence from different SRs quantitatively. Comparative effectiveness reviews, as we define them here, synthesize relevant evidence from individual studies to describe the relative benefits (or harms) of a range of interventions. Comparative effectiveness reviews may use statistical methods (network meta-analysis) to incorporate direct and indirect evidence; therefore, they can provide stronger inferences about the relative effectiveness (or safety) of interventions. While potentially more expensive and time-consuming to produce, a comparative effectiveness review provides a synthesis of a range of interventions for a given condition and the relative efficacy across interventions using consistent and standardized methodology. Copyright © 2014 The Cochrane Collaboration. Published by John Wiley & Sons, Ltd.

  5. The role of physical exercise in cognitive recovery after traumatic brain injury: A systematic review.

    PubMed

    Morris, Timothy; Gomes Osman, Joyce; Tormos Muñoz, Jose Maria; Costa Miserachs, David; Pascual Leone, Alvaro

    2016-11-22

    There is a growing body of evidence revealing exercise-induced effects on brain structure and cognitive function across the lifespan. Animal models of traumatic brain injury also suggest exercise is capable of modulating not only the pathophysiological changes following trauma but also the associated cognitive deficits. To evaluate the effect of physical exercise on cognitive impairment following traumatic brain injury in humans. A systematic search of the PubMed database was performed using the search terms "cognition" and "executive function, memory or attention", "traumatic brain injury" and "physical exercise". Adult human traumatic brain injury studies that assessed cognitive function as an outcome measure (primary or secondary) and used physical exercise as a treatment (single or combined) were assessed by two independent reviewers. Data was extracted under the guidance of the population intervention comparison outcome framework wherein, characteristics of included studies (exercise duration, intensity, combined or single intervention, control groups and cognitive measures) were collected, after which, methodological quality (Cochrane criteria) was assessed. A total of 240 citations were identified, but only 6 met our inclusion criteria (3 from search records, 3 from reference lists. Only a small number of studies have evaluated the effect of exercise on cognition following traumatic brain injury in humans, and of those, assessment of efficacy is difficult due to low methodological strength and a high risk of different types of bias. Evidence of an effect of physical exercise on cognitive recovery suggests further studies should explore this treatment option with greater methodological approaches. Recommendations to reduce risk of bias and methodological shortfalls are discussed and include stricter inclusion criteria to create homogenous groups and larger patient pools, more rigorous cognitive assessments and the study and reporting of additional and combined rehabilitation techniques.

  6. Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.

    PubMed Central

    Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M

    1998-01-01

    Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897

  7. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    ERIC Educational Resources Information Center

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  8. Whole genome sequencing options for bacterial strain typing and epidemiologic analysis based on single nucleotide polymorphism versus gene-by-gene-based approaches.

    PubMed

    Schürch, A C; Arredondo-Alonso, S; Willems, R J L; Goering, R V

    2018-04-01

    Whole genome sequence (WGS)-based strain typing finds increasing use in the epidemiologic analysis of bacterial pathogens in both public health as well as more localized infection control settings. This minireview describes methodologic approaches that have been explored for WGS-based epidemiologic analysis and considers the challenges and pitfalls of data interpretation. Personal collection of relevant publications. When applying WGS to study the molecular epidemiology of bacterial pathogens, genomic variability between strains is translated into measures of distance by determining single nucleotide polymorphisms in core genome alignments or by indexing allelic variation in hundreds to thousands of core genes, assigning types to unique allelic profiles. Interpreting isolate relatedness from these distances is highly organism specific, and attempts to establish species-specific cutoffs are unlikely to be generally applicable. In cases where single nucleotide polymorphism or core gene typing do not provide the resolution necessary for accurate assessment of the epidemiology of bacterial pathogens, inclusion of accessory gene or plasmid sequences may provide the additional required discrimination. As with all epidemiologic analysis, realizing the full potential of the revolutionary advances in WGS-based approaches requires understanding and dealing with issues related to the fundamental steps of data generation and interpretation. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. UV Decontamination of MDA Reagents for Single Cell Genomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Janey; Tighe, Damon; Sczyrba, Alexander

    2011-03-18

    Single cell genomics, the amplification and sequencing of genomes from single cells, can provide a glimpse into the genetic make-up and thus life style of the vast majority of uncultured microbial cells, making it an immensely powerful and increasingly popular tool. This is accomplished by use of multiple displacement amplification (MDA), which can generate billions of copies of a single bacterial genome producing microgram-range DNA required for shotgun sequencing. Here, we address a key challenge inherent to this approach and propose a solution for the improved recovery of single cell genomes. While DNA-free reagents for the amplification of a singlemore » cell genome are a prerequisite for successful single cell sequencing and analysis, DNA contamination has been detected in various reagents, which poses a considerable challenge. Our study demonstrates the effect of UV irradiation in efficient elimination of exogenous contaminant DNA found in MDA reagents, while maintaining Phi29 activity. Consequently, we also find that increased UV exposure to Phi29 does not adversely affect genome coverage of MDA amplified single cells. While additional challenges in single cell genomics remain to be resolved, the proposed methodology is relatively quick and simple and we believe that its application will be of high value for future single cell sequencing projects.« less

  10. Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation.

    PubMed

    Zhao, Wei; Wang, Han

    2016-06-28

    Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages.

  11. Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation

    PubMed Central

    Zhao, Wei; Wang, Han

    2016-01-01

    Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages. PMID:27367691

  12. Nanoengineered Plasmonic Hybrid Systems for Bio-nanotechnology

    NASA Astrophysics Data System (ADS)

    Leong, Kirsty

    Plasmonic hybrid systems are fabricated using a combination of lithography and layer-by-layer directed self-assembly approaches to serve as highly sensitive nanosensing devices. This layer-by-layer directed self-assembly approach is utilized as a hybrid methodology to control the organization of quantum dots (QDs), nanoparticles, and biomolecules onto inorganic nanostructures with site-specific attachment and functionality. Here, surface plasmon-enhanced nanoarrays are fabricated where the photoluminescence of quantum dots and conjugated polymer nanoarrays are studied. This study was performed by tuning the localized surface plasmon resonance and the distance between the emitter and the metal surface using genetically engineered polypeptides as binding agents and biotin-streptavidin binding as linker molecules. In addition, these nanoarrays were also chemically modified to support the immobilization and label-free detection of DNA using surface enhanced Raman scattering. The surface of the nanoarrays was chemically modified using an acridine containing molecule which can act as an intercalating agent for DNA. The self-assembled monolayer (SAM) showed the ability to immobilize and intercalate DNA onto the surface. This SAM system using surface enhanced Raman scattering (SERS) serves as a highly sensitive methodology for the immobilization and label-free detection of DNA applicable into a wide range of bio-diagnostic platforms. Other micropatterned arrays were also fabricated using a combination of soft lithography and surface engineering. Selective single cell patterning and adhesion was achieved through chemical modifications and surface engineering of poly(dimethylsiloxane) surface. The surface of each microwell was functionally engineered with a SAM which contained an aldehyde terminated fused-ring aromatic thiolated molecule. Cells were found to be attracted and adherent to the chemically modified microwells. By combining soft lithography and surface engineering, a simple methodology produced single cell arrays on biocompatible substrates. Thus the design of plasmonic devices relies heavily on the nature of the plasmonic interactions between nanoparticles in the devices which can potentially be fabricated into lab-on-a-chip devices for multiplex sensing capabilities.

  13. Transaction based approach

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2017-07-01

    Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.

  14. A Quasi-3-D Theory for Impedance Eduction in Uniform Grazing Flow

    NASA Technical Reports Server (NTRS)

    Watson, W. R.; Jones, M. G.; Parrott, T. L.

    2005-01-01

    A 2-D impedance eduction methodology is extended to quasi-3-D sound fields in uniform or shearing mean flow. We introduce a nonlocal, nonreflecting boundary condition to terminate the duct and then educe the impedance by minimizing an objective function. The introduction of a parallel, sparse, equation solver significantly reduces the wall clock time for educing the impedance when compared to that of the sequential band solver used in the 2-D methodology. The accuracy, efficiency, and robustness of the methodology is demonstrated using two examples. In the first example, we show that the method reproduces the known impedance of a ceramic tubular test liner. In the second example, we illustrate that the approach educes the impedance of a four-segment liner where the first, second, and fourth segments consist of a perforated face sheet bonded to honeycomb, and the third segment is a cut from the ceramic tubular test liner. The ability of the method to educe the impedances of multisegmented liners has the potential to significantly reduce the amount of time and cost required to determine the impedance of several uniform liners by allowing them to be placed in series in the test section and to educe the impedance of each segment using a single numerical experiment. Finally, we probe the objective function in great detail and show that it contains a single minimum. Thus, our objective function is ideal for use with local, inexpensive, gradient-based optimizers.

  15. Evaluation of methodologies for assessing the overall diet: dietary quality scores and dietary pattern analysis.

    PubMed

    Ocké, Marga C

    2013-05-01

    This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.

  16. An efficient multistage algorithm for full calibration of the hemodynamic model from BOLD signal responses.

    PubMed

    Zambri, Brian; Djellouli, Rabia; Laleg-Kirati, Taous-Meriem

    2017-11-01

    We propose a computational strategy that falls into the category of prediction/correction iterative-type approaches, for calibrating the hemodynamic model. The proposed method is used to estimate consecutively the values of the two sets of model parameters. Numerical results corresponding to both synthetic and real functional magnetic resonance imaging measurements for a single stimulus as well as for multiple stimuli are reported to highlight the capability of this computational methodology to fully calibrate the considered hemodynamic model. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Purification and cultivation of human pituitary growth hormone secreting cells

    NASA Technical Reports Server (NTRS)

    Hymer, W. C.

    1978-01-01

    The maintainance of actively secreting human pituitary growth hormone cells (somatotrophs) in vitro was studied. The primary approach was the testing of agents which may be expected to increase the release of the human growth hormone (hGH). A procedure for tissue procurement is described along with the methodologies used to dissociate human pituitary tissue (obtained either at autopsy or surgery) into single cell suspensions. The validity of the Biogel cell column perfusion system for studying the dynamics of GH release was developed and documented using a rat pituitary cell system.

  18. Theory and Methodology in Researching Emotions in Education

    ERIC Educational Resources Information Center

    Zembylas, Michalinos

    2007-01-01

    Differing theoretical approaches to the study of emotions are presented: emotions as private (psychodynamic approaches); emotions as sociocultural phenomena (social constructionist approaches); and a third perspective (interactionist approaches) transcending these two. These approaches have important methodological implications in studying…

  19. Methodological Approaches in MOOC Research: Retracing the Myth of Proteus

    ERIC Educational Resources Information Center

    Raffaghelli, Juliana Elisa; Cucchiara, Stefania; Persico, Donatella

    2015-01-01

    This paper explores the methodological approaches most commonly adopted in the scholarly literature on Massive Open Online Courses (MOOCs), published during the period January 2008-May 2014. In order to identify trends, gaps and criticalities related to the methodological approaches of this emerging field of research, we analysed 60 papers…

  20. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  1. MonoSLAM: real-time single camera SLAM.

    PubMed

    Davison, Andrew J; Reid, Ian D; Molton, Nicholas D; Stasse, Olivier

    2007-06-01

    We present a real-time algorithm which can recover the 3D trajectory of a monocular camera, moving rapidly through a previously unknown scene. Our system, which we dub MonoSLAM, is the first successful application of the SLAM methodology from mobile robotics to the "pure vision" domain of a single uncontrolled camera, achieving real time but drift-free performance inaccessible to Structure from Motion approaches. The core of the approach is the online creation of a sparse but persistent map of natural landmarks within a probabilistic framework. Our key novel contributions include an active approach to mapping and measurement, the use of a general motion model for smooth camera movement, and solutions for monocular feature initialization and feature orientation estimation. Together, these add up to an extremely efficient and robust algorithm which runs at 30 Hz with standard PC and camera hardware. This work extends the range of robotic systems in which SLAM can be usefully applied, but also opens up new areas. We present applications of MonoSLAM to real-time 3D localization and mapping for a high-performance full-size humanoid robot and live augmented reality with a hand-held camera.

  2. Using Self-Experimentation and Single-Subject Methodology to Promote Critical Thinking

    ERIC Educational Resources Information Center

    Cowley, Brian J.; Lindgren, Ann; Langdon, David

    2006-01-01

    Critical thinking is often absent from classroom endeavor because it is hard to define (Gelder, 2005) or is difficult to assess (Bissell & Lemons, 2006). Critical thinking is defined as application, analysis, synthesis, and evaluation (Browne & Minnick, 2005). This paper shows how self-experimentation and single-subject methodology can be used to…

  3. Riser Difference Uncertainty Methodology Based on Tank AY-101 Wall Thickness Measurements with Application to Tank AN-107

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weier, Dennis R.; Anderson, Kevin K.; Berman, Herbert S.

    2005-03-10

    The DST Integrity Plan (RPP-7574, 2003, Double-Shell Tank Integrity Program Plan, Rev. 1A, CH2M HILL Hanford Group, Inc., Richland, Washington.) requires the ultrasonic wall thickness measurement of two vertical scans of the tank primary wall while using a single riser location. The resulting measurements are then used in extreme value methodology to predict the minimum wall thickness expected for the entire tank. The representativeness of using a single riser in this manner to draw conclusions about the entire circumference of a tank has been questioned. The only data available with which to address the representativeness question comes from Tank AY-101more » since only for that tank have multiple risers been used for such inspection. The purpose of this report is to (1) further characterize AY-101 riser differences (relative to prior work); (2) propose a methodology for incorporating a ''riser difference'' uncertainty for subsequent tanks for which only a single riser is used, and (3) specifically apply the methodology to measurements made from a single riser in Tank AN-107.« less

  4. Fluorescence polarization measures energy funneling in single light-harvesting antennas—LH2 vs conjugated polymers

    NASA Astrophysics Data System (ADS)

    Camacho, Rafael; Tubasum, Sumera; Southall, June; Cogdell, Richard J.; Sforazzini, Giuseppe; Anderson, Harry L.; Pullerits, Tõnu; Scheblykin, Ivan G.

    2015-10-01

    Numerous approaches have been proposed to mimic natural photosynthesis using artificial antenna systems, such as conjugated polymers (CPs), dendrimers, and J-aggregates. As a result, there is a need to characterize and compare the excitation energy transfer (EET) properties of various natural and artificial antennas. Here we experimentally show that EET in single antennas can be characterized by 2D polarization imaging using the single funnel approximation. This methodology addresses the ability of an individual antenna to transfer its absorbed energy towards a single pool of emissive states, using a single parameter called energy funneling efficiency (ɛ). We studied individual peripheral antennas of purple bacteria (LH2) and single CP chains of 20 nm length. As expected from a perfect antenna, LH2s showed funneling efficiencies close to unity. In contrast, CPs showed lower average funneling efficiencies, greatly varying from molecule to molecule. Cyclodextrin insulation of the conjugated backbone improves EET, increasing the fraction of CPs possessing ɛ = 1. Comparison between LH2s and CPs shows the importance of the protection systems and the protein scaffold of LH2, which keep the chromophores in functional form and at such geometrical arrangement that ensures excellent EET.

  5. Fluorescence polarization measures energy funneling in single light-harvesting antennas--LH2 vs conjugated polymers.

    PubMed

    Camacho, Rafael; Tubasum, Sumera; Southall, June; Cogdell, Richard J; Sforazzini, Giuseppe; Anderson, Harry L; Pullerits, Tõnu; Scheblykin, Ivan G

    2015-10-19

    Numerous approaches have been proposed to mimic natural photosynthesis using artificial antenna systems, such as conjugated polymers (CPs), dendrimers, and J-aggregates. As a result, there is a need to characterize and compare the excitation energy transfer (EET) properties of various natural and artificial antennas. Here we experimentally show that EET in single antennas can be characterized by 2D polarization imaging using the single funnel approximation. This methodology addresses the ability of an individual antenna to transfer its absorbed energy towards a single pool of emissive states, using a single parameter called energy funneling efficiency (ε). We studied individual peripheral antennas of purple bacteria (LH2) and single CP chains of 20 nm length. As expected from a perfect antenna, LH2s showed funneling efficiencies close to unity. In contrast, CPs showed lower average funneling efficiencies, greatly varying from molecule to molecule. Cyclodextrin insulation of the conjugated backbone improves EET, increasing the fraction of CPs possessing ε = 1. Comparison between LH2s and CPs shows the importance of the protection systems and the protein scaffold of LH2, which keep the chromophores in functional form and at such geometrical arrangement that ensures excellent EET.

  6. A Cross-Correlational Analysis between Electroencephalographic and End-Tidal Carbon Dioxide Signals: Methodological Issues in the Presence of Missing Data and Real Data Results

    PubMed Central

    Morelli, Maria Sole; Giannoni, Alberto; Passino, Claudio; Landini, Luigi; Emdin, Michele; Vanello, Nicola

    2016-01-01

    Electroencephalographic (EEG) irreducible artifacts are common and the removal of corrupted segments from the analysis may be required. The present study aims at exploring the effects of different EEG Missing Data Segment (MDS) distributions on cross-correlation analysis, involving EEG and physiological signals. The reliability of cross-correlation analysis both at single subject and at group level as a function of missing data statistics was evaluated using dedicated simulations. Moreover, a Bayesian-based approach for combining the single subject results at group level by considering each subject’s reliability was introduced. Starting from the above considerations, the cross-correlation function between EEG Global Field Power (GFP) in delta band and end-tidal CO2 (PETCO2) during rest and voluntary breath-hold was evaluated in six healthy subjects. The analysis of simulated data results at single subject level revealed a worsening of precision and accuracy in the cross-correlation analysis in the presence of MDS. At the group level, a large improvement in the results’ reliability with respect to single subject analysis was observed. The proposed Bayesian approach showed a slight improvement with respect to simple average results. Real data results were discussed in light of the simulated data tests and of the current physiological findings. PMID:27809243

  7. Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 3: how to assess methodological limitations.

    PubMed

    Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte

    2018-01-25

    The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.

  8. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    NASA Technical Reports Server (NTRS)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  9. [Optimization of succinic acid fermentation with Actinobacillus succinogenes by response surface methodology].

    PubMed

    Shen, Naikun; Qin, Yan; Wang, Qingyan; Xie, Nengzhong; Mi, Huizhi; Zhu, Qixia; Liao, Siming; Huang, Ribo

    2013-10-01

    Succinic acid is an important C4 platform chemical in the synthesis of many commodity and special chemicals. In the present work, different compounds were evaluated for succinic acid production by Actinobacillus succinogenes GXAS 137. Important parameters were screened by the single factor experiment and Plackeet-Burman design. Subsequently, the highest production of succinic acid was approached by the path of steepest ascent. Then, the optimum values of the parameters were obtained by Box-Behnken design. The results show that the important parameters were glucose, yeast extract and MgCO3 concentrations. The optimum condition was as follows (g/L): glucose 70.00, yeast extract 9.20 and MgCO3 58.10. Succinic acid yield reached 47.64 g/L at the optimal condition. Succinic acid increased by 29.14% than that before the optimization (36.89 g/L). Response surface methodology was proven to be a powerful tool to optimize succinic acid production.

  10. Wavelet maxima curves of surface latent heat flux associated with two recent Greek earthquakes

    NASA Astrophysics Data System (ADS)

    Cervone, G.; Kafatos, M.; Napoletani, D.; Singh, R. P.

    2004-05-01

    Multi sensor data available through remote sensing satellites provide information about changes in the state of the oceans, land and atmosphere. Recent studies have shown anomalous changes in oceans, land, atmospheric and ionospheric parameters prior to earthquakes events. This paper introduces an innovative data mining technique to identify precursory signals associated with earthquakes. The proposed methodology is a multi strategy approach which employs one dimensional wavelet transformations to identify singularities in the data, and an analysis of the continuity of the wavelet maxima in time and space to identify the singularities associated with earthquakes. The proposed methodology has been employed using Surface Latent Heat Flux (SLHF) data to study the earthquakes which occurred on 14 August 2003 and on 1 March 2004 in Greece. A single prominent SLHF anomaly has been found about two weeks prior to each of the earthquakes.

  11. Identifying subgroups of patients using latent class analysis: should we use a single-stage or a two-stage approach? A methodological study using a cohort of patients with low back pain.

    PubMed

    Nielsen, Anne Molgaard; Kent, Peter; Hestbaek, Lise; Vach, Werner; Kongsted, Alice

    2017-02-01

    Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However, as LBP is a complex multi-domain condition, the optimal approach when using LCA is unknown. Therefore, this paper describes the exploration of two approaches to LCA that may help improve the identification of clinically relevant and interpretable LBP subgroups. From 928 LBP patients consulting a chiropractor, baseline data were used as input to the statistical subgrouping. In a single-stage LCA, all variables were modelled simultaneously to identify patient subgroups. In a two-stage LCA, we used the latent class membership from our previously published LCA within each of six domains of health (activity, contextual factors, pain, participation, physical impairment and psychology) (first stage) as the variables entered into the second stage of the two-stage LCA to identify patient subgroups. The description of the results of the single-stage and two-stage LCA was based on a combination of statistical performance measures, qualitative evaluation of clinical interpretability (face validity) and a subgroup membership comparison. For the single-stage LCA, a model solution with seven patient subgroups was preferred, and for the two-stage LCA, a nine patient subgroup model. Both approaches identified similar, but not identical, patient subgroups characterised by (i) mild intermittent LBP, (ii) recent severe LBP and activity limitations, (iii) very recent severe LBP with both activity and participation limitations, (iv) work-related LBP, (v) LBP and several negative consequences and (vi) LBP with nerve root involvement. Both approaches identified clinically interpretable patient subgroups. The potential importance of these subgroups needs to be investigated by exploring whether they can be identified in other cohorts and by examining their possible association with patient outcomes. This may inform the selection of a preferred LCA approach.

  12. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  13. Intentionality, degree of damage, and moral judgments.

    PubMed

    Berg-Cross, L G

    1975-12-01

    153 first graders were given Piagetian moral judgment problems with a new simplified methodology as well as the usual story-pair paradigm. The new methodology involved making quantitative judgments about single stories and examined the influence of level of intentionality and degree of damage upon absolute punishment ratings. Contrary to results obtained with a story-pair methodology, it was found that with single stories even 6-year-old children responded to the level of intention in the stories as well as the quantity and quality of damage involved. This suggested that Piaget's methodology may be forcing children to employ a simplifying strategy while under other conditions they are able to perform the mental operations necessary to make complex moral judgments.

  14. Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.

  15. Relationships between palaeogeography and opal occurrence in Australia: A data-mining approach

    NASA Astrophysics Data System (ADS)

    Landgrebe, T. C. W.; Merdith, A.; Dutkiewicz, A.; Müller, R. D.

    2013-07-01

    Age-coded multi-layered geological datasets are becoming increasingly prevalent with the surge in open-access geodata, yet there are few methodologies for extracting geological information and knowledge from these data. We present a novel methodology, based on the open-source GPlates software in which age-coded digital palaeogeographic maps are used to “data-mine” spatio-temporal patterns related to the occurrence of Australian opal. Our aim is to test the concept that only a particular sequence of depositional/erosional environments may lead to conditions suitable for the formation of gem quality sedimentary opal. Time-varying geographic environment properties are extracted from a digital palaeogeographic dataset of the eastern Australian Great Artesian Basin (GAB) at 1036 opal localities. We obtain a total of 52 independent ordinal sequences sampling 19 time slices from the Early Cretaceous to the present-day. We find that 95% of the known opal deposits are tied to only 27 sequences all comprising fluvial and shallow marine depositional sequences followed by a prolonged phase of erosion. We then map the total area of the GAB that matches these 27 opal-specific sequences, resulting in an opal-prospective region of only about 10% of the total area of the basin. The key patterns underlying this association involve only a small number of key environmental transitions. We demonstrate that these key associations are generally absent at arbitrary locations in the basin. This new methodology allows for the simplification of a complex time-varying geological dataset into a single map view, enabling straightforward application for opal exploration and for future co-assessment with other datasets/geological criteria. This approach may help unravel the poorly understood opal formation process using an empirical spatio-temporal data-mining methodology and readily available datasets to aid hypothesis testing.

  16. Understanding leachate flow in municipal solid waste landfills by combining time-lapse ERT and subsurface flow modelling - Part II: Constraint methodology of hydrodynamic models.

    PubMed

    Audebert, M; Oxarango, L; Duquennoi, C; Touze-Foltz, N; Forquet, N; Clément, R

    2016-09-01

    Leachate recirculation is a key process in the operation of municipal solid waste landfills as bioreactors. To ensure optimal water content distribution, bioreactor operators need tools to design leachate injection systems. Prediction of leachate flow by subsurface flow modelling could provide useful information for the design of such systems. However, hydrodynamic models require additional data to constrain them and to assess hydrodynamic parameters. Electrical resistivity tomography (ERT) is a suitable method to study leachate infiltration at the landfill scale. It can provide spatially distributed information which is useful for constraining hydrodynamic models. However, this geophysical method does not allow ERT users to directly measure water content in waste. The MICS (multiple inversions and clustering strategy) methodology was proposed to delineate the infiltration area precisely during time-lapse ERT survey in order to avoid the use of empirical petrophysical relationships, which are not adapted to a heterogeneous medium such as waste. The infiltration shapes and hydrodynamic information extracted with MICS were used to constrain hydrodynamic models in assessing parameters. The constraint methodology developed in this paper was tested on two hydrodynamic models: an equilibrium model where, flow within the waste medium is estimated using a single continuum approach and a non-equilibrium model where flow is estimated using a dual continuum approach. The latter represents leachate flows into fractures. Finally, this methodology provides insight to identify the advantages and limitations of hydrodynamic models. Furthermore, we suggest an explanation for the large volume detected by MICS when a small volume of leachate is injected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Review of health information technology usability study methodologies

    PubMed Central

    Bakken, Suzanne

    2011-01-01

    Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation. PMID:21828224

  18. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  19. Advances in single-cell RNA sequencing and its applications in cancer research.

    PubMed

    Zhu, Sibo; Qing, Tao; Zheng, Yuanting; Jin, Li; Shi, Leming

    2017-08-08

    Unlike population-level approaches, single-cell RNA sequencing enables transcriptomic analysis of an individual cell. Through the combination of high-throughput sequencing and bioinformatic tools, single-cell RNA-seq can detect more than 10,000 transcripts in one cell to distinguish cell subsets and dynamic cellular changes. After several years' development, single-cell RNA-seq can now achieve massively parallel, full-length mRNA sequencing as well as in situ sequencing and even has potential for multi-omic detection. One appealing area of single-cell RNA-seq is cancer research, and it is regarded as a promising way to enhance prognosis and provide more precise target therapy by identifying druggable subclones. Indeed, progresses have been made regarding solid tumor analysis to reveal intratumoral heterogeneity, correlations between signaling pathways, stemness, drug resistance, and tumor architecture shaping the microenvironment. Furthermore, through investigation into circulating tumor cells, many genes have been shown to promote a propensity toward stemness and the epithelial-mesenchymal transition, to enhance anchoring and adhesion, and to be involved in mechanisms of anoikis resistance and drug resistance. This review focuses on advances and progresses of single-cell RNA-seq with regard to the following aspects: 1. Methodologies of single-cell RNA-seq 2. Single-cell isolation techniques 3. Single-cell RNA-seq in solid tumor research 4. Single-cell RNA-seq in circulating tumor cell research 5.

  20. Advances in single-cell RNA sequencing and its applications in cancer research

    PubMed Central

    Zhu, Sibo; Qing, Tao; Zheng, Yuanting; Jin, Li; Shi, Leming

    2017-01-01

    Unlike population-level approaches, single-cell RNA sequencing enables transcriptomic analysis of an individual cell. Through the combination of high-throughput sequencing and bioinformatic tools, single-cell RNA-seq can detect more than 10,000 transcripts in one cell to distinguish cell subsets and dynamic cellular changes. After several years’ development, single-cell RNA-seq can now achieve massively parallel, full-length mRNA sequencing as well as in situ sequencing and even has potential for multi-omic detection. One appealing area of single-cell RNA-seq is cancer research, and it is regarded as a promising way to enhance prognosis and provide more precise target therapy by identifying druggable subclones. Indeed, progresses have been made regarding solid tumor analysis to reveal intratumoral heterogeneity, correlations between signaling pathways, stemness, drug resistance, and tumor architecture shaping the microenvironment. Furthermore, through investigation into circulating tumor cells, many genes have been shown to promote a propensity toward stemness and the epithelial-mesenchymal transition, to enhance anchoring and adhesion, and to be involved in mechanisms of anoikis resistance and drug resistance. This review focuses on advances and progresses of single-cell RNA-seq with regard to the following aspects: 1. Methodologies of single-cell RNA-seq 2. Single-cell isolation techniques 3. Single-cell RNA-seq in solid tumor research 4. Single-cell RNA-seq in circulating tumor cell research 5. Perspectives PMID:28881849

  1. Regional Implementation of a Pediatric Cardiology Syncope Algorithm Using Standardized Clinical Assessment and Management Plans (SCAMPS) Methodology.

    PubMed

    Paris, Yvonne; Toro-Salazar, Olga H; Gauthier, Naomi S; Rotondo, Kathleen M; Arnold, Lucy; Hamershock, Rose; Saudek, David E; Fulton, David R; Renaud, Ashley; Alexander, Mark E

    2016-02-19

    Pediatric syncope is common. Cardiac causes are rarely found. We describe and assess a pragmatic approach to these patients first seen by a pediatric cardiologist in the New England region, using Standardized Clinical Assessment and Management Plans (SCAMPs). Ambulatory patients aged 7 to 21 years initially seen for syncope at participating New England Congenital Cardiology Association practices over a 2.5-year period were evaluated using a SCAMP. Findings were iteratively analyzed and the care pathway was revised. The vast majority (85%) of the 1254 patients had typical syncope. A minority had exercise-related or more problematic symptoms. Guideline-defined testing identified one patient with cardiac syncope. Syncope Severity Scores correlated well between physician and patient perceived symptoms. Orthostatic vital signs were of limited use. Largely incidental findings were seen in 10% of ECGs and 11% of echocardiograms. The 10% returning for follow-up, by design, reported more significant symptoms, but did not have newly recognized cardiac disease. Iterative analysis helped refine the approach. SCAMP methodology confirmed that the vast majority of children referred to the outpatient pediatric cardiology setting had typical low-severity neurally mediated syncope that could be effectively evaluated in a single visit using minimal resources. A simple scoring system can help triage patients into treatment categories. Prespecified criteria permitted the effective diagnosis of the single patient with a clear cardiac etiology. Patients with higher syncope scores still have a very low risk of cardiac disease, but may warrant attention. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  2. One-shot calculation of temperature-dependent optical spectra and phonon-induced band-gap renormalization

    NASA Astrophysics Data System (ADS)

    Zacharias, Marios; Giustino, Feliciano

    2016-08-01

    Recently, Zacharias et al. [Phys. Rev. Lett. 115, 177401 (2015), 10.1103/PhysRevLett.115.177401] developed an ab initio theory of temperature-dependent optical absorption spectra and band gaps in semiconductors and insulators. In that work, the zero-point renormalization and the temperature dependence were obtained by sampling the nuclear wave functions using a stochastic approach. In the present work, we show that the stochastic sampling of Zacharias et al. can be replaced by fully deterministic supercell calculations based on a single optimal configuration of the atomic positions. We demonstrate that a single calculation is able to capture the temperature-dependent band-gap renormalization including quantum nuclear effects in direct-gap and indirect-gap semiconductors, as well as phonon-assisted optical absorption in indirect-gap semiconductors. In order to demonstrate this methodology, we calculate from first principles the temperature-dependent optical absorption spectra and the renormalization of direct and indirect band gaps in silicon, diamond, and gallium arsenide, and we obtain good agreement with experiment and with previous calculations. In this work we also establish the formal connection between the Williams-Lax theory of optical transitions and the related theories of indirect absorption by Hall, Bardeen, and Blatt, and of temperature-dependent band structures by Allen and Heine. The present methodology enables systematic ab initio calculations of optical absorption spectra at finite temperature, including both direct and indirect transitions. This feature will be useful for high-throughput calculations of optical properties at finite temperature and for calculating temperature-dependent optical properties using high-level theories such as G W and Bethe-Salpeter approaches.

  3. DNA-Based Diet Analysis for Any Predator

    PubMed Central

    Dunshea, Glenn

    2009-01-01

    Background Prey DNA from diet samples can be used as a dietary marker; yet current methods for prey detection require a priori diet knowledge and/or are designed ad hoc, limiting their scope. I present a general approach to detect diverse prey in the feces or gut contents of predators. Methodology/Principal Findings In the example outlined, I take advantage of the restriction site for the endonuclease Pac I which is present in 16S mtDNA of most Odontoceti mammals, but absent from most other relevant non-mammalian chordates and invertebrates. Thus in DNA extracted from feces of these mammalian predators Pac I will cleave and exclude predator DNA from a small region targeted by novel universal primers, while most prey DNA remain intact allowing prey selective PCR. The method was optimized using scat samples from captive bottlenose dolphins (Tursiops truncatus) fed a diet of 6–10 prey species from three phlya. Up to five prey from two phyla were detected in a single scat and all but one minor prey item (2% of the overall diet) were detected across all samples. The same method was applied to scat samples from free-ranging bottlenose dolphins; up to seven prey taxa were detected in a single scat and 13 prey taxa from eight teleost families were identified in total. Conclusions/Significance Data and further examples are provided to facilitate rapid transfer of this approach to any predator. This methodology should prove useful to zoologists using DNA-based diet techniques in a wide variety of study systems. PMID:19390570

  4. Assessment of levothyroxine sodium bioavailability: recommendations for an improved methodology based on the pooled analysis of eight identically designed trials with 396 drug exposures.

    PubMed

    Walter-Sack, Ingeborg; Clanget, Christof; Ding, Reinhard; Goeggelmann, Christoph; Hinke, Vera; Lang, Matthias; Pfeilschifter, Johannes; Tayrouz, Yorki; Wegscheider, Karl

    2004-01-01

    Assessment of dosage form performance in delivering endogenous compounds, such as hormones, in vivo requires a specific approach. Assessment of relative bioavailability of levothyroxine sodium (L-T4) from eight solid preparations, compared with a liquid formulation, by using pharmacological doses, and critical evaluation of trial methodology based on the pooled analysis of individual data. Eight open-label, randomised, single-dose, crossover phase I studies using eight solid L-T4 dosage forms (25, 50, 75, 100, 125, 150, 175, 200 microg per tablet; administered total doses 600, 625 or 700 microg) and a liquid formulation; assessment of relative bioavailability by 90% confidence intervals for the relative area under the concentration-time curve (AUC) of total thyroxine (TT4), i.e. protein-bound plus free thyroxine, calculated by using the recommended log AUC four-way analysis of variance models for crossover designs. For the pooled analysis, general linear models were applied to assess the validity of model assumptions, to identify potential sources of effect modification, to discuss alternative modelling approaches with respect to endogenous hormone secretion and to give recommendations for future designs and sample sizes. One hundred and sixty-nine healthy males; 29 of these individuals participating in two studies. Single oral doses of L-T4 tablets and the liquid formulation administered after fasting, separated by at least 6 weeks; a total of 396 drug exposures. TT4 AUC from 0 to 48 hours and peak plasma concentration with and without baseline correction. Each study demonstrated equivalence of the tablets to the drinking solution, independent of the chosen analysis model. Sequence effects that could devalidate the chosen crossover approach were not found. Period effects with changing directions that could best be explained by seasonal variation were detected. While the pre-specified method of baseline correction of simply subtracting individual time-zero TT4 values was disadvantageous, the analysis of total AUC could be improved considerably by covariate adjustment for baseline TT4. With this approach, sample sizes could have been substantially reduced or, alternatively, the recommended equivalence ranges could be reduced to +/-6%. Using a single pharmacological dose of L-T4 in two-period crossover designs is a safe and reliable procedure to assess L-T4 dosage form performance. With an adequate statistical modelling approach, the design is efficient and allows general conclusions with moderate sample sizes.

  5. Single scan parameterization of space-variant point spread functions in image space via a printed array: the impact for two PET/CT scanners.

    PubMed

    Kotasidis, F A; Matthews, J C; Angelis, G I; Noonan, P J; Jackson, A; Price, P; Lionheart, W R; Reader, A J

    2011-05-21

    Incorporation of a resolution model during statistical image reconstruction often produces images of improved resolution and signal-to-noise ratio. A novel and practical methodology to rapidly and accurately determine the overall emission and detection blurring component of the system matrix using a printed point source array within a custom-made Perspex phantom is presented. The array was scanned at different positions and orientations within the field of view (FOV) to examine the feasibility of extrapolating the measured point source blurring to other locations in the FOV and the robustness of measurements from a single point source array scan. We measured the spatially-variant image-based blurring on two PET/CT scanners, the B-Hi-Rez and the TruePoint TrueV. These measured spatially-variant kernels and the spatially-invariant kernel at the FOV centre were then incorporated within an ordinary Poisson ordered subset expectation maximization (OP-OSEM) algorithm and compared to the manufacturer's implementation using projection space resolution modelling (RM). Comparisons were based on a point source array, the NEMA IEC image quality phantom, the Cologne resolution phantom and two clinical studies (carbon-11 labelled anti-sense oligonucleotide [(11)C]-ASO and fluorine-18 labelled fluoro-l-thymidine [(18)F]-FLT). Robust and accurate measurements of spatially-variant image blurring were successfully obtained from a single scan. Spatially-variant resolution modelling resulted in notable resolution improvements away from the centre of the FOV. Comparison between spatially-variant image-space methods and the projection-space approach (the first such report, using a range of studies) demonstrated very similar performance with our image-based implementation producing slightly better contrast recovery (CR) for the same level of image roughness (IR). These results demonstrate that image-based resolution modelling within reconstruction is a valid alternative to projection-based modelling, and that, when using the proposed practical methodology, the necessary resolution measurements can be obtained from a single scan. This approach avoids the relatively time-consuming and involved procedures previously proposed in the literature.

  6. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches

    PubMed Central

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-01-01

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541

  7. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches.

    PubMed

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-04-29

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points.

  8. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  9. The Application of a Trade Study Methodology to Determine Which Capabilities to Implement in a Test Facility Data Acquisition System Upgrade

    NASA Technical Reports Server (NTRS)

    McDougal, Kristopher J.

    2008-01-01

    More and more test programs are requiring high frequency measurements. Marshall Space Flight Center s Cold Flow Test Facility has an interest in acquiring such data. The acquisition of this data requires special hardware and capabilities. This document provides a structured trade study approach for determining which additional capabilities of a VXI-based data acquisition system should be utilized to meet the test facility objectives. The paper is focused on the trade study approach detailing and demonstrating the methodology. A case is presented in which a trade study was initially performed to provide a recommendation for the data system capabilities. Implementation details of the recommended alternative are briefly provided as well as the system s performance during a subsequent test program. The paper then addresses revisiting the trade study with modified alternatives and attributes to address issues that arose during the subsequent test program. Although the model does not identify a single best alternative for all sensitivities, the trade study process does provide a much better understanding. This better understanding makes it possible to confidently recommend Alternative 3 as the preferred alternative.

  10. Methodology and issues of integral experiments selection for nuclear data validation

    NASA Astrophysics Data System (ADS)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  11. Parallel methodology to capture cyclic variability in motored engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ameen, Muhsin M.; Yang, Xiaofeng; Kuo, Tang-Wei

    2016-07-28

    Numerical prediction of of cycle-to-cycle variability (CCV) in SI engines is extremely challenging for two key reasons: (i) high-fidelity methods such as large eddy simulation (LES) are require to accurately capture the in-cylinder turbulent flowfield, and (ii) CCV is experienced over long timescales and hence the simulations need to be performed for hundreds of consecutive cycles. In this study, a new methodology is proposed to dissociate this long time-scale problem into several shorter time-scale problems, which can considerably reduce the computational time without sacrificing the fidelity of the simulations. The strategy is to perform multiple single-cycle simulations in parallel bymore » effectively perturbing the simulation parameters such as the initial and boundary conditions. It is shown that by perturbing the initial velocity field effectively based on the intensity of the in-cylinder turbulence, the mean and variance of the in-cylinder flowfield is captured reasonably well. Adding perturbations in the initial pressure field and the boundary pressure improves the predictions. It is shown that this new approach is able to give accurate predictions of the flowfield statistics in less than one-tenth of time required for the conventional approach of simulating consecutive engine cycles.« less

  12. A Data-Driven Approach to Develop Physically Sound Predictors: Application to Depth-Averaged Velocities and Drag Coefficients on Vegetated Flows

    NASA Astrophysics Data System (ADS)

    Tinoco, R. O.; Goldstein, E. B.; Coco, G.

    2016-12-01

    We use a machine learning approach to seek accurate, physically sound predictors, to estimate two relevant flow parameters for open-channel vegetated flows: mean velocities and drag coefficients. A genetic programming algorithm is used to find a robust relationship between properties of the vegetation and flow parameters. We use data published from several laboratory experiments covering a broad range of conditions to obtain: a) in the case of mean flow, an equation that matches the accuracy of other predictors from recent literature while showing a less complex structure, and b) for drag coefficients, a predictor that relies on both single element and array parameters. We investigate different criteria for dataset size and data selection to evaluate their impact on the resulting predictor, as well as simple strategies to obtain only dimensionally consistent equations, and avoid the need for dimensional coefficients. The results show that a proper methodology can deliver physically sound models representative of the processes involved, such that genetic programming and machine learning techniques can be used as powerful tools to study complicated phenomena and develop not only purely empirical, but "hybrid" models, coupling results from machine learning methodologies into physics-based models.

  13. Artistic image analysis using graph-based learning approaches.

    PubMed

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  14. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  15. Direct Maximization of Protein Identifications from Tandem Mass Spectra*

    PubMed Central

    Spivak, Marina; Weston, Jason; Tomazela, Daniela; MacCoss, Michael J.; Noble, William Stafford

    2012-01-01

    The goal of many shotgun proteomics experiments is to determine the protein complement of a complex biological mixture. For many mixtures, most methodological approaches fall significantly short of this goal. Existing solutions to this problem typically subdivide the task into two stages: first identifying a collection of peptides with a low false discovery rate and then inferring from the peptides a corresponding set of proteins. In contrast, we formulate the protein identification problem as a single optimization problem, which we solve using machine learning methods. This approach is motivated by the observation that the peptide and protein level tasks are cooperative, and the solution to each can be improved by using information about the solution to the other. The resulting algorithm directly controls the relevant error rate, can incorporate a wide variety of evidence and, for complex samples, provides 18–34% more protein identifications than the current state of the art approaches. PMID:22052992

  16. Distinguishing Asthma Phenotypes Using Machine Learning Approaches.

    PubMed

    Howard, Rebecca; Rattray, Magnus; Prosperi, Mattia; Custovic, Adnan

    2015-07-01

    Asthma is not a single disease, but an umbrella term for a number of distinct diseases, each of which are caused by a distinct underlying pathophysiological mechanism. These discrete disease entities are often labelled as 'asthma endotypes'. The discovery of different asthma subtypes has moved from subjective approaches in which putative phenotypes are assigned by experts to data-driven ones which incorporate machine learning. This review focuses on the methodological developments of one such machine learning technique-latent class analysis-and how it has contributed to distinguishing asthma and wheezing subtypes in childhood. It also gives a clinical perspective, presenting the findings of studies from the past 5 years that used this approach. The identification of true asthma endotypes may be a crucial step towards understanding their distinct pathophysiological mechanisms, which could ultimately lead to more precise prevention strategies, identification of novel therapeutic targets and the development of effective personalized therapies.

  17. Assessing Potential Energy Cost Savings from Increased Energy Code Compliance in Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.; Athalye, Rahul A.

    The US Department of Energy’s most recent commercial energy code compliance evaluation efforts focused on determining a percent compliance rating for states to help them meet requirements under the American Recovery and Reinvestment Act (ARRA) of 2009. That approach included a checklist of code requirements, each of which was graded pass or fail. Percent compliance for any given building was simply the percent of individual requirements that passed. With its binary approach to compliance determination, the previous methodology failed to answer some important questions. In particular, how much energy cost could be saved by better compliance with the commercial energymore » code and what are the relative priorities of code requirements from an energy cost savings perspective? This paper explores an analytical approach and pilot study using a single building type and climate zone to answer those questions.« less

  18. MetaSort untangles metagenome assembly by reducing microbial community complexity

    PubMed Central

    Ji, Peifeng; Zhang, Yanming; Wang, Jinfeng; Zhao, Fangqing

    2017-01-01

    Most current approaches to analyse metagenomic data rely on reference genomes. Novel microbial communities extend far beyond the coverage of reference databases and de novo metagenome assembly from complex microbial communities remains a great challenge. Here we present a novel experimental and bioinformatic framework, metaSort, for effective construction of bacterial genomes from metagenomic samples. MetaSort provides a sorted mini-metagenome approach based on flow cytometry and single-cell sequencing methodologies, and employs new computational algorithms to efficiently recover high-quality genomes from the sorted mini-metagenome by the complementary of the original metagenome. Through extensive evaluations, we demonstrated that metaSort has an excellent and unbiased performance on genome recovery and assembly. Furthermore, we applied metaSort to an unexplored microflora colonized on the surface of marine kelp and successfully recovered 75 high-quality genomes at one time. This approach will greatly improve access to microbial genomes from complex or novel communities. PMID:28112173

  19. Replica exchange enveloping distribution sampling (RE-EDS): A robust method to estimate multiple free-energy differences from a single simulation.

    PubMed

    Sidler, Dominik; Schwaninger, Arthur; Riniker, Sereina

    2016-10-21

    In molecular dynamics (MD) simulations, free-energy differences are often calculated using free energy perturbation or thermodynamic integration (TI) methods. However, both techniques are only suited to calculate free-energy differences between two end states. Enveloping distribution sampling (EDS) presents an attractive alternative that allows to calculate multiple free-energy differences in a single simulation. In EDS, a reference state is simulated which "envelopes" the end states. The challenge of this methodology is the determination of optimal reference-state parameters to ensure equal sampling of all end states. Currently, the automatic determination of the reference-state parameters for multiple end states is an unsolved issue that limits the application of the methodology. To resolve this, we have generalised the replica-exchange EDS (RE-EDS) approach, introduced by Lee et al. [J. Chem. Theory Comput. 10, 2738 (2014)] for constant-pH MD simulations. By exchanging configurations between replicas with different reference-state parameters, the complexity of the parameter-choice problem can be substantially reduced. A new robust scheme to estimate the reference-state parameters from a short initial RE-EDS simulation with default parameters was developed, which allowed the calculation of 36 free-energy differences between nine small-molecule inhibitors of phenylethanolamine N-methyltransferase from a single simulation. The resulting free-energy differences were in excellent agreement with values obtained previously by TI and two-state EDS simulations.

  20. Comparison of the phenolic composition of fruit juices by single step gradient HPLC analysis of multiple components versus multiple chromatographic runs optimised for individual families.

    PubMed

    Bremner, P D; Blacklock, C J; Paganga, G; Mullen, W; Rice-Evans, C A; Crozier, A

    2000-06-01

    After minimal sample preparation, two different HPLC methodologies, one based on a single gradient reversed-phase HPLC step, the other on multiple HPLC runs each optimised for specific components, were used to investigate the composition of flavonoids and phenolic acids in apple and tomato juices. The principal components in apple juice were identified as chlorogenic acid, phloridzin, caffeic acid and p-coumaric acid. Tomato juice was found to contain chlorogenic acid, caffeic acid, p-coumaric acid, naringenin and rutin. The quantitative estimates of the levels of these compounds, obtained with the two HPLC procedures, were very similar, demonstrating that either method can be used to analyse accurately the phenolic components of apple and tomato juices. Chlorogenic acid in tomato juice was the only component not fully resolved in the single run study and the multiple run analysis prior to enzyme treatment. The single run system of analysis is recommended for the initial investigation of plant phenolics and the multiple run approach for analyses where chromatographic resolution requires improvement.

  1. Reviewing the methodology of an integrative review.

    PubMed

    Hopia, Hanna; Latvala, Eila; Liimatainen, Leena

    2016-12-01

    Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.

  2. Auditing as part of the terminology design life cycle.

    PubMed

    Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue

    2006-01-01

    To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology's concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert's manual review on portions of the concepts with a high likelihood of errors.

  3. High-resolution three-dimensional structural microscopy by single-angle Bragg ptychography

    DOE PAGES

    Hruszkewycz, S. O.; Allain, M.; Holt, M. V.; ...

    2016-11-21

    Coherent X-ray microscopy by phase retrieval of Bragg diffraction intensities enables lattice distortions within a crystal to be imaged at nanometre-scale spatial resolutions in three dimensions. While this capability can be used to resolve structure–property relationships at the nanoscale under working conditions, strict data measurement requirements can limit the application of current approaches. Here, in this work, we introduce an efficient method of imaging three-dimensional (3D) nanoscale lattice behaviour and strain fields in crystalline materials with a methodology that we call 3D Bragg projection ptychography (3DBPP). This method enables 3D image reconstruction of a crystal volume from a series ofmore » two-dimensional X-ray Bragg coherent intensity diffraction patterns measured at a single incident beam angle. Structural information about the sample is encoded along two reciprocal-space directions normal to the Bragg diffracted exit beam, and along the third dimension in real space by the scanning beam. Finally, we present our approach with an analytical derivation, a numerical demonstration, and an experimental reconstruction of lattice distortions in a component of a nanoelectronic prototype device.« less

  4. Overview of integrative tools and methods in assessing ecological integrity in estuarine and coastal systems worldwide.

    PubMed

    Borja, Angel; Bricker, Suzanne B; Dauer, Daniel M; Demetriades, Nicolette T; Ferreira, João G; Forbes, Anthony T; Hutchings, Pat; Jia, Xiaoping; Kenchington, Richard; Carlos Marques, João; Zhu, Changbo

    2008-09-01

    In recent years, several sets of legislation worldwide (Oceans Act in USA, Australia or Canada; Water Framework Directive or Marine Strategy in Europe, National Water Act in South Africa, etc.) have been developed in order to address ecological quality or integrity, within estuarine and coastal systems. Most such legislation seeks to define quality in an integrative way, by using several biological elements, together with physico-chemical and pollution elements. Such an approach allows assessment of ecological status at the ecosystem level ('ecosystem approach' or 'holistic approach' methodologies), rather than at species level (e.g. mussel biomonitoring or Mussel Watch) or just at chemical level (i.e. quality objectives) alone. Increasing attention has been paid to the development of tools for different physico-chemical or biological (phytoplankton, zooplankton, benthos, algae, phanerogams, fishes) elements of the ecosystems. However, few methodologies integrate all the elements into a single evaluation of a water body. The need for such integrative tools to assess ecosystem quality is very important, both from a scientific and stakeholder point of view. Politicians and managers need information from simple and pragmatic, but scientifically sound methodologies, in order to show to society the evolution of a zone (estuary, coastal area, etc.), taking into account human pressures or recovery processes. These approaches include: (i) multidisciplinarity, inherent in the teams involved in their implementation; (ii) integration of biotic and abiotic factors; (iii) accurate and validated methods in determining ecological integrity; and (iv) adequate indicators to follow the evolution of the monitored ecosystems. While some countries increasingly use the establishment of marine parks to conserve marine biodiversity and ecological integrity, there is awareness (e.g. in Australia) that conservation and management of marine ecosystems cannot be restricted to Marine Protected Areas but must include areas outside such reserves. This contribution reviews the current situation of integrative ecological assessment worldwide, by presenting several examples from each of the continents: Africa, Asia, Australia, Europe and North America.

  5. Energy saving in WWTP: Daily benchmarking under uncertainty and data availability limitations.

    PubMed

    Torregrossa, D; Schutz, G; Cornelissen, A; Hernández-Sancho, F; Hansen, J

    2016-07-01

    Efficient management of Waste Water Treatment Plants (WWTPs) can produce significant environmental and economic benefits. Energy benchmarking can be used to compare WWTPs, identify targets and use these to improve their performance. Different authors have performed benchmark analysis on monthly or yearly basis but their approaches suffer from a time lag between an event, its detection, interpretation and potential actions. The availability of on-line measurement data on many WWTPs should theoretically enable the decrease of the management response time by daily benchmarking. Unfortunately this approach is often impossible because of limited data availability. This paper proposes a methodology to perform a daily benchmark analysis under database limitations. The methodology has been applied to the Energy Online System (EOS) developed in the framework of the project "INNERS" (INNovative Energy Recovery Strategies in the urban water cycle). EOS calculates a set of Key Performance Indicators (KPIs) for the evaluation of energy and process performances. In EOS, the energy KPIs take in consideration the pollutant load in order to enable the comparison between different plants. For example, EOS does not analyse the energy consumption but the energy consumption on pollutant load. This approach enables the comparison of performances for plants with different loads or for a single plant under different load conditions. The energy consumption is measured by on-line sensors, while the pollutant load is measured in the laboratory approximately every 14 days. Consequently, the unavailability of the water quality parameters is the limiting factor in calculating energy KPIs. In this paper, in order to overcome this limitation, the authors have developed a methodology to estimate the required parameters and manage the uncertainty in the estimation. By coupling the parameter estimation with an interval based benchmark approach, the authors propose an effective, fast and reproducible way to manage infrequent inlet measurements. Its use enables benchmarking on a daily basis and prepares the ground for further investigation. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A terahertz performance of hybrid single walled CNT based amplifier with analytical approach

    NASA Astrophysics Data System (ADS)

    Kumar, Sandeep; Song, Hanjung

    2018-01-01

    This work is focuses on terahertz performance of hybrid single walled carbon nanotube (CNT) based amplifier and proposed for measurement of soil parameters application. The proposed circuit topology provides hybrid structure which achieves wide impedance bandwidth of 0.33 THz within range of 1.07-THz to 1.42-THz with fractional amount of 28%. The single walled RF CNT network executes proposed ambition and proves its ability to resonant at 1.25-THz with analytical approach. Moreover, a RF based microstrip transmission line radiator used as compensator in the circuit topology which achieves more than 30 dB of gain. A proper methodology is chosen for achieves stability at circuit level in order to obtain desired optimal conditions. The fundamental approach optimizes matched impedance condition at (50+j0) Ω and noise variation with impact of series resistances for the proposed hybrid circuit topology and demonstrates the accuracy of performance parameters at the circuit level. The chip fabrication of the proposed circuit by using RF based commercial CMOS process of 45 nm which reveals promising results with simulation one. Additionally, power measurement analysis achieves highest output power of 26 dBm with power added efficiency of 78%. The succeed minimum noise figure from 0.6 dB to 0.4 dB is outstanding achievement for circuit topology at terahertz range. The chip area of hybrid circuit is 0.65 mm2 and power consumption of 9.6 mW.

  7. Trial latencies estimation of event-related potentials in EEG by means of genetic algorithms

    NASA Astrophysics Data System (ADS)

    Da Pelo, P.; De Tommaso, M.; Monaco, A.; Stramaglia, S.; Bellotti, R.; Tangaro, S.

    2018-04-01

    Objective. Event-related potentials (ERPs) are usually obtained by averaging thus neglecting the trial-to-trial latency variability in cognitive electroencephalography (EEG) responses. As a consequence the shape and the peak amplitude of the averaged ERP are smeared and reduced, respectively, when the single-trial latencies show a relevant variability. To date, the majority of the methodologies for single-trial latencies inference are iterative schemes providing suboptimal solutions, the most commonly used being the Woody’s algorithm. Approach. In this study, a global approach is developed by introducing a fitness function whose global maximum corresponds to the set of latencies which renders the trial signals most aligned as possible. A suitable genetic algorithm has been implemented to solve the optimization problem, characterized by new genetic operators tailored to the present problem. Main results. The results, on simulated trials, showed that the proposed algorithm performs better than Woody’s algorithm in all conditions, at the cost of an increased computational complexity (justified by the improved quality of the solution). Application of the proposed approach on real data trials, resulted in an increased correlation between latencies and reaction times w.r.t. the output from RIDE method. Significance. The above mentioned results on simulated and real data indicate that the proposed method, providing a better estimate of single-trial latencies, will open the way to more accurate study of neural responses as well as to the issue of relating the variability of latencies to the proper cognitive and behavioural correlates.

  8. Physiotherapy treatment approaches for the recovery of postural control and lower limb function following stroke.

    PubMed

    Pollock, A; Baer, G; Pomeroy, V; Langhorne, P

    2007-01-24

    There are a number of different approaches to physiotherapy treatment following stroke that, broadly speaking, are based on neurophysiological, motor learning and orthopaedic principles. Some physiotherapists base their treatment on a single approach, while others use a mixture of components from a number of different approaches. To determine if there is a difference in the recovery of postural control and lower limb function in patients with stroke if physiotherapy treatment is based on orthopaedic or neurophysiological or motor learning principles, or on a mixture of these treatment principles. We searched the Cochrane Stroke Group Trials Register (last searched May 2005), the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library Issue 2, 2005), MEDLINE (1966 to May 2005), EMBASE (1980 to May 2005) and CINAHL (1982 to May 2005). We contacted experts and researchers with an interest in stroke rehabilitation. Randomised or quasi-randomised controlled trials of physiotherapy treatment approaches aimed at promoting the recovery of postural control and lower limb function in adult participants with a clinical diagnosis of stroke. Outcomes included measures of disability, motor impairment or participation. Two review authors independently categorised the identified trials according to the inclusion and exclusion criteria, documented their methodological quality, and extracted the data. Twenty-one trials were included in the review, five of which were included in two comparisons. Eight trials compared a neurophysiological approach with another approach; eight compared a motor learning approach with another approach; and eight compared a mixed approach with another approach. A mixed approach was significantly more effective than no treatment or placebo control for improving functional independence (standardised mean difference (SMD) 0.94, 95% confidence intervals (CI) 0.08 to 1.80). There was no significant evidence that any single approach had a better outcome than any other single approach or no treatment control. There is evidence that physiotherapy intervention, using a mix of components from different approaches, is significantly more effective than no treatment or placebo control in the recovery of functional independence following stroke. There is insufficient evidence to conclude that any one physiotherapy approach is more effective in promoting recovery of lower limb function or postural control following stroke than any other approach. We recommend that future research should concentrate on investigating the effectiveness of clearly described individual techniques and task-specific treatments, regardless of their historical or philsophical origin.

  9. Single-molecule studies of the neuronal SNARE fusion machinery.

    PubMed

    Brunger, Axel T; Weninger, Keith; Bowen, Mark; Chu, Steven

    2009-01-01

    SNAREs are essential components of the machinery for Ca(2+)-triggered fusion of synaptic vesicles with the plasma membrane, resulting in neurotransmitter release into the synaptic cleft. Although much is known about their biophysical and structural properties and their interactions with accessory proteins such as the Ca(2+) sensor synaptotagmin, their precise role in membrane fusion remains an enigma. Ensemble studies of liposomes with reconstituted SNAREs have demonstrated that SNAREs and accessory proteins can trigger lipid mixing/fusion, but the inability to study individual fusion events has precluded molecular insights into the fusion process. Thus, this field is ripe for studies with single-molecule methodology. In this review, we discuss applications of single-molecule approaches to observe reconstituted SNAREs, their complexes, associated proteins, and their effect on biological membranes. Some of the findings are provocative, such as the possibility of parallel and antiparallel SNARE complexes or of vesicle docking with only syntaxin and synaptobrevin, but have been confirmed by other experiments.

  10. Collision-induced dissociative chemical cross-linking reagents and methodology: Applications to protein structural characterization using tandem mass spectrometry analysis.

    PubMed

    Soderblom, Erik J; Goshe, Michael B

    2006-12-01

    Chemical cross-linking combined with mass spectrometry is a viable approach to study the low-resolution structure of protein and protein complexes. However, unambiguous identification of the residues involved in a cross-link remains analytically challenging. To enable a more effective analysis across various MS platforms, we have developed a novel set of collision-induced dissociative cross-linking reagents and methodology for chemical cross-linking experiments using tandem mass spectrometry (CID-CXL-MS/MS). These reagents incorporate a single gas-phase cleavable bond within their linker region that can be selectively fragmented within the in-source region of the mass spectrometer, enabling independent MS/MS analysis for each peptide. Initial design concepts were characterized using a synthesized cross-linked peptide complex. Following verification and subsequent optimization of cross-linked peptide complex dissociation, our reagents were applied to homodimeric glutathione S-transferase and monomeric bovine serum albumin. Cross-linked residues identified by our CID-CXL-MS/MS method were in agreement with published crystal structures and previous cross-linking studies using conventional approaches. Common LC/MS/MS acquisition approaches such as data-dependent acquisition experiments using ion trap mass spectrometers and product ion spectral analysis using SEQUEST were shown to be compatible with our CID-CXL-MS/MS reagents, obviating the requirement for high resolution and high mass accuracy measurements to identify both intra- and interpeptide cross-links.

  11. Solar tower cavity receiver aperture optimization based on transient optical and thermo-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Schöttl, Peter; Bern, Gregor; van Rooyen, De Wet; Heimsath, Anna; Fluri, Thomas; Nitz, Peter

    2017-06-01

    A transient simulation methodology for cavity receivers for Solar Tower Central Receiver Systems with molten salt as heat transfer fluid is described. Absorbed solar radiation is modeled with ray tracing and a sky discretization approach to reduce computational effort. Solar radiation re-distribution in the cavity as well as thermal radiation exchange are modeled based on view factors, which are also calculated with ray tracing. An analytical approach is used to represent convective heat transfer in the cavity. Heat transfer fluid flow is simulated with a discrete tube model, where the boundary conditions at the outer tube surface mainly depend on inputs from the previously mentioned modeling aspects. A specific focus is put on the integration of optical and thermo-hydraulic models. Furthermore, aiming point and control strategies are described, which are used during the transient performance assessment. Eventually, the developed simulation methodology is used for the optimization of the aperture opening size of a PS10-like reference scenario with cavity receiver and heliostat field. The objective function is based on the cumulative gain of one representative day. Results include optimized aperture opening size, transient receiver characteristics and benefits of the implemented aiming point strategy compared to a single aiming point approach. Future work will include annual simulations, cost assessment and optimization of a larger range of receiver parameters.

  12. Institute for High Heat Flux Removal (IHHFR). Phases I, II, and III

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd, Ronald D.

    2014-08-31

    The IHHFR focused on interdisciplinary applications as it relates to high heat flux engineering issues and problems which arise due to engineering systems being miniaturized, optimized, or requiring increased high heat flux performance. The work in the IHHFR focused on water as a coolant and includes: (1) the development, design, and construction of the high heat flux flow loop and facility; (2) test section development, design, and fabrication; and, (3) single-side heat flux experiments to produce 2-D boiling curves and 3-D conjugate heat transfer measurements for single-side heated test sections. This work provides data for comparisons with previously developed andmore » new single-side heated correlations and approaches that address the single-side heated effect on heat transfer. In addition, this work includes the addition of single-side heated circular TS and a monoblock test section with a helical wire insert. Finally, the present work includes: (1) data base expansion for the monoblock with a helical wire insert (only for the latter geometry), (2) prediction and verification using finite element, (3) monoblock model and methodology development analyses, and (4) an alternate model development for a hypervapotron and related conjugate heat transfer controlling parameters.« less

  13. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    PubMed

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  14. Transferring Codified Knowledge: Socio-Technical versus Top-Down Approaches

    ERIC Educational Resources Information Center

    Guzman, Gustavo; Trivelato, Luiz F.

    2008-01-01

    Purpose: This paper aims to analyse and evaluate the transfer process of codified knowledge (CK) performed under two different approaches: the "socio-technical" and the "top-down". It is argued that the socio-technical approach supports the transfer of CK better than the top-down approach. Design/methodology/approach: Case study methodology was…

  15. Chemical Proteomic Approaches Targeting Cancer Stem Cells: A Review of Current Literature.

    PubMed

    Jung, Hye Jin

    2017-01-01

    Cancer stem cells (CSCs) have been proposed as central drivers of tumor initiation, progression, recurrence, and therapeutic resistance. Therefore, identifying stem-like cells within cancers and understanding their properties is crucial for the development of effective anticancer therapies. Recently, chemical proteomics has become a powerful tool to efficiently determine protein networks responsible for CSC pathophysiology and comprehensively elucidate molecular mechanisms of drug action against CSCs. This review provides an overview of major methodologies utilized in chemical proteomic approaches. In addition, recent successful chemical proteomic applications targeting CSCs are highlighted. Future direction of potential CSC research by integrating chemical genomic and proteomic data obtained from a single biological sample of CSCs are also suggested in this review. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  16. Evaluation of methodology for the analysis of 'time-to-event' data in pharmacogenomic genome-wide association studies.

    PubMed

    Syed, Hamzah; Jorgensen, Andrea L; Morris, Andrew P

    2016-06-01

    To evaluate the power to detect associations between SNPs and time-to-event outcomes across a range of pharmacogenomic study designs while comparing alternative regression approaches. Simulations were conducted to compare Cox proportional hazards modeling accounting for censoring and logistic regression modeling of a dichotomized outcome at the end of the study. The Cox proportional hazards model was demonstrated to be more powerful than the logistic regression analysis. The difference in power between the approaches was highly dependent on the rate of censoring. Initial evaluation of single-nucleotide polymorphism association signals using computationally efficient software with dichotomized outcomes provides an effective screening tool for some design scenarios, and thus has important implications for the development of analytical protocols in pharmacogenomic studies.

  17. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  18. An approach to the rationalization of streamflow data collection networks

    NASA Astrophysics Data System (ADS)

    Burn, Donald H.; Goulter, Ian C.

    1991-01-01

    A new procedure for rationalizing a streamflow data collection network is developed. The procedure is a two-phase approach in which in the first phase, a hierarchical clustering technique is used to identify groups of similar gauging stations. In the second phase, a single station from each identified group of gauging stations is selected to be retained in the rationalized network. The station selection phase is an inherently heuristic process that incorporates information about the characteristics of the individual stations in the network. The methodology allows the direct inclusion of user judgement into the station selection process in that it is possible to select more than one station from a group, if conditions warrant. The technique is demonstrated using streamflow gauging stations in and near the Pembina River basin, southern Manitoba, Canada.

  19. Automatic Authorship Detection Using Textual Patterns Extracted from Integrated Syntactic Graphs

    PubMed Central

    Gómez-Adorno, Helena; Sidorov, Grigori; Pinto, David; Vilariño, Darnes; Gelbukh, Alexander

    2016-01-01

    We apply the integrated syntactic graph feature extraction methodology to the task of automatic authorship detection. This graph-based representation allows integrating different levels of language description into a single structure. We extract textual patterns based on features obtained from shortest path walks over integrated syntactic graphs and apply them to determine the authors of documents. On average, our method outperforms the state of the art approaches and gives consistently high results across different corpora, unlike existing methods. Our results show that our textual patterns are useful for the task of authorship attribution. PMID:27589740

  20. Observational Versus Experimental Studies: What’s the Evidence for a Hierarchy?

    PubMed Central

    Concato, John

    2004-01-01

    Summary: The tenets of evidence-based medicine include an emphasis on hierarchies of research design (i.e., study architecture). Often, a single randomized, controlled trial is considered to provide “truth,” whereas results from any observational study are viewed with suspicion. This paper describes information that contradicts and discourages such a rigid approach to evaluating the quality of research design. Unless a more balanced strategy evolves, new claims of methodological authority may be just as problematic as the traditional claims of medical authority that have been criticized by proponents of evidence-based medicine. PMID:15717036

  1. A Framework for WWW Query Processing

    NASA Technical Reports Server (NTRS)

    Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)

    2000-01-01

    Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).

  2. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  3. Vertically aligned single-walled carbon nanotubes by chemical assembly--methodology, properties, and applications.

    PubMed

    Diao, Peng; Liu, Zhongfan

    2010-04-06

    Single-walled carbon nanotubes (SWNTs), as one of the most promising one-dimension nanomaterials due to its unique structure, peculiar chemical, mechanical, thermal, and electronic properties, have long been considered as an important building block to construct ordered alignments. Vertically aligned SWNTs (v-SWNTs) have been successfully prepared by using direct growth and chemical assembly strategies. In this review, we focus explicitly on the v-SWNTs fabricated via chemical assembly strategy. We provide the readers with a full and systematic summary covering the advances in all aspects of this area, including various approaches for the preparation of v-SWNTs using chemical assembly techniques, characterization, assembly kinetics, and electrochemical properties of v-SWNTs. We also review the applications of v-SWNTs in electrochemical and bioelectrochemical sensors, photoelectric conversion, and scanning probe microscopy.

  4. Refinements to the Graves and Pitarka (2010) Broadband Ground Motion Simulation Method

    USGS Publications Warehouse

    Graves, Robert; Arben Pitarka,

    2015-01-01

    This brief article describes refinements to the Graves and Pitarka (2010) broadband ground motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the SCEC Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Our simulation technique is a hybrid approach that combines low-­‐frequency and high-­‐frequency motions computed with different methods into a single broadband response. The separate low-­‐ and high-­‐frequency components have traditionally been called “deterministic” and “stochastic”, respectively; however, this nomenclature is an oversimplification. In reality, the low-­‐frequency approach includes many stochastic elements, and likewise, the high-­‐frequency approach includes many deterministic elements (e.g., Pulido and Kubo, 2004; Hartzell et al., 2005; Liu et al., 2006; Frankel, 2009; Graves and Pitarka, 2010; Mai et al., 2010). While the traditional terminology will likely remain in use by the broader modeling community, in this paper we will refer to these using the generic terminology “low-­‐frequency” and “high-­‐ frequency” approaches. Furthermore, one of the primary goals in refining our methodology is to provide a smoother and more consistent transition between the low-­‐ and high-­‐ frequency calculations, with the ultimate objective being the development of a single unified modeling approach that can be applied over a broad frequency band. GP2010 was validated by modeling recorded strong motions from four California earthquakes. While the method performed well overall, several issues were identified including the tendency to over-­‐predict the level of longer period (2-­‐5 sec) motions and the effects of rupture directivity. The refinements incorporated in GP14.3 are aimed at addressing these issues with application to the simulation of earthquakes in Western US (WUS). These refinements include the addition of a deep weak zone (details in following section) to the rupture characterization and allowing perturbations in the correlation of rise time and rupture speed with the specified slip distribution. Additionally, we have extended the parameterization of GP14.3 so that it is also applicable for simulating Eastern North America (ENA) earthquakes. This work has been guided by the comprehensive set of validation studies described in Goulet and Abrahamson (2014) and Dreger et al. (2014). The GP14.3 method shows improved performance relative to GP2010, and we direct the interested reader to Dreger et al. (2014) for a detailed assessment of the current methodology. In this paper, we concentrate on describing the modifications in more detail, and also discussing additional refinements that are currently being developed.

  5. Diffuse interface immersed boundary method for multi-fluid flows with arbitrarily moving rigid bodies

    NASA Astrophysics Data System (ADS)

    Patel, Jitendra Kumar; Natarajan, Ganesh

    2018-05-01

    We present an interpolation-free diffuse interface immersed boundary method for multiphase flows with moving bodies. A single fluid formalism using the volume-of-fluid approach is adopted to handle multiple immiscible fluids which are distinguished using the volume fractions, while the rigid bodies are tracked using an analogous volume-of-solid approach that solves for the solid fractions. The solution to the fluid flow equations are carried out using a finite volume-immersed boundary method, with the latter based on a diffuse interface philosophy. In the present work, we assume that the solids are filled with a "virtual" fluid with density and viscosity equal to the largest among all fluids in the domain. The solids are assumed to be rigid and their motion is solved using Newton's second law of motion. The immersed boundary methodology constructs a modified momentum equation that reduces to the Navier-Stokes equations in the fully fluid region and recovers the no-slip boundary condition inside the solids. An implicit incremental fractional-step methodology in conjunction with a novel hybrid staggered/non-staggered approach is employed, wherein a single equation for normal momentum at the cell faces is solved everywhere in the domain, independent of the number of spatial dimensions. The scalars are all solved for at the cell centres, with the transport equations for solid and fluid volume fractions solved using a high-resolution scheme. The pressure is determined everywhere in the domain (including inside the solids) using a variable coefficient Poisson equation. The solution to momentum, pressure, solid and fluid volume fraction equations everywhere in the domain circumvents the issue of pressure and velocity interpolation, which is a source of spurious oscillations in sharp interface immersed boundary methods. A well-balanced algorithm with consistent mass/momentum transport ensures robust simulations of high density ratio flows with strong body forces. The proposed diffuse interface immersed boundary method is shown to be discretely mass-preserving while being temporally second-order accurate and exhibits nominal second-order accuracy in space. We examine the efficacy of the proposed approach through extensive numerical experiments involving one or more fluids and solids, that include two-particle sedimentation in homogeneous and stratified environment. The results from the numerical simulations show that the proposed methodology results in reduced spurious force oscillations in case of moving bodies while accurately resolving complex flow phenomena in multiphase flows with moving solids. These studies demonstrate that the proposed diffuse interface immersed boundary method, which could be related to a class of penalisation approaches, is a robust and promising alternative to computationally expensive conformal moving mesh algorithms as well as the class of sharp interface immersed boundary methods for multibody problems in multi-phase flows.

  6. A systematic review finds methodological improvements necessary for prognostic models in determining traumatic brain injury outcomes.

    PubMed

    Mushkudiani, Nino A; Hukkelhoven, Chantal W P M; Hernández, Adrián V; Murray, Gordon D; Choi, Sung C; Maas, Andrew I R; Steyerberg, Ewout W

    2008-04-01

    To describe the modeling techniques used for early prediction of outcome in traumatic brain injury (TBI) and to identify aspects for potential improvements. We reviewed key methodological aspects of studies published between 1970 and 2005 that proposed a prognostic model for the Glasgow Outcome Scale of TBI based on admission data. We included 31 papers. Twenty-four were single-center studies, and 22 reported on fewer than 500 patients. The median of the number of initially considered predictors was eight, and on average five of these were selected for the prognostic model, generally including age, Glasgow Coma Score (or only motor score), and pupillary reactivity. The most common statistical technique was logistic regression with stepwise selection of predictors. Model performance was often quantified by accuracy rate rather than by more appropriate measures such as the area under the receiver-operating characteristic curve. Model validity was addressed in 15 studies, but mostly used a simple split-sample approach, and external validation was performed in only four studies. Although most models agree on the three most important predictors, many were developed on small sample sizes within single centers and hence lack generalizability. Modeling strategies have to be improved, and include external validation.

  7. A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: the Study Design and Implementation Assessment Device (Study DIAD).

    PubMed

    Valentine, Jeffrey C; Cooper, Harris

    2008-06-01

    Assessments of studies meant to evaluate the effectiveness of interventions, programs, and policies can serve an important role in the interpretation of research results. However, evidence suggests that available quality assessment tools have poor measurement characteristics and can lead to opposing conclusions when applied to the same body of studies. These tools tend to (a) be insufficiently operational, (b) rely on arbitrary post-hoc decision rules, and (c) result in a single number to represent a multidimensional construct. In response to these limitations, a multilevel and hierarchical instrument was developed in consultation with a wide range of methodological and statistical experts. The instrument focuses on the operational details of studies and results in a profile of scores instead of a single score to represent study quality. A pilot test suggested that satisfactory between-judge agreement can be obtained using well-trained raters working in naturalistic conditions. Limitations of the instrument are discussed, but these are inherent in making decisions about study quality given incomplete reporting and in the absence of strong, contextually based information about the effects of design flaws on study outcomes. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  8. A novel multi-model neuro-fuzzy-based MPPT for three-phase grid-connected photovoltaic system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaouachi, Aymen; Kamel, Rashad M.; Nagasaka, Ken

    This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three multi-layered feed forwarded Artificial Neural Networks (ANN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated ANN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology,more » comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and nonlinear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network and the Perturb and Observe (P and O) algorithm dispositive. (author)« less

  9. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  10. Comparison and combination of "direct" and fragment based local correlation methods: Cluster in molecules and domain based local pair natural orbital perturbation and coupled cluster theories

    NASA Astrophysics Data System (ADS)

    Guo, Yang; Becker, Ute; Neese, Frank

    2018-03-01

    Local correlation theories have been developed in two main flavors: (1) "direct" local correlation methods apply local approximation to the canonical equations and (2) fragment based methods reconstruct the correlation energy from a series of smaller calculations on subsystems. The present work serves two purposes. First, we investigate the relative efficiencies of the two approaches using the domain-based local pair natural orbital (DLPNO) approach as the "direct" method and the cluster in molecule (CIM) approach as the fragment based approach. Both approaches are applied in conjunction with second-order many-body perturbation theory (MP2) as well as coupled-cluster theory with single-, double- and perturbative triple excitations [CCSD(T)]. Second, we have investigated the possible merits of combining the two approaches by performing CIM calculations with DLPNO methods serving as the method of choice for performing the subsystem calculations. Our cluster-in-molecule approach is closely related to but slightly deviates from approaches in the literature since we have avoided real space cutoffs. Moreover, the neglected distant pair correlations in the previous CIM approach are considered approximately. Six very large molecules (503-2380 atoms) were studied. At both MP2 and CCSD(T) levels of theory, the CIM and DLPNO methods show similar efficiency. However, DLPNO methods are more accurate for 3-dimensional systems. While we have found only little incentive for the combination of CIM with DLPNO-MP2, the situation is different for CIM-DLPNO-CCSD(T). This combination is attractive because (1) the better parallelization opportunities offered by CIM; (2) the methodology is less memory intensive than the genuine DLPNO-CCSD(T) method and, hence, allows for large calculations on more modest hardware; and (3) the methodology is applicable and efficient in the frequently met cases, where the largest subsystem calculation is too large for the canonical CCSD(T) method.

  11. A deep convolutional neural network approach to single-particle recognition in cryo-electron microscopy.

    PubMed

    Zhu, Yanan; Ouyang, Qi; Mao, Youdong

    2017-07-21

    Single-particle cryo-electron microscopy (cryo-EM) has become a mainstream tool for the structural determination of biological macromolecular complexes. However, high-resolution cryo-EM reconstruction often requires hundreds of thousands of single-particle images. Particle extraction from experimental micrographs thus can be laborious and presents a major practical bottleneck in cryo-EM structural determination. Existing computational methods for particle picking often use low-resolution templates for particle matching, making them susceptible to reference-dependent bias. It is critical to develop a highly efficient template-free method for the automatic recognition of particle images from cryo-EM micrographs. We developed a deep learning-based algorithmic framework, DeepEM, for single-particle recognition from noisy cryo-EM micrographs, enabling automated particle picking, selection and verification in an integrated fashion. The kernel of DeepEM is built upon a convolutional neural network (CNN) composed of eight layers, which can be recursively trained to be highly "knowledgeable". Our approach exhibits an improved performance and accuracy when tested on the standard KLH dataset. Application of DeepEM to several challenging experimental cryo-EM datasets demonstrated its ability to avoid the selection of un-wanted particles and non-particles even when true particles contain fewer features. The DeepEM methodology, derived from a deep CNN, allows automated particle extraction from raw cryo-EM micrographs in the absence of a template. It demonstrates an improved performance, objectivity and accuracy. Application of this novel method is expected to free the labor involved in single-particle verification, significantly improving the efficiency of cryo-EM data processing.

  12. Navigating the grounded theory terrain. Part 1.

    PubMed

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.

  13. Analysis of phase II methodologies for single-arm clinical trials with multiple endpoints in rare cancers: An example in Ewing's sarcoma.

    PubMed

    Dutton, P; Love, S B; Billingham, L; Hassan, A B

    2018-05-01

    Trials run in either rare diseases, such as rare cancers, or rare sub-populations of common diseases are challenging in terms of identifying, recruiting and treating sufficient patients in a sensible period. Treatments for rare diseases are often designed for other disease areas and then later proposed as possible treatments for the rare disease after initial phase I testing is complete. To ensure the trial is in the best interests of the patient participants, frequent interim analyses are needed to force the trial to stop promptly if the treatment is futile or toxic. These non-definitive phase II trials should also be stopped for efficacy to accelerate research progress if the treatment proves to be particularly promising. In this paper, we review frequentist and Bayesian methods that have been adapted to incorporate two binary endpoints and frequent interim analyses. The Eurosarc Trial of Linsitinib in advanced Ewing Sarcoma (LINES) is used as a motivating example and provides a suitable platform to compare these approaches. The Bayesian approach provides greater design flexibility, but does not provide additional value over the frequentist approaches in a single trial setting when the prior is non-informative. However, Bayesian designs are able to borrow from any previous experience, using prior information to improve efficiency.

  14. SAQP pitch walk metrology using single target metrology

    NASA Astrophysics Data System (ADS)

    Fang, Fang; Herrera, Pedro; Kagalwala, Taher; Camp, Janay; Vaid, Alok; Pandev, Stilian; Zach, Franz

    2017-03-01

    Self-aligned quadruple patterning (SAQP) processes have found widespread acceptance in advanced technology nodes to drive device scaling beyond the resolution limitations of immersion scanners. Of the four spaces generated in this process from one lithography pattern two tend to be equivalent as they are derived from the first spacer deposition. The three independent spaces are commonly labelled as α, β and γ. α, β and γ are controlled by multiple process steps including the initial lithographic patterning process, the two mandrel and spacer etches as well as the two spacer depositions. Scatterometry has been the preferred metrology approach, however is restricted to repetitive arrays. In these arrays independent measurements, in particular of alpha and gamma, are not possible due to degeneracy of the standard array targets. . In this work we present a single target approach which lifts the degeneracies commonly encountered while using product relevant layout geometries. We will first describe the metrology approach which includes the previously described SRM (signal response metrology) combined with reference data derived from CD SEM data. The performance of the methodology is shown in figures 1-3. In these figures the optically determined values for alpha, beta and gamma are compared to the CD SEM reference data. The variations are achieved using controlled process experiments varying Mandrel CD and Spacer deposition thicknesses.

  15. Fluorescence polarization measures energy funneling in single light-harvesting antennas—LH2 vs conjugated polymers

    PubMed Central

    Camacho, Rafael; Tubasum, Sumera; Southall, June; Cogdell, Richard J.; Sforazzini, Giuseppe; Anderson, Harry L.; Pullerits, Tõnu; Scheblykin, Ivan G.

    2015-01-01

    Numerous approaches have been proposed to mimic natural photosynthesis using artificial antenna systems, such as conjugated polymers (CPs), dendrimers, and J-aggregates. As a result, there is a need to characterize and compare the excitation energy transfer (EET) properties of various natural and artificial antennas. Here we experimentally show that EET in single antennas can be characterized by 2D polarization imaging using the single funnel approximation. This methodology addresses the ability of an individual antenna to transfer its absorbed energy towards a single pool of emissive states, using a single parameter called energy funneling efficiency (ε). We studied individual peripheral antennas of purple bacteria (LH2) and single CP chains of 20 nm length. As expected from a perfect antenna, LH2s showed funneling efficiencies close to unity. In contrast, CPs showed lower average funneling efficiencies, greatly varying from molecule to molecule. Cyclodextrin insulation of the conjugated backbone improves EET, increasing the fraction of CPs possessing ε = 1. Comparison between LH2s and CPs shows the importance of the protection systems and the protein scaffold of LH2, which keep the chromophores in functional form and at such geometrical arrangement that ensures excellent EET. PMID:26478272

  16. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  17. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting.

    PubMed

    Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.

  18. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting

    PubMed Central

    Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627

  19. Unspoken phenomena: using the photovoice method to enrich phenomenological inquiry.

    PubMed

    Plunkett, Robyn; Leipert, Beverly D; Ray, Susan L

    2013-06-01

    Photovoice is a powerful method that is gaining momentum in nursing research. As a relatively new method in nursing science, the situatedness of photovoice within or alongside various research methodologies in a single study remains in a stage of early development. The purpose of this paper is to discuss the photovoice method as a means to elicit phenomenological data when researching the lived experience. While the foundational bases of phenomenology and photovoice differ substantially, the argument presented in this paper suggests that the photovoice method can be successfully used in phenomenological inquiry provided that significant rigour checks are pursued. This includes reflecting upon the origins and understandings of both methodology and method to promote methodological congruency. Data collection and analysis approaches that contribute to phenomenological inquiry using the photovoice method in addition to rigour and ethical considerations are discussed. The use of data generated from photovoice in phenomenological inquiry may fill a void of understanding furnished by limitations of traditional phenomenological inquiry and of spoken language and can enhance understanding of the lived experience, which may not always be best understood by words alone. © 2012 John Wiley & Sons Ltd.

  20. Improving Mathematics Performance among Secondary Students with EBD: A Methodological Review

    ERIC Educational Resources Information Center

    Mulcahy, Candace A.; Krezmien, Michael P.; Travers, Jason

    2016-01-01

    In this methodological review, the authors apply special education research quality indicators and standards for single case design to analyze mathematics intervention studies for secondary students with emotional and behavioral disorders (EBD). A systematic methodological review of literature from 1975 to December 2012 yielded 19 articles that…

  1. Aggregation of carbon dioxide sequestration storage assessment units

    USGS Publications Warehouse

    Blondes, Madalyn S.; Schuenemeyer, John H.; Olea, Ricardo A.; Drew, Lawrence J.

    2013-01-01

    The U.S. Geological Survey is currently conducting a national assessment of carbon dioxide (CO2) storage resources, mandated by the Energy Independence and Security Act of 2007. Pre-emission capture and storage of CO2 in subsurface saline formations is one potential method to reduce greenhouse gas emissions and the negative impact of global climate change. Like many large-scale resource assessments, the area under investigation is split into smaller, more manageable storage assessment units (SAUs), which must be aggregated with correctly propagated uncertainty to the basin, regional, and national scales. The aggregation methodology requires two types of data: marginal probability distributions of storage resource for each SAU, and a correlation matrix obtained by expert elicitation describing interdependencies between pairs of SAUs. Dependencies arise because geologic analogs, assessment methods, and assessors often overlap. The correlation matrix is used to induce rank correlation, using a Cholesky decomposition, among the empirical marginal distributions representing individually assessed SAUs. This manuscript presents a probabilistic aggregation method tailored to the correlations and dependencies inherent to a CO2 storage assessment. Aggregation results must be presented at the basin, regional, and national scales. A single stage approach, in which one large correlation matrix is defined and subsets are used for different scales, is compared to a multiple stage approach, in which new correlation matrices are created to aggregate intermediate results. Although the single-stage approach requires determination of significantly more correlation coefficients, it captures geologic dependencies among similar units in different basins and it is less sensitive to fluctuations in low correlation coefficients than the multiple stage approach. Thus, subsets of one single-stage correlation matrix are used to aggregate to basin, regional, and national scales.

  2. Identification and validation of single nucleotide polymorphisms in growth- and maturation-related candidate genes in sole (Solea solea L.).

    PubMed

    Diopere, Eveline; Hellemans, Bart; Volckaert, Filip A M; Maes, Gregory E

    2013-03-01

    Genomic methodologies applied in evolutionary and fisheries research have been of great benefit to understand the marine ecosystem and the management of natural resources. Although single nucleotide polymorphisms (SNPs) are attractive for the study of local adaptation, spatial stock management and traceability, and investigating the effects of fisheries-induced selection, they have rarely been exploited in non-model organisms. This is partly due to difficulties in finding and validating SNPs in species with limited or no genomic resources. Complementary to random genome-scan approaches, a targeted candidate gene approach has the potential to unveil pre-selected functional diversity and provides more in depth information on the action of selection at specific genes. For example genes can be under selective pressure due to climate change and sustained periods of heavy fishing pressure. In this study, we applied a candidate gene approach in sole (Solea solea L.), an important member of the demersal ecosystem. As consumption flatfish it is heavy exploited and has experienced associated life-history changes over the last 60years. To discover novel genetic polymorphisms in or around genes linked to important life history traits in sole, we screened a total of 76 candidate genes related to growth and maturation using a targeted resequencing approach. We identified in total 86 putative SNPs in 22 genes and validated 29 SNPs using a multiplex single-base extension genotyping assay. We found 22 informative SNPs, of which two represent non-synonymous mutations, potentially of functional relevance. These novel markers should be rapidly and broadly applicable in analyses of natural sole populations, as a measure of the evolutionary signature of overfishing and for initiatives on marker assisted selection. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Recovery in the 21st Century: From Shame to Strength.

    PubMed

    Gumbley, Stephen J

    2016-01-01

    Through the "war on drugs," the just-say-no campaign, and into the early years of this century, the overarching approach to substance use disorders (SUDs) called for a single outcome (abstinence) and a single methodology (spiritual connection with a higher power) as the remedy for SUDs. Those who did not become permanently abstinent or rejected the spiritual approach were seen as "not ready" or "in denial."A seismic shift in thinking about "addiction" and "recovery" began in earnest in the 1990s. In 2005, the Substance Abuse and Mental Health Services Administration brought together leaders of the treatment and recovery field for the historic National Summit on Recovery to develop broad-based consensus on guiding principles for recovery and elements of recovery-oriented systems of care.Major changes associated with the recovery-oriented approach include viewing SUDs as chronic, rather than acute, problems that require long-term support and focusing on recovery management rather than disease management. Complete abstinence is not an absolute requirement for wellness for all persons with SUDs. There are "many pathways to recovery," not only the 12-Step approach (White & Kurtz, 2006). Sustained recovery is self-directed and requires personal choices, the support of peers and allies, and community reinforcement as well as a strength-based approach and the use of research-based interventions. This Perspectives column addresses the historical context for the transformation toward a recovery-oriented system of care, highlights federal efforts to promote recovery-oriented approaches, and describes recovery-oriented terminology to reduce misconceptions, labeling, and stigmatization and promote recovery for individuals, families, and communities.

  4. Methodology or method? A critical review of qualitative case study reports.

    PubMed

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  5. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  6. AERIS: An Integrated Domain Information System for Aerospace Science and Technology

    ERIC Educational Resources Information Center

    Hatua, Sudip Ranjan; Madalli, Devika P.

    2011-01-01

    Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…

  7. All-atom molecular dynamics simulations of spin labelled double and single-strand DNA for EPR studies.

    PubMed

    Prior, C; Danilāne, L; Oganesyan, V S

    2018-05-16

    We report the first application of fully atomistic molecular dynamics (MD) simulations to the prediction of electron paramagnetic resonance (EPR) spectra of spin labelled DNA. Models for two structurally different DNA spin probes with either the rigid or flexible position of the nitroxide group in the base pair, employed in experimental studies previously, have been developed. By the application of the combined MD-EPR simulation methodology we aimed at the following. Firstly, to provide a test bed against a sensitive spectroscopic technique for the recently developed improved version of the parmbsc1 force field for MD modelling of DNA. The predicted EPR spectra show good agreement with the experimental ones available from the literature, thus confirming the accuracy of the currently employed DNA force fields. Secondly, to provide a quantitative interpretation of the motional contributions into the dynamics of spin probes in both duplex and single-strand DNA fragments and to analyse their perturbing effects on the local DNA structure. Finally, a combination of MD and EPR allowed us to test the validity of the application of the Model-Free (M-F) approach coupled with the partial averaging of magnetic tensors to the simulation of EPR spectra of DNA systems by comparing the resultant EPR spectra with those simulated directly from MD trajectories. The advantage of the M-F based EPR simulation approach over the direct propagation techniques is that it requires motional and order parameters that can be calculated from shorter MD trajectories. The reported MD-EPR methodology is transferable to the prediction and interpretation of EPR spectra of higher order DNA structures with novel types of spin labels.

  8. Application of the maximum cumulative ratio (MCR) as a screening tool for the evaluation of mixtures in residential indoor air.

    PubMed

    De Brouwere, Katleen; Cornelis, Christa; Arvanitis, Athanasios; Brown, Terry; Crump, Derrick; Harrison, Paul; Jantunen, Matti; Price, Paul; Torfs, Rudi

    2014-05-01

    The maximum cumulative ratio (MCR) method allows the categorisation of mixtures according to whether the mixture is of concern for toxicity and if so whether this is driven by one substance or multiple substances. The aim of the present study was to explore, by application of the MCR approach, whether health risks due to indoor air pollution are dominated by one substance or are due to concurrent exposure to various substances. Analysis was undertaken on monitoring data of four European indoor studies (giving five datasets), involving 1800 records of indoor air or personal exposure. Application of the MCR methodology requires knowledge of the concentrations of chemicals in a mixture together with health-based reference values for those chemicals. For this evaluation, single substance health-based reference values (RVs) were selected through a structured review process. The MCR analysis found high variability in the proportion of samples of concern for mixture toxicity. The fraction of samples in these groups of concern varied from 2% (Flemish schools) to 77% (EXPOLIS, Basel, indoor), the variation being due not only to the variation in indoor air contaminant levels across the studies but also to other factors such as differences in number and type of substances monitored, analytical performance, and choice of RVs. However, in 4 out of the 5 datasets, a considerable proportion of cases were found where a chemical-by-chemical approach failed to identify the need for the investigation of combined risk assessment. Although the MCR methodology applied in the current study provides no consideration of commonality of endpoints, it provides a tool for discrimination between those mixtures requiring further combined risk assessment and those for which a single-substance assessment is sufficient. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. A Machine Learning Approach to Identifying Placebo Responders in Late-Life Depression Trials.

    PubMed

    Zilcha-Mano, Sigal; Roose, Steven P; Brown, Patrick J; Rutherford, Bret R

    2018-01-11

    Despite efforts to identify characteristics associated with medication-placebo differences in antidepressant trials, few consistent findings have emerged to guide participant selection in drug development settings and differential therapeutics in clinical practice. Limitations in the methodologies used, particularly searching for a single moderator while treating all other variables as noise, may partially explain the failure to generate consistent results. The present study tested whether interactions between pretreatment patient characteristics, rather than a single-variable solution, may better predict who is most likely to benefit from placebo versus medication. Data were analyzed from 174 patients aged 75 years and older with unipolar depression who were randomly assigned to citalopram or placebo. Model-based recursive partitioning analysis was conducted to identify the most robust significant moderators of placebo versus citalopram response. The greatest signal detection between medication and placebo in favor of medication was among patients with fewer years of education (≤12) who suffered from a longer duration of depression since their first episode (>3.47 years) (B = 2.53, t(32) = 3.01, p = 0.004). Compared with medication, placebo had the greatest response for those who were more educated (>12 years), to the point where placebo almost outperformed medication (B = -0.57, t(96) = -1.90, p = 0.06). Machine learning approaches capable of evaluating the contributions of multiple predictor variables may be a promising methodology for identifying placebo versus medication responders. Duration of depression and education should be considered in the efforts to modulate placebo magnitude in drug development settings and in clinical practice. Copyright © 2018 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. [Methodological approaches to the creation of healthy food].

    PubMed

    Kornen, N N; Viktorova, E P; Evdokimova, O V

    2015-01-01

    The substantiation of necessity of creation of healthy food products and their classification. Formulated methodological approaches to the creation of healthy food: enriched, functional and specialized purpose.

  11. Developing comparative criminology and the case of China: an introduction.

    PubMed

    Liu, Jianhong

    2007-02-01

    Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.

  12. General implementation of arbitrary nonlinear quadrature phase gates

    NASA Astrophysics Data System (ADS)

    Marek, Petr; Filip, Radim; Ogawa, Hisashi; Sakaguchi, Atsushi; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We propose general methodology of deterministic single-mode quantum interaction nonlinearly modifying single quadrature variable of a continuous-variable system. The methodology is based on linear coupling of the system to ancillary systems subsequently measured by quadrature detectors. The nonlinear interaction is obtained by using the data from the quadrature detection for dynamical manipulation of the coupling parameters. This measurement-induced methodology enables direct realization of arbitrary nonlinear quadrature interactions without the need to construct them from the lowest-order gates. Such nonlinear interactions are crucial for more practical and efficient manipulation of continuous quadrature variables as well as qubits encoded in continuous-variable systems.

  13. OPTIGOV - A new methodology for evaluating Clinical Governance implementation by health providers

    PubMed Central

    2010-01-01

    Background The aim of Clinical Governance (CG) is to the pursuit of quality in health care through the integration of all the activities impacting on the patient into a single strategy. OPTIGOV (Optimizing Health Care Governance) is a methodology for the assessment of the level of implementation of CG within healthcare organizations. The aim of this paper is to explain the process underlying the development of OPTIGOV, and describe its characteristics and steps. Methods OPTIGOV was developed in 2006 by the Institute of Hygiene of the Catholic University of the Sacred Heart and Eurogroup Consulting Alliance. The main steps of the process were: choice of areas for analysis and questionnaire development, based on a review of scientific literature; assignment of scores and weights to individual questions and areas; implementation of a software interfaceable with Microsoft Office. Results OPTIGOV consists of: a) a hospital audit with a structured approach; b) development of an improvement operational plan. A questionnaire divided into 13 areas of analysis is used. For each area there is a form with a variable number of questions and "closed" answers. A score is assigned to each answer, area of analysis, healthcare department and unit. The single scores can be gathered for the organization as a whole. The software application allows for collation of data, calculation of scores and development of benchmarks to allow comparisons between healthcare organizations. Implementation consists of three stages: the preparation phase includes a kick off meeting, selection of interviewees and development of a survey plan. The registration phase includes hospital audits, reviewing of hospital documentation, data collection and score processing. Lastly, results are processed, inserted into a final report, and discussed in a meeting with the Hospital Board and in a final workshop. Conclusions The OPTIGOV methodology for the evaluation of CG implementation was developed with an evidence-based approach. The ongoing adoption of OPTIGOV in several projects will put to the test its potential to realistically represent the organization status, pinpoint criticalities and transferable best practices, provide a plan for improvement, and contribute to triggering changes and pursuit of quality in health care. PMID:20565967

  14. Single event test methodology for integrated optoelectronics

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Cooley, James A.; Stassinopoulos, E. G.; Marshall, Paul; Crabtree, Christina

    1993-01-01

    A single event upset (SEU), defined as a transient or glitch on the output of a device, and its applicability to integrated optoelectronics are discussed in the context of spacecraft design and the need for more than a bit error rate viewpoint for testing and analysis. A methodology for testing integrated optoelectronic receivers and transmitters for SEUs is presented, focusing on the actual test requirements and system schemes needed for integrated optoelectronic devices. Two main causes of single event effects in the space environment, including protons and galactic cosmic rays, are considered along with ground test facilities for simulating the space environment.

  15. Auditing as Part of the Terminology Design Life Cycle

    PubMed Central

    Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue

    2006-01-01

    Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044

  16. A Methodological Approach to Support Collaborative Media Creation in an E-Learning Higher Education Context

    ERIC Educational Resources Information Center

    Ornellas, Adriana; Muñoz Carril, Pablo César

    2014-01-01

    This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…

  17. Employing an ethnographic approach: key characteristics.

    PubMed

    Lambert, Veronica; Glacken, Michele; McCarron, Mary

    2011-01-01

    Nurses are increasingly embracing ethnography as a useful research methodology. This paper presents an overview of some of the main characteristics we considered and the challenges encountered when using ethnography to explore the nature of communication between children and health professionals in a children's hospital. There is no consensual definition or single procedure to follow when using ethnography. This is largely attributable to the re-contextualisation of ethnography over time through diversification in and across many disciplines. Thus, it is imperative to consider some of ethnography's trademark features. To identify core trademark features of ethnography, we collated data following a scoping review of pertinent ethnographic textbooks, journal articles, attendance at ethnographic workshops and discussions with principle ethnographers. This is a methodological paper. Essentially, ethnography is a field-orientated activity that has cultural interpretations at its core, although the levels of those interpretations vary. We identified six trademark features to be considered when embracing an ethnographic approach: naturalism; context; multiple data sources; small case numbers; 'emic' and 'etic' perspectives, and ethical considerations. Ethnography has an assortment of meanings, so it is not often used in a wholly orthodox way and does not fall under the auspices of one epistemological belief. Yet, there are core criteria and trademark features that researchers should take into account alongside their particular epistemological beliefs when embracing an ethnographic inquiry. We hope this paper promotes a clearer vision of the methodological processes to consider when embarking on ethnography and creates an avenue for others to disseminate their experiences of and challenges encountered when applying ethnography's trademark features in different healthcare contexts.

  18. Estimating the cost of epilepsy in Europe: a review with economic modeling.

    PubMed

    Pugliatti, Maura; Beghi, Ettore; Forsgren, Lars; Ekman, Mattias; Sobocki, Patrik

    2007-12-01

    Based on available epidemiologic, health economic, and international population statistics literature, the cost of epilepsy in Europe was estimated. Europe was defined as the 25 European Union member countries, Iceland, Norway, and Switzerland. Guidelines for epidemiological studies on epilepsy were used for a case definition. A bottom-up prevalence-based cost-of-illness approach, the societal perspective for including the cost items, and the human capital approach as valuation principle for indirect costs were used. The cost estimates were based on selected studies with common methodology and valuation principles. The estimated prevalence of epilepsy in Europe in 2004 was 4.3-7.8 per 1,000. The estimated total cost of the disease in Europe was euro15.5 billion in 2004, indirect cost being the single most dominant cost category (euro8.6 billion). Direct health care costs were euro2.8 billion, outpatient care comprising the largest part (euro1.3 billion). Direct nonmedical cost was euro4.2 billion. That of antiepileptic drugs was euro400 million. The total cost per case was euro2,000-11,500 and the estimated cost per European inhabitant was euro33. Epilepsy is a relevant socioeconomic burden at individual, family, health services, and societal level in Europe. The greater proportion of such burden is outside the formal health care sector, antiepileptic drugs representing a smaller proportion. Lack of economic data from several European countries and other methodological limitations make this report an initial estimate of the cost of epilepsy in Europe. Prospective incidence cost-of-illness studies from well-defined populations and common methodology are encouraged.

  19. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Rough set approach for accident chains exploration.

    PubMed

    Wong, Jinn-Tsai; Chung, Yi-Shih

    2007-05-01

    This paper presents a novel non-parametric methodology--rough set theory--for accident occurrence exploration. The rough set theory allows researchers to analyze accidents in multiple dimensions and to model accident occurrence as factor chains. Factor chains are composed of driver characteristics, trip characteristics, driver behavior and environment factors that imply typical accident occurrence. A real-world database (2003 Taiwan single auto-vehicle accidents) is used as an example to demonstrate the proposed approach. The results show that although most accident patterns are unique, some accident patterns are significant and worth noting. Student drivers who are young and less experienced exhibit a relatively high possibility of being involved in off-road accidents on roads with a speed limit between 51 and 79 km/h under normal driving circumstances. Notably, for bump-into-facility accidents, wet surface is a distinctive environmental factor.

  1. How to Increase Value in the Footwear Supply Chain

    NASA Astrophysics Data System (ADS)

    Fornasiero, Rosanna; Tescaro, Mauro; Scarso, Enrico; Gottardi, Giorgio

    The Lean approach has been implemented in many different sectors as a methodology to improve industrial performance at company level. In the latest years this approach has been further developed in literature and in practice to integrate the principles of agility, adaptability and the mass customization paradigm where product and services have to be designed together to meet specific requirements, and where value originated by the supply chain enhance the value of single company thanks to the use of ICT and remote control. In this paper we analyze the Beyond-Lean paradigm and propose a path for companies in the footwear sector to improve their performance based on high-value-added products and processes. A detailed process analysis based on Value Stream Mapping is used to define criticalities and suggest improvements paths both at technological and organizational level.

  2. Comparative Study of Impedance Eduction Methods, Part 2: NASA Tests and Methodology

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Howerton, Brian M.; Busse-Gerstengarbe, Stefan

    2013-01-01

    A number of methods have been developed at NASA Langley Research Center for eduction of the acoustic impedance of sound-absorbing liners mounted in the wall of a flow duct. This investigation uses methods based on the Pridmore-Brown and convected Helmholtz equations to study the acoustic behavior of a single-layer, conventional liner fabricated by the German Aerospace Center and tested in the NASA Langley Grazing Flow Impedance Tube. Two key assumptions are explored in this portion of the investigation. First, a comparison of results achieved with uniform-flow and shear-flow impedance eduction methods is considered. Also, an approach based on the Prony method is used to extend these methods from single-mode to multi-mode implementations. Finally, a detailed investigation into the effects of harmonic distortion on the educed impedance is performed, and the results are used to develop guidelines regarding acceptable levels of harmonic distortion

  3. Housing conditions and stimulus females: a robust social discrimination task for studying male rodent social recognition

    PubMed Central

    Macbeth, Abbe H.; Edds, Jennifer Stepp; Young, W. Scott

    2010-01-01

    Social recognition (SR) enables rodents to distinguish between familiar and novel conspecifics, largely through individual odor cues. SR tasks utilize the tendency for a male to sniff and interact with a novel individual more than a familiar individual. Many paradigms have been used to study the roles of the neuropeptides oxytocin and vasopressin in SR. However, inconsistencies in results have arisen within similar mouse strains, and across different paradigms and laboratories, making reliable testing of social recognition difficult. The current protocol details a novel approach that is replicable across investigators and in different strains of mice. We created a protocol that utilizes gonadally intact, singly housed females presented within corrals to group-housed males. Housing females singly prior to testing is particularly important for reliable discrimination. This methodology will be useful for studying short-term social memory in rodents, and may also be applicable for longer-term studies. PMID:19816420

  4. Selection of Single-Walled Carbon Nanotube with Narrow Diameter Distribution by Using a PPE PPV Copolymer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, Kelly A; Chen, Yusheng; Malkovskiy, Andrey

    2012-01-01

    Electronic and mechanic properties of single-walled carbon nanotubes (SWNTs) are uniquely dependent on the tube's chiralities and diameters. Isolation of different type SWNTs remains one of the fundamental and challenging issues in nanotube science. Herein, we demonstrate that SWNTs can be effectively enriched to a narrow diameter range by sequential treatment of the HiPco sample with nitric acid and a {pi}-conjugated copolymer poly(phenyleneethynylene) (PPE)-co-poly(phenylenevinylene) (PPV). On the basis of Raman, fluorescence, and microscopic evidence, the nitric acid is found to selectively remove the SWNTs of small diameter. The polymer not only effectively dispersed carbon nanotubes but also exhibited a goodmore » selectivity toward a few SWNTs. The reported approach thus offers a new methodology to isolate SWNTs, which has the potential to operate in a relatively large scale.« less

  5. Supramolecular guests in solvent driven block copolymer assembly: From internally structured nanoparticles to micelles

    PubMed Central

    Klinger, Daniel; Robb, Maxwell J.; Spruell, Jason M.; Lynd, Nathaniel A.; Hawker, Craig J.

    2014-01-01

    Supramolecular interactions between different hydrogen-bonding guests and poly(2-vinyl pyridine)-block-poly (styrene) can be exploited to prepare remarkably diverse self-assembled nanostructures in dispersion from a single block copolymer (BCP). The characteristics of the BCP can be efficiently controlled by tailoring the properties of a guest which preferentially binds to the P2VP block. For example, the incorporation of a hydrophobic guest creates a hydrophobic BCP complex that forms phase separated nanoparticles upon self-assembly. Conversely, the incorporation of a hydrophilic guest results in an amphiphilic BCP complex that forms spherical micelles in water. The ability to tune the self-assembly behavior and access dramatically different nanostructures from a single BCP substrate demonstrates the exceptional versatility of the self-assembly of BCPs driven by supramolecular interactions. This approach represents a new methodology that will enable the further design of complex, responsive self-assembled nanostructures. PMID:25525473

  6. Acetone-butanol-ethanol competitive sorption simulation from single, binary, and ternary systems in a fixed-bed of KA-I resin.

    PubMed

    Wu, Jinglan; Zhuang, Wei; Ying, Hanjie; Jiao, Pengfei; Li, Renjie; Wen, Qingshi; Wang, Lili; Zhou, Jingwei; Yang, Pengpeng

    2015-01-01

    Separation of butanol based on sorption methodology from acetone-butanol-ethanol (ABE) fermentation broth has advantages in terms of biocompatibility and stability, as well as economy, and therefore gains much attention. In this work a chromatographic column model based on the solid film linear driving force approach and the competitive Langmuir isotherm equations was used to predict the competitive sorption behaviors of ABE single, binary, and ternary mixture. It was observed that the outlet concentration of weaker retained components exceeded the inlet concentration, which is an evidence of competitive adsorption. Butanol, the strongest retained component, could replace ethanol almost completely and also most of acetone. In the end of this work, the proposed model was validated by comparison of the experimental and predicted ABE ternary breakthrough curves using the real ABE fermentation broth as a feed solution. © 2014 American Institute of Chemical Engineers.

  7. Two-probe STM experiments at the atomic level.

    PubMed

    Kolmer, Marek; Olszowski, Piotr; Zuzak, Rafal; Godlewski, Szymon; Joachim, Christian; Szymonski, Marek

    2017-11-08

    Direct characterization of planar atomic or molecular scale devices and circuits on a supporting surface by multi-probe measurements requires unprecedented stability of single atom contacts and manipulation of scanning probes over large, nanometer scale area with atomic precision. In this work, we describe the full methodology behind atomically defined two-probe scanning tunneling microscopy (STM) experiments performed on a model system: dangling bond dimer wire supported on a hydrogenated germanium (0 0 1) surface. We show that 70 nm long atomic wire can be simultaneously approached by two independent STM scanners with exact probe to probe distance reaching down to 30 nm. This allows direct wire characterization by two-probe I-V characteristics at distances below 50 nm. Our technical results presented in this work open a new area for multi-probe research, which can be now performed with precision so far accessible only by single-probe scanning probe microscopy (SPM) experiments.

  8. Electrostatically Directed Self-Assembly of Ultrathin Supramolecular Polymer Microcapsules

    PubMed Central

    Parker, Richard M; Zhang, Jing; Zheng, Yu; Coulston, Roger J; Smith, Clive A; Salmon, Andrew R; Yu, Ziyi; Scherman, Oren A; Abell, Chris

    2015-01-01

    Supramolecular self-assembly offers routes to challenging architectures on the molecular and macroscopic scale. Coupled with microfluidics it has been used to make microcapsules—where a 2D sheet is shaped in 3D, encapsulating the volume within. In this paper, a versatile methodology to direct the accumulation of capsule-forming components to the droplet interface using electrostatic interactions is described. In this approach, charged copolymers are selectively partitioned to the microdroplet interface by a complementary charged surfactant for subsequent supramolecular cross-linking via cucurbit[8]uril. This dynamic assembly process is employed to selectively form both hollow, ultrathin microcapsules and solid microparticles from a single solution. The ability to dictate the distribution of a mixture of charged copolymers within the microdroplet, as demonstrated by the single-step fabrication of distinct core–shell microcapsules, gives access to a new generation of innovative self-assembled constructs. PMID:26213532

  9. Hybrid microfiber-lithium-niobate nanowaveguide structures as high-purity heralded single-photon sources

    NASA Astrophysics Data System (ADS)

    Main, Philip; Mosley, Peter J.; Ding, Wei; Zhang, Lijian; Gorbach, Andrey V.

    2016-12-01

    We propose a compact, fiber-integrated architecture for photon-pair generation by parametric downconversion with unprecedented flexibility in the properties of the photons produced. Our approach is based on a thin-film lithium niobate nanowaveguide, evanescently coupled to a tapered silica microfiber. We demonstrate how controllable mode hybridization between the fiber and waveguide yields control over the joint spectrum of the photon pairs. We also investigate how independent engineering of the linear and nonlinear properties of the structure can be achieved through the addition of a tapered, proton-exchanged layer to the waveguide. This allows further refinement of the joint spectrum through custom profiling of the effective nonlinearity, drastically improving the purity of the heralded photons. We give details of a source design capable of generating heralded single photons in the telecom wavelength range with purity of at least 0.95, and we provide a feasible fabrication methodology.

  10. Single Event Effect (SEE) Test Planning 101

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Pellish, Jonathan; Berg, Melanie D.

    2011-01-01

    This is a course on SEE Test Plan development. It is an introductory discussion of the items that go into planning an SEE test that should complement the SEE test methodology used. Material will only cover heavy ion SEE testing and not proton, LASER, or other though many of the discussed items may be applicable. While standards and guidelines for how-to perform single event effects (SEE) testing have existed almost since the first cyclotron testing, guidance on the development of SEE test plans has not been as easy to find. In this section of the short course, we attempt to rectify this lack. We consider the approach outlined here as a "living" document: mission specific constraints and new technology related issues always need to be taken into account. We note that we will use the term "test planning" in the context of those items being included in a test plan.

  11. Distal radius fracture fixation with a volar locking plate and endoscopic carpal tunnel release using a single 15mm approach: Feasibility study.

    PubMed

    Zemirline, A; Taleb, C; Naito, K; Vernet, P; Liverneaux, P; Lebailly, F

    2018-05-17

    Distal radius fractures (DRF) may trigger, reveal or decompensate acute carpal tunnel syndrome (CTS) in 0.5-21% of cases. Internal fixation and median nerve release must then be carried out urgently. Less invasive approaches have been described for both the median nerve release using an endoscopic device and for the DRF fixation using a volar locking plate. We assessed the feasibility of DRF fixation and median nerve release through a single, minimally-invasive 15mm approach on a series of 10 cases. We reviewed retrospectively 10 consecutive cases of DRF associated with symptomatic CTS in 8 women and 2 men, aged 57 years on average. CTS was diagnosed clinically. All patients were treated during outpatient surgery with a volar locking plate and endoscopic carpal tunnel release using a single 15mm minimally-invasive approach. In one case, arthroscopic scapholunate repair was also required. Six months after the procedure, all patients were reviewed with a clinical examination and a radiological evaluation. The average values for the clinical and radiological outcomes were as follows: pain on VAS 1.5/10; QuickDASH 14.3/100; flexion 90%; extension 90.6%; pronation 95.6%; supination 87.9%; grip strength 90.1%; 2PD test 5.2mm (4-8mm). Five complications occurred: two cases of temporary dysesthesia in the territory of the median nerve and one case of temporary hypoesthesia of the palmar branch of the median nerve, which had all completely recovered; two cases of complex regional pain syndrome type I, which were still active at 6 months. Despite its methodological weaknesses, our study is the only one to describe the technical feasibility of a single 15mm minimally-invasive approach for both internal fixation using a volar locking plate and endoscopic nerve release, with no serious complications. This technique should be added to the surgical toolbox of minimally-invasive procedures for the hand and wrist. Copyright © 2018 SFCM. Published by Elsevier Masson SAS. All rights reserved.

  12. Single-particle cryo-EM using alignment by classification (ABC): the structure of Lumbricus terrestris haemoglobin.

    PubMed

    Afanasyev, Pavel; Seer-Linnemayr, Charlotte; Ravelli, Raimond B G; Matadeen, Rishi; De Carlo, Sacha; Alewijnse, Bart; Portugal, Rodrigo V; Pannu, Navraj S; Schatz, Michael; van Heel, Marin

    2017-09-01

    Single-particle cryogenic electron microscopy (cryo-EM) can now yield near-atomic resolution structures of biological complexes. However, the reference-based alignment algorithms commonly used in cryo-EM suffer from reference bias, limiting their applicability (also known as the 'Einstein from random noise' problem). Low-dose cryo-EM therefore requires robust and objective approaches to reveal the structural information contained in the extremely noisy data, especially when dealing with small structures. A reference-free pipeline is presented for obtaining near-atomic resolution three-dimensional reconstructions from heterogeneous ('four-dimensional') cryo-EM data sets. The methodologies integrated in this pipeline include a posteriori camera correction, movie-based full-data-set contrast transfer function determination, movie-alignment algorithms, (Fourier-space) multivariate statistical data compression and unsupervised classification, 'random-startup' three-dimensional reconstructions, four-dimensional structural refinements and Fourier shell correlation criteria for evaluating anisotropic resolution. The procedures exclusively use information emerging from the data set itself, without external 'starting models'. Euler-angle assignments are performed by angular reconstitution rather than by the inherently slower projection-matching approaches. The comprehensive 'ABC-4D' pipeline is based on the two-dimensional reference-free 'alignment by classification' (ABC) approach, where similar images in similar orientations are grouped by unsupervised classification. Some fundamental differences between X-ray crystallography versus single-particle cryo-EM data collection and data processing are discussed. The structure of the giant haemoglobin from Lumbricus terrestris at a global resolution of ∼3.8 Å is presented as an example of the use of the ABC-4D procedure.

  13. Technology of combined chemical-mechanical fabrication of durable coatings

    NASA Astrophysics Data System (ADS)

    Smolentsev, V. P.; Ivanov, V. V.; Portnykh, A. I.

    2018-03-01

    The article presents the scientific fundamentals of methodology for calculating the modes and structuring the technological processes of combined chemical-mechanical fabrication of durable coatings. It is shown that they are based on classical patterns, describing the processes of simultaneous chemical and mechanical impact. The paper demonstrates the possibility of structuring a technological process, taking into account the systematic approach to impact management and strengthening the reciprocal positive influence of each impact upon the combined process. The combined processes have been planned for fabricating the model types of chemical-mechanical coatings of durable products in machine construction. The planning methodology is underpinned by a scientific hypothesis of a single source of impact management through energy potential of process components themselves, or by means of external energy supply through mechanical impact. The control of it is fairly thoroughly studied in the case of pulsed external strikes of hard pellets, similar to processes of vibroimpact hardening, thoroughly studied and mastered in many scientific schools of Russia.

  14. Entropy of Leukemia on Multidimensional Morphological and Molecular Landscapes

    NASA Astrophysics Data System (ADS)

    Vilar, Jose M. G.

    2014-04-01

    Leukemia epitomizes the class of highly complex diseases that new technologies aim to tackle by using large sets of single-cell-level information. Achieving such a goal depends critically not only on experimental techniques but also on approaches to interpret the data. A most pressing issue is to identify the salient quantitative features of the disease from the resulting massive amounts of information. Here, I show that the entropies of cell-population distributions on specific multidimensional molecular and morphological landscapes provide a set of measures for the precise characterization of normal and pathological states, such as those corresponding to healthy individuals and acute myeloid leukemia (AML) patients. I provide a systematic procedure to identify the specific landscapes and illustrate how, applied to cell samples from peripheral blood and bone marrow aspirates, this characterization accurately diagnoses AML from just flow cytometry data. The methodology can generally be applied to other types of cell populations and establishes a straightforward link between the traditional statistical thermodynamics methodology and biomedical applications.

  15. Sustainability of green jobs in Portugal: a methodological approach using occupational health indicators.

    PubMed

    Moreira, Sandra; Vasconcelos, Lia; Silva Santos, Carlos

    2017-09-28

    This study aimed to develop a methodological tool to analyze and monitor the green jobs in the context of Occupational Health and Safety. A literature review in combination with an investigation of Occupational Health Indicators was performed. The resulting tool of Occupational Health Indicators was based on the existing information of "Single Report" and was validated by national's experts. The tool brings together 40 Occupational Health Indicators in four key fields established by World Health Organization in their conceptual framework "Health indicators of sustainable jobs." The tool proposed allows for assessing if the green jobs enabled to follow the principles and requirements of Occupational Health Indicators and if these jobs are as good for the environment as for the workers' health, so if they can be considered quality jobs. This shows that Occupational Health Indicators are indispensable for the assessment of the sustainability of green jobs and should be taken into account in the definition and evaluation of policies and strategies of the sustainable development.

  16. Clinical judgement and the medical profession

    PubMed Central

    Kienle, Gunver S; Kiene, Helmut

    2011-01-01

    Objectives Clinical judgment is a central element of the medical profession, essential for the performance of the doctor, and potentially generating information also for other clinicians and for scientists and health care managers. The recently renewed interest in clinical judgement is primarily engaged with its role in communication, diagnosis and decision making. Beyond this issue, the present article highlights the interrelations between clinical judgement, therapy assessment and medical professionalism. Methods Literature review and theory development. Results The article presents different methodological approaches to causality assessment in clinical studies and in clinical judgement, and offers criteria for clinical single case causality. The article outlines models of medical professionalism such as technical rationality and practice epistemology, and characterizes features of professional expertise such as tacit knowledge, reflection in action, and gestalt cognition. Conclusions Consequences of a methodological and logistical advancement of clinical judgment are discussed, both in regard to medical progress and to the renewel of the cognitive basis of the medical profession. PMID:20973873

  17. Mixed-Methods Design in Biology Education Research: Approach and Uses.

    PubMed

    Warfa, Abdi-Rizak M

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both quantitative and qualitative inquiries. Specifically, the paper provides an overview of mixed-methods design typologies most relevant in biology education research. It also discusses common methodological issues that may arise in mixed-methods studies and ways to address them. The paper concludes with recommendations on how to report and write about MMR. © 2016 L. A.-R. M. Warfa. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  18. Comparison of a 3-D CFD-DSMC Solution Methodology With a Wind Tunnel Experiment

    NASA Technical Reports Server (NTRS)

    Glass, Christopher E.; Horvath, Thomas J.

    2002-01-01

    A solution method for problems that contain both continuum and rarefied flow regions is presented. The methodology is applied to flow about the 3-D Mars Sample Return Orbiter (MSRO) that has a highly compressed forebody flow, a shear layer where the flow separates from a forebody lip, and a low density wake. Because blunt body flow fields contain such disparate regions, employing a single numerical technique to solve the entire 3-D flow field is often impractical, or the technique does not apply. Direct simulation Monte Carlo (DSMC) could be employed to solve the entire flow field; however, the technique requires inordinate computational resources for continuum and near-continuum regions, and is best suited for the wake region. Computational fluid dynamics (CFD) will solve the high-density forebody flow, but continuum assumptions do not apply in the rarefied wake region. The CFD-DSMC approach presented herein may be a suitable way to obtain a higher fidelity solution.

  19. Phase I/II adaptive design for drug combination oncology trials

    PubMed Central

    Wages, Nolan A.; Conaway, Mark R.

    2014-01-01

    Existing statistical methodology on dose finding for combination chemotherapies has focused on toxicity considerations alone in finding a maximum tolerated dose combination to recommend for further testing of efficacy in a phase II setting. Recently, there has been increasing interest in integrating phase I and phase II trials in order to facilitate drug development. In this article, we propose a new adaptive phase I/II method for dual-agent combinations that takes into account both toxicity and efficacy after each cohort inclusion. The primary objective, both within and at the conclusion of the trial, becomes finding a single dose combination with an acceptable level of toxicity that maximizes efficacious response. We assume that there exist monotone dose–toxicity and dose–efficacy relationships among doses of one agent when the dose of other agent is fixed. We perform extensive simulation studies that demonstrate the operating characteristics of our proposed approach, and we compare simulated results to existing methodology in phase I/II design for combinations of agents. PMID:24470329

  20. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    PubMed

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  1. Variant-aware saturating mutagenesis using multiple Cas9 nucleases identifies regulatory elements at trait-associated loci.

    PubMed

    Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H

    2017-04-01

    Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.

  2. Conventional and genetic talent identification in sports: will recent developments trace talent?

    PubMed

    Breitbach, Sarah; Tug, Suzan; Simon, Perikles

    2014-11-01

    The purpose of talent identification (TI) is the earliest possible selection of auspicious athletes with the goal of systematically maximizing their potential. The literature proposes excellent reviews on various facets of talent research on different scientific issues such as sports sciences or genetics. However, the approaches of conventional and genetic testing have only been discussed separately by and for the respective groups of interest. In this article, we combine the discoveries of these disciplines into a single review to provide a comprehensive overview and elucidate the prevailing limitations. Fundamental problems in TI reside in the difficulties of defining the construct ‘talent’ or groups of different performance levels that represent the target variable of testing. Conventional and genetic testing reveal a number of methodological and technical limitations, and parallels are summarised in terms of the test designs, the point in time of testing, psychological skills or traits and unknown interactions between different variables. In conclusion, many deficiencies in the current talent research have gained attention. Alternative solutions include the talent development approach, while genetic testing is re-emphasised as a tool for risk stratification in sport participation. Future research needs to clearly define the group of interest and comprehensively implement all methodological improvement suggestions.

  3. Particle Filtering for Obstacle Tracking in UAS Sense and Avoid Applications

    PubMed Central

    Moccia, Antonio

    2014-01-01

    Obstacle detection and tracking is a key function for UAS sense and avoid applications. In fact, obstacles in the flight path must be detected and tracked in an accurate and timely manner in order to execute a collision avoidance maneuver in case of collision threat. The most important parameter for the assessment of a collision risk is the Distance at Closest Point of Approach, that is, the predicted minimum distance between own aircraft and intruder for assigned current position and speed. Since assessed methodologies can cause some loss of accuracy due to nonlinearities, advanced filtering methodologies, such as particle filters, can provide more accurate estimates of the target state in case of nonlinear problems, thus improving system performance in terms of collision risk estimation. The paper focuses on algorithm development and performance evaluation for an obstacle tracking system based on a particle filter. The particle filter algorithm was tested in off-line simulations based on data gathered during flight tests. In particular, radar-based tracking was considered in order to evaluate the impact of particle filtering in a single sensor framework. The analysis shows some accuracy improvements in the estimation of Distance at Closest Point of Approach, thus reducing the delay in collision detection. PMID:25105154

  4. Methodology for quantitative rapid multi-tracer PET tumor characterizations.

    PubMed

    Kadrmas, Dan J; Hoffman, John M

    2013-10-04

    Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted.

  5. Methodology for Quantitative Rapid Multi-Tracer PET Tumor Characterizations

    PubMed Central

    Kadrmas, Dan J.; Hoffman, John M.

    2013-01-01

    Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted. PMID:24312149

  6. Relationships among the structural topology, bond strength, and mechanical properties of single-walled aluminosilicate nanotubes.

    PubMed

    Liou, Kai-Hsin; Tsou, Nien-Ti; Kang, Dun-Yen

    2015-10-21

    Carbon nanotubes (CNTs) are regarded as small but strong due to their nanoscale microstructure and high mechanical strength (Young's modulus exceeds 1000 GPa). A longstanding question has been whether there exist other nanotube materials with mechanical properties as good as those of CNTs. In this study, we investigated the mechanical properties of single-walled aluminosilicate nanotubes (AlSiNTs) using a multiscale computational method and then conducted a comparison with single-walled carbon nanotubes (SWCNTs). By comparing the potential energy estimated from molecular and macroscopic material mechanics, we were able to model the chemical bonds as beam elements for the nanoscale continuum modeling. This method allowed for simulated mechanical tests (tensile, bending, and torsion) with minimum computational resources for deducing their Young's modulus and shear modulus. The proposed approach also enabled the creation of hypothetical nanotubes to elucidate the relative contributions of bond strength and nanotube structural topology to overall nanotube mechanical strength. Our results indicated that it is the structural topology rather than bond strength that dominates the mechanical properties of the nanotubes. Finally, we investigated the relationship between the structural topology and the mechanical properties by analyzing the von Mises stress distribution in the nanotubes. The proposed methodology proved effective in rationalizing differences in the mechanical properties of AlSiNTs and SWCNTs. Furthermore, this approach could be applied to the exploration of new high-strength nanotube materials.

  7. Physician supply forecast: better than peering in a crystal ball?

    PubMed Central

    Roberfroid, Dominique; Leonard, Christian; Stordeur, Sabine

    2009-01-01

    Background Anticipating physician supply to tackle future health challenges is a crucial but complex task for policy planners. A number of forecasting tools are available, but the methods, advantages and shortcomings of such tools are not straightforward and not always well appraised. Therefore this paper had two objectives: to present a typology of existing forecasting approaches and to analyse the methodology-related issues. Methods A literature review was carried out in electronic databases Medline-Ovid, Embase and ERIC. Concrete examples of planning experiences in various countries were analysed. Results Four main forecasting approaches were identified. The supply projection approach defines the necessary inflow to maintain or to reach in the future an arbitrary predefined level of service offer. The demand-based approach estimates the quantity of health care services used by the population in the future to project physician requirements. The needs-based approach involves defining and predicting health care deficits so that they can be addressed by an adequate workforce. Benchmarking health systems with similar populations and health profiles is the last approach. These different methods can be combined to perform a gap analysis. The methodological challenges of such projections are numerous: most often static models are used and their uncertainty is not assessed; valid and comprehensive data to feed into the models are often lacking; and a rapidly evolving environment affects the likelihood of projection scenarios. As a result, the internal and external validity of the projections included in our review appeared limited. Conclusion There is no single accepted approach to forecasting physician requirements. The value of projections lies in their utility in identifying the current and emerging trends to which policy-makers need to respond. A genuine gap analysis, an effective monitoring of key parameters and comprehensive workforce planning are key elements to improving the usefulness of physician supply projections. PMID:19216772

  8. Methodology or method? A critical review of qualitative case study reports

    PubMed Central

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  9. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    PubMed

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  10. Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research

    ERIC Educational Resources Information Center

    Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.

    2017-01-01

    Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…

  11. PRO_LIGAND: An approach to de novo molecular design. 4. Application to the design of peptides

    NASA Astrophysics Data System (ADS)

    Frenkel, David; Clark, David E.; Li, Jin; Murray, Christopher W.; Robson, Barry; Waszkowycz, Bohdan; Westhead, David R.

    1995-06-01

    In some instances, peptides can play an important role in the discovery of lead compounds. This paper describes the peptide design facility of the de novo drug design package, PRO_LIGAND. The package provides a unified framework for the design of peptides that are similar or complementary to a specified target. The approach uses single amino acid residues, selected from preconstructed libraries of different residues and conformations, and places them on top of predefined target interaction sites. This approach is a well-tested methodology for the design of organics but has not been used for peptides before. Peptides represent a difficulty because of their great conformational flexibility and a study of the advantages and disavantages of this simple approach is an important step in the development of design tools. After a description of our general approach, a more detailed discussion of its adaptation to peptides is given. The method is then applied to the design of peptide-based inhibitors to HIV-1 protease and the design of structural mimics of the surface region of lysozyme. The results are encouraging and point the way towards further development of interaction site-based approaches for peptide design.

  12. Development of two surgical approaches to the pituitary gland in the Horse.

    PubMed

    Carmalt, James L; Scansen, Brian A

    2018-12-01

    Current treatment of equine pituitary pars intermedia dysfunction (PPID) requires daily oral medication. Minimally invasive surgical palliation of this condition is appealing as a single treatment to alleviate the clinical signs of disease, dramatically improving the welfare of the horse. To develop a surgical approach to the equine pituitary gland, for subsequent treatment of PPID. A cadaver study to develop methodology and a terminal procedure under anaesthesia in the most promising techniques. Four surgical approaches to the pituitary gland were investigated in cadaver animals. A ventral trans-basispheniodal osteotomy and a minimally invasive intravenous approach via the ventral cavernous sinus progressed to live horse trials. Technical complications prevented the myeloscopic and trans-sphenopalatine sinus techniques from being successful. The ventral basisphenoidal osteotomy was repeatable and has potential if an intra-operative imaging guidance system could be employed. The minimally invasive approach was repeatable, atraumatic and relatively inexpensive. A minimally invasive surgical approach to the equine pituitary gland is possible and allows for needle placement within the target tissue. More work is necessary to determine what that treatment might be, but repeatable access to the gland has been obtained, which is a promising step.

  13. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  14. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  15. Using Constructivist Case Study Methodology to Understand Community Development Processes: Proposed Methodological Questions to Guide the Research Process

    ERIC Educational Resources Information Center

    Lauckner, Heidi; Paterson, Margo; Krupa, Terry

    2012-01-01

    Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…

  16. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    ERIC Educational Resources Information Center

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  17. Visualizing single molecules interacting with nuclear pore complexes by narrow-field epifluorescence microscopy

    PubMed Central

    Yang, Weidong; Musser, Siegfried M.

    2008-01-01

    The utility of single molecule fluorescence (SMF) for understanding biological reactions has been amply demonstrated by a diverse series of studies over the last decade. In large part, the molecules of interest have been limited to those within a small focal volume or near a surface to achieve the high sensitivity required for detecting the inherently weak signals arising from individual molecules. Consequently, the investigation of molecular behavior with high time and spatial resolution deep within cells using SMF has remained challenging. Recently, we demonstrated that narrow-field epifluorescence microscopy allows visualization of nucleocytoplasmic transport at the single cargo level. We describe here the methodological approach that yields 2 ms and ∼15 nm resolution for a stationary particle. The spatial resolution for a mobile particle is inherently worse, and depends on how fast the particle is moving. The signal-to-noise ratio is sufficiently high to directly measure the time a single cargo molecule spends interacting with the nuclear pore complex. Particle tracking analysis revealed that cargo molecules randomly diffuse within the nuclear pore complex, exiting as a result of a single rate-limiting step. We expect that narrow-field epifluorescence microscopy will be useful for elucidating other binding and trafficking events within cells. PMID:16879979

  18. Insight and Evidence Motivating the Simplification of Dual-Analysis Hybrid Systems into Single-Analysis Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.

    2018-01-01

    Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.

  19. The IDEA model: A single equation approach to the Ebola forecasting challenge.

    PubMed

    Tuite, Ashleigh R; Fisman, David N

    2018-03-01

    Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  20. Paradigms, pragmatism and possibilities: mixed-methods research in speech and language therapy.

    PubMed

    Glogowska, Margaret

    2011-01-01

    After the decades of the so-called 'paradigm wars' in social science research methodology and the controversy about the relative place and value of quantitative and qualitative research methodologies, 'paradigm peace' appears to have now been declared. This has come about as many researchers have begun to take a 'pragmatic' approach in the selection of research methodology, choosing the methodology best suited to answering the research question rather than conforming to a methodological orthodoxy. With the differences in the philosophical underpinnings of the two traditions set to one side, an increasing awareness, and valuing, of the 'mixed-methods' approach to research is now present in the fields of social, educational and health research. To explore what is meant by mixed-methods research and the ways in which quantitative and qualitative methodologies and methods can be combined and integrated, particularly in the broad field of health services research and the narrower one of speech and language therapy. The paper discusses the ways in which methodological approaches have already been combined and integrated in health services research and speech and language therapy, highlighting the suitability of mixed-methods research for answering the typically multifaceted questions arising from the provision of complex interventions. The challenges of combining and integrating quantitative and qualitative methods and the barriers to the adoption of mixed-methods approaches are also considered. The questions about healthcare, as it is being provided in the 21st century, calls for a range of methodological approaches. This is particularly the case for human communication and its disorders, where mixed-methods research offers a wealth of possibilities. In turn, speech and language therapy research should be able to contribute substantively to the future development of mixed-methods research. © 2010 Royal College of Speech & Language Therapists.

  1. Single-Subject Evaluation: A Tool for Quality Assurance.

    ERIC Educational Resources Information Center

    Nuehring, Elane M.; Pascone, Anne B.

    1986-01-01

    The use of single-subject designs in peer review, in utilization review, and in other quality-assurance audits is encouraged. Presents an overview of the methodologies of single-subject designs and quality assurance, and provides examples of cases in which single-subject techniques furnished relevant quality assurance documentation. (Author/ABB)

  2. Descriptive Analysis of Single Subject Research Designs: 1983-2007

    ERIC Educational Resources Information Center

    Hammond, Diana; Gast, David L.

    2010-01-01

    Single subject research methodology is commonly used and cited in special education courses and journals. This article reviews the types of single subject research designs published in eight refereed journals between 1983 and 2007 used to answer applied research questions. Single subject designs were categorized as withdrawal/reversal, time…

  3. Hemodynamic consequences of LPA stenosis in single ventricle stage 2 LPN circulation with automatic registration

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele E.; Kung, Ethan O.; Dorfman, Adam L.; Hsia, Tain-Yen; Baretta, Alessia; Arbia, Gregory; Marsden, Alison L.

    2013-11-01

    Congenital heart diseases such as hypoplastic left heart syndrome annually affect about 3% of births in the US alone. Surgical palliation of single ventricle patients is performed in stages. Consequently to the stage 2 surgical procedure or other previous conditions, a stenosis of the left pulmonary artery (LPA) is often observed, raising the clinical question of whether or not it should be treated. The severity of stenoses are commonly assessed through geometric inspection or catheter in-vivo pressure measurements with limited quantitative information about patient-specific physiology. The present study uses a multiscale CFD approach to provide an assessment of the severity of LPA stenoses. A lumped parameter 0D model is used to simulate stage 2 circulation, and parameters are automatically identified accounting for uncertainty in the clinical data available for a cohort of patients. The importance of the latter parameters, whether alone or in groups, is also ranked using forward uncertainty propagation methods. Various stenosis levels are applied to the three-dimensional SVC-PA junction model using a dual mesh-morphing approach. Traditional assessments methodologies are compared to the results of our findings and critically discussed.

  4. Effective Hamiltonian approach to bright and dark excitons in single-walled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Choi, Sangkook; Deslippe, Jack; Louie, Steven G.

    2009-03-01

    Recently, excitons in single-walled carbon nanotubes (SWCNTs) have generated great research interest due to the large binding energies and unique screening properties associated with one-dimensional (1D) materials. Considerable progress in their theoretical understanding has been achieved by studies employing the ab initio GW-Bethe-Salpeter equation methodology. For example, the presence of bright and dark excitons with binding energies of a large fraction of an eV has been predicted and subsequently verified by experiment. Some of these results have also been quantitatively reproduced by recent model calculations using a spatially dependent screened Coulomb interaction between the excited electron and hole, an approach that would be useful for studying large diameter and chiral nanotubes with many atoms per unit cell. However, this previous model neglects the degeneracy of the band states and hence the dark excitons. We present an extension of this exciton model for the SWCNT, incorporating the screened Coulomb interaction as well as state degeneracy, to understand and compute the characteristics of the bright and dark excitons, such as the bright and dark level splittings. Supported by NSF #DMR07-05941, DOE #De-AC02-05CH11231 and computational resources from Teragrid and NERSC.

  5. Quantitative assessment of dynamic PET imaging data in cancer imaging.

    PubMed

    Muzi, Mark; O'Sullivan, Finbarr; Mankoff, David A; Doot, Robert K; Pierce, Larry A; Kurland, Brenda F; Linden, Hannah M; Kinahan, Paul E

    2012-11-01

    Clinical imaging in positron emission tomography (PET) is often performed using single-time-point estimates of tracer uptake or static imaging that provides a spatial map of regional tracer concentration. However, dynamic tracer imaging can provide considerably more information about in vivo biology by delineating both the temporal and spatial pattern of tracer uptake. In addition, several potential sources of error that occur in static imaging can be mitigated. This review focuses on the application of dynamic PET imaging to measuring regional cancer biologic features and especially in using dynamic PET imaging for quantitative therapeutic response monitoring for cancer clinical trials. Dynamic PET imaging output parameters, particularly transport (flow) and overall metabolic rate, have provided imaging end points for clinical trials at single-center institutions for years. However, dynamic imaging poses many challenges for multicenter clinical trial implementations from cross-center calibration to the inadequacy of a common informatics infrastructure. Underlying principles and methodology of PET dynamic imaging are first reviewed, followed by an examination of current approaches to dynamic PET image analysis with a specific case example of dynamic fluorothymidine imaging to illustrate the approach. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  7. Entropy-Based Analysis and Bioinformatics-Inspired Integration of Global Economic Information Transfer

    PubMed Central

    An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis. PMID:23300959

  8. Entropy-based analysis and bioinformatics-inspired integration of global economic information transfer.

    PubMed

    Kim, Jinkyu; Kim, Gunn; An, Sungbae; Kwon, Young-Kyun; Yoon, Sungroh

    2013-01-01

    The assessment of information transfer in the global economic network helps to understand the current environment and the outlook of an economy. Most approaches on global networks extract information transfer based mainly on a single variable. This paper establishes an entirely new bioinformatics-inspired approach to integrating information transfer derived from multiple variables and develops an international economic network accordingly. In the proposed methodology, we first construct the transfer entropies (TEs) between various intra- and inter-country pairs of economic time series variables, test their significances, and then use a weighted sum approach to aggregate information captured in each TE. Through a simulation study, the new method is shown to deliver better information integration compared to existing integration methods in that it can be applied even when intra-country variables are correlated. Empirical investigation with the real world data reveals that Western countries are more influential in the global economic network and that Japan has become less influential following the Asian currency crisis.

  9. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  10. TAPping into argumentation: Developments in the application of Toulmin's Argument Pattern for studying science discourse

    NASA Astrophysics Data System (ADS)

    Erduran, Sibel; Simon, Shirley; Osborne, Jonathan

    2004-11-01

    This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.

  11. Methodology for Estimating Total Automotive Manufacturing Costs

    DOT National Transportation Integrated Search

    1983-04-01

    A number of methodologies for estimating manufacturing costs have been developed. This report discusses the different approaches and shows that an approach to estimating manufacturing costs in the automobile industry based on surrogate plants is pref...

  12. External Validity in the Study of Human Development: Theoretical and Methodological Issues

    ERIC Educational Resources Information Center

    Hultsch, David F.; Hickey, Tom

    1978-01-01

    An examination of the concept of external validity from two theoretical perspectives: a traditional mechanistic approach and a dialectical organismic approach. Examines the theoretical and methodological implications of these perspectives. (BD)

  13. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  14. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings.

    PubMed

    King, C; Beard, J; Crampin, A C; Costello, A; Mwansambo, C; Cunliffe, N A; Heyderman, R S; French, N; Bar-Zeev, N

    2015-09-11

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing robust and compelling evidence of total benefits of newly introduced vaccines on reducing child mortality. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Methodological challenges in measuring vaccine effectiveness using population cohorts in low resource settings

    PubMed Central

    King, C.; Beard, J.; Crampin, A.C.; Costello, A.; Mwansambo, C.; Cunliffe, N.A.; Heyderman, R.S.; French, N.; Bar-Zeev, N.

    2015-01-01

    Post-licensure real world evaluation of vaccine implementation is important for establishing evidence of vaccine effectiveness (VE) and programme impact, including indirect effects. Large cohort studies offer an important epidemiological approach for evaluating VE, but have inherent methodological challenges. Since March 2012, we have conducted an open prospective cohort study in two sites in rural Malawi to evaluate the post-introduction effectiveness of 13-valent pneumococcal conjugate vaccine (PCV13) against all-cause post-neonatal infant mortality and monovalent rotavirus vaccine (RV1) against diarrhoea-related post-neonatal infant mortality. Our study sites cover a population of 500,000, with a baseline post-neonatal infant mortality of 25 per 1000 live births. We conducted a methodological review of cohort studies for vaccine effectiveness in a developing country setting, applied to our study context. Based on published literature, we outline key considerations when defining the denominator (study population), exposure (vaccination status) and outcome ascertainment (mortality and cause of death) of such studies. We assess various definitions in these three domains, in terms of their impact on power, effect size and potential biases and their direction, using our cohort study for illustration. Based on this iterative process, we discuss the pros and cons of our final per-protocol analysis plan. Since no single set of definitions or analytical approach accounts for all possible biases, we propose sensitivity analyses to interrogate our assumptions and methodological decisions. In the poorest regions of the world where routine vital birth and death surveillance are frequently unavailable and the burden of disease and death is greatest We conclude that provided the balance between definitions and their overall assumed impact on estimated VE are acknowledged, such large scale real-world cohort studies can provide crucial information to policymakers by providing robust and compelling evidence of total benefits of newly introduced vaccines on reducing child mortality. PMID:26235370

  16. Ab initio calculations of the lattice parameter and elastic stiffness coefficients of bcc Fe with solutes

    DOE PAGES

    Fellinger, Michael R.; Hector, Louis G.; Trinkle, Dallas R.

    2016-10-28

    Here, we present an efficient methodology for computing solute-induced changes in lattice parameters and elastic stiffness coefficients Cij of single crystals using density functional theory. We also introduce a solute strain misfit tensor that quantifies how solutes change lattice parameters due to the stress they induce in the host crystal. Solutes modify the elastic stiffness coefficients through volumetric changes and by altering chemical bonds. We compute each of these contributions to the elastic stiffness coefficients separately, and verify that their sum agrees with changes in the elastic stiffness coefficients computed directly using fully optimized supercells containing solutes. Computing the twomore » elastic stiffness contributions separately is more computationally efficient and provides more information on solute effects than the direct calculations. We compute the solute dependence of polycrystalline averaged shear and Young's moduli from the solute dependence of the single-crystal Cij. We then apply this methodology to substitutional Al, B, Cu, Mn, Si solutes and octahedral interstitial C and N solutes in bcc Fe. Comparison with experimental data indicates that our approach accurately predicts solute-induced changes in the lattice parameter and elastic coefficients. The computed data can be used to quantify solute-induced changes in mechanical properties such as strength and ductility, and can be incorporated into mesoscale models to improve their predictive capabilities.« less

  17. Rapid and label-free microfluidic neutrophil purification and phenotyping in diabetes mellitus

    NASA Astrophysics Data System (ADS)

    Hou, Han Wei; Petchakup, Chayakorn; Tay, Hui Min; Tam, Zhi Yang; Dalan, Rinkoo; Chew, Daniel Ek Kwang; Li, King Ho Holden; Boehm, Bernhard O.

    2016-07-01

    Advanced management of dysmetabolic syndromes such as diabetes will benefit from a timely mechanistic insight enabling personalized medicine approaches. Herein, we present a rapid microfluidic neutrophil sorting and functional phenotyping strategy for type 2 diabetes mellitus (T2DM) patients using small blood volumes (fingerprick ~100 μL). The developed inertial microfluidics technology enables single-step neutrophil isolation (>90% purity) without immuno-labeling and sorted neutrophils are used to characterize their rolling behavior on E-selectin, a critical step in leukocyte recruitment during inflammation. The integrated microfluidics testing methodology facilitates high throughput single-cell quantification of neutrophil rolling to detect subtle differences in speed distribution. Higher rolling speed was observed in T2DM patients (P < 0.01) which strongly correlated with neutrophil activation, rolling ligand P-selectin glycoprotein ligand 1 (PSGL-1) expression, as well as established cardiovascular risk factors (cholesterol, high-sensitive C-reactive protein (CRP) and HbA1c). Rolling phenotype can be modulated by common disease risk modifiers (metformin and pravastatin). Receiver operating characteristics (ROC) and principal component analysis (PCA) revealed neutrophil rolling as an important functional phenotype in T2DM diagnostics. These results suggest a new point-of-care testing methodology, and neutrophil rolling speed as a functional biomarker for rapid profiling of dysmetabolic subjects in clinical and patient-oriented settings.

  18. Multiscale mutation clustering algorithm identifies pan-cancer mutational clusters associated with pathway-level changes in gene expression

    PubMed Central

    Poole, William; Leinonen, Kalle; Shmulevich, Ilya

    2017-01-01

    Cancer researchers have long recognized that somatic mutations are not uniformly distributed within genes. However, most approaches for identifying cancer mutations focus on either the entire-gene or single amino-acid level. We have bridged these two methodologies with a multiscale mutation clustering algorithm that identifies variable length mutation clusters in cancer genes. We ran our algorithm on 539 genes using the combined mutation data in 23 cancer types from The Cancer Genome Atlas (TCGA) and identified 1295 mutation clusters. The resulting mutation clusters cover a wide range of scales and often overlap with many kinds of protein features including structured domains, phosphorylation sites, and known single nucleotide variants. We statistically associated these multiscale clusters with gene expression and drug response data to illuminate the functional and clinical consequences of mutations in our clusters. Interestingly, we find multiple clusters within individual genes that have differential functional associations: these include PTEN, FUBP1, and CDH1. This methodology has potential implications in identifying protein regions for drug targets, understanding the biological underpinnings of cancer, and personalizing cancer treatments. Toward this end, we have made the mutation clusters and the clustering algorithm available to the public. Clusters and pathway associations can be interactively browsed at m2c.systemsbiology.net. The multiscale mutation clustering algorithm is available at https://github.com/IlyaLab/M2C. PMID:28170390

  19. Multiscale mutation clustering algorithm identifies pan-cancer mutational clusters associated with pathway-level changes in gene expression.

    PubMed

    Poole, William; Leinonen, Kalle; Shmulevich, Ilya; Knijnenburg, Theo A; Bernard, Brady

    2017-02-01

    Cancer researchers have long recognized that somatic mutations are not uniformly distributed within genes. However, most approaches for identifying cancer mutations focus on either the entire-gene or single amino-acid level. We have bridged these two methodologies with a multiscale mutation clustering algorithm that identifies variable length mutation clusters in cancer genes. We ran our algorithm on 539 genes using the combined mutation data in 23 cancer types from The Cancer Genome Atlas (TCGA) and identified 1295 mutation clusters. The resulting mutation clusters cover a wide range of scales and often overlap with many kinds of protein features including structured domains, phosphorylation sites, and known single nucleotide variants. We statistically associated these multiscale clusters with gene expression and drug response data to illuminate the functional and clinical consequences of mutations in our clusters. Interestingly, we find multiple clusters within individual genes that have differential functional associations: these include PTEN, FUBP1, and CDH1. This methodology has potential implications in identifying protein regions for drug targets, understanding the biological underpinnings of cancer, and personalizing cancer treatments. Toward this end, we have made the mutation clusters and the clustering algorithm available to the public. Clusters and pathway associations can be interactively browsed at m2c.systemsbiology.net. The multiscale mutation clustering algorithm is available at https://github.com/IlyaLab/M2C.

  20. A Modified Penalty Parameter Approach for Optimal Estimation of UH with Simultaneous Estimation of Infiltration Parameters

    NASA Astrophysics Data System (ADS)

    Bhattacharjya, Rajib Kumar

    2018-05-01

    The unit hydrograph and the infiltration parameters of a watershed can be obtained from observed rainfall-runoff data by using inverse optimization technique. This is a two-stage optimization problem. In the first stage, the infiltration parameters are obtained and the unit hydrograph ordinates are estimated in the second stage. In order to combine this two-stage method into a single stage one, a modified penalty parameter approach is proposed for converting the constrained optimization problem to an unconstrained one. The proposed approach is designed in such a way that the model initially obtains the infiltration parameters and then searches the optimal unit hydrograph ordinates. The optimization model is solved using Genetic Algorithms. A reduction factor is used in the penalty parameter approach so that the obtained optimal infiltration parameters are not destroyed during subsequent generation of genetic algorithms, required for searching optimal unit hydrograph ordinates. The performance of the proposed methodology is evaluated by using two example problems. The evaluation shows that the model is superior, simple in concept and also has the potential for field application.

  1. Near-Earth object hazardous impact: A Multi-Criteria Decision Making approach.

    PubMed

    Sánchez-Lozano, J M; Fernández-Martínez, M

    2016-11-16

    The impact of a near-Earth object (NEO) may release large amounts of energy and cause serious damage. Several NEO hazard studies conducted over the past few years provide forecasts, impact probabilities and assessment ratings, such as the Torino and Palermo scales. These high-risk NEO assessments involve several criteria, including impact energy, mass, and absolute magnitude. The main objective of this paper is to provide the first Multi-Criteria Decision Making (MCDM) approach to classify hazardous NEOs. Our approach applies a combination of two methods from a widely utilized decision making theory. Specifically, the Analytic Hierarchy Process (AHP) methodology is employed to determine the criteria weights, which influence the decision making, and the Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) is used to obtain a ranking of alternatives (potentially hazardous NEOs). In addition, NEO datasets provided by the NASA Near-Earth Object Program are utilized. This approach allows the classification of NEOs by descending order of their TOPSIS ratio, a single quantity that contains all of the relevant information for each object.

  2. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Targeted proteomics coming of age - SRM, PRM and DIA performance evaluated from a core facility perspective.

    PubMed

    Kockmann, Tobias; Trachsel, Christian; Panse, Christian; Wahlander, Asa; Selevsek, Nathalie; Grossmann, Jonas; Wolski, Witold E; Schlapbach, Ralph

    2016-08-01

    Quantitative mass spectrometry is a rapidly evolving methodology applied in a large number of omics-type research projects. During the past years, new designs of mass spectrometers have been developed and launched as commercial systems while in parallel new data acquisition schemes and data analysis paradigms have been introduced. Core facilities provide access to such technologies, but also actively support the researchers in finding and applying the best-suited analytical approach. In order to implement a solid fundament for this decision making process, core facilities need to constantly compare and benchmark the various approaches. In this article we compare the quantitative accuracy and precision of current state of the art targeted proteomics approaches single reaction monitoring (SRM), parallel reaction monitoring (PRM) and data independent acquisition (DIA) across multiple liquid chromatography mass spectrometry (LC-MS) platforms, using a readily available commercial standard sample. All workflows are able to reproducibly generate accurate quantitative data. However, SRM and PRM workflows show higher accuracy and precision compared to DIA approaches, especially when analyzing low concentrated analytes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Modeling Fatigue Damage Onset and Progression in Composites Using an Element-Based Virtual Crack Closure Technique Combined With the Floating Node Method

    NASA Technical Reports Server (NTRS)

    De Carvalho, Nelson V.; Krueger, Ronald

    2016-01-01

    A new methodology is proposed to model the onset and propagation of matrix cracks and delaminations in carbon-epoxy composites subject to fatigue loading. An extended interface element, based on the Floating Node Method, is developed to represent delaminations and matrix cracks explicitly in a mesh independent fashion. Crack propagation is determined using an element-based Virtual Crack Closure Technique approach to determine mixed-mode energy release rates, and the Paris-Law relationship to obtain crack growth rate. Crack onset is determined using a stressbased onset criterion coupled with a stress vs. cycle curve and Palmgren-Miner rule to account for fatigue damage accumulation. The approach is implemented in Abaqus/Standard® via the user subroutine functionality. Verification exercises are performed to assess the accuracy and correct implementation of the approach. Finally, it was demonstrated that this approach captured the differences in failure morphology in fatigue for two laminates of identical stiffness, but with layups containing ?deg plies that were either stacked in a single group, or distributed through the laminate thickness.

  5. Implementation of Single Source Based Hospital Information System for the Catholic Medical Center Affiliated Hospitals

    PubMed Central

    Choi, Inyoung; Choi, Ran; Lee, Jonghyun

    2010-01-01

    Objectives The objective of this research is to introduce the unique approach of the Catholic Medical Center (CMC) integrate network hospitals with organizational and technical methodologies adopted for seamless implementation. Methods The Catholic Medical Center has developed a new hospital information system to connect network hospitals and adopted new information technology architecture which uses single source for multiple distributed hospital systems. Results The hospital information system of the CMC was developed to integrate network hospitals adopting new system development principles; one source, one route and one management. This information architecture has reduced the cost for system development and operation, and has enhanced the efficiency of the management process. Conclusions Integrating network hospital through information system was not simple; it was much more complicated than single organization implementation. We are still looking for more efficient communication channel and decision making process, and also believe that our new system architecture will be able to improve CMC health care system and provide much better quality of health care service to patients and customers. PMID:21818432

  6. Bio-inspired formation of functional calcite/metal oxide nanoparticle composites.

    PubMed

    Kim, Yi-Yeoun; Schenk, Anna S; Walsh, Dominic; Kulak, Alexander N; Cespedes, Oscar; Meldrum, Fiona C

    2014-01-21

    Biominerals are invariably composite materials, where occlusion of organic macromolecules within single crystals can significantly modify their properties. In this article, we take inspiration from this biogenic strategy to generate composite crystals in which magnetite (Fe3O4) and zincite (ZnO) nanoparticles are embedded within a calcite single crystal host, thereby endowing it with new magnetic or optical properties. While growth of crystals in the presence of small molecules, macromolecules and particles can lead to their occlusion within the crystal host, this approach requires particles with specific surface chemistries. Overcoming this limitation, we here precipitate crystals within a nanoparticle-functionalised xyloglucan gel, where gels can also be incorporated within single crystals, according to their rigidity. This method is independent of the nanoparticle surface chemistry and as the gel maintains its overall structure when occluded within the crystal, the nanoparticles are maintained throughout the crystal, preventing, for example, their movement and accumulation at the crystal surface during crystal growth. This methodology is expected to be quite general, and could be used to endow a wide range of crystals with new functionalities.

  7. Vibrational self-consistent field theory using optimized curvilinear coordinates.

    PubMed

    Bulik, Ireneusz W; Frisch, Michael J; Vaccaro, Patrick H

    2017-07-28

    A vibrational SCF model is presented in which the functions forming the single-mode functions in the product wavefunction are expressed in terms of internal coordinates and the coordinates used for each mode are optimized variationally. This model involves no approximations to the kinetic energy operator and does not require a Taylor-series expansion of the potential. The non-linear optimization of coordinates is found to give much better product wavefunctions than the limited variations considered in most previous applications of SCF methods to vibrational problems. The approach is tested using published potential energy surfaces for water, ammonia, and formaldehyde. Variational flexibility allowed in the current ansätze results in excellent zero-point energies expressed through single-product states and accurate fundamental transition frequencies realized by short configuration-interaction expansions. Fully variational optimization of single-product states for excited vibrational levels also is discussed. The highlighted methodology constitutes an excellent starting point for more sophisticated treatments, as the bulk characteristics of many-mode coupling are accounted for efficiently in terms of compact wavefunctions (as evident from the accurate prediction of transition frequencies).

  8. Approach to Teaching Research Methodology for Information Technology

    ERIC Educational Resources Information Center

    Steenkamp, Annette Lerine; McCord, Samual Alan

    2007-01-01

    The paper reports on an approach to teaching a course in information technology research methodology in a doctoral program, the Doctor of Management in Information Technology (DMIT), in which research, with focus on finding innovative solutions to problems found in practice, comprises a significant part of the degree. The approach makes a…

  9. Multiple Cultures of Doing Geography Facilitate Global Studies

    ERIC Educational Resources Information Center

    Ahamer, Gilbert

    2013-01-01

    Purpose: This article aims to explain why geography is a prime discipline for analysing globalisation and a multicultural view of Global Studies. The generic approach of human geography to first select an appropriate methodology is taken as a key approach. Design/methodology/approach: Concepts from aggregate disciplines such as history, economics,…

  10. Using Q Methodology in the Literature Review Process: A Mixed Research Approach

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Frels, Rebecca K.

    2015-01-01

    Because of the mixed research-based nature of literature reviews, it is surprising, then, that insufficient information has been provided as to how reviewers can incorporate mixed research approaches into their literature reviews. Thus, in this article, we provide a mixed methods research approach--Q methodology--for analyzing information…

  11. Identifying Behavioral Barriers to Campus Sustainability: A Multi-Method Approach

    ERIC Educational Resources Information Center

    Horhota, Michelle; Asman, Jenni; Stratton, Jeanine P.; Halfacre, Angela C.

    2014-01-01

    Purpose: The purpose of this paper is to assess the behavioral barriers to sustainable action in a campus community. Design/methodology/approach: This paper reports three different methodological approaches to the assessment of behavioral barriers to sustainable actions on a college campus. Focus groups and surveys were used to assess campus…

  12. A Pareto frontier intersection-based approach for efficient multiobjective optimization of competing concept alternatives

    NASA Astrophysics Data System (ADS)

    Rousis, Damon A.

    The expected growth of civil aviation over the next twenty years places significant emphasis on revolutionary technology development aimed at mitigating the environmental impact of commercial aircraft. As the number of technology alternatives grows along with model complexity, current methods for Pareto finding and multiobjective optimization quickly become computationally infeasible. Coupled with the large uncertainty in the early stages of design, optimal designs are sought while avoiding the computational burden of excessive function calls when a single design change or technology assumption could alter the results. This motivates the need for a robust and efficient evaluation methodology for quantitative assessment of competing concepts. This research presents a novel approach that combines Bayesian adaptive sampling with surrogate-based optimization to efficiently place designs near Pareto frontier intersections of competing concepts. Efficiency is increased over sequential multiobjective optimization by focusing computational resources specifically on the location in the design space where optimality shifts between concepts. At the intersection of Pareto frontiers, the selection decisions are most sensitive to preferences place on the objectives, and small perturbations can lead to vastly different final designs. These concepts are incorporated into an evaluation methodology that ultimately reduces the number of failed cases, infeasible designs, and Pareto dominated solutions across all concepts. A set of algebraic samples along with a truss design problem are presented as canonical examples for the proposed approach. The methodology is applied to the design of ultra-high bypass ratio turbofans to guide NASA's technology development efforts for future aircraft. Geared-drive and variable geometry bypass nozzle concepts are explored as enablers for increased bypass ratio and potential alternatives over traditional configurations. The method is shown to improve sampling efficiency and provide clusters of feasible designs that motivate a shift towards revolutionary technologies that reduce fuel burn, emissions, and noise on future aircraft.

  13. Outcomes of the Bobath concept on upper limb recovery following stroke.

    PubMed

    Luke, Carolyn; Dodd, Karen J; Brock, Kim

    2004-12-01

    To determine the effectiveness of the Bobath concept at reducing upper limb impairments, activity limitations and participation restrictions after stroke. Electronic databases were searched to identify relevant trials published between 1966 and 2003. Two reviewers independently assessed articles for the following inclusion criteria: population of adults with upper limb disability after stroke; stated use of the Bobath concept aimed at improving upper limb disability in isolation from other approaches; outcomes reflecting changes in upper limb impairment, activity limitation or participation restriction. Of the 688 articles initially identified, eight met the inclusion criteria. Five were randomized controlled trials, one used a single-group crossover design and two were single-case design studies. Five studies measured impairments including shoulder pain, tone, muscle strength and motor control. The Bobath concept was found to reduce shoulder pain better than cryotherapy, and to reduce tone compared to no intervention and compared to proprioceptive neuromuscular facilitation (PNF). However, no difference was detected for changes in tone between the Bobath concept and a functional approach. Differences did not reach significance for measures of muscle strength and motor control. Six studies measured activity limitations, none of these found the Bobath concept was superior to other therapy approaches. Two studies measured changes in participation restriction and both found equivocal results. Comparisons of the Bobath concept with other approaches do not demonstrate superiority of one approach over the other at improving upper limb impairment, activity or participation. However, study limitations relating to methodological quality, the outcome measures used and contextual factors investigated limit the ability to draw conclusions. Future research should use sensitive upper limb measures, trained Bobath therapists and homogeneous samples to identify the influence of patient factors on the response to therapy approaches.

  14. Methodological and Epistemological Considerations in Utilizing Qualitative Inquiry to Develop Interventions.

    PubMed

    Duggleby, Wendy; Williams, Allison

    2016-01-01

    The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.

  15. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    PubMed

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  16. Risk Mapping Case Study: Industrial Area Of Trinec Town (Czech Republic) potentially endangered by floods and landslides

    NASA Astrophysics Data System (ADS)

    Dobes, P.; Hrdina, P.; Kotatko, A.; Danihelka, P.; Bednarik, M.; Krejci, O.; Kasperakova, D.

    2009-04-01

    One of present questions in the context of natural and technological risk mapping, which become important in last years, is analysis and assessment of selected types of multirisks. It results from relevant R&D projetcs and also from international workshops and conferences. From various surveys and presented activities it is evident existence a lot of data and methodological approaches for single risk categories but a lack of tested methodological approaches for multirisks. Within framework of workgroup was done literature search of multirisk assessment methodologies and innovations. The idea of this relatively small, local scale case study arose during the 3rd Risk Mapping Workshop, coordinated by EC DG JRC, IPSC in November 2007. The proposal was based on the previous risk analysis and assessment project, which has been done for Frydek-Mistek County area (Czech Republic) in the year 2002. Several industrial facilities in the Trinec are partly situated in the inundation area of river Olše and are partly protected by concrete barriers built on the banks of Olše. It has to be mentioned that these banks are unstable and in the permanent slow movement. If iron-concrete barriers will be overflowed by water as the result of sudden bank landslide or flood wave, it could trigger several industrial accidents on steel and energy production facilities. Area is highly developed from demographic and socioeconomic point of view. Selected area is in high stage of geological, engineering geological and hydrogeological investigation. Most important scenarios of acidents in the area were developed by What-If analysis and Black box analysis (just growth of several different scenarios; qualitative analysis). In the period of few years later, more QRA analyses of industrial risks were proceeded separately, thanks to District Office, public and Seveso II Directive requirements. General scenarios of multi-hazard events was considered. In the case study, three methodologies was applied to assess hazard and risk: qualitative approach based on German methodology of Risk matrix compilation; quantitative approach based on statistical methods previously used for the area between two towns Hlohovec and Sered in Slovakia; quantitative approach for the modelling of the floods on the river Olse based on model HEC-RAS. For evaluation of selected scenarios impacts to the facilities and also to the public, including evaluation of present barriers, was used also method of expert assesment. With regard to the preliminary results it could be estimated, that flooding of industrial facilities is less probable due to existing barriers, but several usefull recomendations for similar prone areas could be derived. Acknowledgements This work is partially supported by the Czech Ministry of the Evnironment in the frame of R&D project "Comprehensive Interactions between Natural Processes and Industry with Regard to Major Accident Prevention and Emergency Planning" (Registration Number: SPII 1a10 45/07).

  17. Methodological quality of meta-analyses of single-case experimental studies.

    PubMed

    Jamshidi, Laleh; Heyvaert, Mieke; Declercq, Lies; Fernández-Castilla, Belén; Ferron, John M; Moeyaert, Mariola; Beretvas, S Natasha; Onghena, Patrick; Van den Noortgate, Wim

    2017-12-28

    Methodological rigor is a fundamental factor in the validity and credibility of the results of a meta-analysis. Following an increasing interest in single-case experimental design (SCED) meta-analyses, the current study investigates the methodological quality of SCED meta-analyses. We assessed the methodological quality of 178 SCED meta-analyses published between 1985 and 2015 through the modified Revised-Assessment of Multiple Systematic Reviews (R-AMSTAR) checklist. The main finding of the current review is that the methodological quality of the SCED meta-analyses has increased over time, but is still low according to the R-AMSTAR checklist. A remarkable percentage of the studies (93.80% of the included SCED meta-analyses) did not even reach the midpoint score (22, on a scale of 0-44). The mean and median methodological quality scores were 15.57 and 16, respectively. Relatively high scores were observed for "providing the characteristics of the included studies" and "doing comprehensive literature search". The key areas of deficiency were "reporting an assessment of the likelihood of publication bias" and "using the methods appropriately to combine the findings of studies". Although the results of the current review reveal that the methodological quality of the SCED meta-analyses has increased over time, still more efforts are needed to improve their methodological quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Analysis of phase II methodologies for single-arm clinical trials with multiple endpoints in rare cancers: An example in Ewing’s sarcoma

    PubMed Central

    Dutton, P; Love, SB; Billingham, L; Hassan, AB

    2016-01-01

    Trials run in either rare diseases, such as rare cancers, or rare sub-populations of common diseases are challenging in terms of identifying, recruiting and treating sufficient patients in a sensible period. Treatments for rare diseases are often designed for other disease areas and then later proposed as possible treatments for the rare disease after initial phase I testing is complete. To ensure the trial is in the best interests of the patient participants, frequent interim analyses are needed to force the trial to stop promptly if the treatment is futile or toxic. These non-definitive phase II trials should also be stopped for efficacy to accelerate research progress if the treatment proves to be particularly promising. In this paper, we review frequentist and Bayesian methods that have been adapted to incorporate two binary endpoints and frequent interim analyses. The Eurosarc Trial of Linsitinib in advanced Ewing Sarcoma (LINES) is used as a motivating example and provides a suitable platform to compare these approaches. The Bayesian approach provides greater design flexibility, but does not provide additional value over the frequentist approaches in a single trial setting when the prior is non-informative. However, Bayesian designs are able to borrow from any previous experience, using prior information to improve efficiency. PMID:27587590

  19. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  20. Critical dialogical approach: A methodological direction for occupation-based social transformative work.

    PubMed

    Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke

    2018-05-03

    Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.

  1. A partially coupled, fraction-by-fraction modelling approach to the subsurface migration of gasoline spills

    NASA Astrophysics Data System (ADS)

    Fagerlund, F.; Niemi, A.

    2007-01-01

    The subsurface spreading behaviour of gasoline, as well as several other common soil- and groundwater pollutants (e.g. diesel, creosote), is complicated by the fact that it is a mixture of hundreds of different constituents, behaving differently with respect to e.g. dissolution, volatilisation, adsorption and biodegradation. Especially for scenarios where the non-aqueous phase liquid (NAPL) phase is highly mobile, such as for sudden spills in connection with accidents, it is necessary to simultaneously analyse the migration of the NAPL and its individual components in order to assess risks and environmental impacts. Although a few fully coupled, multi-phase, multi-constituent models exist, such models are highly complex and may be time consuming to use. A new, somewhat simplified methodology for modelling the subsurface migration of gasoline while taking its multi-constituent nature into account is therefore introduced here. Constituents with similar properties are grouped together into eight fractions. The migration of each fraction in the aqueous and gaseous phases as well as adsorption is modelled separately using a single-constituent multi-phase flow model, while the movement of the free-phase gasoline is essentially the same for all fractions. The modelling is done stepwise to allow updating of the free-phase gasoline composition at certain time intervals. The output is the concentration of the eight different fractions in the aqueous, gaseous, free gasoline and solid phases with time. The approach is evaluated by comparing it to a fully coupled multi-phase, multi-constituent numerical simulator in the modelling of a typical accident-type spill scenario, based on a tanker accident in northern Sweden. Here the PCFF method produces results similar to those of the more sophisticated, fully coupled model. The benefit of the method is that it is easy to use and can be applied to any single-constituent multi-phase numerical simulator, which in turn may have different strengths in incorporating various processes. The results demonstrate that the different fractions have significantly different migration behaviours and although the methodology involves some simplifications, it is a considerable improvement compared to modelling the gasoline constituents completely individually or as one single mixture.

  2. Methodology for locating defects within hardwood logs and determining their impact on lumber-value yield

    Treesearch

    Thomas Harless; Francis G. Wagner; Phillip Steele; Fred Taylor; Vikram Yadama; Charles W. McMillin

    1991-01-01

    A precise research methodology is described by which internal log-defect locations may help select hardwood log ortentation and sawing procedure to improve lumber value. Procedures for data collection, data handling, simulated sawing, and data analysis are described. A single test log verified the methodology. Results from this log showed significant differences in...

  3. Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,

    DTIC Science & Technology

    1975-12-01

    Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and

  4. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  5. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  6. Evaluation of complex community-based childhood obesity prevention interventions.

    PubMed

    Karacabeyli, D; Allender, S; Pinkney, S; Amed, S

    2018-05-16

    Multi-setting, multi-component community-based interventions have shown promise in preventing childhood obesity; however, evaluation of these complex interventions remains a challenge. The objective of the study is to systematically review published methodological approaches to outcome evaluation for multi-setting community-based childhood obesity prevention interventions and synthesize a set of pragmatic recommendations. MEDLINE, CINAHL and PsycINFO were searched from inception to 6 July 2017. Papers were included if the intervention targeted children ≤18 years, engaged at least two community sectors and described their outcome evaluation methodology. A single reviewer conducted title and abstract scans, full article review and data abstraction. Directed content analysis was performed by three reviewers to identify prevailing themes. Thirty-three studies were included, and of these, 26 employed a quasi-experimental design; the remaining were randomized control trials. Body mass index was the most commonly measured outcome, followed by health behaviour change and psychosocial outcomes. Six themes emerged, highlighting advantages and disadvantages of active vs. passive consent, quasi-experimental vs. randomized control trials, longitudinal vs. repeat cross-sectional designs and the roles of process evaluation and methodological flexibility in evaluating complex interventions. Selection of study designs and outcome measures compatible with community infrastructure, accompanied by process evaluation, may facilitate successful outcome evaluation. © 2018 World Obesity Federation.

  7. Methodology Developed for Modeling the Fatigue Crack Growth Behavior of Single-Crystal, Nickel-Base Superalloys

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Because of their superior high-temperature properties, gas generator turbine airfoils made of single-crystal, nickel-base superalloys are fast becoming the standard equipment on today's advanced, high-performance aerospace engines. The increased temperature capabilities of these airfoils has allowed for a significant increase in the operating temperatures in turbine sections, resulting in superior propulsion performance and greater efficiencies. However, the previously developed methodologies for life-prediction models are based on experience with polycrystalline alloys and may not be applicable to single-crystal alloys under certain operating conditions. One of the main areas where behavior differences between single-crystal and polycrystalline alloys are readily apparent is subcritical fatigue crack growth (FCG). The NASA Lewis Research Center's work in this area enables accurate prediction of the subcritical fatigue crack growth behavior in single-crystal, nickel-based superalloys at elevated temperatures.

  8. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  9. Measures of outdoor play and independent mobility in children and youth: A methodological review.

    PubMed

    Bates, Bree; Stone, Michelle R

    2015-09-01

    Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  10. A Comparison of Single-Purpose and Non-Single-Purpose Clinical Education on the Retention Rates of Registered Nursing Graduates

    ERIC Educational Resources Information Center

    Bush, Dana I.

    2012-01-01

    There are 26 single-purpose nursing programs in the United States. They are nursing programs operated by hospitals with the single purpose of supplying the hospitals with well-prepared health care staff. Using a quantitative methodology and an ex post facto design, this study compared employment retention rates between single-purpose and…

  11. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  12. Approaches to capturing the financial cost of family care-giving within a palliative care context: a systematic review.

    PubMed

    Gardiner, Clare; Brereton, Louise; Frey, Rosemary; Wilkinson-Meyers, Laura; Gott, Merryn

    2016-09-01

    The economic burden faced by family caregivers of people at the end of life is well recognised. Financial burden has a significant impact on the provision of family care-giving in the community setting, but has seen limited research attention. A systematic review with realist review synthesis and thematic analysis was undertaken to identify literature relating to the financial costs and impact of family care-giving at the end of life. This paper reports findings relating to previously developed approaches which capture the financial costs and implications of caring for family members receiving palliative/end-of-life care. Seven electronic databases were searched from inception to April 2012, for original research studies relating to the financial impact of care-giving at the end of life. Studies were independently screened to identify those which met the study inclusion criteria, and the methodological quality of included studies was appraised using realist review criteria of relevance and rigour. A descriptive thematic approach was used to synthesise data. Twelve articles met the inclusion criteria for the review. Various approaches to capturing data on the financial costs of care-giving at the end of life were noted; however, no single tool was identified with the sole purpose of exploring these costs. The majority of approaches used structured questionnaires and were administered by personal interview, with most studies using longitudinal designs. Calculation of costs was most often based on recall by patients and family caregivers, in some studies combined with objective measures of resource use. While the studies in this review provide useful data on approaches to capturing costs of care-giving, more work is needed to develop methods which accurately and sensitively capture the financial costs of caring at the end of life. Methodological considerations include study design and method of administration, contextual and cultural relevance, and accuracy of cost estimates. © 2015 John Wiley & Sons Ltd.

  13. Stabilization of perturbed Boolean network attractors through compensatory interactions

    PubMed Central

    2014-01-01

    Background Understanding and ameliorating the effects of network damage are of significant interest, due in part to the variety of applications in which network damage is relevant. For example, the effects of genetic mutations can cascade through within-cell signaling and regulatory networks and alter the behavior of cells, possibly leading to a wide variety of diseases. The typical approach to mitigating network perturbations is to consider the compensatory activation or deactivation of system components. Here, we propose a complementary approach wherein interactions are instead modified to alter key regulatory functions and prevent the network damage from triggering a deregulatory cascade. Results We implement this approach in a Boolean dynamic framework, which has been shown to effectively model the behavior of biological regulatory and signaling networks. We show that the method can stabilize any single state (e.g., fixed point attractors or time-averaged representations of multi-state attractors) to be an attractor of the repaired network. We show that the approach is minimalistic in that few modifications are required to provide stability to a chosen attractor and specific in that interventions do not have undesired effects on the attractor. We apply the approach to random Boolean networks, and further show that the method can in some cases successfully repair synchronous limit cycles. We also apply the methodology to case studies from drought-induced signaling in plants and T-LGL leukemia and find that it is successful in both stabilizing desired behavior and in eliminating undesired outcomes. Code is made freely available through the software package BooleanNet. Conclusions The methodology introduced in this report offers a complementary way to manipulating node expression levels. A comprehensive approach to evaluating network manipulation should take an "all of the above" perspective; we anticipate that theoretical studies of interaction modification, coupled with empirical advances, will ultimately provide researchers with greater flexibility in influencing system behavior. PMID:24885780

  14. Early Amyloidogenic Oligomerization Studied through Fluorescence Lifetime Correlation Spectroscopy

    PubMed Central

    Paredes, Jose M.; Casares, Salvador; Ruedas-Rama, Maria J.; Fernandez, Elena; Castello, Fabio; Varela, Lorena; Orte, Angel

    2012-01-01

    Amyloidogenic protein aggregation is a persistent biomedical problem. Despite active research in disease-related aggregation, the need for multidisciplinary approaches to the problem is evident. Recent advances in single-molecule fluorescence spectroscopy are valuable for examining heterogenic biomolecular systems. In this work, we have explored the initial stages of amyloidogenic aggregation by employing fluorescence lifetime correlation spectroscopy (FLCS), an advanced modification of conventional fluorescence correlation spectroscopy (FCS) that utilizes time-resolved information. FLCS provides size distributions and kinetics for the oligomer growth of the SH3 domain of α-spectrin, whose N47A mutant forms amyloid fibrils at pH 3.2 and 37 °C in the presence of salt. The combination of FCS with additional fluorescence lifetime information provides an exciting approach to focus on the initial aggregation stages, allowing a better understanding of the fibrillization process, by providing multidimensional information, valuable in combination with other conventional methodologies. PMID:22949804

  15. Hybrid cardiac imaging with MR-CAT scan: a feasibility study.

    PubMed

    Hillenbrand, C; Sandstede, J; Pabst, T; Hahn, D; Haase, A; Jakob, P M

    2000-06-01

    We demonstrate the feasibility of a new versatile hybrid imaging concept, the combined acquisition technique (CAT), for cardiac imaging. The cardiac CAT approach, which combines new methodology with existing technology, essentially integrates fast low-angle shot (FLASH) and echoplanar imaging (EPI) modules in a sequential fashion, whereby each acquisition module is employed with independently optimized imaging parameters. One important CAT sequence optimization feature is the ability to use different bandwidths for different acquisition modules. Twelve healthy subjects were imaged using three cardiac CAT acquisition strategies: a) CAT was used to reduce breath-hold duration times while maintaining constant spatial resolution; b) CAT was used to increase spatial resolution in a given breath-hold time; and c) single-heart beat CAT imaging was performed. The results obtained demonstrate the feasibility of cardiac imaging using the CAT approach and the potential of this technique to accelerate the imaging process with almost conserved image quality. Copyright 2000 Wiley-Liss, Inc.

  16. Physiological utility theory and the neuroeconomics of choice

    PubMed Central

    Glimcher, Paul W.; Dorris, Michael C.; Bayer, Hannah M.

    2006-01-01

    Over the past half century economists have responded to the challenges of Allais [Econometrica (1953) 53], Ellsberg [Quart. J. Econ. (1961) 643] and others raised to neoclassicism either by bounding the reach of economic theory or by turning to descriptive approaches. While both of these strategies have been enormously fruitful, neither has provided a clear programmatic approach that aspires to a complete understanding of human decision making as did neoclassicism. There is, however, growing evidence that economists and neurobiologists are now beginning to reveal the physical mechanisms by which the human neuroarchitecture accomplishes decision making. Although in their infancy, these studies suggest both a single unified framework for understanding human decision making and a methodology for constraining the scope and structure of economic theory. Indeed, there is already evidence that these studies place mathematical constraints on existing economic models. This article reviews some of those constraints and suggests the outline of a neuroeconomic theory of decision. PMID:16845435

  17. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  18. Translating Radiometric Requirements for Satellite Sensors to Match International Standards.

    PubMed

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.

  19. Translating Radiometric Requirements for Satellite Sensors to Match International Standards

    PubMed Central

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032

  20. Conceptual Challenges of the Systemic Approach in Understanding Cell Differentiation.

    PubMed

    Paldi, Andras

    2018-01-01

    The cells of a multicellular organism are derived from a single zygote and genetically identical. Yet, they are phenotypically very different. This difference is the result of a process commonly called cell differentiation. How the phenotypic diversity emerges during ontogenesis or regeneration is a central and intensely studied but still unresolved issue in biology. Cell biology is facing conceptual challenges that are frequently confused with methodological difficulties. How to define a cell type? What stability or change means in the context of cell differentiation and how to deal with the ubiquitous molecular variations seen in the living cells? What are the driving forces of the change? We propose to reframe the problem of cell differentiation in a systemic way by incorporating different theoretical approaches. The new conceptual framework is able to capture the insights made at different levels of cellular organization and considered previously as contradictory. It also provides a formal strategy for further experimental studies.

  1. Distinction between added-energy and phase-resetting mechanisms in non-invasively detected somatosensory evoked responses.

    PubMed

    Fedele, T; Scheer, H-J; Burghoff, M; Waterstraat, G; Nikulin, V V; Curio, G

    2013-01-01

    Non-invasively recorded averaged event-related potentials (ERP) represent a convenient opportunity to investigate human brain perceptive and cognitive processes. Nevertheless, generative ERP mechanisms are still debated. Two previous approaches have been contested in the past: the added-energy model in which the response raises independently from the ongoing background activity, and the phase-reset model, based on stimulus-driven synchronization of oscillatory ongoing activity. Many criteria for the distinction of these two models have been proposed, but there is no definitive methodology to disentangle them, owing also to the limited information at the single trial level. Here, we propose a new approach combining low-noise EEG technology and multivariate decomposition techniques. We present theoretical analyses based on simulated data and identify in high-frequency somatosensory evoked responses an optimal target for the distinction between the two mechanisms.

  2. Behavior analysis and social constructionism: Some points of contact and departure

    PubMed Central

    Roche, Bryan; Barnes-Holmes, Dermot

    2003-01-01

    Social constructionists occasionally single out behavior analysis as the field of psychology that most closely resembles the natural sciences in its commitment to empiricism, and accuses it of suffering from many of the limitations to science identified by the postmodernist movement (e.g., K. J. Gergen, 1985a; Soyland, 1994). Indeed, behavior analysis is a natural science in many respects. However, it also shares with social constructionism important epistemological features such as a rejection of mentalism, a functional-analytic approach to language, the use of interpretive methodologies, and a reflexive stance on analysis. The current paper outlines briefly the key tenets of the behavior-analytic and social constructionist perspectives before examining a number of commonalties between these approaches. The paper aims to show that far from being a nemesis to social constructionism, behavior analysis may in fact be its close ally. PMID:22478403

  3. Further developments in the controlled growth approach for optimal structural synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, P.

    1982-01-01

    It is pointed out that the use of nonlinear programming methods in conjunction with finite element and other discrete analysis techniques have provided a powerful tool in the domain of optimal structural synthesis. The present investigation is concerned with new strategies which comprise an extension to the controlled growth method considered by Hajela and Sobieski-Sobieszczanski (1981). This method proposed an approach wherein the standard nonlinear programming (NLP) methodology of working with a very large number of design variables was replaced by a sequence of smaller optimization cycles, each involving a single 'dominant' variable. The current investigation outlines some new features. Attention is given to a modified cumulative constraint representation which is defined in both the feasible and infeasible domain of the design space. Other new features are related to the evaluation of the 'effectiveness measure' on which the choice of the dominant variable and the linking strategy is based.

  4. Behavior analysis and social constructionism: some points of contact and departure.

    PubMed

    Roche, Bryan; Barnes-Holmes, Dermot

    2003-01-01

    Social constructionists occasionally single out behavior analysis as the field of psychology that most closely resembles the natural sciences in its commitment to empiricism, and accuses it of suffering from many of the limitations to science identified by the postmodernist movement (e.g., K. J. Gergen, 1985a; Soyland, 1994). Indeed, behavior analysis is a natural science in many respects. However, it also shares with social constructionism important epistemological features such as a rejection of mentalism, a functional-analytic approach to language, the use of interpretive methodologies, and a reflexive stance on analysis. The current paper outlines briefly the key tenets of the behavior-analytic and social constructionist perspectives before examining a number of commonalties between these approaches. The paper aims to show that far from being a nemesis to social constructionism, behavior analysis may in fact be its close ally.

  5. Measuring Longitudinal Gains in Student Learning: A Comparison of Rasch Scoring and Summative Scoring Approaches

    ERIC Educational Resources Information Center

    Zhao, Yue; Huen, Jenny M. Y.; Chan, Y. W.

    2017-01-01

    This study pioneers a Rasch scoring approach and compares it to a conventional summative approach for measuring longitudinal gains in student learning. In this methodological note, our proposed methodology is demonstrated using an example of rating scales in a student survey as part of a higher education outcome assessment. Such assessments have…

  6. Identifying DNA methylation in a nanochannel

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoyin; Yasui, Takao; Yanagida, Takeshi; Kaji, Noritada; Rahong, Sakon; Kanai, Masaki; Nagashima, Kazuki; Kawai, Tomoji; Baba, Yoshinobu

    2016-01-01

    DNA methylation is a stable epigenetic modification, which is well known to be involved in gene expression regulation. In general, however, analyzing DNA methylation requires rather time consuming processes (24-96 h) via DNA replication and protein modification. Here we demonstrate a methodology to analyze DNA methylation at a single DNA molecule level without any protein modifications by measuring the contracted length and relaxation time of DNA within a nanochannel. Our methodology is based on the fact that methylation makes DNA molecules stiffer, resulting in a longer contracted length and a longer relaxation time (a slower contraction rate). The present methodology offers a promising way to identify DNA methylation without any protein modification at a single DNA molecule level within 2 h.

  7. Evidence for curricular and instructional design approaches in undergraduate medical education: An umbrella review.

    PubMed

    Onyura, Betty; Baker, Lindsay; Cameron, Blair; Friesen, Farah; Leslie, Karen

    2016-01-01

    An umbrella review compiles evidence from multiple reviews into a single accessible document. This umbrella review synthesizes evidence from systematic reviews on curricular and instructional design approaches in undergraduate medical education, focusing on learning outcomes. We conducted bibliographic database searches in Medline, EMBASE and ERIC from database inception to May 2013 inclusive, and digital keyword searches of leading medical education journals. We identified 18,470 abstracts; 467 underwent duplicate full-text scrutiny. Thirty-six articles met all eligibility criteria. Articles were abstracted independently by three authors, using a modified Kirkpatrick model for evaluating learning outcomes. Evidence for the effectiveness of diverse educational approaches is reported. This review maps out empirical knowledge on the efficacy of a broad range of educational approaches in medical education. Critical knowledge gaps, and lapses in methodological rigour, are discussed, providing valuable insight for future research. The findings call attention to the need for adopting evaluative strategies that explore how contextual variabilities and individual (teacher/learner) differences influence efficacy of educational interventions. Additionally, the results underscore that extant empirical evidence does not always provide unequivocal answers about what approaches are most effective. Educators should incorporate best available empirical knowledge with experiential and contextual knowledge.

  8. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  9. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  10. Potential of SNP markers for the characterization of Brazilian cassava germplasm.

    PubMed

    de Oliveira, Eder Jorge; Ferreira, Cláudia Fortes; da Silva Santos, Vanderlei; de Jesus, Onildo Nunes; Oliveira, Gilmara Alvarenga Fachardo; da Silva, Maiane Suzarte

    2014-06-01

    High-throughput markers, such as SNPs, along with different methodologies were used to evaluate the applicability of the Bayesian approach and the multivariate analysis in structuring the genetic diversity in cassavas. The objective of the present work was to evaluate the diversity and genetic structure of the largest cassava germplasm bank in Brazil. Complementary methodological approaches such as discriminant analysis of principal components (DAPC), Bayesian analysis and molecular analysis of variance (AMOVA) were used to understand the structure and diversity of 1,280 accessions genotyped using 402 single nucleotide polymorphism markers. The genetic diversity (0.327) and the average observed heterozygosity (0.322) were high considering the bi-allelic markers. In terms of population, the presence of a complex genetic structure was observed indicating the formation of 30 clusters by DAPC and 34 clusters by Bayesian analysis. Both methodologies presented difficulties and controversies in terms of the allocation of some accessions to specific clusters. However, the clusters suggested by the DAPC analysis seemed to be more consistent for presenting higher probability of allocation of the accessions within the clusters. Prior information related to breeding patterns and geographic origins of the accessions were not sufficient for providing clear differentiation between the clusters according to the AMOVA analysis. In contrast, the F ST was maximized when considering the clusters suggested by the Bayesian and DAPC analyses. The high frequency of germplasm exchange between producers and the subsequent alteration of the name of the same material may be one of the causes of the low association between genetic diversity and geographic origin. The results of this study may benefit cassava germplasm conservation programs, and contribute to the maximization of genetic gains in breeding programs.

  11. A Hybrid Model for Predicting the Prevalence of Schistosomiasis in Humans of Qianjiang City, China

    PubMed Central

    Wang, Ying; Lu, Zhouqin; Tian, Lihong; Tan, Li; Shi, Yun; Nie, Shaofa; Liu, Li

    2014-01-01

    Backgrounds/Objective Schistosomiasis is still a major public health problem in China, despite the fact that the government has implemented a series of strategies to prevent and control the spread of the parasitic disease. Advanced warning and reliable forecasting can help policymakers to adjust and implement strategies more effectively, which will lead to the control and elimination of schistosomiasis. Our aim is to explore the application of a hybrid forecasting model to track the trends of the prevalence of schistosomiasis in humans, which provides a methodological basis for predicting and detecting schistosomiasis infection in endemic areas. Methods A hybrid approach combining the autoregressive integrated moving average (ARIMA) model and the nonlinear autoregressive neural network (NARNN) model to forecast the prevalence of schistosomiasis in the future four years. Forecasting performance was compared between the hybrid ARIMA-NARNN model, and the single ARIMA or the single NARNN model. Results The modelling mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model was 0.1869×10−4, 0.0029, 0.0419 with a corresponding testing error of 0.9375×10−4, 0.0081, 0.9064, respectively. These error values generated with the hybrid model were all lower than those obtained from the single ARIMA or NARNN model. The forecasting values were 0.75%, 0.80%, 0.76% and 0.77% in the future four years, which demonstrated a no-downward trend. Conclusion The hybrid model has high quality prediction accuracy in the prevalence of schistosomiasis, which provides a methodological basis for future schistosomiasis monitoring and control strategies in the study area. It is worth attempting to utilize the hybrid detection scheme in other schistosomiasis-endemic areas including other infectious diseases. PMID:25119882

  12. Single-particle cryo-EM using alignment by classification (ABC): the structure of Lumbricus terrestris haemoglobin

    PubMed Central

    Seer-Linnemayr, Charlotte; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; Alewijnse, Bart; Portugal, Rodrigo V.; Pannu, Navraj S.; Schatz, Michael; van Heel, Marin

    2017-01-01

    Single-particle cryogenic electron microscopy (cryo-EM) can now yield near-atomic resolution structures of biological complexes. However, the reference-based alignment algorithms commonly used in cryo-EM suffer from reference bias, limiting their applicability (also known as the ‘Einstein from random noise’ problem). Low-dose cryo-EM therefore requires robust and objective approaches to reveal the structural information contained in the extremely noisy data, especially when dealing with small structures. A reference-free pipeline is presented for obtaining near-atomic resolution three-dimensional reconstructions from heterogeneous (‘four-dimensional’) cryo-EM data sets. The methodologies integrated in this pipeline include a posteriori camera correction, movie-based full-data-set contrast transfer function determination, movie-alignment algorithms, (Fourier-space) multivariate statistical data compression and unsupervised classification, ‘random-startup’ three-dimensional reconstructions, four-dimensional structural refinements and Fourier shell correlation criteria for evaluating anisotropic resolution. The procedures exclusively use information emerging from the data set itself, without external ‘starting models’. Euler-angle assignments are performed by angular reconstitution rather than by the inherently slower projection-matching approaches. The comprehensive ‘ABC-4D’ pipeline is based on the two-dimensional reference-free ‘alignment by classification’ (ABC) approach, where similar images in similar orientations are grouped by unsupervised classification. Some fundamental differences between X-ray crystallography versus single-particle cryo-EM data collection and data processing are discussed. The structure of the giant haemoglobin from Lumbricus terrestris at a global resolution of ∼3.8 Å is presented as an example of the use of the ABC-4D procedure. PMID:28989723

  13. OmpF, a nucleotide-sensing nanoprobe, computational evaluation of single channel activities

    NASA Astrophysics Data System (ADS)

    Abdolvahab, R. H.; Mobasheri, H.; Nikouee, A.; Ejtehadi, M. R.

    2016-09-01

    The results of highthroughput practical single channel experiments should be formulated and validated by signal analysis approaches to increase the recognition precision of translocating molecules. For this purpose, the activities of the single nano-pore forming protein, OmpF, in the presence of nucleotides were recorded in real time by the voltage clamp technique and used as a means for nucleotide recognition. The results were analyzed based on the permutation entropy of current Time Series (TS), fractality, autocorrelation, structure function, spectral density, and peak fraction to recognize each nucleotide, based on its signature effect on the conductance, gating frequency and voltage sensitivity of channel at different concentrations and membrane potentials. The amplitude and frequency of ion current fluctuation increased in the presence of Adenine more than Cytosine and Thymine in milli-molar (0.5 mM) concentrations. The variance of the current TS at various applied voltages showed a non-monotonic trend whose initial increasing slope in the presence of Thymine changed to a decreasing one in the second phase and was different from that of Adenine and Cytosine; e.g., by increasing the voltage from 40 to 140 mV in the 0.5 mM concentration of Adenine or Cytosine, the variance decreased by one third while for the case of Thymine it was doubled. Moreover, according to the structure function of TS, the fractality of current TS differed as a function of varying membrane potentials (pd) and nucleotide concentrations. Accordingly, the calculated permutation entropy of the TS, validated the biophysical approach defined for the recognition of different nucleotides at various concentrations, pd's and polarities. Thus, the promising outcomes of the combined experimental and theoretical methodologies presented here can be implemented as a complementary means in pore-based nucleotide recognition approaches.

  14. Global dynamic optimization approach to predict activation in metabolic pathways.

    PubMed

    de Hijas-Liste, Gundián M; Klipp, Edda; Balsa-Canto, Eva; Banga, Julio R

    2014-01-06

    During the last decade, a number of authors have shown that the genetic regulation of metabolic networks may follow optimality principles. Optimal control theory has been successfully used to compute optimal enzyme profiles considering simple metabolic pathways. However, applying this optimal control framework to more general networks (e.g. branched networks, or networks incorporating enzyme production dynamics) yields problems that are analytically intractable and/or numerically very challenging. Further, these previous studies have only considered a single-objective framework. In this work we consider a more general multi-objective formulation and we present solutions based on recent developments in global dynamic optimization techniques. We illustrate the performance and capabilities of these techniques considering two sets of problems. First, we consider a set of single-objective examples of increasing complexity taken from the recent literature. We analyze the multimodal character of the associated non linear optimization problems, and we also evaluate different global optimization approaches in terms of numerical robustness, efficiency and scalability. Second, we consider generalized multi-objective formulations for several examples, and we show how this framework results in more biologically meaningful results. The proposed strategy was used to solve a set of single-objective case studies related to unbranched and branched metabolic networks of different levels of complexity. All problems were successfully solved in reasonable computation times with our global dynamic optimization approach, reaching solutions which were comparable or better than those reported in previous literature. Further, we considered, for the first time, multi-objective formulations, illustrating how activation in metabolic pathways can be explained in terms of the best trade-offs between conflicting objectives. This new methodology can be applied to metabolic networks with arbitrary topologies, non-linear dynamics and constraints.

  15. A Single Conversation with a Wise Man Is Better than Ten Years of Study: A Model for Testing Methodologies for Pedagogy or Andragogy

    ERIC Educational Resources Information Center

    Taylor, Bryan; Kroth, Michael

    2009-01-01

    This article creates the Teaching Methodology Instrument (TMI) to help determine the level of adult learning principles being used by a particular teaching methodology in a classroom. The instrument incorporates the principles and assumptions set forth by Malcolm Knowles of what makes a good adult learning environment. The Socratic method as used…

  16. Incomplete initial nutation diffusion imaging: An ultrafast, single-scan approach for diffusion mapping.

    PubMed

    Ianuş, Andrada; Shemesh, Noam

    2018-04-01

    Diffusion MRI is confounded by the need to acquire at least two images separated by a repetition time, thereby thwarting the detection of rapid dynamic microstructural changes. The issue is exacerbated when diffusivity variations are accompanied by rapid changes in T 2 . The purpose of the present study is to accelerate diffusion MRI acquisitions such that both reference and diffusion-weighted images necessary for quantitative diffusivity mapping are acquired in a single-shot experiment. A general methodology termed incomplete initial nutation diffusion imaging (INDI), capturing two diffusion contrasts in a single shot, is presented. This methodology creates a longitudinal magnetization reservoir that facilitates the successive acquisition of two images separated by only a few milliseconds. The theory behind INDI is presented, followed by proof-of-concept studies in water phantom, ex vivo, and in vivo experiments at 16.4 and 9.4 T. Mean diffusivities extracted from INDI were comparable with diffusion tensor imaging and the two-shot isotropic diffusion encoding in the water phantom. In ex vivo mouse brain tissues, as well as in the in vivo mouse brain, mean diffusivities extracted from conventional isotropic diffusion encoding and INDI were in excellent agreement. Simulations for signal-to-noise considerations identified the regimes in which INDI is most beneficial. The INDI method accelerates diffusion MRI acquisition to single-shot mode, which can be of great importance for mapping dynamic microstructural properties in vivo without T 2 bias. Magn Reson Med 79:2198-2204, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.

  17. Considerations for the Systematic Analysis and Use of Single-Case Research

    ERIC Educational Resources Information Center

    Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith

    2012-01-01

    Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…

  18. [Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].

    PubMed

    2012-01-01

    The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.

  19. Deliverology

    ERIC Educational Resources Information Center

    Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen

    2017-01-01

    Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…

  20. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  1. The "Push-Pull" Approach to Fast-Track Management Development: A Case Study in Scientific Publishing

    ERIC Educational Resources Information Center

    Fojt, Martin; Parkinson, Stephen; Peters, John; Sandelands, Eric

    2008-01-01

    Purpose: The purpose of this paper is to explore how a medium sized business has addressed what it has termed a "push-pull" method of management and organization development, based around an action learning approach. Design/methodology/approach: The paper sets out a methodology that other SMEs might look to replicate in their management and…

  2. Concurrent airline fleet allocation and aircraft design with profit modeling for multiple airlines

    NASA Astrophysics Data System (ADS)

    Govindaraju, Parithi

    A "System of Systems" (SoS) approach is particularly beneficial in analyzing complex large scale systems comprised of numerous independent systems -- each capable of independent operations in their own right -- that when brought in conjunction offer capabilities and performance beyond the constituents of the individual systems. The variable resource allocation problem is a type of SoS problem, which includes the allocation of "yet-to-be-designed" systems in addition to existing resources and systems. The methodology presented here expands upon earlier work that demonstrated a decomposition approach that sought to simultaneously design a new aircraft and allocate this new aircraft along with existing aircraft in an effort to meet passenger demand at minimum fleet level operating cost for a single airline. The result of this describes important characteristics of the new aircraft. The ticket price model developed and implemented here enables analysis of the system using profit maximization studies instead of cost minimization. A multiobjective problem formulation has been implemented to determine characteristics of a new aircraft that maximizes the profit of multiple airlines to recognize the fact that aircraft manufacturers sell their aircraft to multiple customers and seldom design aircraft customized to a single airline's operations. The route network characteristics of two simple airlines serve as the example problem for the initial studies. The resulting problem formulation is a mixed-integer nonlinear programming problem, which is typically difficult to solve. A sequential decomposition strategy is applied as a solution methodology by segregating the allocation (integer programming) and aircraft design (non-linear programming) subspaces. After solving a simple problem considering two airlines, the decomposition approach is then applied to two larger airline route networks representing actual airline operations in the year 2005. The decomposition strategy serves as a promising technique for future detailed analyses. Results from the profit maximization studies favor a smaller aircraft in terms of passenger capacity due to its higher yield generation capability on shorter routes while results from the cost minimization studies favor a larger aircraft due to its lower direct operating cost per seat mile.

  3. Using principal component analysis to capture individual differences within a unified neuropsychological model of chronic post-stroke aphasia: Revealing the unique neural correlates of speech fluency, phonology and semantics.

    PubMed

    Halai, Ajay D; Woollams, Anna M; Lambon Ralph, Matthew A

    2017-01-01

    Individual differences in the performance profiles of neuropsychologically-impaired patients are pervasive yet there is still no resolution on the best way to model and account for the variation in their behavioural impairments and the associated neural correlates. To date, researchers have generally taken one of three different approaches: a single-case study methodology in which each case is considered separately; a case-series design in which all individual patients from a small coherent group are examined and directly compared; or, group studies, in which a sample of cases are investigated as one group with the assumption that they are drawn from a homogenous category and that performance differences are of no interest. In recent research, we have developed a complementary alternative through the use of principal component analysis (PCA) of individual data from large patient cohorts. This data-driven approach not only generates a single unified model for the group as a whole (expressed in terms of the emergent principal components) but is also able to capture the individual differences between patients (in terms of their relative positions along the principal behavioural axes). We demonstrate the use of this approach by considering speech fluency, phonology and semantics in aphasia diagnosis and classification, as well as their unique neural correlates. PCA of the behavioural data from 31 patients with chronic post-stroke aphasia resulted in four statistically-independent behavioural components reflecting phonological, semantic, executive-cognitive and fluency abilities. Even after accounting for lesion volume, entering the four behavioural components simultaneously into a voxel-based correlational methodology (VBCM) analysis revealed that speech fluency (speech quanta) was uniquely correlated with left motor cortex and underlying white matter (including the anterior section of the arcuate fasciculus and the frontal aslant tract), phonological skills with regions in the superior temporal gyrus and pars opercularis, and semantics with the anterior temporal stem. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  4. Cryptosporidium as a testbed for single cell genome characterization of unicellular eukaryotes.

    PubMed

    Troell, Karin; Hallström, Björn; Divne, Anna-Maria; Alsmark, Cecilia; Arrighi, Romanico; Huss, Mikael; Beser, Jessica; Bertilsson, Stefan

    2016-06-23

    Infectious disease involving multiple genetically distinct populations of pathogens is frequently concurrent, but difficult to detect or describe with current routine methodology. Cryptosporidium sp. is a widespread gastrointestinal protozoan of global significance in both animals and humans. It cannot be easily maintained in culture and infections of multiple strains have been reported. To explore the potential use of single cell genomics methodology for revealing genome-level variation in clinical samples from Cryptosporidium-infected hosts, we sorted individual oocysts for subsequent genome amplification and full-genome sequencing. Cells were identified with fluorescent antibodies with an 80 % success rate for the entire single cell genomics workflow, demonstrating that the methodology can be applied directly to purified fecal samples. Ten amplified genomes from sorted single cells were selected for genome sequencing and compared both to the original population and a reference genome in order to evaluate the accuracy and performance of the method. Single cell genome coverage was on average 81 % even with a moderate sequencing effort and by combining the 10 single cell genomes, the full genome was accounted for. By a comparison to the original sample, biological variation could be distinguished and separated from noise introduced in the amplification. As a proof of principle, we have demonstrated the power of applying single cell genomics to dissect infectious disease caused by closely related parasite species or subtypes. The workflow can easily be expanded and adapted to target other protozoans, and potential applications include mapping genome-encoded traits, virulence, pathogenicity, host specificity and resistance at the level of cells as truly meaningful biological units.

  5. Assembling evidence for identifying reservoirs of infection

    PubMed Central

    Viana, Mafalda; Mancy, Rebecca; Biek, Roman; Cleaveland, Sarah; Cross, Paul C.; Lloyd-Smith, James O.; Haydon, Daniel T.

    2014-01-01

    Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems. PMID:24726345

  6. Ca2+ waves across gaps in non-excitable cells induced by femtosecond laser exposure

    NASA Astrophysics Data System (ADS)

    He, Hao; Wang, Shaoyang; Li, Xun; Li, Shiyang; Hu, Minglie; Cao, Youjia; Wang, Ching-Yue

    2012-04-01

    Calcium is a second messenger in all cells for various cellular processes. It was found in astrocytes and neurons that femtosecond laser stimulation could induce Ca2+ wave propagation. In this work, a femtosecond laser with a power above a certain threshold was focused on single HeLa/HEK293T cells for Ca2+ mobilization. Several types of Ca2+ oscillation patterns were found in neighboring cells. The Ca2+ wave propagated very fast across 40-μm gaps in the Ca2+-free medium mediated by the adenosine-triphosphate released from cells. This approach could provide a clean methodology to investigate the Ca2+ dynamics in non-excitable cells.

  7. Assembling evidence for identifying reservoirs of infection

    USGS Publications Warehouse

    Mafalda, Viana; Rebecca, Mancy; Roman, Biek; Sarah, Cleaveland; Cross, Paul C.; James O, Lloyd-Smith; Daniel T, Haydon

    2014-01-01

    Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems.

  8. Comparative tests of bench equipment for fuel control system testing of gas-turbine engine

    NASA Astrophysics Data System (ADS)

    Shendaleva, E. V.

    2018-04-01

    The relevance of interlaboratory comparative researches is confirmed by attention of world metrological community to this field of activity. Use of the interlaboratory comparative research methodology not only for single gages collation, but also for bench equipment complexes, such as modeling stands for fuel control system testing of gas-turbine engine, is offered. In this case a comparative measure of different bench equipment will be the control fuel pump. Ensuring traceability of measuring result received at test benches of various air enterprises, development and introduction of national standards to practice of bench tests and, eventually, improvement of quality and safety of a aircraft equipment is result of this approach.

  9. A compressive sensing-based computational method for the inversion of wide-band ground penetrating radar data

    NASA Astrophysics Data System (ADS)

    Gelmini, A.; Gottardi, G.; Moriyama, T.

    2017-10-01

    This work presents an innovative computational approach for the inversion of wideband ground penetrating radar (GPR) data. The retrieval of the dielectric characteristics of sparse scatterers buried in a lossy soil is performed by combining a multi-task Bayesian compressive sensing (MT-BCS) solver and a frequency hopping (FH) strategy. The developed methodology is able to benefit from the regularization capabilities of the MT-BCS as well as to exploit the multi-chromatic informative content of GPR measurements. A set of numerical results is reported in order to assess the effectiveness of the proposed GPR inverse scattering technique, as well as to compare it to a simpler single-task implementation.

  10. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  11. Modeling Viral Capsid Assembly

    PubMed Central

    2014-01-01

    I present a review of the theoretical and computational methodologies that have been used to model the assembly of viral capsids. I discuss the capabilities and limitations of approaches ranging from equilibrium continuum theories to molecular dynamics simulations, and I give an overview of some of the important conclusions about virus assembly that have resulted from these modeling efforts. Topics include the assembly of empty viral shells, assembly around single-stranded nucleic acids to form viral particles, and assembly around synthetic polymers or charged nanoparticles for nanotechnology or biomedical applications. I present some examples in which modeling efforts have promoted experimental breakthroughs, as well as directions in which the connection between modeling and experiment can be strengthened. PMID:25663722

  12. Wave-packet approach to transport properties of carrier coupled with intermolecular and intramolecular vibrations of organic semiconductors

    NASA Astrophysics Data System (ADS)

    Ishii, Hiroyuki; Honma, Keisuke; Kobayashi, Nobuhiko; Hirose, Kenji

    2012-06-01

    We present a methodology to study the charge-transport properties of organic semiconductors by the time-dependent wave-packet diffusion method, taking the polaron effects into account. As an example, we investigate the transport properties of single-crystal pentacene organic semiconductors coupled with inter- and intramolecular vibrations within the mixed Holstein and Peierls model, which describes both hopping and bandlike transport behaviors due to small and large polaron formations. Taking into account static disorders, which inevitably exist in the molecular crystals, we present the temperature dependence of charge-transport properties in competition among the thermal fluctuation of molecular motions, the polaron formation, and the static disorders.

  13. Current Status and Challenges of Atmospheric Data Assimilation

    NASA Astrophysics Data System (ADS)

    Atlas, R. M.; Gelaro, R.

    2016-12-01

    The issues of modern atmospheric data assimilation are fairly simple to comprehend but difficult to address, involving the combination of literally billions of model variables and tens of millions of observations daily. In addition to traditional meteorological variables such as wind, temperature pressure and humidity, model state vectors are being expanded to include explicit representation of precipitation, clouds, aerosols and atmospheric trace gases. At the same time, model resolutions are approaching single-kilometer scales globally and new observation types have error characteristics that are increasingly non-Gaussian. This talk describes the current status and challenges of atmospheric data assimilation, including an overview of current methodologies, the difficulty of estimating error statistics, and progress toward coupled earth system analyses.

  14. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  15. Propellant Readiness Level: A Methodological Approach to Propellant Characterization

    NASA Technical Reports Server (NTRS)

    Bossard, John A.; Rhys, Noah O.

    2010-01-01

    A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.

  16. Predicting muscle forces during the propulsion phase of single leg triple hop test.

    PubMed

    Alvim, Felipe Costa; Lucareli, Paulo Roberto Garcia; Menegaldo, Luciano Luporini

    2018-01-01

    Functional biomechanical tests allow the assessment of musculoskeletal system impairments in a simple way. Muscle force synergies associated with movement can provide additional information for diagnosis. However, such forces cannot be directly measured noninvasively. This study aims to estimate muscle activations and forces exerted during the preparation phase of the single leg triple hop test. Two different approaches were tested: static optimization (SO) and computed muscle control (CMC). As an indirect validation, model-estimated muscle activations were compared with surface electromyography (EMG) of selected hip and thigh muscles. Ten physically healthy active women performed a series of jumps, and ground reaction forces, kinematics and EMG data were recorded. An existing OpenSim model with 92 musculotendon actuators was used to estimate muscle forces. Reflective markers data were processed using the OpenSim Inverse Kinematics tool. Residual Reduction Algorithm (RRA) was applied recursively before running the SO and CMC. For both, the same adjusted kinematics were used as inputs. Both approaches presented similar residuals amplitudes. SO showed a closer agreement between the estimated activations and the EMGs of some muscles. Due to inherent EMG methodological limitations, the superiority of SO in relation to CMC can be only hypothesized. It should be confirmed by conducting further studies comparing joint contact forces. The workflow presented in this study can be used to estimate muscle forces during the preparation phase of the single leg triple hop test and allows investigating muscle activation and coordination. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Multiple Time-of-Flight/Time-of-Flight Events in a Single Laser Shot for Improved Matrix-Assisted Laser Desorption/Ionization Tandem Mass Spectrometry Quantification.

    PubMed

    Prentice, Boone M; Chumbley, Chad W; Hachey, Brian C; Norris, Jeremy L; Caprioli, Richard M

    2016-10-04

    Quantitative matrix-assisted laser desorption/ionization time-of-flight (MALDI TOF) approaches have historically suffered from poor accuracy and precision mainly due to the nonuniform distribution of matrix and analyte across the target surface, matrix interferences, and ionization suppression. Tandem mass spectrometry (MS/MS) can be used to ensure chemical specificity as well as improve signal-to-noise ratios by eliminating interferences from chemical noise, alleviating some concerns about dynamic range. However, conventional MALDI TOF/TOF modalities typically only scan for a single MS/MS event per laser shot, and multiplex assays require sequential analyses. We describe here new methodology that allows for multiple TOF/TOF fragmentation events to be performed in a single laser shot. This technology allows the reference of analyte intensity to that of the internal standard in each laser shot, even when the analyte and internal standard are quite disparate in m/z, thereby improving quantification while maintaining chemical specificity and duty cycle. In the quantitative analysis of the drug enalapril in pooled human plasma with ramipril as an internal standard, a greater than 4-fold improvement in relative standard deviation (<10%) was observed as well as improved coefficients of determination (R 2 ) and accuracy (>85% quality controls). Using this approach we have also performed simultaneous quantitative analysis of three drugs (promethazine, enalapril, and verapamil) using deuterated analogues of these drugs as internal standards.

  18. Evaluation of next generation sequencing for the analysis of Eimeria communities in wildlife.

    PubMed

    Vermeulen, Elke T; Lott, Matthew J; Eldridge, Mark D B; Power, Michelle L

    2016-05-01

    Next-generation sequencing (NGS) techniques are well-established for studying bacterial communities but not yet for microbial eukaryotes. Parasite communities remain poorly studied, due in part to the lack of reliable and accessible molecular methods to analyse eukaryotic communities. We aimed to develop and evaluate a methodology to analyse communities of the protozoan parasite Eimeria from populations of the Australian marsupial Petrogale penicillata (brush-tailed rock-wallaby) using NGS. An oocyst purification method for small sample sizes and polymerase chain reaction (PCR) protocol for the 18S rRNA locus targeting Eimeria was developed and optimised prior to sequencing on the Illumina MiSeq platform. A data analysis approach was developed by modifying methods from bacterial metagenomics and utilising existing Eimeria sequences in GenBank. Operational taxonomic unit (OTU) assignment at a high similarity threshold (97%) was more accurate at assigning Eimeria contigs into Eimeria OTUs but at a lower threshold (95%) there was greater resolution between OTU consensus sequences. The assessment of two amplification PCR methods prior to Illumina MiSeq, single and nested PCR, determined that single PCR was more sensitive to Eimeria as more Eimeria OTUs were detected in single amplicons. We have developed a simple and cost-effective approach to a data analysis pipeline for community analysis of eukaryotic organisms using Eimeria communities as a model. The pipeline provides a basis for evaluation using other eukaryotic organisms and potential for diverse community analysis studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Simultaneous profiling of activity patterns in multiple neuronal subclasses.

    PubMed

    Parrish, R Ryley; Grady, John; Codadu, Neela K; Trevelyan, Andrew J; Racca, Claudia

    2018-06-01

    Neuronal networks typically comprise heterogeneous populations of neurons. A core objective when seeking to understand such networks, therefore, is to identify what roles these different neuronal classes play. Acquiring single cell electrophysiology data for multiple cell classes can prove to be a large and daunting task. Alternatively, Ca 2+ network imaging provides activity profiles of large numbers of neurons simultaneously, but without distinguishing between cell classes. We therefore developed a strategy for combining cellular electrophysiology, Ca 2+ network imaging, and immunohistochemistry to provide activity profiles for multiple cell classes at once. This involves cross-referencing easily identifiable landmarks between imaging of the live and fixed tissue, and then using custom MATLAB functions to realign the two imaging data sets, to correct for distortions of the tissue introduced by the fixation or immunohistochemical processing. We illustrate the methodology for analyses of activity profiles during epileptiform events recorded in mouse brain slices. We further demonstrate the activity profile of a population of parvalbumin-positive interneurons prior, during, and following a seizure-like event. Current approaches to Ca 2+ network imaging analyses are severely limited in their ability to subclassify neurons, and often rely on transgenic approaches to identify cell classes. In contrast, our methodology is a generic, affordable, and flexible technique to characterize neuronal behaviour with respect to classification based on morphological and neurochemical identity. We present a new approach for analysing Ca 2+ network imaging datasets, and use this to explore the parvalbumin-positive interneuron activity during epileptiform events. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Equifinality in empirical studies of cultural transmission.

    PubMed

    Barrett, Brendan J

    2018-01-31

    Cultural systems exhibit equifinal behavior - a single final state may be arrived at via different mechanisms and/or from different initial states. Potential for equifinality exists in all empirical studies of cultural transmission including controlled experiments, observational field research, and computational simulations. Acknowledging and anticipating the existence of equifinality is important in empirical studies of social learning and cultural evolution; it helps us understand the limitations of analytical approaches and can improve our ability to predict the dynamics of cultural transmission. Here, I illustrate and discuss examples of equifinality in studies of social learning, and how certain experimental designs might be prone to it. I then review examples of equifinality discussed in the social learning literature, namely the use of s-shaped diffusion curves to discern individual from social learning and operational definitions and analytical approaches used in studies of conformist transmission. While equifinality exists to some extent in all studies of social learning, I make suggestions for how to address instances of it, with an emphasis on using data simulation and methodological verification alongside modern statistical approaches that emphasize prediction and model comparison. In cases where evaluated learning mechanisms are equifinal due to non-methodological factors, I suggest that this is not always a problem if it helps us predict cultural change. In some cases, equifinal learning mechanisms might offer insight into how both individual learning, social learning strategies and other endogenous social factors might by important in structuring cultural dynamics and within- and between-group heterogeneity. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. How to regress and predict in a Bland-Altman plot? Review and contribution based on tolerance intervals and correlated-errors-in-variables models.

    PubMed

    Francq, Bernard G; Govaerts, Bernadette

    2016-06-30

    Two main methodologies for assessing equivalence in method-comparison studies are presented separately in the literature. The first one is the well-known and widely applied Bland-Altman approach with its agreement intervals, where two methods are considered interchangeable if their differences are not clinically significant. The second approach is based on errors-in-variables regression in a classical (X,Y) plot and focuses on confidence intervals, whereby two methods are considered equivalent when providing similar measures notwithstanding the random measurement errors. This paper reconciles these two methodologies and shows their similarities and differences using both real data and simulations. A new consistent correlated-errors-in-variables regression is introduced as the errors are shown to be correlated in the Bland-Altman plot. Indeed, the coverage probabilities collapse and the biases soar when this correlation is ignored. Novel tolerance intervals are compared with agreement intervals with or without replicated data, and novel predictive intervals are introduced to predict a single measure in an (X,Y) plot or in a Bland-Atman plot with excellent coverage probabilities. We conclude that the (correlated)-errors-in-variables regressions should not be avoided in method comparison studies, although the Bland-Altman approach is usually applied to avert their complexity. We argue that tolerance or predictive intervals are better alternatives than agreement intervals, and we provide guidelines for practitioners regarding method comparison studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Alternate approaches to repress endogenous microRNA activity in Arabidopsis thaliana

    PubMed Central

    Wang, Ming-Bo

    2011-01-01

    MicroRnAs (miRnAs) are an endogenous class of regulatory small RnA (sRnA). in plants, miRnAs are processed from short non-protein-coding messenger RnAs (mRnAs) transcribed from small miRnA genes (MIR genes). Traditionally in the model plant Arabidopsis thaliana (Arabidopsis), the functional analysis of a gene product has relied on the identification of a corresponding T-DnA insertion knockout mutant from a large, randomly-mutagenized population. However, because of the small size of MIR genes and presence of multiple, highly conserved members in most plant miRnA families, it has been extremely laborious and time consuming to obtain a corresponding single or multiple, null mutant plant line. Our recent study published in Molecular Plant1 outlines an alternate method for the functional characterization of miRnA action in Arabidopsis, termed anti-miRnA technology. Using this approach we demonstrated that the expression of individual miRnAs or entire miRnA families, can be readily and efficiently knocked-down. Our approach is in addition to two previously reported methodologies that also allow for the targeted suppression of either individual miRnAs, or all members of a MIR gene family; these include miRnA target mimicry2,3 and transcriptional gene silencing (TGS) of MIR gene promoters.4 All three methodologies rely on endogenous gene regulatory machinery and in this article we provide an overview of these technologies and discuss their strengths and weaknesses in inhibiting the activity of their targeted miRnA(s). PMID:21358288

  3. Re-engineering the Federal planning process: A total Federal planning strategy, integrating NEPA with modern management tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eccleston, C.H.

    1997-09-05

    The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less

  4. Perceived Managerial and Leadership Effectiveness in Colombia

    ERIC Educational Resources Information Center

    Torres, Luis Eduardo; Ruiz, Carlos Enrique; Hamlin, Bob; Velez-Calle, Andres

    2015-01-01

    Purpose: The purpose of this study was to identify what Colombians perceive as effective and least effective/ineffective managerial behavior. Design/methodology/approach: This study was conducted following a qualitative methodology based on the philosophical assumptions of pragmatism and the "pragmatic approach" (Morgan, 2007). The…

  5. Car-Parrinello simulation of hydrogen bond dynamics in sodium hydrogen bissulfate.

    PubMed

    Pirc, Gordana; Stare, Jernej; Mavri, Janez

    2010-06-14

    We studied proton dynamics of a short hydrogen bond of the crystalline sodium hydrogen bissulfate, a hydrogen-bonded ferroelectric system. Our approach was based on the established Car-Parrinello molecular dynamics (CPMD) methodology, followed by an a posteriori quantization of the OH stretching motion. The latter approach is based on snapshot structures taken from CPMD trajectory, calculation of proton potentials, and solving of the vibrational Schrodinger equation for each of the snapshot potentials. The so obtained contour of the OH stretching band has the center of gravity at about 1540 cm(-1) and a half width of about 700 cm(-1), which is in qualitative agreement with the experimental infrared spectrum. The corresponding values for the deuterated form are 1092 and 600 cm(-1), respectively. The hydrogen probability densities obtained by solving the vibrational Schrodinger equation allow for the evaluation of potential of mean force along the proton transfer coordinate. We demonstrate that for the present system the free energy profile is of the single-well type and features a broad and shallow minimum near the center of the hydrogen bond, allowing for frequent and barrierless proton (or deuteron) jumps. All the calculated time-averaged geometric parameters were in reasonable agreement with the experimental neutron diffraction data. As the present methodology for quantization of proton motion is applicable to a variety of hydrogen-bonded systems, it is promising for potential use in computational enzymology.

  6. Set-Based Discrete Particle Swarm Optimization Based on Decomposition for Permutation-Based Multiobjective Combinatorial Optimization Problems.

    PubMed

    Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun

    2018-07-01

    This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.

  7. Lattice Boltzmann methods for global linear instability analysis

    NASA Astrophysics Data System (ADS)

    Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis

    2017-12-01

    Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.

  8. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    PubMed

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  9. Single Event Burnout in DC-DC Converters for the LHC Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Claudio H. Rivetta et al.

    High voltage transistors in DC-DC converters are prone to catastrophic Single Event Burnout in the LHC radiation environment. This paper presents a systematic methodology to analyze single event effects sensitivity in converters and proposes solutions based on de-rating input voltage and output current or voltage.

  10. Case Series Investigations in Cognitive Neuropsychology

    PubMed Central

    Schwartz, Myrna F.; Dell, Gary S.

    2011-01-01

    Case series methodology involves the systematic assessment of a sample of related patients, with the goal of understanding how and why they differ from one another. This method has become increasingly important in cognitive neuropsychology, which has long been identified with single-subject research. We review case series studies dealing with impaired semantic memory, reading, and language production, and draw attention to the affinity of this methodology for testing theories that are expressed as computational models and for addressing questions about neuroanatomy. It is concluded that case series methods usefully complement single-subject techniques. PMID:21714756

  11. Generating Vegetation Leaf Area Index Earth System Data Record from Multiple Sensors. Part 1; Theory

    NASA Technical Reports Server (NTRS)

    Ganguly, Sangram; Schull, Mitchell A.; Samanta, Arindam; Shabanov, Nikolay V.; Milesi, Cristina; Nemani, Ramakrishna R.; Knyazikhin, Yuri; Myneni, Ranga B.

    2008-01-01

    The generation of multi-decade long Earth System Data Records (ESDRs) of Leaf Area Index (LAI) and Fraction of Photosynthetically Active Radiation absorbed by vegetation (FPAR) from remote sensing measurements of multiple sensors is key to monitoring long-term changes in vegetation due to natural and anthropogenic influences. Challenges in developing such ESDRs include problems in remote sensing science (modeling of variability in global vegetation, scaling, atmospheric correction) and sensor hardware (differences in spatial resolution, spectral bands, calibration, and information content). In this paper, we develop a physically based approach for deriving LAI and FPAR products from the Advanced Very High Resolution Radiometer (AVHRR) data that are of comparable quality to the Moderate resolution Imaging Spectroradiometer (MODIS) LAI and FPAR products, thus realizing the objective of producing a long (multi-decadal) time series of these products. The approach is based on the radiative transfer theory of canopy spectral invariants which facilitates parameterization of the canopy spectral bidirectional reflectance factor (BRF). The methodology permits decoupling of the structural and radiometric components and obeys the energy conservation law. The approach is applicable to any optical sensor, however, it requires selection of sensor-specific values of configurable parameters, namely, the single scattering albedo and data uncertainty. According to the theory of spectral invariants, the single scattering albedo is a function of the spatial scale, and thus, accounts for the variation in BRF with sensor spatial resolution. Likewise, the single scattering albedo accounts for the variation in spectral BRF with sensor bandwidths. The second adjustable parameter is data uncertainty, which accounts for varying information content of the remote sensing measurements, i.e., Normalized Difference Vegetation Index (NDVI, low information content), vs. spectral BRF (higher information content). Implementation of this approach indicates good consistency in LAI values retrieved from NDVI (AVHRRmode) and spectral BRF (MODIS-mode). Specific details of the implementation and evaluation of the derived products are detailed in the second part of this two-paper series.

  12. Reliability analysis of single crystal NiAl turbine blades

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan; Noebe, Ronald; Wheeler, Donald R.; Holland, Fred; Palko, Joseph; Duffy, Stephen; Wright, P. Kennard

    1995-01-01

    As part of a co-operative agreement with General Electric Aircraft Engines (GEAE), NASA LeRC is modifying and validating the Ceramic Analysis and Reliability Evaluation of Structures algorithm for use in design of components made of high strength NiAl based intermetallic materials. NiAl single crystal alloys are being actively investigated by GEAE as a replacement for Ni-based single crystal superalloys for use in high pressure turbine blades and vanes. The driving force for this research lies in the numerous property advantages offered by NiAl alloys over their superalloy counterparts. These include a reduction of density by as much as a third without significantly sacrificing strength, higher melting point, greater thermal conductivity, better oxidation resistance, and a better response to thermal barrier coatings. The current drawback to high strength NiAl single crystals is their limited ductility. Consequently, significant efforts including the work agreement with GEAE are underway to develop testing and design methodologies for these materials. The approach to validation and component analysis involves the following steps: determination of the statistical nature and source of fracture in a high strength, NiAl single crystal turbine blade material; measurement of the failure strength envelope of the material; coding of statistically based reliability models; verification of the code and model; and modeling of turbine blades and vanes for rig testing.

  13. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    PubMed Central

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  14. Use of methodological tools for assessing the quality of studies in periodontology and implant dentistry: a systematic review.

    PubMed

    Faggion, Clovis M; Huda, Fahd; Wasiak, Jason

    2014-06-01

    To evaluate the methodological approaches used to assess the quality of studies included in systematic reviews (SRs) in periodontology and implant dentistry. Two electronic databases (PubMed and Cochrane Database of Systematic Reviews) were searched independently to identify SRs examining interventions published through 2 September 2013. The reference lists of included SRs and records of 10 specialty dental journals were searched manually. Methodological approaches were assessed using seven criteria based on the Cochrane Handbook for Systematic Reviews of Interventions. Temporal trends in methodological quality were also explored. Of the 159 SRs with meta-analyses included in the analysis, 44 (28%) reported the use of domain-based tools, 15 (9%) reported the use of checklists and 7 (4%) reported the use of scales. Forty-two (26%) SRs reported use of more than one tool. Criteria were met heterogeneously; authors of 15 (9%) publications incorporated the quality of evidence of primary studies into SRs, whereas 69% of SRs reported methodological approaches in the Materials/Methods section. Reporting of four criteria was significantly better in recent (2010-2013) than in previous publications. The analysis identified several methodological limitations of approaches used to assess evidence in studies included in SRs in periodontology and implant dentistry. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Research Methodology in Second Language Studies: Trends, Concerns, and New Directions

    ERIC Educational Resources Information Center

    King, Kendall A.; Mackey, Alison

    2016-01-01

    The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…

  16. Measurement and Analysis of Multiple Output Transient Propagation in BJT Analog Circuits

    NASA Astrophysics Data System (ADS)

    Roche, Nicolas J.-H.; Khachatrian, A.; Warner, J. H.; Buchner, S. P.; McMorrow, D.; Clymer, D. A.

    2016-08-01

    The propagation of Analog Single Event Transients (ASETs) to multiple outputs of Bipolar Junction Transistor (BJTs) Integrated Circuits (ICs) is reported for the first time. The results demonstrate that ASETs can appear at several outputs of a BJT amplifier or comparator as a result of a single ion or single laser pulse strike at a single physical location on the chip of a large-scale integrated BJT analog circuit. This is independent of interconnect cross-talk or charge-sharing effects. Laser experiments, together with SPICE simulations and analysis of the ASET's propagation in the s-domain are used to explain how multiple-output transients (MOTs) are generated and propagate in the device. This study demonstrates that both the charge collection associated with an ASET and the ASET's shape, commonly used to characterize the propagation of SETs in devices and systems, are unable to explain quantitatively how MOTs propagate through an integrated analog circuit. The analysis methodology adopted here involves combining the Fourier transform of the propagating signal and the current-source transfer function in the s-domain. This approach reveals the mechanisms involved in the transient signal propagation from its point of generation to one or more outputs without the signal following a continuous interconnect path.

  17. Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver

    NASA Astrophysics Data System (ADS)

    Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.

    2017-08-01

    The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.

  18. MetaCAA: A clustering-aided methodology for efficient assembly of metagenomic datasets.

    PubMed

    Reddy, Rachamalla Maheedhar; Mohammed, Monzoorul Haque; Mande, Sharmila S

    2014-01-01

    A key challenge in analyzing metagenomics data pertains to assembly of sequenced DNA fragments (i.e. reads) originating from various microbes in a given environmental sample. Several existing methodologies can assemble reads originating from a single genome. However, these methodologies cannot be applied for efficient assembly of metagenomic sequence datasets. In this study, we present MetaCAA - a clustering-aided methodology which helps in improving the quality of metagenomic sequence assembly. MetaCAA initially groups sequences constituting a given metagenome into smaller clusters. Subsequently, sequences in each cluster are independently assembled using CAP3, an existing single genome assembly program. Contigs formed in each of the clusters along with the unassembled reads are then subjected to another round of assembly for generating the final set of contigs. Validation using simulated and real-world metagenomic datasets indicates that MetaCAA aids in improving the overall quality of assembly. A software implementation of MetaCAA is available at https://metagenomics.atc.tcs.com/MetaCAA. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. 76 FR 23825 - Study Methodologies for Diagnostics in the Postmarket Setting; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... community on issues related to the studies and methodological approaches examining diagnostics in the... discuss a large number of methodological concerns at the workshop, including, but not limited to the...

  20. 77 FR 66471 - Methodology for Designation of Frontier and Remote Areas

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-05

    ... the use of a shorter, more intuitively appealing descriptive label in research publications and other...) Selection of final methodological approach; and (8) Analyses using final methodology on 2000 data. All the...

Top