Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi
2013-09-18
The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.
Chiasson, Mike; Reddy, Madhu; Kaplan, Bonnie; Davidson, Elizabeth
2007-06-01
The effective use of information technology (IT) is a crucial component for the delivery of effective services in health care. Current approaches to medical informatics (MI) research have significantly contributed to the success of IT use in health care but important challenges remain to be addressed. We believe that expanding the multi-disciplinary basis for MI research is important to meeting these research challenges. In this paper, we outline theories and methods used in information systems (IS) research that we believe can inform our understanding of health care IT applications and outcomes. To do so, we discuss some general differences in the focus and methods of MI and IS research to identify broad opportunities. We then review conceptual and methodological approaches in IS that have been applied in health care IT research. These include: technology-use mediation, collaborative work, genre theory, interpretive research, action research, and modeling. Examples of these theories and methods in healthcare IS research are illustrated.
Braithwaite, Jeffrey; Westbrook, Johanna; Pawsey, Marjorie; Greenfield, David; Naylor, Justine; Iedema, Rick; Runciman, Bill; Redman, Sally; Jorm, Christine; Robinson, Maureen; Nathan, Sally; Gibberd, Robert
2006-01-01
Background Accreditation has become ubiquitous across the international health care landscape. Award of full accreditation status in health care is viewed, as it is in other sectors, as a valid indicator of high quality organisational performance. However, few studies have empirically demonstrated this assertion. The value of accreditation, therefore, remains uncertain, and this persists as a central legitimacy problem for accreditation providers, policymakers and researchers. The question arises as to how best to research the validity, impact and value of accreditation processes in health care. Most health care organisations participate in some sort of accreditation process and thus it is not possible to study its merits using a randomised controlled strategy. Further, tools and processes for accreditation and organisational performance are multifaceted. Methods/design To understand the relationship between them a multi-method research approach is required which incorporates both quantitative and qualitative data. The generic nature of accreditation standard development and inspection within different sectors enhances the extent to which the findings of in-depth study of accreditation process in one industry can be generalised to other industries. This paper presents a research design which comprises a prospective, multi-method, multi-level, multi-disciplinary approach to assess the validity, impact and value of accreditation. Discussion The accreditation program which assesses over 1,000 health services in Australia is used as an exemplar for testing this design. The paper proposes this design as a framework suitable for application to future international research into accreditation. Our aim is to stimulate debate on the role of accreditation and how to research it. PMID:16968552
Identity, Intersectionality, and Mixed-Methods Approaches
ERIC Educational Resources Information Center
Harper, Casandra E.
2011-01-01
In this article, the author argues that current strategies to study and understand students' identities fall short of fully capturing their complexity. A multi-dimensional perspective and a mixed-methods approach can reveal nuance that is missed with current approaches. The author offers an illustration of how mixed-methods research can promote a…
NASA Astrophysics Data System (ADS)
Kuzle, A.
2018-06-01
The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.
Shared worlds: multi-sited ethnography and nursing research.
Molloy, Luke; Walker, Kim; Lakeman, Richard
2017-03-22
Background Ethnography, originally developed for the study of supposedly small-scale societies, is now faced with an increasingly mobile, changing and globalised world. Cultural identities can exist without reference to a specific location and extend beyond regional and national boundaries. It is therefore no longer imperative that the sole object of the ethnographer's practice should be a geographically bounded site. Aim To present a critical methodological review of multi-sited ethnography. Discussion Understanding that it can no longer be taken with any certainty that location alone determines culture, multi-sited ethnography provides a method of contextualising multi-sited social phenomena. The method enables researchers to examine social phenomena that are simultaneously produced in different locations. It has been used to undertake cultural analysis of diverse areas such as organ trafficking, global organisations, technologies and anorexia. Conclusion The authors contend that multi-sited ethnography is particularly suited to nursing research as it provides researchers with an ethnographic method that is more relevant to the interconnected world of health and healthcare services. Implications for practice Multi-sited ethnography provides nurse researchers with an approach to cultural analysis in areas such as the social determinants of health, healthcare services and the effects of health policies across multiple locations.
ERIC Educational Resources Information Center
Bértoa, Fernando Casal
2017-01-01
Although much has been written about the process of party system institutionalization in different regions, the reasons why some party systems institutionalize while others do not still remain a mystery. Seeking to fill this lacuna in the literature, and using a mixed-methods research approach, this article constitutes a first attempt to answer…
Multi-label literature classification based on the Gene Ontology graph.
Jin, Bo; Muller, Brian; Zhai, Chengxiang; Lu, Xinghua
2008-12-08
The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators) that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate protein annotation based on the literature.
Chen, Hai; Liang, Xiaoying; Li, Rui
2013-01-01
Multi-Agent Systems (MAS) offer a conceptual approach to include multi-actor decision making into models of land use change. Through the simulation based on the MAS, this paper tries to show the application of MAS in the micro scale LUCC, and reveal the transformation mechanism of difference scale. This paper starts with a description of the context of MAS research. Then, it adopts the Nested Spatial Choice (NSC) method to construct the multi-scale LUCC decision-making model. And a case study for Mengcha village, Mizhi County, Shaanxi Province is reported. Finally, the potentials and drawbacks of the following approach is discussed and concluded. From our design and implementation of the MAS in multi-scale model, a number of observations and conclusions can be drawn on the implementation and future research directions. (1) The use of the LUCC decision-making and multi-scale transformation framework provides, according to us, a more realistic modeling of multi-scale decision making process. (2) By using continuous function, rather than discrete function, to construct the decision-making of the households is more realistic to reflect the effect. (3) In this paper, attempts have been made to give a quantitative analysis to research the household interaction. And it provides the premise and foundation for researching the communication and learning among the households. (4) The scale transformation architecture constructed in this paper helps to accumulate theory and experience for the interaction research between the micro land use decision-making and the macro land use landscape pattern. Our future research work will focus on: (1) how to rational use risk aversion principle, and put the rule on rotation between household parcels into model. (2) Exploring the methods aiming at researching the household decision-making over a long period, it allows us to find the bridge between the long-term LUCC data and the short-term household decision-making. (3) Researching the quantitative method and model, especially the scenario analysis model which may reflect the interaction among different household types.
FODEM: A Multi-Threaded Research and Development Method for Educational Technology
ERIC Educational Resources Information Center
Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki
2012-01-01
Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…
Progress in multi-dimensional upwind differencing
NASA Technical Reports Server (NTRS)
Vanleer, Bram
1992-01-01
Multi-dimensional upwind-differencing schemes for the Euler equations are reviewed. On the basis of the first-order upwind scheme for a one-dimensional convection equation, the two approaches to upwind differencing are discussed: the fluctuation approach and the finite-volume approach. The usual extension of the finite-volume method to the multi-dimensional Euler equations is not entirely satisfactory, because the direction of wave propagation is always assumed to be normal to the cell faces. This leads to smearing of shock and shear waves when these are not grid-aligned. Multi-directional methods, in which upwind-biased fluxes are computed in a frame aligned with a dominant wave, overcome this problem, but at the expense of robustness. The same is true for the schemes incorporating a multi-dimensional wave model not based on multi-dimensional data but on an 'educated guess' of what they could be. The fluctuation approach offers the best possibilities for the development of genuinely multi-dimensional upwind schemes. Three building blocks are needed for such schemes: a wave model, a way to achieve conservation, and a compact convection scheme. Recent advances in each of these components are discussed; putting them all together is the present focus of a worldwide research effort. Some numerical results are presented, illustrating the potential of the new multi-dimensional schemes.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Data fusion of multi-scale representations for structural damage detection
NASA Astrophysics Data System (ADS)
Guo, Tian; Xu, Zili
2018-01-01
Despite extensive researches into structural health monitoring (SHM) in the past decades, there are few methods that can detect multiple slight damage in noisy environments. Here, we introduce a new hybrid method that utilizes multi-scale space theory and data fusion approach for multiple damage detection in beams and plates. A cascade filtering approach provides multi-scale space for noisy mode shapes and filters the fluctuations caused by measurement noise. In multi-scale space, a series of amplification and data fusion algorithms are utilized to search the damage features across all possible scales. We verify the effectiveness of the method by numerical simulation using damaged beams and plates with various types of boundary conditions. Monte Carlo simulations are conducted to illustrate the effectiveness and noise immunity of the proposed method. The applicability is further validated via laboratory cases studies focusing on different damage scenarios. Both results demonstrate that the proposed method has a superior noise tolerant ability, as well as damage sensitivity, without knowing material properties or boundary conditions.
[Research on the methods for multi-class kernel CSP-based feature extraction].
Wang, Jinjia; Zhang, Lingzhi; Hu, Bei
2012-04-01
To relax the presumption of strictly linear patterns in the common spatial patterns (CSP), we studied the kernel CSP (KCSP). A new multi-class KCSP (MKCSP) approach was proposed in this paper, which combines the kernel approach with multi-class CSP technique. In this approach, we used kernel spatial patterns for each class against all others, and extracted signal components specific to one condition from EEG data sets of multiple conditions. Then we performed classification using the Logistic linear classifier. Brain computer interface (BCI) competition III_3a was used in the experiment. Through the experiment, it can be proved that this approach could decompose the raw EEG singles into spatial patterns extracted from multi-class of single trial EEG, and could obtain good classification results.
NASA Astrophysics Data System (ADS)
Chen, Shiyu; Li, Haiyang; Baoyin, Hexi
2018-06-01
This paper investigates a method for optimizing multi-rendezvous low-thrust trajectories using indirect methods. An efficient technique, labeled costate transforming, is proposed to optimize multiple trajectory legs simultaneously rather than optimizing each trajectory leg individually. Complex inner-point constraints and a large number of free variables are one main challenge in optimizing multi-leg transfers via shooting algorithms. Such a difficulty is reduced by first optimizing each trajectory leg individually. The results may be, next, utilized as an initial guess in the simultaneous optimization of multiple trajectory legs. In this paper, the limitations of similar techniques in previous research is surpassed and a homotopic approach is employed to improve the convergence efficiency of the shooting process in multi-rendezvous low-thrust trajectory optimization. Numerical examples demonstrate that newly introduced techniques are valid and efficient.
Somatic Consequences and Symptomatic Responses to Stress: Directions for Future Research
1999-07-01
endeavors, some early work in developing multi- method , multi-source assessment approaches for identifying cases of PTSD; some clinical studies ... research dealing with the entire concept of the cultural shaping of what he calls the illness narrative and the way in which this tends to control the...talk for five to ten minutes about the pattern of the research you’ve been doing and the directions it’s been going in and the directions you think it
The role of simulation in mixed-methods research: a framework & application to patient safety.
Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth
2017-05-04
Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.
ERIC Educational Resources Information Center
Parsons, Sarah; Cobb, Sue
2014-01-01
Technology design in the field of human-computer interaction has developed a continuum of participatory research methods, closely mirroring methodological approaches and epistemological discussions in other fields. This paper positions such approaches as examples of inclusive research (to varying degrees) within education, and illustrates the…
Australian Public Universities: Are They Practising a Corporate Approach to Governance?
ERIC Educational Resources Information Center
Christopher, Joseph
2014-01-01
This article draws on the multi-theoretical approach to governance and a qualitative research method to examine the extent to which the corporate approach is practised in Australian public universities. The findings reveal that in meeting the needs of multiple stakeholders, universities are faced with a number of structural, legalistic, and…
Beyond mind-reading: multi-voxel pattern analysis of fMRI data.
Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V
2006-09-01
A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.
ERIC Educational Resources Information Center
Rizvi, Sadaf, Ed.
2011-01-01
This book provides an original perspective on a range of controversial issues in educational and social research through case studies of multi-disciplinary and mixed-method research involving children, teachers, schools and communities in Europe and the developing world. These case studies from researchers "across continents" and…
Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel
Shanhua, Xu; Songbo, Ren; Youde, Wang
2015-01-01
To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel. PMID:26121468
Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel.
Shanhua, Xu; Songbo, Ren; Youde, Wang
2015-01-01
To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel.
NASA Astrophysics Data System (ADS)
Zhang, Bo; Zhang, Long; Ye, Zhongfu
2016-12-01
A novel sky-subtraction method based on non-negative matrix factorisation with sparsity is proposed in this paper. The proposed non-negative matrix factorisation with sparsity method is redesigned for sky-subtraction considering the characteristics of the skylights. It has two constraint terms, one for sparsity and the other for homogeneity. Different from the standard sky-subtraction techniques, such as the B-spline curve fitting methods and the Principal Components Analysis approaches, sky-subtraction based on non-negative matrix factorisation with sparsity method has higher accuracy and flexibility. The non-negative matrix factorisation with sparsity method has research value for the sky-subtraction on multi-object fibre spectroscopic telescope surveys. To demonstrate the effectiveness and superiority of the proposed algorithm, experiments are performed on Large Sky Area Multi-Object Fiber Spectroscopic Telescope data, as the mechanisms of the multi-object fibre spectroscopic telescopes are similar.
ERIC Educational Resources Information Center
Morales-del-Castillo, Jose Manuel; Peis, Eduardo; Moreno, Juan Manuel; Herrera-Viedma, Enrique
2009-01-01
Introduction: In this paper we propose a multi-agent Selective Dissemination of Information service to improve the research community's access to digital library resources. The service also provides a new recommendation approach to satisfy researchers' specific information requirements. Method: The service model is developed by jointly applying…
Multiscale Modeling in the Clinic: Drug Design and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, Colleen E.; An, Gary; Cannon, William R.
A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multi-scale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multi-scale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions tomore » guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multi-scale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical techniques employed for multi-scale modeling approaches used in pharmacology and present several examples illustrating the current state-of-the-art regarding drug development for: Excitable Systems (Heart); Cancer (Metastasis and Differentiation); Cancer (Angiogenesis and Drug Targeting); Metabolic Disorders; and Inflammation and Sepsis. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multi-scale models.« less
Teaching Action Research: The Role of Demographics
ERIC Educational Resources Information Center
Mcmurray, Adela J.
2006-01-01
This article summarizes a longitudinal study of employed MBA students with particular emphasis on findings involving their choice of action research model to implement personal and organizational change in their environment. A multi-method approach merging both quantitative and qualitative techniques was utilized. A questionnaire consisting of…
Multi-analytical Approaches Informing the Risk of Sepsis
NASA Astrophysics Data System (ADS)
Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael
Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.
Okurut, Kenan; Kulabako, Robinah Nakawunde; Chenoweth, Jonathan; Charles, Katrina
2015-01-01
Sanitation improvement is crucial in saving lives that are lost due to water contamination. Progress towards achieving full sanitation coverage is still slow in low-income informal settlements in most developing countries. Furthermore, resources are being wasted on installing facilities that are later misused or never used because they do not meet the local demand. Understanding demand for improved sanitation in the local context is critical if facilities are to be continually used. Various approaches that attempt to change peoples' behaviours or create demand have been reviewed to identify what they are designed to address. A multi-disciplinary research team using mixed methods is re-emphasised as a comprehensive approach for assessing demand for improved sanitation in low-income informal settlements, where the sanitation situation is more challenging than in other areas. Further research involving a multi-disciplinary research team and use of mixed methods to assess sanitation demand in informal settlements is needed.
2017-06-09
primary question. This thesis has used the case study research methodology with Capability-Based Assessment (CBA) approach. My engagement in this...protected by more restrictions in their home countries, in which case further publication or sale of copyrighted images is not permissible...effective coordinating mechanism. The research follows the case study method utilizing the Capability Based Analysis (CBA) approach to scrutinize the
ERIC Educational Resources Information Center
Genemo, Hussein; Miah, Shah Jahan; McAndrew, Alasdair
2016-01-01
Assessment has been defined as an authentic method that plays an important role in evaluating students' learning attitude in acquiring lifelong knowledge. Traditional methods of assessment including the Computer-Aided Assessment (CAA) for mathematics show limited ability to assess students' full work unless multi-step questions are sub-divided…
Time series modeling of human operator dynamics in manual control tasks
NASA Technical Reports Server (NTRS)
Biezad, D. J.; Schmidt, D. K.
1984-01-01
A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency responses of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that has not been previously modeled to demonstrate the strengths of the method.
Time Series Modeling of Human Operator Dynamics in Manual Control Tasks
NASA Technical Reports Server (NTRS)
Biezad, D. J.; Schmidt, D. K.
1984-01-01
A time-series technique is presented for identifying the dynamic characteristics of the human operator in manual control tasks from relatively short records of experimental data. Control of system excitation signals used in the identification is not required. The approach is a multi-channel identification technique for modeling multi-input/multi-output situations. The method presented includes statistical tests for validity, is designed for digital computation, and yields estimates for the frequency response of the human operator. A comprehensive relative power analysis may also be performed for validated models. This method is applied to several sets of experimental data; the results are discussed and shown to compare favorably with previous research findings. New results are also presented for a multi-input task that was previously modeled to demonstrate the strengths of the method.
Magrans de Abril, Ildefons; Yoshimoto, Junichiro; Doya, Kenji
2018-06-01
This article presents a review of computational methods for connectivity inference from neural activity data derived from multi-electrode recordings or fluorescence imaging. We first identify biophysical and technical challenges in connectivity inference along the data processing pipeline. We then review connectivity inference methods based on two major mathematical foundations, namely, descriptive model-free approaches and generative model-based approaches. We investigate representative studies in both categories and clarify which challenges have been addressed by which method. We further identify critical open issues and possible research directions. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Noar, Seth M; Mehrotra, Purnima
2011-03-01
Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Contributions of Youth Engagement to the Development of Social Capital through Community Mapping
ERIC Educational Resources Information Center
Nathaniel, Keith C.; Kinsey, Sharon B.
2013-01-01
The Multi-State North Central Extension Research Activity (NCERA), Contributions of 4-H Participation to the Development of Social Capital, identified a strategy to pilot a research method that incorporates an inquiry-based approach to understanding community level impact of youth programs. This article focuses on how youth engagement educators…
Researching Employment Relations: A Self-Reflexive Analysis of a Multi-Method, School-Based Project
ERIC Educational Resources Information Center
McDonald, Paula; Graham, Tina
2011-01-01
Drawing on primary data and adjunct material, this article adopts a critical self-reflexive approach to a three-year, Australian Research Council-funded project that explored themes around "employment citizenship" for high school students in Queensland. The article addresses three overlapping areas that reflect some of the central…
The U.S. EPA National Health and Environmental Effects Research Laboratory's (NHEERL) Wildlife Research Strategy was developed to provide methods, models and data to address concerns related to toxic chemicals and habitat alteration in the context of wildlife risk assessment and ...
Pfeifer, Mischa D; Scholkmann, Felix; Labruyère, Rob
2017-01-01
Even though research in the field of functional near-infrared spectroscopy (fNIRS) has been performed for more than 20 years, consensus on signal processing methods is still lacking. A significant knowledge gap exists between established researchers and those entering the field. One major issue regularly observed in publications from researchers new to the field is the failure to consider possible signal contamination by hemodynamic changes unrelated to neurovascular coupling (i.e., scalp blood flow and systemic blood flow). This might be due to the fact that these researchers use the signal processing methods provided by the manufacturers of their measurement device without an advanced understanding of the performed steps. The aim of the present study was to investigate how different signal processing approaches (including and excluding approaches that partially correct for the possible signal contamination) affect the results of a typical functional neuroimaging study performed with fNIRS. In particular, we evaluated one standard signal processing method provided by a commercial company and compared it to three customized approaches. We thereby investigated the influence of the chosen method on the statistical outcome of a clinical data set (task-evoked motor cortex activity). No short-channels were used in the present study and therefore two types of multi-channel corrections based on multiple long-channels were applied. The choice of the signal processing method had a considerable influence on the outcome of the study. While methods that ignored the contamination of the fNIRS signals by task-evoked physiological noise yielded several significant hemodynamic responses over the whole head, the statistical significance of these findings disappeared when accounting for part of the contamination using a multi-channel regression. We conclude that adopting signal processing methods that correct for physiological confounding effects might yield more realistic results in cases where multi-distance measurements are not possible. Furthermore, we recommend using manufacturers' standard signal processing methods only in case the user has an advanced understanding of every signal processing step performed.
ERIC Educational Resources Information Center
Brewer-Lowry, Aleshia Nichol; Arcury, Thomas A.; Bell, Ronny A.; Quandt, Sara A.
2010-01-01
Purpose of the Study: This study identified approaches to diabetes self-management that differentiate persons with well-controlled from poorly controlled diabetes. Previous research has focused largely on persons participating in self-management interventions. Design and Methods: In-depth qualitative interviews were conducted with 48 adults, drawn…
Willis, C. D.; Greene, J. K.; Abramowicz, A.; Riley, B. L.
2016-01-01
Abstract Introduction: The Public Health Agency of Canada’s Multi-sectoral Partnerships Initiative, administered by the Centre for Chronic Disease Prevention (CCDP), brings together diverse partners to design, implement and advance innovative approaches for improving population health. This article describes the development and initial priorities of an action research project (a learning and improvement strategy) that aims to facilitate continuous improvement of the CCDP’s partnership initiative and contribute to the evidence on multi-sectoral partnerships. Methods: The learning and improvement strategy for the CCDP’s multi-sectoral partnership initiative was informed by (1) consultations with CCDP staff and senior management, and (2) a review of conceptual frameworks to do with multi-sectoral partnerships. Consultations explored the development of the multi-sectoral initiative, barriers and facilitators to success, and markers of effectiveness. Published and grey literature was reviewed using a systematic search strategy with findings synthesized using a narrative approach. Results: Consultations and the review highlighted the importance of understanding partnership impacts, developing a shared vision, implementing a shared measurement system and creating opportunities for knowledge exchange. With that in mind, we propose a six-component learning and improvement strategy that involves (1) prioritizing learning needs, (2) mapping needs to evidence, (3) using relevant data-collection methods, (4) analyzing and synthesizing data, (5) feeding data back to CCDP staff and teams and (6) taking action. Initial learning needs include investigating partnership reach and the unanticipated effects of multi-sectoral partnerships for individuals, groups, organizations or communities. Conclusion: While the CCDP is the primary audience for the learning and improvement strategy, it may prove useful for a range of audiences, including other government departments and external organizations interested in capturing and sharing new knowledge generated from multi-sectoral partnerships. PMID:27284702
Seismic Data Analysis throught Multi-Class Classification.
NASA Astrophysics Data System (ADS)
Anderson, P.; Kappedal, R. D.; Magana-Zook, S. A.
2017-12-01
In this research, we conducted twenty experiments of varying time and frequency bands on 5000seismic signals with the intent of finding a method to classify signals as either an explosion or anearthquake in an automated fashion. We used a multi-class approach by clustering of the data throughvarious techniques. Dimensional reduction was examined through the use of wavelet transforms withthe use of the coiflet mother wavelet and various coefficients to explore possible computational time vsaccuracy dependencies. Three and four classes were generated from the clustering techniques andexamined with the three class approach producing the most accurate and realistic results.
ERIC Educational Resources Information Center
Woodzicka, Julie A.; Ford, Thomas E.; Caudill, Abbie; Ohanmamooreni, Alyna
2015-01-01
A collaborative research grant from the National Science Foundation allowed the first two authors to provide students at primarily undergraduate institutions with a multi-faculty, multi-institution team research experience. Teams of undergraduate students at Western Carolina University and Washington and Lee University collaborated with one…
ERIC Educational Resources Information Center
Cline, Joy F.
2013-01-01
This research investigated admission criteria of baccalaureate nursing students related to their success in a multi-state sample of peer universities in the United States. The researcher used mixed methods to collect data that were analyzed using descriptive and phenomenological approaches. The sample of the study was chairpersons from peer…
Multi-Level Steering and Institution Building: The European Union's Approach to Research Policy
ERIC Educational Resources Information Center
Young, Mitchell
2012-01-01
Adopting the conception of the university as a primary driver of innovation and economic growth has brought increased pressure for the European Union (EU) to actively steer university-based research policy, despite its being outside of the EU's direct jurisdiction. While the open method of coordination (OMC) was developed for such situations, the…
ERIC Educational Resources Information Center
Perkins, Kathleen M.
2016-01-01
Theatre is a multi-dimensional discipline encompassing aspects of several domains in the arts and humanities. Therefore, an array of scholarly practices, pedagogies, and methods might be available to a SoTL researcher from the close reading of texts in script analysis to portfolio critiques in set, costume, and lighting design--approaches shared…
Silva Junqueira, Vinícius; de Azevedo Peixoto, Leonardo; Galvêas Laviola, Bruno; Lopes Bhering, Leonardo; Mendonça, Simone; Agostini Costa, Tania da Silveira; Antoniassi, Rosemar
2016-01-01
The biggest challenge for jatropha breeding is to identify superior genotypes that present high seed yield and seed oil content with reduced toxicity levels. Therefore, the objective of this study was to estimate genetic parameters for three important traits (weight of 100 seed, oil seed content, and phorbol ester concentration), and to select superior genotypes to be used as progenitors in jatropha breeding. Additionally, the genotypic values and the genetic parameters estimated under the Bayesian multi-trait approach were used to evaluate different selection indices scenarios of 179 half-sib families. Three different scenarios and economic weights were considered. It was possible to simultaneously reduce toxicity and increase seed oil content and weight of 100 seed by using index selection based on genotypic value estimated by the Bayesian multi-trait approach. Indeed, we identified two families that present these characteristics by evaluating genetic diversity using the Ward clustering method, which suggested nine homogenous clusters. Future researches must integrate the Bayesian multi-trait methods with realized relationship matrix, aiming to build accurate selection indices models. PMID:27281340
On the Development of Multi-Step Inverse FEM with Shell Model
NASA Astrophysics Data System (ADS)
Huang, Y.; Du, R.
2005-08-01
The inverse or one-step finite element approach is increasingly used in the sheet metal stamping industry to predict strain distribution and the initial blank shape in the preliminary design stage. Based on the existing theory, there are two types of method: one is based on the principle of virtual work and the other is based on the principle of extreme work. Much research has been conducted to improve the accuracy of simulation results. For example, based on the virtual work principle, Batoz et al. developed a new method using triangular DKT shell elements. In this new method, the bending and unbending effects are considered. Based on the principle of extreme work, Majlessi and et al. proposed the multi-step inverse approach with membrane elements and applied it to an axis-symmetric part. Lee and et al. presented an axis-symmetric shell element model to solve the similar problem. In this paper, a new multi-step inverse method is introduced with no limitation on the workpiece shape. It is a shell element model based on the virtual work principle. The new method is validated by means of comparing to the commercial software system (PAMSTAMP®). The comparison results indicate that the accuracy is good.
Members' sensemaking in a multi-professional team.
Rovio-Johansson, Airi; Liff, Roy
2012-01-01
The aim of this study is to investigate sensemaking as interaction among team members in a multi-professional team setting in a new public management context at a Swedish Child and Youth Psychiatric Unit. A discursive pragmatic approach grounded in ethonomethodology is taken in the analysis of a treatment conference (TC). In order to interpret and understand the multi-voiced complexity of discourse and of talk-in-interaction, the authors use dialogism in the analysis of the members' sensemaking processes. The analysis is based on the theoretical assumption that language and texts are the primary tools actors use to comprehend the social reality and to make sense of their multi-professional discussions. Health care managers are offered insights, derived from theory and empirical evidence, into how professionals' communications influence multi-professional cooperation. The team leader and members are interviewed before and after the observed TC. Team members create their identities and positions in the group by interpreting and "misinterpreting" talk-in-interaction. The analyses reveal the ways the team members relate to their treatment methods in the discussion of a patient; advocating a treatment method means that the team member and the method are intertwined. The findings may be valuable to health care professionals and managers working in teams by showing them how to achieve greater cooperation through the use of verbal abilities. The findings and methods contribute to the international research on cooperation problems in multi-professional teams and to the empirical research on institutional discourse through text and talk.
Guidance for using mixed methods design in nursing practice research.
Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia
2016-08-01
The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.
Design and Implementation of Collaborative Research Approaches
NASA Technical Reports Server (NTRS)
Venti, Mike W.; Berger, David E.
2009-01-01
This poster reviews the collarborative research approaches that NASA has been designing and implementing for the Integrated Vehicle Health Management (IVHM) Project. The inputs for the technical plan are reviewed, the Research Test and Integration Plan (RTIP) WIKI, is used to create and propose a multi-themed and multi-partner research testing opportunities. The outputs are testing opportunities.
Evoked prior learning experience and approach to learning as predictors of academic achievement.
Trigwell, Keith; Ashwin, Paul; Millan, Elena S
2013-09-01
In separate studies and research from different perspectives, five factors are found to be among those related to higher quality outcomes of student learning (academic achievement). Those factors are higher self-efficacy, deeper approaches to learning, higher quality teaching, students' perceptions that their workload is appropriate, and greater learning motivation. University learning improvement strategies have been built on these research results. To investigate how students' evoked prior experience, perceptions of their learning environment, and their approaches to learning collectively contribute to academic achievement. This is the first study to investigate motivation and self-efficacy in the same educational context as conceptions of learning, approaches to learning and perceptions of the learning environment. Undergraduate students (773) from the full range of disciplines were part of a group of over 2,300 students who volunteered to complete a survey of their learning experience. On completing their degrees 6 and 18 months later, their academic achievement was matched with their learning experience survey data. A 77-item questionnaire was used to gather students' self-report of their evoked prior experience (self-efficacy, learning motivation, and conceptions of learning), perceptions of learning context (teaching quality and appropriate workload), and approaches to learning (deep and surface). Academic achievement was measured using the English honours degree classification system. Analyses were conducted using correlational and multi-variable (structural equation modelling) methods. The results from the correlation methods confirmed those found in numerous earlier studies. The results from the multi-variable analyses indicated that surface approach to learning was the strongest predictor of academic achievement, with self-efficacy and motivation also found to be directly related. In contrast to the correlation results, a deep approach to learning was not related to academic achievement, and teaching quality and conceptions of learning were only indirectly related to achievement. Research aimed at understanding how students experience their learning environment and how that experience relates to the quality of their learning needs to be conducted using a wider range of variables and more sophisticated analytical methods. In this study of one context, some of the relations found in earlier bivariate studies, and on which learning intervention strategies have been built, are not confirmed when more holistic teaching-learning contexts are analysed using multi-variable methods. © 2012 The British Psychological Society.
Data structures supporting multi-region adaptive isogeometric analysis
NASA Astrophysics Data System (ADS)
Perduta, Anna; Putanowicz, Roman
2018-01-01
Since the first paper published in 2005 Isogeometric Analysis (IGA) has gained strong interest and found applications in many engineering problems. Despite the advancement of the method, there are still far fewer software implementations comparing to Finite Element Method. The paper presents an approach to the development of data structures that can support multi-region IGA with local mesh refinement (patch-based) and possible application in IGA-FEM models. The purpose of this paper is to share original design concepts, that authors have created while developing an IGA package, which other researchers may find beneficial for their own simulation codes.
Christodoulidis, Argyrios; Hurtut, Thomas; Tahar, Houssem Ben; Cheriet, Farida
2016-09-01
Segmenting the retinal vessels from fundus images is a prerequisite for many CAD systems for the automatic detection of diabetic retinopathy lesions. So far, research efforts have concentrated mainly on the accurate localization of the large to medium diameter vessels. However, failure to detect the smallest vessels at the segmentation step can lead to false positive lesion detection counts in a subsequent lesion analysis stage. In this study, a new hybrid method for the segmentation of the smallest vessels is proposed. Line detection and perceptual organization techniques are combined in a multi-scale scheme. Small vessels are reconstructed from the perceptual-based approach via tracking and pixel painting. The segmentation was validated in a high resolution fundus image database including healthy and diabetic subjects using pixel-based as well as perceptual-based measures. The proposed method achieves 85.06% sensitivity rate, while the original multi-scale line detection method achieves 81.06% sensitivity rate for the corresponding images (p<0.05). The improvement in the sensitivity rate for the database is 6.47% when only the smallest vessels are considered (p<0.05). For the perceptual-based measure, the proposed method improves the detection of the vasculature by 7.8% against the original multi-scale line detection method (p<0.05). Copyright © 2016 Elsevier Ltd. All rights reserved.
The Validity of the Multi-Informant Approach to Assessing Child and Adolescent Mental Health
De Los Reyes, Andres; Augenstein, Tara M.; Wang, Mo; Thomas, Sarah A.; Drabick, Deborah A.G.; Burgers, Darcy E.; Rabinowitz, Jill
2015-01-01
Child and adolescent patients may display mental health concerns within some contexts and not others (e.g., home vs. school). Thus, understanding the specific contexts in which patients display concerns may assist mental health professionals in tailoring treatments to patients' needs. Consequently, clinical assessments often include reports from multiple informants who vary in the contexts in which they observe patients' behavior (e.g., patients, parents, teachers). Previous meta-analyses indicate that informants' reports correlate at low-to-moderate magnitudes. However, is it valid to interpret low correspondence among reports as indicating that patients display concerns in some contexts and not others? We meta-analyzed 341 studies published between 1989 and 2014 that reported cross-informant correspondence estimates, and observed low-to-moderate correspondence (mean internalizing: r = .25; mean externalizing: r = .30; mean overall: r = .28). Informant pair, mental health domain, and measurement method moderated magnitudes of correspondence. These robust findings have informed the development of concepts for interpreting multi-informant assessments, allowing researchers to draw specific predictions about the incremental and construct validity of these assessments. In turn, we critically evaluated research on the incremental and construct validity of the multi-informant approach to clinical child and adolescent assessment. In so doing, we identify crucial gaps in knowledge for future research, and provide recommendations for “best practices” in using and interpreting multi-informant assessments in clinical work and research. This paper has important implications for developing personalized approaches to clinical assessment, with the goal of informing techniques for tailoring treatments to target the specific contexts where patients display concerns. PMID:25915035
Cross-Sectional HIV Incidence Estimation in HIV Prevention Research
Brookmeyer, Ron; Laeyendecker, Oliver; Donnell, Deborah; Eshleman, Susan H.
2013-01-01
Accurate methods for estimating HIV incidence from cross-sectional samples would have great utility in prevention research. This report describes recent improvements in cross-sectional methods that significantly improve their accuracy. These improvements are based on the use of multiple biomarkers to identify recent HIV infections. These multi-assay algorithms (MAAs) use assays in a hierarchical approach for testing that minimizes the effort and cost of incidence estimation. These MAAs do not require mathematical adjustments for accurate estimation of the incidence rates in study populations in the year prior to sample collection. MAAs provide a practical, accurate, and cost-effective approach for cross-sectional HIV incidence estimation that can be used for HIV prevention research and global epidemic monitoring. PMID:23764641
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
2010-01-01
Multi-Disciplinary, Multi-Output Sensitivity Analysis ( MIMOSA ) .........29 3.1 Introduction to Research Thrust 1...39 3.3 MIMOSA Approach ..........................................................................................41 3.3.1...Collaborative Consistency of MIMOSA .......................................................41 3.3.2 Formulation of MIMOSA
Wan, Shixiang; Duan, Yucong; Zou, Quan
2017-09-01
Predicting the subcellular localization of proteins is an important and challenging problem. Traditional experimental approaches are often expensive and time-consuming. Consequently, a growing number of research efforts employ a series of machine learning approaches to predict the subcellular location of proteins. There are two main challenges among the state-of-the-art prediction methods. First, most of the existing techniques are designed to deal with multi-class rather than multi-label classification, which ignores connections between multiple labels. In reality, multiple locations of particular proteins imply that there are vital and unique biological significances that deserve special focus and cannot be ignored. Second, techniques for handling imbalanced data in multi-label classification problems are necessary, but never employed. For solving these two issues, we have developed an ensemble multi-label classifier called HPSLPred, which can be applied for multi-label classification with an imbalanced protein source. For convenience, a user-friendly webserver has been established at http://server.malab.cn/HPSLPred. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Aerodynamic optimization studies on advanced architecture computers
NASA Technical Reports Server (NTRS)
Chawla, Kalpana
1995-01-01
The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.
The role of economics in the QUERI program: QUERI Series
Smith, Mark W; Barnett, Paul G
2008-01-01
Background The United States (U.S.) Department of Veterans Affairs (VA) Quality Enhancement Research Initiative (QUERI) has implemented economic analyses in single-site and multi-site clinical trials. To date, no one has reviewed whether the QUERI Centers are taking an optimal approach to doing so. Consistent with the continuous learning culture of the QUERI Program, this paper provides such a reflection. Methods We present a case study of QUERI as an example of how economic considerations can and should be integrated into implementation research within both single and multi-site studies. We review theoretical and applied cost research in implementation studies outside and within VA. We also present a critique of the use of economic research within the QUERI program. Results Economic evaluation is a key element of implementation research. QUERI has contributed many developments in the field of implementation but has only recently begun multi-site implementation trials across multiple regions within the national VA healthcare system. These trials are unusual in their emphasis on developing detailed costs of implementation, as well as in the use of business case analyses (budget impact analyses). Conclusion Economics appears to play an important role in QUERI implementation studies, only after implementation has reached the stage of multi-site trials. Economic analysis could better inform the choice of which clinical best practices to implement and the choice of implementation interventions to employ. QUERI economics also would benefit from research on costing methods and development of widely accepted international standards for implementation economics. PMID:18430199
Teske, Steven C
2011-05-01
School officials throughout the United States have adopted zero tolerance policies to address student discipline, resulting in an increase in out-of-school suspensions and expulsions. The introduction of police on school campuses also increased the referral of students to the juvenile courts. Although school personnel generally view zero tolerance policies as a constructive measure, this approach denies recent research on adolescent brain development that mischief is a foreseeable derivative of adolescence. A case study method examined one juvenile court's innovative multi-integrated systems approach related to the adverse trends associated with zero tolerance policies. A multi-disciplinary protocol resulted in more effective youth assessments that reduced out-of-school suspensions and school referrals; increased graduation rates by 20%; and decreased delinquent felony rates by nearly 50%. The resulting protocol changed how the system responds to disruptive students by significantly reducing out-of-school suspensions and school referrals, and putting into place alternatives as well as providing community resources to address the underlying causes of the behavior. A multi-systems approach that targets the reasons for disruptive behavior improves student educational and behavioral outcomes. © 2011 Wiley Periodicals, Inc.
Balasubramanian, Bijal A; Cohen, Deborah J; Davis, Melinda M; Gunn, Rose; Dickinson, L Miriam; Miller, William L; Crabtree, Benjamin F; Stange, Kurt C
2015-03-10
In healthcare change interventions, on-the-ground learning about the implementation process is often lost because of a primary focus on outcome improvements. This paper describes the Learning Evaluation, a methodological approach that blends quality improvement and implementation research methods to study healthcare innovations. Learning Evaluation is an approach to multi-organization assessment. Qualitative and quantitative data are collected to conduct real-time assessment of implementation processes while also assessing changes in context, facilitating quality improvement using run charts and audit and feedback, and generating transportable lessons. Five principles are the foundation of this approach: (1) gather data to describe changes made by healthcare organizations and how changes are implemented; (2) collect process and outcome data relevant to healthcare organizations and to the research team; (3) assess multi-level contextual factors that affect implementation, process, outcome, and transportability; (4) assist healthcare organizations in using data for continuous quality improvement; and (5) operationalize common measurement strategies to generate transportable results. Learning Evaluation principles are applied across organizations by the following: (1) establishing a detailed understanding of the baseline implementation plan; (2) identifying target populations and tracking relevant process measures; (3) collecting and analyzing real-time quantitative and qualitative data on important contextual factors; (4) synthesizing data and emerging findings and sharing with stakeholders on an ongoing basis; and (5) harmonizing and fostering learning from process and outcome data. Application to a multi-site program focused on primary care and behavioral health integration shows the feasibility and utility of Learning Evaluation for generating real-time insights into evolving implementation processes. Learning Evaluation generates systematic and rigorous cross-organizational findings about implementing healthcare innovations while also enhancing organizational capacity and accelerating translation of findings by facilitating continuous learning within individual sites. Researchers evaluating change initiatives and healthcare organizations implementing improvement initiatives may benefit from a Learning Evaluation approach.
a Framework for Low-Cost Multi-Platform VR and AR Site Experiences
NASA Astrophysics Data System (ADS)
Wallgrün, J. O.; Huang, J.; Zhao, J.; Masrur, A.; Oprean, D.; Klippel, A.
2017-11-01
Low-cost consumer-level immersive solutions have the potential to revolutionize education and research in many fields by providing virtual experiences of sites that are either inaccessible, too dangerous, or too expensive to visit, or by augmenting in-situ experiences using augmented and mixed reality methods. We present our approach for creating low-cost multi-platform virtual and augmented reality site experiences of real world places for education and research purposes, making extensive use of Structure-from-Motion methods as well as 360° photography and videography. We discuss several example projects, for the Mayan City of Cahal Pech, Iceland's Thrihnukar volcano, the Santa Marta informal settlement in Rio, and for the Penn State Campus, and we propose a framework for creating and maintaining such applications by combining declarative content specification methods with a central linked-data based spatio-temporal information system.
An extended Lagrangian method for subsonic flows
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Loh, Ching Y.
1992-01-01
It is well known that fluid motion can be specified by either the Eulerian of Lagrangian description. Most of Computational Fluid Dynamics (CFD) developments over the last three decades have been based on the Eulerian description and considerable progress has been made. In particular, the upwind methods, inspired and guided by the work of Gudonov, have met with many successes in dealing with complex flows, especially where discontinuities exist. However, this shock capturing property has proven to be accurate only when the discontinuity is aligned with one of the grid lines since most upwind methods are strictly formulated in 1-D framework and only formally extended to multi-dimensions. Consequently, the attractive property of crisp resolution of these discontinuities is lost and research on genuine multi-dimensional approach has just been undertaken by several leading researchers. Nevertheless they are still based on the Eulerian description.
Participatory flood vulnerability assessment: a multi-criteria approach
NASA Astrophysics Data System (ADS)
Madruga de Brito, Mariana; Evers, Mariele; Delos Santos Almoradie, Adrian
2018-01-01
This paper presents a participatory multi-criteria decision-making (MCDM) approach for flood vulnerability assessment while considering the relationships between vulnerability criteria. The applicability of the proposed framework is demonstrated in the municipalities of Lajeado and Estrela, Brazil. The model was co-constructed by 101 experts from governmental organizations, universities, research institutes, NGOs, and private companies. Participatory methods such as the Delphi survey, focus groups, and workshops were applied. A participatory problem structuration, in which the modellers work closely with end users, was used to establish the structure of the vulnerability index. The preferences of each participant regarding the criteria importance were spatially modelled through the analytical hierarchy process (AHP) and analytical network process (ANP) multi-criteria methods. Experts were also involved at the end of the modelling exercise for validation. The final product is a set of individual and group flood vulnerability maps. Both AHP and ANP proved to be effective for flood vulnerability assessment; however, ANP is preferred as it considers the dependences among criteria. The participatory approach enabled experts to learn from each other and acknowledge different perspectives towards social learning. The findings highlight that to enhance the credibility and deployment of model results, multiple viewpoints should be integrated without forcing consensus.
Developing and Validating Personas in e-Commerce: A Heuristic Approach
NASA Astrophysics Data System (ADS)
Thoma, Volker; Williams, Bryn
A multi-method persona development process in a large e-commerce business is described. Personas are fictional representations of customers that describe typical user attributes to facilitate a user-centered approach in interaction design. In the current project persona attributes were derived from various data sources, such as stakeholder interviews, user tests and interviews, data mining, customer surveys, and ethnographic (direct observation, diary studies) research. The heuristic approach of using these data sources conjointly allowed for an early validation of relevant persona dimensions.
Research of Simple Multi-Attribute Rating Technique for Decision Support
NASA Astrophysics Data System (ADS)
Siregar, Dodi; Arisandi, Diki; Usman, Ari; Irwan, Dedy; Rahim, Robbi
2017-12-01
One of the roles of decision support system is that it can assist the decision maker in obtaining the appropriate alternative with the desired criteria, one of the methods that could apply for the decision maker is SMART method with multicriteria decision making. This multi-criteria decision-making theory has meaning where every alternative has criteria and has value and weight, and the author uses this approach to facilitate decision making with a compelling case. The problems discussed in this paper are classified into problems of a variety Multiobjective (multiple goals to be accomplished) and multicriteria (many of the decisive criteria in reaching such decisions).
The validity of the multi-informant approach to assessing child and adolescent mental health.
De Los Reyes, Andres; Augenstein, Tara M; Wang, Mo; Thomas, Sarah A; Drabick, Deborah A G; Burgers, Darcy E; Rabinowitz, Jill
2015-07-01
Child and adolescent patients may display mental health concerns within some contexts and not others (e.g., home vs. school). Thus, understanding the specific contexts in which patients display concerns may assist mental health professionals in tailoring treatments to patients' needs. Consequently, clinical assessments often include reports from multiple informants who vary in the contexts in which they observe patients' behavior (e.g., patients, parents, teachers). Previous meta-analyses indicate that informants' reports correlate at low-to-moderate magnitudes. However, is it valid to interpret low correspondence among reports as indicating that patients display concerns in some contexts and not others? We meta-analyzed 341 studies published between 1989 and 2014 that reported cross-informant correspondence estimates, and observed low-to-moderate correspondence (mean internalizing: r = .25; mean externalizing: r = .30; mean overall: r = .28). Informant pair, mental health domain, and measurement method moderated magnitudes of correspondence. These robust findings have informed the development of concepts for interpreting multi-informant assessments, allowing researchers to draw specific predictions about the incremental and construct validity of these assessments. In turn, we critically evaluated research on the incremental and construct validity of the multi-informant approach to clinical child and adolescent assessment. In so doing, we identify crucial gaps in knowledge for future research, and provide recommendations for "best practices" in using and interpreting multi-informant assessments in clinical work and research. This article has important implications for developing personalized approaches to clinical assessment, with the goal of informing techniques for tailoring treatments to target the specific contexts where patients display concerns. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Leading multi-professional teams in the children’s workforce: an action research project
Stuart, Kaz
2012-01-01
Introduction The 2004 Children Act in the UK saw the introduction of integrated working in children’s services. A raft of change followed with processes designed to make joint working easier, and models and theories to support the development of integrated work. This paper explores the links between key concepts and practice. Methods A practitioner action research approach is taken using an autoethnographic account kept over six months. The research question was, to what extent is this group collaborating? Results When the architecture of practice was revealed, differences between espoused and real practice could be seen. Whilst understanding and displaying the outward signs of an effective multi professional group, the individuals did not trust one another. This was exhibited by covert interprofessional issues. As a result, collaborative inertia was achieved. This realisation prompted them to participate in further developmental and participative action research. Conclusion The paper concludes that trust and relational agency are central to effective leadership of multi professional teams. PMID:22371690
Best Practice to Order Authors in Multi/Interdisciplinary Health Sciences Research Publications.
Smith, Elise; Master, Zubin
2017-01-01
Misunderstanding and disputes about authorship are commonplace among members of multi/interdisciplinary health research teams. If left unmanaged and unresolved, these conflicts can undermine knowledge sharing and collaboration, obscure accountability for research, and contribute to the incorrect attribution of credit. To mitigate these issues, certain researchers suggest quantitative authorship distributions schemes (e.g., point systems), while others wish to replace or minimize the importance of authorship by using "contributorship"-a system based on authors' self-reporting contributions. While both methods have advantages, we argue that authorship and contributorship will most likely continue to coexist for multiple ethical and practical reasons. In this article, we develop a five-step "best practice" that incorporates the distribution of both contributorship and authorship for multi/interdisciplinary research. This procedure involves continuous dialogue and the use of a detailed contributorship taxonomy ending with a declaration explaining contributorship, which is used to justify authorship order. Institutions can introduce this approach in responsible conduct of research training as it promotes greater fairness, trust, and collegiality among team members and ultimately reduces confusion and facilitates resolution of time-consuming disagreements.
White, Michael J.; Judd, Maya D.; Poliandri, Simone
2012-01-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches. PMID:23543938
White, Michael J; Judd, Maya D; Poliandri, Simone
2012-08-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches.
ERIC Educational Resources Information Center
Chong, Pei Wen; Graham, Linda J.
2013-01-01
International comparison is complicated by the use of different terms, classification methods, policy frameworks and system structures, not to mention different languages and terminology. Multi-case studies can assist in the understanding of the influence wielded by cultural, social, economic, historical and political forces upon educational…
A Multi-Level Model of Moral Thinking Based on Neuroscience and Moral Psychology
ERIC Educational Resources Information Center
Jeong, Changwoo; Han, Hye Min
2011-01-01
Developments in neurobiology are providing new insights into the biological and physical features of human thinking, and brain-activation imaging methods such as functional magnetic resonance imaging have become the most dominant research techniques to approach the biological part of thinking. With the aid of neurobiology, there also have been…
DOT National Transportation Integrated Search
2016-12-19
The efforts of this project aim to capture and engage these potentials through a design-research method that incorporates a top down, data-driven approach with bottom-up stakeholder perspectives to develop prototypical scenario-based design solutions...
ERIC Educational Resources Information Center
Letwinsky, Karim Medico; Cavender, Monica
2018-01-01
Many preservice teacher (PST) programs throughout the world are preparing students to implement the Core Standards, which require deeper conceptual understandings of mathematics and an informed approach for teaching. In this qualitative multi-case study, researchers explored the teaching methods for two university instructors and changes in PSTs…
The Efficacy of Collaborative Strategic Reading in Middle School Science and Social Studies Classes
ERIC Educational Resources Information Center
Boardman, Alison G.; Klingner, Janette K.; Buckley, Pamela; Annamma, Subini; Lasser, Cristin J.
2015-01-01
This study investigated the efficacy of a multi-component reading comprehension instructional approach, Collaborative Strategic Reading (CSR), compared to business-as-usual instructional methods with 19 teachers and 1074 students in middle school social studies and science classrooms in a large urban district. Researchers collaborated with school…
ERIC Educational Resources Information Center
Hobbs, William D.
2009-01-01
Research on leadership in outdoor adventure programs has focused primarily on Educational and Outdoor Skills. Anecdotal and practical experience has suggested that the performance of highly effective leaders may depend instead on distinctive qualities and components closely tied to individual character--a perspective of transformational…
Johnson, Maxine; O'Hara, Rachel; Hirst, Enid; Weyman, Andrew; Turner, Janette; Mason, Suzanne; Quinn, Tom; Shewan, Jane; Siriwardena, A Niroshan
2017-01-24
Paramedics make important and increasingly complex decisions at scene about patient care. Patient safety implications of influences on decision making in the pre-hospital setting were previously under-researched. Cutting edge perspectives advocate exploring the whole system rather than individual influences on patient safety. Ethnography (the study of people and cultures) has been acknowledged as a suitable method for identifying health care issues as they occur within the natural context. In this paper we compare multiple methods used in a multi-site, qualitative study that aimed to identify system influences on decision making. The study was conducted in three NHS Ambulance Trusts in England and involved researchers from each Trust working alongside academic researchers. Exploratory interviews with key informants e.g. managers (n = 16) and document review provided contextual information. Between October 2012 and July 2013 researchers observed 34 paramedic shifts and ten paramedics provided additional accounts via audio-recorded 'digital diaries' (155 events). Three staff focus groups (total n = 21) and three service user focus groups (total n = 23) explored a range of experiences and perceptions. Data collection and analysis was carried out by academic and ambulance service researchers as well as service users. Workshops were held at each site to elicit feedback on the findings and facilitate prioritisation of issues identified. The use of a multi-method qualitative approach allowed cross-validation of important issues for ambulance service staff and service users. A key factor in successful implementation of the study was establishing good working relationships with academic and ambulance service teams. Enrolling at least one research lead at each site facilitated the recruitment process as well as study progress. Active involvement with the study allowed ambulance service researchers and service users to gain a better understanding of the research process. Feedback workshops allowed stakeholders to discuss and prioritise findings as well as identify new research areas. Combining multiple qualitative methods with a collaborative research approach can facilitate exploration of system influences on patient safety in under-researched settings. The paper highlights empirical issues, strengths and limitations for this approach. Feedback workshops were effective for verifying findings and prioritising areas for future intervention and research.
Krishnamoorthy, Kannan; Mahalingam, Manikandan
2015-03-01
The present study is aimed to select the suitable method for preparation of camptothecin loaded polymeric nanoparticles by utilizing the multi-criteria decision making method. Novel approaches of drug delivery by formulation using nanotechnology are revolutionizing the future of medicine. Recent years have witnessed unprecedented growth of research and application in the area of nanotechnology. Nanoparticles have become an important area of research in the field of drug delivery because they have the ability to deliver a wide range of drug to varying areas of body. Despite of extensive research and development, polymeric nanoparticles are frequently used to improve the therapeutic effect of drugs. A number of techniques are available for the preparation of polymeric nanoparticles. The Analytical Hierarchy Process (AHP) is a method for decision making, which are derived from individual judgements for qualitative factors, using the pair-wise comparison matrix. In AHP, a decision hierarchy is constructed with a goal, criteria and alternatives. The model uses three main criteria 1) Instrument, 2) Process and Output and 3) Cost. In addition, there are eight sub-criteria's as well as eight alternatives. Pair-wise comparison matrixes are used to obtain the overall priority weight and ranking for the selection of suitable method. Nanoprecipitation technique is the most suitable method for the preparation of camptothecin loaded polymeric nanoparticles with the highest overall priority weight of 0.297 CONCLUSION: In particular, the result indicates that the priority weights obtained from AHP could be defined as a multiple output for finding out the most suitable method for preparation of camptothecin loaded polymeric nanoparticles.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Kumar, Akhil; Tiwari, Ashish; Sharma, Ashok
2018-03-15
Alzheimer disease (AD) is now considered as a multifactorial neurodegenerative disorder and rapidly increasing to an alarming situation and causing higher death rate. One target one ligand hypothesis is not able to provide complete solution of AD due to multifactorial nature of disease and one target one drug seems to fail to provide better treatment against AD. Moreover, current available treatments are limited and most of the upcoming treatments under clinical trials are based on modulating single target. So the current AD drug discovery research shifting towards new approach for better solution that simultaneously modulate more than one targets in the neurodegenerative cascade. This can be achieved by network pharmacology, multi-modal therapies, multifaceted, and/or the more recently proposed term "multi-targeted designed drugs. Drug discovery project is tedious, costly and long term project. Moreover, multi target AD drug discovery added extra challenges such as good binding affinity of ligands for multiple targets, optimal ADME/T properties, no/less off target side effect and crossing of the blood brain barrier. These hurdles may be addressed by insilico methods for efficient solution in less time and cost as computational methods successfully applied to single target drug discovery project. Here we are summarizing some of the most prominent and computationally explored single target against AD and further we discussed successful example of dual or multiple inhibitors for same targets. Moreover we focused on ligand and structure based computational approach to design MTDL against AD. However is not an easy task to balance dual activity in a single molecule but computational approach such as virtual screening docking, QSAR, simulation and free energy are useful in future MTDLs drug discovery alone or in combination with fragment based method. However, rational and logical implementations of computational drug designing methods are capable of assisting AD drug discovery and play an important role in optimizing multi-target drug discovery. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Waller, Rebecca; Gardner, Frances; Dishion, Thomas; Sitnick, Stephanie L.; Shaw, Daniel S.; Winter, Charlotte E.; Wilson, Melvin
2016-01-01
A large literature provides strong empirical support for the influence of parenting on child outcomes. The current study addresses enduring research questions testing the importance of early parenting behavior to children’s adjustment. Specifically, we developed and tested a novel multi-method observational measure of parental positive behavior support at age 2. Next, we tested whether early parental positive behavior support was related to child adjustment at school age, within a multi-agent and multi-method measurement approach and design. Observational and parent-reported data from mother–child dyads (N = 731; 49 percent female) were collected from a high-risk sample at age 2. Follow-up data were collected via teacher report and child assessment at age 7.5. The results supported combining three different observational methods to assess positive behavior support at age 2 within a latent factor. Further, parents’ observed positive behavior support at age 2 predicted multiple types of teacher-reported and child-assessed problem behavior and competencies at 7.5 years old. Results supported the validity and predictive capability of a multi-method observational measure of parenting and the importance of a continued focus on the early years within preventive interventions. PMID:26997757
Waller, Rebecca; Gardner, Frances; Dishion, Thomas; Sitnick, Stephanie L; Shaw, Daniel S; Winter, Charlotte E; Wilson, Melvin
2015-05-01
A large literature provides strong empirical support for the influence of parenting on child outcomes. The current study addresses enduring research questions testing the importance of early parenting behavior to children's adjustment. Specifically, we developed and tested a novel multi-method observational measure of parental positive behavior support at age 2. Next, we tested whether early parental positive behavior support was related to child adjustment at school age, within a multi-agent and multi-method measurement approach and design. Observational and parent-reported data from mother-child dyads (N = 731; 49 percent female) were collected from a high-risk sample at age 2. Follow-up data were collected via teacher report and child assessment at age 7.5. The results supported combining three different observational methods to assess positive behavior support at age 2 within a latent factor. Further, parents' observed positive behavior support at age 2 predicted multiple types of teacher-reported and child-assessed problem behavior and competencies at 7.5 years old. Results supported the validity and predictive capability of a multi-method observational measure of parenting and the importance of a continued focus on the early years within preventive interventions.
Kushniruk, Andre; Senathirajah, Yalini; Borycki, Elizabeth
2017-01-01
The usability and safety of health information systems have become major issues in the design and implementation of useful healthcare IT. In this paper we describe a multi-phased multi-method approach to integrating usability engineering methods into system testing to ensure both usability and safety of healthcare IT upon widespread deployment. The approach involves usability testing followed by clinical simulation (conducted in-situ) and "near-live" recording of user interactions with systems. At key stages in this process, usability problems are identified and rectified forming a usability and technology-induced error "safety net" that catches different types of usability and safety problems prior to releasing systems widely in healthcare settings.
Classen, Sherrilene; Lopez, Ellen DS; Winter, Sandra; Awadzi, Kezia D; Ferree, Nita; Garvan, Cynthia W
2007-01-01
The topic of motor vehicle crashes among the elderly is dynamic and multi-faceted requiring a comprehensive and synergistic approach to intervention planning. This approach must be based on the values of a given population as well as health statistics and asserted through community, organizational and policy strategies. An integrated summary of the predictors (quantitative research), and views (qualitative research) of the older drivers and their stakeholders, does not currently exist. This study provided an explicit socio-ecological view explaining the interrelation of possible causative factors, an integrated summary of these causative factors, and empirical guidelines for developing public health interventions to promote older driver safety. Using a mixed methods approach, we were able to compare and integrate main findings from a national crash dataset with perspectives of stakeholders. We identified: 11 multi-causal factors for safe elderly driving; the importance of the environmental factors - previously underrated in the literature- interacting with behavioral and health factors; and the interrelatedness among many socio-ecological factors. For the first time, to our knowledge, we conceptualized the fundamental elements of a multi-causal health promotion plan, with measurable intermediate and long-term outcomes. After completing the detailed plan we will test the effectiveness of this intervention on multiple levels. PMID:18225470
NASA Astrophysics Data System (ADS)
Patriot, E. A.; Suhandi, A.; Chandra, D. T.
2018-05-01
The ultimate goal of learning in the curriculum 2013 is that learning must improve and balance between soft skills and hard skills of learners. In addition to the knowledge aspect, one of the other skills to be trained in the learning process using a scientific approach is communication skills. This study aims to get an overview of the implementation of interactive conceptual instruction with multi representation to optimize the achievement of students’ scientific communication skills on work and energy concept. The scientific communication skills contains the sub-skills were searching the information, scientific writing, group discussion and knowledge presentation. This study was descriptive research with observation method. Subjects in this study were 35 students of class X in Senior High School at Sumedang. The results indicate an achievement of optimal scientific communication skills. The greatest achievement of KKI based on observation is at fourth meeting of KKI-3, which is a sub-skill of resume writing of 89%. Allmost students responded positively to the implication of interactive conceptual instruction with multi representation approach. It can be concluded that the implication of interactive conceptual instruction with multi representation approach can optimize the achievement of students’ scientific communication skill on work and energy concept.
Isocyanates and human health: Multi-stakeholder information needs and research priorities
Lockey, JE; Redlich, CA; Streicher, R; Pfahles-Hutchens, A; Hakkinen, PJ; Ellison, GL; Harber, P; Utell, M; Holland, J; Comai, A; White, Marc
2014-01-01
Objective Outline the knowledge gaps and research priorities identified by a broad-base of stakeholders involved in the planning and participation of an international conference and research agenda workshop on isocyanates and human health held in Potomac, Maryland in April 2013. Methods A multi-modal iterative approach was employed for data collection including pre-conference surveys, review of a 2001 consensus conference on isocyanates, oral and poster presentations, focused break-out sessions, panel discussions and post-conference research agenda workshop. Results Participants included representatives of consumer and worker health, health professionals, regulatory agencies, academic and industry scientists, labor, and trade associations. Conclusions Recommendations were summarized regarding knowledge gaps and research priorities in the following areas: worker and consumer exposures; toxicology, animal models, and biomarkers; human cancer risk; environmental exposure and monitoring; and respiratory epidemiology and disease, and occupational health surveillance. PMID:25563538
Participatory approaches to understanding practices of flood management across borders
NASA Astrophysics Data System (ADS)
Bracken, L. J.; Forrester, J.; Oughton, E. A.; Cinderby, S.; Donaldson, A.; Anness, L.; Passmore, D.
2012-04-01
The aim of this paper is to outline and present initial results from a study designed to identify principles of and practices for adaptive co-management strategies for resilience to flooding in borderlands using participatory methods. Borderlands are the complex and sometimes undefined spaces existing at the interface of different territories and draws attention towards messy connections and disconnections (Strathern 2004; Sassen 2006). For this project the borderlands concerned are those between professional and lay knowledge, between responsible agencies, and between one nation and another. Research was focused on the River Tweed catchment, located on the Scottish-English border. This catchment is subject to complex environmental designations and rural development regimes that make integrated management of the whole catchment difficult. A multi-method approach was developed using semi-structured interviews, Q methodology and participatory GIS in order to capture wide ranging practices for managing flooding, the judgements behind these practices and to 'scale up' participation in the study. Professionals and local experts were involved in the research. The methodology generated a useful set of options for flood management, with research outputs easily understood by key management organisations and the wider public alike. There was a wide endorsement of alternative flood management solutions from both managers and local experts. The role of location was particularly important for ensuring communication and data sharing between flood managers from different organisations and more wide ranging stakeholders. There were complex issues around scale; both the mismatch between communities and evidence of flooding and the mismatch between governance and scale of intervention for natural flood management. The multi-method approach was essential in capturing practice and the complexities around governance of flooding. The involvement of key flood management organisations was integral to making the research of relevance to professionals.
Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.
Plakas, K V; Georgiadis, A A; Karabelas, A J
2016-01-01
The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.
Ryu, Ehri; Cheong, Jeewon
2017-01-01
In this article, we evaluated the performance of statistical methods in single-group and multi-group analysis approaches for testing group difference in indirect effects and for testing simple indirect effects in each group. We also investigated whether the performance of the methods in the single-group approach was affected when the assumption of equal variance was not satisfied. The assumption was critical for the performance of the two methods in the single-group analysis: the method using a product term for testing the group difference in a single path coefficient, and the Wald test for testing the group difference in the indirect effect. Bootstrap confidence intervals in the single-group approach and all methods in the multi-group approach were not affected by the violation of the assumption. We compared the performance of the methods and provided recommendations. PMID:28553248
Solving multi-objective optimization problems in conservation with the reference point method
Dujardin, Yann; Chadès, Iadine
2018-01-01
Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650
Small Wins: An Initiative to Promote Gender Equity in Higher Education
ERIC Educational Resources Information Center
Johnson, Katherine A.; Warr, Deborah J.; Hegarty, Kelsey; Guillemin, Marilys
2015-01-01
Gender inequity in leadership and management roles within the higher education sector remains a widespread problem. Researchers have suggested that a multi-pronged method is the preferred approach to reach and maintain gender equity over time. A large university faculty undertook an audit to gauge the level of gender equity on the senior…
Evaluating Student Satisfaction of Quality at International Branch Campuses
ERIC Educational Resources Information Center
Ahmad, Syed Zamberi
2015-01-01
The aim of this research is to present the determinants of students' perceptions of quality and experience of study at international branch campuses in Malaysia, a country that is set to become an academic hub in Asia. This study used a multi-method approach for data collection. The respondents comprised 245 students (both undergraduate and…
The Diffusion and Impact of Radio Frequency Identification in Supply Chains: A Multi-Method Approach
ERIC Educational Resources Information Center
Wu, Xiaoran
2012-01-01
As a promising and emerging technology for supply chain management, Radio Frequency Identification (RFID) is a new alternative to existing tracking technologies and also allows a range of internal control and supply chain coordination. RFID has generated a significant amount of interest and activities from both practitioners and researchers in…
Measuring Library Space Use and Preferences: Charting a Path toward Increased Engagement
ERIC Educational Resources Information Center
Webb, Kathleen M.; Schaller, Molly A.; Hunley, Sawyer A.
2008-01-01
The University of Dayton (UD) used a multi-method research approach to evaluate current space use in the library. A general campus survey on study spaces, online library surveys, a week-long video study, and data from the "National Survey of Student Engagement (NSSE)" were examined to understand student choices in library usage. Results…
Krishnamoorthy, Kannan; Mahalingam, Manikandan
2015-01-01
Purpose: The present study is aimed to select the suitable method for preparation of camptothecin loaded polymeric nanoparticles by utilizing the multi-criteria decision making method. Novel approaches of drug delivery by formulation using nanotechnology are revolutionizing the future of medicine. Recent years have witnessed unprecedented growth of research and application in the area of nanotechnology. Nanoparticles have become an important area of research in the field of drug delivery because they have the ability to deliver a wide range of drug to varying areas of body. Methods: Despite of extensive research and development, polymeric nanoparticles are frequently used to improve the therapeutic effect of drugs. A number of techniques are available for the preparation of polymeric nanoparticles. The Analytical Hierarchy Process (AHP) is a method for decision making, which are derived from individual judgements for qualitative factors, using the pair-wise comparison matrix. In AHP, a decision hierarchy is constructed with a goal, criteria and alternatives. Results: The model uses three main criteria 1) Instrument, 2) Process and Output and 3) Cost. In addition, there are eight sub-criteria’s as well as eight alternatives. Pair-wise comparison matrixes are used to obtain the overall priority weight and ranking for the selection of suitable method. Nanoprecipitation technique is the most suitable method for the preparation of camptothecin loaded polymeric nanoparticles with the highest overall priority weight of 0.297 Conclusion: In particular, the result indicates that the priority weights obtained from AHP could be defined as a multiple output for finding out the most suitable method for preparation of camptothecin loaded polymeric nanoparticles. PMID:25789220
Multisource geological data mining and its utilization of uranium resources exploration
NASA Astrophysics Data System (ADS)
Zhang, Jie-lin
2009-10-01
Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.
On Multifunctional Collaborative Methods in Engineering Science
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
2001-01-01
Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized.
Nutrient supplementation approaches in the treatment of ADHD.
Rucklidge, Julia J; Johnstone, Jeanette; Kaplan, Bonnie J
2009-04-01
Attention-deficit/hyperactivity disorder (ADHD) is a chronic, debilitating psychiatric illness that often co-occurs with other common psychiatric problems. Although empirical evidence supports pharmacological and behavioral treatments, side effects, concerns regarding safety and fears about long-term use all contribute to families searching for alternative methods of treating the symptoms of ADHD. This review presents the published evidence on supplementation, including single ingredients (e.g., minerals, vitamins, amino acids and essential fatty acids), botanicals and multi-ingredient formulas in the treatment of ADHD symptoms. In most cases, evidence is sparse, mixed and lacking information. Of those supplements where we found published studies, the evidence is best for zinc (two positive randomized, controlled trials); there is mixed evidence for carnitine, pycnogenol and essential fatty acids, and more research is needed before drawing conclusions about vitamins, magnesium, iron, SAM-e, tryptophan and Ginkgo biloba with ginseng. To date, there is no evidence to support the use of St John's wort, tyrosine or phenylalanine in the treatment of ADHD symptoms. Multi-ingredient approaches are an intriguing yet under-researched area; we discuss the benefits of this approach considering the heterogeneous nature of ADHD.
Bromuri, Stefano; Zufferey, Damien; Hennebert, Jean; Schumacher, Michael
2014-10-01
This research is motivated by the issue of classifying illnesses of chronically ill patients for decision support in clinical settings. Our main objective is to propose multi-label classification of multivariate time series contained in medical records of chronically ill patients, by means of quantization methods, such as bag of words (BoW), and multi-label classification algorithms. Our second objective is to compare supervised dimensionality reduction techniques to state-of-the-art multi-label classification algorithms. The hypothesis is that kernel methods and locality preserving projections make such algorithms good candidates to study multi-label medical time series. We combine BoW and supervised dimensionality reduction algorithms to perform multi-label classification on health records of chronically ill patients. The considered algorithms are compared with state-of-the-art multi-label classifiers in two real world datasets. Portavita dataset contains 525 diabetes type 2 (DT2) patients, with co-morbidities of DT2 such as hypertension, dyslipidemia, and microvascular or macrovascular issues. MIMIC II dataset contains 2635 patients affected by thyroid disease, diabetes mellitus, lipoid metabolism disease, fluid electrolyte disease, hypertensive disease, thrombosis, hypotension, chronic obstructive pulmonary disease (COPD), liver disease and kidney disease. The algorithms are evaluated using multi-label evaluation metrics such as hamming loss, one error, coverage, ranking loss, and average precision. Non-linear dimensionality reduction approaches behave well on medical time series quantized using the BoW algorithm, with results comparable to state-of-the-art multi-label classification algorithms. Chaining the projected features has a positive impact on the performance of the algorithm with respect to pure binary relevance approaches. The evaluation highlights the feasibility of representing medical health records using the BoW for multi-label classification tasks. The study also highlights that dimensionality reduction algorithms based on kernel methods, locality preserving projections or both are good candidates to deal with multi-label classification tasks in medical time series with many missing values and high label density. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rahmanita, E.; Widyaningrum, V. T.; Kustiyahningsih, Y.; Purnama, J.
2018-04-01
SMEs have a very important role in the development of the economy in Indonesia. SMEs assist the government in terms of creating new jobs and can support household income. The number of SMEs in Madura and the number of measurement indicators in the SME mapping so that it requires a method.This research uses Fuzzy Analytic Network Process (FANP) method for performance measurement SME. The FANP method can handle data that contains uncertainty. There is consistency index in determining decisions. Performance measurement in this study is based on a perspective of the Balanced Scorecard. This research approach integrated internal business perspective, learning, and growth perspective and fuzzy Analytic Network Process (FANP). The results of this research areframework a priority weighting of assessment indicators SME.
Polnaszek, Brock; Gilmore-Bykovskyi, Andrea; Hovanes, Melissa; Roiland, Rachel; Ferguson, Patrick; Brown, Roger; Kind, Amy JH
2014-01-01
Background Unstructured data encountered during retrospective electronic medical record (EMR) abstraction has routinely been identified as challenging to reliably abstract, as this data is often recorded as free text, without limitations to format or structure. There is increased interest in reliably abstracting this type of data given its prominent role in care coordination and communication, yet limited methodological guidance exists. Objective As standard abstraction approaches resulted in sub-standard data reliability for unstructured data elements collected as part of a multi-site, retrospective EMR study of hospital discharge communication quality, our goal was to develop, apply and examine the utility of a phase-based approach to reliably abstract unstructured data. This approach is examined using the specific example of discharge communication for warfarin management. Research Design We adopted a “fit-for-use” framework to guide the development and evaluation of abstraction methods using a four step, phase-based approach including (1) team building, (2) identification of challenges, (3) adaptation of abstraction methods, and (4) systematic data quality monitoring. Measures Unstructured data elements were the focus of this study, including elements communicating steps in warfarin management (e.g., warfarin initiation) and medical follow-up (e.g., timeframe for follow-up). Results After implementation of the phase-based approach, inter-rater reliability for all unstructured data elements demonstrated kappas of ≥ 0.89 -- an average increase of + 0.25 for each unstructured data element. Conclusions As compared to standard abstraction methodologies, this phase-based approach was more time intensive, but did markedly increase abstraction reliability for unstructured data elements within multi-site EMR documentation. PMID:27624585
Diaby, Vakaramoko; Goeree, Ron
2014-02-01
In recent years, the quest for more comprehensiveness, structure and transparency in reimbursement decision-making in healthcare has prompted the research into alternative decision-making frameworks. In this environment, multi-criteria decision analysis (MCDA) is arising as a valuable tool to support healthcare decision-making. In this paper, we present the main MCDA decision support methods (elementary methods, value-based measurement models, goal programming models and outranking models) using a case study approach. For each family of methods, an example of how an MCDA model would operate in a real decision-making context is presented from a critical perspective, highlighting the parameters setting, the selection of the appropriate evaluation model as well as the role of sensitivity and robustness analyses. This study aims to provide a step-by-step guide on how to use MCDA methods for reimbursement decision-making in healthcare.
Simplified neutrosophic sets and their applications in multi-criteria group decision-making problems
NASA Astrophysics Data System (ADS)
Peng, Juan-juan; Wang, Jian-qiang; Wang, Jing; Zhang, Hong-yu; Chen, Xiao-hong
2016-07-01
As a variation of fuzzy sets and intuitionistic fuzzy sets, neutrosophic sets have been developed to represent uncertain, imprecise, incomplete and inconsistent information that exists in the real world. Simplified neutrosophic sets (SNSs) have been proposed for the main purpose of addressing issues with a set of specific numbers. However, there are certain problems regarding the existing operations of SNSs, as well as their aggregation operators and the comparison methods. Therefore, this paper defines the novel operations of simplified neutrosophic numbers (SNNs) and develops a comparison method based on the related research of intuitionistic fuzzy numbers. On the basis of these operations and the comparison method, some SNN aggregation operators are proposed. Additionally, an approach for multi-criteria group decision-making (MCGDM) problems is explored by applying these aggregation operators. Finally, an example to illustrate the applicability of the proposed method is provided and a comparison with some other methods is made.
Innovative architecture design for high performance organic and hybrid multi-junction solar cells
NASA Astrophysics Data System (ADS)
Li, Ning; Spyropoulos, George D.; Brabec, Christoph J.
2017-08-01
The multi-junction concept is especially attractive for the photovoltaic (PV) research community owing to its potential to overcome the Schockley-Queisser limit of single-junction solar cells. Tremendous research interests are now focused on the development of high-performance absorbers and novel device architectures for emerging PV technologies, such as organic and perovskite PVs. It has been predicted that the multi-junction concept is able to boost the organic and perovskite PV technologies approaching the 20% and 30% benchmarks, respectively, showing a bright future of commercialization of the emerging PV technologies. In this contribution, we will demonstrate innovative architecture design for solution-processed, highly functional organic and hybrid multi-junction solar cells. A simple but elegant approach to fabricating organic and hybrid multi-junction solar cells will be introduced. By laminating single organic/hybrid solar cells together through an intermediate layer, the manufacturing cost and complexity of large-scale multi-junction solar cells can be significantly reduced. This smart approach to balancing the photocurrents as well as open circuit voltages in multi-junction solar cells will be demonstrated and discussed in detail.
CFD Methods and Tools for Multi-Element Airfoil Analysis
NASA Technical Reports Server (NTRS)
Rogers, Stuart E.; George, Michael W. (Technical Monitor)
1995-01-01
This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.
Kagawa-Singer, Marjorie; Adler, Shelley R; Mouton, Charles E; Ory, Marcia; Underwood, Lynne G
2009-01-01
To outline the lessons learned about the use of focus groups for the multisite, multi-ethnic longitudinal Study of Women Across the Nation (SWAN). Focus groups were designed to identify potential cultural differences in the incidence of symptoms and the meaning of transmenopause among women of diverse cultures, and to identify effective recruitment and retention strategies. Inductive and deductive focus groups for a multi-ethnic study. Seven community research sites across the United States conducted focus groups with six ethnic populations: African American, Chinese American, Japanese American, Mexican American, non-Hispanic white, and Puerto Rican. Community women from each ethnic group of color. A set of four/five focus groups in each ethnic group as the formative stage of the deductive, quantitative SWAN survey. Identification of methodological advantages and challenges to the successful implementation of formative focus groups in a multi-ethnic, multi-site population-based epidemiologic study. We provide recommendations from our lessons learned to improve the use of focus groups in future studies with multi-ethnic populations. Mixed methods using inductive and deductive approaches require the scientific integrity of both research paradigms. Adequate resources and time must be budgeted as essential parts of the overall strategy from the outset of study. Inductive cross-cultural researchers should be key team members, beginning with inception through each subsequent design phase to increase the scientific validity, generalizability, and comparability of the results across diverse ethnic groups, to assure the relevance, validity and applicability of the findings to the multicultural population of focus.
Organic synthesis: march of the machines.
Ley, Steven V; Fitzpatrick, Daniel E; Ingham, Richard J; Myers, Rebecca M
2015-03-09
Organic synthesis is changing; in a world where budgets are constrained and the environmental impacts of practice are scrutinized, it is increasingly recognized that the efficient use of human resource is just as important as material use. New technologies and machines have found use as methods for transforming the way we work, addressing these issues encountered in research laboratories by enabling chemists to adopt a more holistic systems approach in their work. Modern developments in this area promote a multi-disciplinary approach and work is more efficient as a result. This Review focuses on the concepts, procedures and methods that have far-reaching implications in the chemistry world. Technologies have been grouped as topics of opportunity and their recent applications in innovative research laboratories are described. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Conducting a team-based multi-sited focused ethnography in primary care.
Bikker, A P; Atherton, H; Brant, H; Porqueddu, T; Campbell, J L; Gibson, A; McKinstry, B; Salisbury, C; Ziebland, S
2017-09-12
Focused ethnography is an applied and pragmatic form of ethnography that explores a specific social phenomenon as it occurs in everyday life. Based on the literature a problem-focused research question is formulated before the data collection. The data generation process targets key informants and situations so that relevant results on the pre-defined topic can be obtained within a relatively short time-span. As part of a theory based evaluation of alternative forms of consultation (such as video, phone and email) in primary care we used the focused ethnographic method in a multisite study in general practice across the UK. To date there is a gap in the literature on using focused ethnography in healthcare research.The aim of the paper is to build on the various methodological approaches in health services research by presenting the challenges and benefits we encountered whilst conducing a focused ethnography in British primary care. Our considerations are clustered under three headings: constructing a shared understanding, dividing the tasks within the team, and the functioning of the focused ethnographers within the broader multi-disciplinary team.As a result of using this approach we experienced several advantages, like the ability to collect focused data in several settings simultaneously within in a short time-span. Also, the sharing of experiences and interpretations between the researchers contributed to a more holistic understanding of the research topic. However, mechanisms need to be in place to facilitate and synthesise the observations, guide the analysis, and to ensure that all researchers feel engaged. Reflection, trust and flexibility among the team members were crucial to successfully adopt a team focused ethnographic approach. When used for policy focussed applied healthcare research a team-based multi-sited focused ethnography can uncover practices and understandings that would not be apparent through surveys or interviews alone. If conducted with care, it can provide timely findings within the fast moving context of healthcare policy and research.
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
A Belief-Space Approach to Integrated Intelligence - Research Area 10.3: Intelligent Networks
2017-12-05
A Belief-Space Approach to Integrated Intelligence- Research Area 10.3: Intelligent Networks The views , opinions and/or findings contained in this...high dimensionality and multi -modality of their hybrid configuration spaces. Planners that perform a purely geometric search are prohibitively slow...Hamburg, January Paper Title: Hierarchical planning for multi -contact non-prehensile manipulation Publication Type: Conference Paper or Presentation
Enhancing resource coordination for multi-modal evacuation planning.
DOT National Transportation Integrated Search
2013-01-01
This research project seeks to increase knowledge about coordinating effective multi-modal evacuation for disasters. It does so by identifying, evaluating, and assessing : current transportation management approaches for multi-modal evacuation planni...
Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought
NASA Astrophysics Data System (ADS)
Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.
2017-10-01
Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.
ERIC Educational Resources Information Center
Araya, Saba Q.
2013-01-01
As pressure increases to ensure that limited resources are utilized as effectively as possible, funding adequacy remains a priority for all California public schools. The research was conducted through a multi-methods approach of principal interviews, site level resource allocation data, and overall student achievement on state assessments. The…
ERIC Educational Resources Information Center
Booth, Margaret Zoller; Abercrombie, Sara; Frey, Christopher J.
2017-01-01
This research examines the relationship between ethnicity, ethnic identity, self-efficacy, and academic achievement within a multi-ethnic mid-sized city in the US Midwest. Utilizing a mixed-methods approach, middle and high school adolescents (Fall, N = 482; Spring, N = 392) were surveyed and interviewed over the course of one year to investigate…
NASA Astrophysics Data System (ADS)
Xu, Z.; Guan, K.; Peng, B.; Casler, N. P.; Wang, S. W.
2017-12-01
Landscape has complex three-dimensional features. These 3D features are difficult to extract using conventional methods. Small-footprint LiDAR provides an ideal way for capturing these features. Existing approaches, however, have been relegated to raster or metric-based (two-dimensional) feature extraction from the upper or bottom layer, and thus are not suitable for resolving morphological and intensity features that could be important to fine-scale land cover mapping. Therefore, this research combines airborne LiDAR and multi-temporal Landsat imagery to classify land cover types of Williamson County, Illinois that has diverse and mixed landscape features. Specifically, we applied a 3D convolutional neural network (CNN) method to extract features from LiDAR point clouds by (1) creating occupancy grid, intensity grid at 1-meter resolution, and then (2) normalizing and incorporating data into a 3D CNN feature extractor for many epochs of learning. The learned features (e.g., morphological features, intensity features, etc) were combined with multi-temporal spectral data to enhance the performance of land cover classification based on a Support Vector Machine classifier. We used photo interpretation for training and testing data generation. The classification results show that our approach outperforms traditional methods using LiDAR derived feature maps, and promises to serve as an effective methodology for creating high-quality land cover maps through fusion of complementary types of remote sensing data.
Remote Sensing of Cloud Top Heights Using the Research Scanning Polarimeter
NASA Technical Reports Server (NTRS)
Sinclair, Kenneth; van Diedenhoven, Bastiaan; Cairns, Brian; Yorks, John; Wasilewski, Andrzej
2015-01-01
Clouds cover roughly two thirds of the globe and act as an important regulator of Earth's radiation budget. Of these, multilayered clouds occur about half of the time and are predominantly two-layered. Changes in cloud top height (CTH) have been predicted by models to have a globally averaged positive feedback, however observational changes in CTH have shown uncertain results. Additional CTH observations are necessary to better and quantify the effect. Improved CTH observations will also allow for improved sub-grid parameterizations in large-scale models and accurate CTH information is important when studying variations in freezing point and cloud microphysics. NASA's airborne Research Scanning Polarimeter (RSP) is able to measure cloud top height using a novel multi-angular contrast approach. RSP scans along the aircraft track and obtains measurements at 152 viewing angles at any aircraft location. The approach presented here aggregates measurements from multiple scans to a single location at cloud altitude using a correlation function designed to identify the location-distinct features in each scan. During NASAs SEAC4RS air campaign, the RSP was mounted on the ER-2 aircraft along with the Cloud Physics Lidar (CPL), which made simultaneous measurements of CTH. The RSPs unique method of determining CTH is presented. The capabilities of using single and combinations of channels within the approach are investigated. A detailed comparison of RSP retrieved CTHs with those of CPL reveal the accuracy of the approach. Results indicate a strong ability for the RSP to accurately identify cloud heights. Interestingly, the analysis reveals an ability for the approach to identify multiple cloud layers in a single scene and estimate the CTH of each layer. Capabilities and limitations of identifying single and multiple cloud layers heights are explored. Special focus is given to sources of error in the method including optically thin clouds, physically thick clouds, multi-layered clouds as well as cloud phase. When determining multi-layered CTHs, limits on the upper clouds opacity are assessed.
Application of multi-grid methods for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Demuren, A. O.
1989-01-01
The application of a class of multi-grid methods to the solution of the Navier-Stokes equations for two-dimensional laminar flow problems is discussed. The methods consist of combining the full approximation scheme-full multi-grid technique (FAS-FMG) with point-, line-, or plane-relaxation routines for solving the Navier-Stokes equations in primitive variables. The performance of the multi-grid methods is compared to that of several single-grid methods. The results show that much faster convergence can be procured through the use of the multi-grid approach than through the various suggestions for improving single-grid methods. The importance of the choice of relaxation scheme for the multi-grid method is illustrated.
Application of multi-grid methods for solving the Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Demuren, A. O.
1989-01-01
This paper presents the application of a class of multi-grid methods to the solution of the Navier-Stokes equations for two-dimensional laminar flow problems. The methods consists of combining the full approximation scheme-full multi-grid technique (FAS-FMG) with point-, line- or plane-relaxation routines for solving the Navier-Stokes equations in primitive variables. The performance of the multi-grid methods is compared to those of several single-grid methods. The results show that much faster convergence can be procured through the use of the multi-grid approach than through the various suggestions for improving single-grid methods. The importance of the choice of relaxation scheme for the multi-grid method is illustrated.
Using Evaluation Research as a Means for Policy Analysis in a "New" Mission-Oriented Policy Context
ERIC Educational Resources Information Center
Amanatidou, Effie; Cunningham, Paul; Gök, Abdullah; Garefi, Ioanna
2014-01-01
Grand challenges stress the importance of multi-disciplinary research, a multi-actor approach in examining the current state of affairs and exploring possible solutions, multi-level governance and policy coordination across geographical boundaries and policy areas, and a policy environment for enabling change both in science and technology and in…
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-01-01
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-11-16
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.
On deception detection in multi-agent systems and deception intent
NASA Astrophysics Data System (ADS)
Santos, Eugene, Jr.; Li, Deqing; Yuan, Xiuqing
2008-04-01
Deception detection plays an important role in the military decision-making process, but detecting deception is a challenging task. The deception planning process involves a number of human factors. It is intent-driven where intentions are usually hidden or not easily observable. As a result, in order to detect deception, any adversary model must have the capability to capture the adversary's intent. This paper discusses deception detection in multi-agent systems and in adversary modeling. We examined psychological and cognitive science research on deception and implemented various theories of deception within our approach. First, in multi-agent expert systems, one detection method uses correlations between agents to predict reasonable opinions/responses of other agents (Santos & Johnson, 2004). We further explore this idea and present studies that show the impact of different factors on detection success rate. Second, from adversary modeling, our detection method focuses on inferring adversary intent. By combining deception "branches" with intent inference models, we can estimate an adversary's deceptive activities and at the same time enhance intent inference. Two major kinds of deceptions are developed in this approach in different fashions. Simulative deception attempts to find inconsistency in observables, while dissimulative deception emphasizes the inference of enemy intentions.
Experimental Design for Multi-drug Combination Studies Using Signaling Networks
Huang, Hengzhen; Fang, Hong-Bin; Tan, Ming T.
2017-01-01
Summary Combinations of multiple drugs are an important approach to maximize the chance for therapeutic success by inhibiting multiple pathways/targets. Analytic methods for studying drug combinations have received increasing attention because major advances in biomedical research have made available large number of potential agents for testing. The preclinical experiment on multi-drug combinations plays a key role in (especially cancer) drug development because of the complex nature of the disease, the need to reduce development time and costs. Despite recent progresses in statistical methods for assessing drug interaction, there is an acute lack of methods for designing experiments on multi-drug combinations. The number of combinations grows exponentially with the number of drugs and dose-levels and it quickly precludes laboratory testing. Utilizing experimental dose-response data of single drugs and a few combinations along with pathway/network information to obtain an estimate of the functional structure of the dose-response relationship in silico, we propose an optimal design that allows exploration of the dose-effect surface with the smallest possible sample size in this paper. The simulation studies show our proposed methods perform well. PMID:28960231
TARGET researchers use various sequencing and array-based methods to examine the genomes, transcriptomes, and for some diseases epigenomes of select childhood cancers. This “multi-omic” approach generates a comprehensive profile of molecular alterations for each cancer type. Alterations are changes in DNA or RNA, such as rearrangements in chromosome structure or variations in gene expression, respectively. Through computational analyses and assays to validate biological function, TARGET researchers predict which alterations disrupt the function of a gene or pathway and promote cancer growth, progression, and/or survival. Researchers identify candidate therapeutic targets and/or prognostic markers from the cancer-associated alterations.
Mentorship in Practice: A Multi-Method Approach.
ERIC Educational Resources Information Center
Schreck, Timothy J.; And Others
This study was conducted to evaluate a field-based mentorship program using a multi-method approach. It explored the use of mentorship as practiced in the Florida Compact, a business education partnership established in Florida in 1987. The study was designed to identify differences between mentors and mentorees, as well as differences within…
An efficient method for the fusion of light field refocused images
NASA Astrophysics Data System (ADS)
Wang, Yingqian; Yang, Jungang; Xiao, Chao; An, Wei
2018-04-01
Light field cameras have drawn much attention due to the advantage of post-capture adjustments such as refocusing after exposure. The depth of field in refocused images is always shallow because of the large equivalent aperture. As a result, a large number of multi-focus images are obtained and an all-in-focus image is demanded. Consider that most multi-focus image fusion algorithms do not particularly aim at large numbers of source images and traditional DWT-based fusion approach has serious problems in dealing with lots of multi-focus images, causing color distortion and ringing effect. To solve this problem, this paper proposes an efficient multi-focus image fusion method based on stationary wavelet transform (SWT), which can deal with a large quantity of multi-focus images with shallow depth of fields. We compare SWT-based approach with DWT-based approach on various occasions. And the results demonstrate that the proposed method performs much better both visually and quantitatively.
Metric Evaluation Pipeline for 3d Modeling of Urban Scenes
NASA Astrophysics Data System (ADS)
Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.
2017-05-01
Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.
Heideklang, René; Shokouhi, Parisa
2016-01-01
This article focuses on the fusion of flaw indications from multi-sensor nondestructive materials testing. Because each testing method makes use of a different physical principle, a multi-method approach has the potential of effectively differentiating actual defect indications from the many false alarms, thus enhancing detection reliability. In this study, we propose a new technique for aggregating scattered two- or three-dimensional sensory data. Using a density-based approach, the proposed method explicitly addresses localization uncertainties such as registration errors. This feature marks one of the major of advantages of this approach over pixel-based image fusion techniques. We provide guidelines on how to set all the key parameters and demonstrate the technique’s robustness. Finally, we apply our fusion approach to experimental data and demonstrate its capability to locate small defects by substantially reducing false alarms under conditions where no single-sensor method is adequate. PMID:26784200
Ren, Jingzheng; Liang, Hanwei; Dong, Liang; Sun, Lu; Gao, Zhiqiu
2016-08-15
Industrial symbiosis provides novel and practical pathway to the design for the sustainability. Decision support tool for its verification is necessary for practitioners and policy makers, while to date, quantitative research is limited. The objective of this work is to present an innovative approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied by the proposed method, a few of compromises between high profitability and high sustainability can be obtained for the decision-makers/stakeholders to make decision. Copyright © 2016 Elsevier B.V. All rights reserved.
Berger-González, Mónica; Stauffacher, Michael; Zinsstag, Jakob; Edwards, Peter; Krütli, Pius
2016-01-01
Transdisciplinarity (TD) is a participatory research approach in which actors from science and society work closely together. It offers means for promoting knowledge integration and finding solutions to complex societal problems, and can be applied within a multiplicity of epistemic systems. We conducted a TD process from 2011 to 2014 between indigenous Mayan medical specialists from Guatemala and Western biomedical physicians and scientists to study cancer. Given the immense cultural gap between the partners, it was necessary to develop new methods to overcome biases induced by ethnocentric behaviors and power differentials. This article describes this intercultural cooperation and presents a method of reciprocal reflexivity (Bidirectional Emic-Etic tool) developed to overcome them. As a result of application, researchers observed successful knowledge integration at the epistemic level, the social-organizational level, and the communicative level throughout the study. This approach may prove beneficial to others engaged in facilitating participatory health research in complex intercultural settings. © The Author(s) 2015.
Harper, S L; Edge, V L; Cunsolo Willox, A
2012-03-01
Global climate change and its impact on public health exemplify the challenge of managing complexity and uncertainty in health research. The Canadian North is currently experiencing dramatic shifts in climate, resulting in environmental changes which impact Inuit livelihoods, cultural practices, and health. For researchers investigating potential climate change impacts on Inuit health, it has become clear that comprehensive and meaningful research outcomes depend on taking a systemic and transdisciplinary approach that engages local citizens in project design, data collection, and analysis. While it is increasingly recognised that using approaches that embrace complexity is a necessity in public health, mobilizing such approaches from theory into practice can be challenging. In 2009, the Rigolet Inuit Community Government in Rigolet, Nunatsiavut, Canada partnered with a transdisciplinary team of researchers, health practitioners, and community storytelling facilitators to create the Changing Climate, Changing Health, Changing Stories project, aimed at developing a multi-media participatory, community-run methodological strategy to gather locally appropriate and meaningful data to explore climate-health relationships. The goal of this profile paper is to describe how an EcoHealth approach guided by principles of transdisciplinarity, community participation, and social equity was used to plan and implement this climate-health research project. An overview of the project, including project development, research methods, project outcomes to date, and challenges encountered, is presented. Though introduced in this one case study, the processes, methods, and lessons learned are broadly applicable to researchers and communities interested in implementing EcoHealth approaches in community-based research.
Multi-objective decision-making model based on CBM for an aircraft fleet
NASA Astrophysics Data System (ADS)
Luo, Bin; Lin, Lin
2018-04-01
Modern production management patterns, in which multi-unit (e.g., a fleet of aircrafts) are managed in a holistic manner, have brought new challenges for multi-unit maintenance decision making. To schedule a good maintenance plan, not only does the individual machine maintenance have to be considered, but also the maintenance of the other individuals have to be taken into account. Since most condition-based maintenance researches for aircraft focused on solely reducing maintenance cost or maximizing the availability of single aircraft, as well as considering that seldom researches concentrated on both the two objectives: minimizing cost and maximizing the availability of a fleet (total number of available aircraft in fleet), a multi-objective decision-making model based on condition-based maintenance concentrated both on the above two objectives is established. Furthermore, in consideration of the decision maker may prefer providing the final optimal result in the form of discrete intervals instead of a set of points (non-dominated solutions) in real decision-making problem, a novel multi-objective optimization method based on support vector regression is proposed to solve the above multi-objective decision-making model. Finally, a case study regarding a fleet is conducted, with the results proving that the approach efficiently generates outcomes that meet the schedule requirements.
Jessica Palazzolo; Joshua Robinson; Phillip Ellis
2016-01-01
Ecosystem restoration design is a relatively new field of work that requires multi-disciplinary expertise in the natural sciences. Although the field is new, federal agencies and public institutions have spent several decades and millions of dollars researching the sciences and methods that underly restoration activities. However, many restoration practitioners are...
Rosas, Scott R; Cope, Marie T; Villa, Christie; Motevalli, Mahnaz; Utech, Jill; Schouten, Jeffrey T
2014-04-01
Large-scale, multi-network clinical trials are seen as a means for efficient and effective utilization of resources with greater responsiveness to new discoveries. Formal structures instituted within the National Institutes of Health (NIH) HIV/AIDS Clinical Trials facilitate collaboration and coordination across networks and emphasize an integrated approach to HIV/AIDS vaccine, prevention and therapeutics clinical trials. This study examines the joint usage of clinical research sites as means of gaining efficiency, extending capacity, and adding scientific value to the networks. A semi-structured questionnaire covering eight clinical management domains was administered to 74 (62% of sites) clinical site coordinators at single- and multi-network sites to identify challenges and efficiencies related to clinical trials management activities and coordination with multi-network units. Overall, respondents at multi-network sites did not report more challenges than single-network sites, but did report unique challenges to overcome including in the areas of study prioritization, community engagement, staff education and training, and policies and procedures. The majority of multi-network sites reported that such affiliations do allow for the consolidation and cost-sharing of research functions. Suggestions for increasing the efficiency or performance of multi-network sites included streamlining standards and requirements, consolidating protocol activation methods, using a single cross-network coordinating centre, and creating common budget and payment mechanisms. The results of this assessment provide important information to consider in the design and management of multi-network configurations for the NIH HIV/AIDS Clinical Trials Networks, as well as others contemplating and promoting the concept of multi-network settings. © 2013 John Wiley & Sons Ltd.
TRIAD: The Translational Research Informatics and Data Management Grid
Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.
2011-01-01
Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879
Gene prioritization and clustering by multi-view text mining
2010-01-01
Background Text mining has become a useful tool for biologists trying to understand the genetics of diseases. In particular, it can help identify the most interesting candidate genes for a disease for further experimental analysis. Many text mining approaches have been introduced, but the effect of disease-gene identification varies in different text mining models. Thus, the idea of incorporating more text mining models may be beneficial to obtain more refined and accurate knowledge. However, how to effectively combine these models still remains a challenging question in machine learning. In particular, it is a non-trivial issue to guarantee that the integrated model performs better than the best individual model. Results We present a multi-view approach to retrieve biomedical knowledge using different controlled vocabularies. These controlled vocabularies are selected on the basis of nine well-known bio-ontologies and are applied to index the vast amounts of gene-based free-text information available in the MEDLINE repository. The text mining result specified by a vocabulary is considered as a view and the obtained multiple views are integrated by multi-source learning algorithms. We investigate the effect of integration in two fundamental computational disease gene identification tasks: gene prioritization and gene clustering. The performance of the proposed approach is systematically evaluated and compared on real benchmark data sets. In both tasks, the multi-view approach demonstrates significantly better performance than other comparing methods. Conclusions In practical research, the relevance of specific vocabulary pertaining to the task is usually unknown. In such case, multi-view text mining is a superior and promising strategy for text-based disease gene identification. PMID:20074336
Multi-User Hardware Solutions to Combustion Science ISS Research
NASA Technical Reports Server (NTRS)
Otero, Angel M.
2001-01-01
In response to the budget environment and to expand on the International Space Station (ISS) Fluids and Combustion Facility (FCF) Combustion Integrated Rack (CIR), common hardware approach, the NASA Combustion Science Program shifted focus in 1999 from single investigator PI (Principal Investigator)-specific hardware to multi-user 'Minifacilities'. These mini-facilities would take the CIR common hardware philosophy to the next level. The approach that was developed re-arranged all the investigations in the program into sub-fields of research. Then common requirements within these subfields were used to develop a common system that would then be complemented by a few PI-specific components. The sub-fields of research selected were droplet combustion, solids and fire safety, and gaseous fuels. From these research areas three mini-facilities have sprung: the Multi-user Droplet Combustion Apparatus (MDCA) for droplet research, Flow Enclosure for Novel Investigations in Combustion of Solids (FEANICS) for solids and fire safety, and the Multi-user Gaseous Fuels Apparatus (MGFA) for gaseous fuels. These mini-facilities will develop common Chamber Insert Assemblies (CIA) and diagnostics for the respective investigators complementing the capability provided by CIR. Presently there are four investigators for MDCA, six for FEANICS, and four for MGFA. The goal of these multi-user facilities is to drive the cost per PI down after the initial development investment is made. Each of these mini-facilities will become a fixture of future Combustion Science NASA Research Announcements (NRAs), enabling investigators to propose against an existing capability. Additionally, an investigation is provided the opportunity to enhance the existing capability to bridge the gap between the capability and their specific science requirements. This multi-user development approach will enable the Combustion Science Program to drive cost per investigation down while drastically reducing the time required to go from selection to space flight.
NASA Astrophysics Data System (ADS)
Peng, Juan-juan; Wang, Jian-qiang; Yang, Wu-E.
2017-01-01
In this paper, multi-criteria decision-making (MCDM) problems based on the qualitative flexible multiple criteria method (QUALIFLEX), in which the criteria values are expressed by multi-valued neutrosophic information, are investigated. First, multi-valued neutrosophic sets (MVNSs), which allow the truth-membership function, indeterminacy-membership function and falsity-membership function to have a set of crisp values between zero and one, are introduced. Then the likelihood of multi-valued neutrosophic number (MVNN) preference relations is defined and the corresponding properties are also discussed. Finally, an extended QUALIFLEX approach based on likelihood is explored to solve MCDM problems where the assessments of alternatives are in the form of MVNNs; furthermore an example is provided to illustrate the application of the proposed method, together with a comparison analysis.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Zheng, Xiao-yong; Cao, Dong-sheng; Ye, Fa-qing; Xiang, Zheng
2014-01-01
Traditional Chinese medicine (TCM) has unique therapeutic effects for complex chronic diseases. However, for the lack of an effective systematic approach, the research progress on the effective substances and pharmacological mechanism of action has been very slow. In this paper, by incorporating network biology, bioinformatics and chemoinformatics methods, an integrated approach was proposed to systematically investigate and explain the pharmacological mechanism of action and effective substances of TCM. This approach includes the following main steps: First, based on the known drug targets, network biology was used to screen out putative drug targets; Second, the molecular docking method was used to calculate whether the molecules from TCM and drug targets related to chronic kidney diseases (CKD) interact or not; Third, according to the result of molecular docking, natural product-target network, main component-target network and compound-target network were constructed; Finally, through analysis of network characteristics and literature mining, potential effective multi-components and their synergistic mechanism were putatively identified and uncovered. Bu-shen-Huo-xue formula (BSHX) which was frequently used for treating CKD, was used as the case to demonstrate reliability of our proposed approach. The results show that BSHX has the therapeutic effect by using multi-channel network regulation, such as regulating the coagulation and fibrinolytic balance, and the expression of inflammatory factors, inhibiting abnormal ECM accumulation. Tanshinone IIA, rhein, curcumin, calycosin and quercetin may be potential effective ingredients of BSHX. This research shows that the integration approach can be an effective means for discovering active substances and revealing their pharmacological mechanisms of TCM. PMID:24598793
A Mixtures-of-Trees Framework for Multi-Label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We propose a new probabilistic approach for multi-label classification that aims to represent the class posterior distribution P(Y|X). Our approach uses a mixture of tree-structured Bayesian networks, which can leverage the computational advantages of conditional tree-structured models and the abilities of mixtures to compensate for tree-structured restrictions. We develop algorithms for learning the model from data and for performing multi-label predictions using the learned model. Experiments on multiple datasets demonstrate that our approach outperforms several state-of-the-art multi-label classification methods. PMID:25927011
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J.
2008-01-01
Background: To assist with the development of a French online quality-controlled health gateway (CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (F-MTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. Objective: In this paper, we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. Methods: The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. Results: There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. Conclusion: F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French. PMID:18998933
The role of economics in the QUERI program: QUERI Series.
Smith, Mark W; Barnett, Paul G
2008-04-22
The United States (U.S.) Department of Veterans Affairs (VA) Quality Enhancement Research Initiative (QUERI) has implemented economic analyses in single-site and multi-site clinical trials. To date, no one has reviewed whether the QUERI Centers are taking an optimal approach to doing so. Consistent with the continuous learning culture of the QUERI Program, this paper provides such a reflection. We present a case study of QUERI as an example of how economic considerations can and should be integrated into implementation research within both single and multi-site studies. We review theoretical and applied cost research in implementation studies outside and within VA. We also present a critique of the use of economic research within the QUERI program. Economic evaluation is a key element of implementation research. QUERI has contributed many developments in the field of implementation but has only recently begun multi-site implementation trials across multiple regions within the national VA healthcare system. These trials are unusual in their emphasis on developing detailed costs of implementation, as well as in the use of business case analyses (budget impact analyses). Economics appears to play an important role in QUERI implementation studies, only after implementation has reached the stage of multi-site trials. Economic analysis could better inform the choice of which clinical best practices to implement and the choice of implementation interventions to employ. QUERI economics also would benefit from research on costing methods and development of widely accepted international standards for implementation economics.
Integrated Research on Disaster Risk - A Review
NASA Astrophysics Data System (ADS)
Beer, T.
2016-12-01
Integrated Research on Disaster Risk, generally known as IRDR, is a decade-long research programme co-sponsored by the International Council for Science (ICSU), the International Social Science Council (ISSC), and the United Nations International Strategy for Disaster Reduction (UNISDR). It is a global, multi-disciplinary approach to dealing with the challenges brought by natural disasters, mitigating their impacts, and improving related policy-making mechanisms. The home page is at: http://www.irdrinternational.org/The research programme was named Integrated Research on Disaster Risk to indicate that it is addressing the challenge of natural and human-induced environmental hazards. In November 2008 and May 2009 respectively, both the ISSC and the UNISDR agreed to join the ICSU in co-sponsoring the IRDR programme. Although the approaches in the sciences vary, the IRDR programme approaches the issues of natural and human-induced hazards and disasters from several perspectives: from the hazards to the disasters, and from the human exposures and vulnerabilities back to the hazards. This coordinated and multi-dimensional approach takes the IRDR programme beyond approaches that have traditionally been undertaken To meet its research objectives the IRDR established four core projects, comprising working groups of experts from diverse disciplines, to formulate new methods in addressing the shortcomings of current disaster risk research. Assessment of Integrated Research on Disaster Risk (AIRDR) Disaster Loss Data (DATA) Forensic Investigations of Disasters (FORIN) Risk Interpretation and Action (RIA) Dr Tom Beer was a member of both the scoping and planning groups and was a member of the committee to undertake a mid-term review of IRDR with the terms of reference being to examine and to report by November 2016. 1. Strategic planning and implementation 2. Governance 3. Secretariat, funding and operations 4. Stakeholders and partnerships 5. Communication, visibility and influence 6. Future development His talk will give an overview of the history and science of IRDR and some of the outcomes of the mid-term review.
Preface paper to the Semi-Arid Land-Surface-Atmosphere (SALSA) Program special issue
Goodrich, D.C.; Chehbouni, A.; Goff, B.; MacNish, B.; Maddock, T.; Moran, S.; Shuttleworth, W.J.; Williams, D.G.; Watts, C.; Hipps, L.H.; Cooper, D.I.; Schieldge, J.; Kerr, Y.H.; Arias, H.; Kirkland, M.; Carlos, R.; Cayrol, P.; Kepner, W.; Jones, B.; Avissar, R.; Begue, A.; Bonnefond, J.-M.; Boulet, G.; Branan, B.; Brunel, J.P.; Chen, L.C.; Clarke, T.; Davis, M.R.; DeBruin, H.; Dedieu, G.; Elguero, E.; Eichinger, W.E.; Everitt, J.; Garatuza-Payan, J.; Gempko, V.L.; Gupta, H.; Harlow, C.; Hartogensis, O.; Helfert, M.; Holifield, C.; Hymer, D.; Kahle, A.; Keefer, T.; Krishnamoorthy, S.; Lhomme, J.-P.; Lagouarde, J.-P.; Lo, Seen D.; Luquet, D.; Marsett, R.; Monteny, B.; Ni, W.; Nouvellon, Y.; Pinker, R.; Peters, C.; Pool, D.; Qi, J.; Rambal, S.; Rodriguez, J.; Santiago, F.; Sano, E.; Schaeffer, S.M.; Schulte, M.; Scott, R.; Shao, X.; Snyder, K.A.; Sorooshian, S.; Unkrich, C.L.; Whitaker, M.; Yucel, I.
2000-01-01
The Semi-Arid Land-Surface-Atmosphere Program (SALSA) is a multi-agency, multi-national research effort that seeks to evaluate the consequences of natural and human-induced environmental change in semi-arid regions. The ultimate goal of SALSA is to advance scientific understanding of the semi-arid portion of the hydrosphere-biosphere interface in order to provide reliable information for environmental decision making. SALSA approaches this goal through a program of long-term, integrated observations, process research, modeling, assessment, and information management that is sustained by cooperation among scientists and information users. In this preface to the SALSA special issue, general program background information and the critical nature of semi-arid regions is presented. A brief description of the Upper San Pedro River Basin, the initial location for focused SALSA research follows. Several overarching research objectives under which much of the interdisciplinary research contained in the special issue was undertaken are discussed. Principal methods, primary research sites and data collection used by numerous investigators during 1997-1999 are then presented. Scientists from about 20 US, five European (four French and one Dutch), and three Mexican agencies and institutions have collaborated closely to make the research leading to this special issue a reality. The SALSA Program has served as a model of interagency cooperation by breaking new ground in the approach to large scale interdisciplinary science with relatively limited resources.
ERIC Educational Resources Information Center
Hallberg, Kelly; Cook, Thomas D.; Figlio, David
2013-01-01
The goal of this paper is to provide guidance for applied education researchers in using multi-level data to study the effects of interventions implemented at the school level. Two primary approaches are currently employed in observational studies of the effect of school-level interventions. One approach employs intact school matching: matching…
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1995-01-01
Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
The Integrated Multi-Level Bilingual Teaching of "Social Research Methods"
ERIC Educational Resources Information Center
Zhu, Yanhan; Ye, Jian
2012-01-01
"Social Research Methods," as a methodology course, combines theories and practices closely. Based on the synergy theory, this paper tries to establish an integrated multi-level bilingual teaching mode. Starting from the transformation of teaching concepts, we should integrate interactions, experiences, and researches together and focus…
Rowland, Caroline F; Monaghan, Padraic
2017-01-01
In developmental psycholinguistics, we have, for many years, been generating and testing theories that propose both descriptions of adult representations and explanations of how those representations develop. We have learnt that restricting ourselves to any one methodology yields only incomplete data about the nature of linguistic representations. We argue that we need a multi-method approach to the study of representation.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Wang, Yadong; Li, Xiangrui; Yuan, Yiwen; Patel, Mahomed S
2014-01-01
To describe an innovative approach for developing and implementing an in-service curriculum in China for staff of the newly established health emergency response offices (HEROs), and that is generalisable to other settings. The multi-method training needs assessment included reviews of the competency domains needed to implement the International Health Regulations (2005) as well as China's policies and emergency regulations. The review, iterative interviews and workshops with experts in government, academia, the military, and with HERO staff were reviewed critically by an expert technical advisory panel. Over 1600 participants contributed to curriculum development. Of the 18 competency domains identified as essential for HERO staff, nine were developed into priority in-service training modules to be conducted over 2.5 weeks. Experts from academia and experienced practitioners prepared and delivered each module through lectures followed by interactive problem-solving exercises and desktop simulations to help trainees apply, experiment with, and consolidate newly acquired knowledge and skills. This study adds to the emerging literature on China's enduring efforts to strengthen its emergency response capabilities since the outbreak of SARS in 2003. The multi-method approach to curriculum development in partnership with senior policy-makers, researchers, and experienced practitioners can be applied in other settings to ensure training is responsive and customized to local needs, resources and priorities. Ongoing curriculum development should reflect international standards and be coupled with the development of appropriate performance support systems at the workplace for motivating staff to apply their newly acquired knowledge and skills effectively and creatively.
Lessons learned from recruiting nursing homes to a quantitative cross-sectional pilot study.
Tzouvara, Vasiliki; Papadopoulos, Chris; Randhawa, Gurch
2016-03-01
A growing older adult population is leading to increased admission rates to long-term care facilities such as nursing homes and residential care homes. Assisted healthcare services should be flexible, integrated, and responsive to older adults' needs. However, there is a limited body of empirical evidence because of the recruitment challenges in these settings. To describe the barriers and challenges faced in recruiting to a recent pilot study, consider previously implemented and proposed recruitment strategies, and propose a new multi-method approach to maximising recruitment of care homes. The proposed multi-method approach harnesses key recruitment strategies previously highlighted as effective in navigating the many challenges and barriers that are likely to be encountered, such as mistrust, scepticism and concerns about disruption to routines. This includes making strategic use of existing personal and professional connections within the research team, engaging with care homes that have previously engaged with the research process, forming relationships of trust, and employing a range of incentives. Implementing carefully planned recruitment strategies is likely to improve relationships between nursing homes and researchers. As a consequence, recruitment can be augmented which can enable the production of rigorous evidence required for achieving effective nursing practice and patient wellbeing. Boosting recruitment rates is crucial in helping to build new and less biased research evidence and for informing and underpinning all forms of evidence-based practice. The lessons learned from our pilot and the review of the literature highlight these issues and better enable investigators to access research settings that commonly possess many complex recruitment barriers and challenges.
Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach
Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni
2014-01-01
Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245
Judicialization 2.0: Understanding right-to-health litigation in real time.
Biehl, João; Socal, Mariana P; Gauri, Varun; Diniz, Debora; Medeiros, Marcelo; Rondon, Gabriela; Amon, Joseph J
2018-05-21
Over the past two decades, debate over the whys, the hows, and the effects of the ever-expanding phenomenon of right-to-health litigation ('judicialization') throughout Latin America have been marked by polarised arguments and limited information. In contrast to claims of judicialization as a positive or negative trend, less attention has been paid to ways to better understand the phenomenon in real time. In this article, we propose a new approach-Judicialization 2.0-that recognises judicialization as an integral part of democratic life. This approach seeks to expand access to information about litigation on access to medicines (and health care generally) in order to better characterise the complexity of the phenomenon and thus inform new research and more robust public discussions. Drawing from our multi-disciplinary perspectives and field experiences in highly judicialized contexts, we thus describe a new multi-source, multi-stakeholder mixed-method approach designed to capture the patterns and heterogeneity of judicialization and understand its medical and socio-political impact in real time, along with its counterfactuals. By facilitating greater data availability and open access, we can drive advancements towards transparent and participatory priority setting, as well as accountability mechanisms that promote quality universal health coverage.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
USDA-ARS?s Scientific Manuscript database
The important questions about agriculture, climate, and sustainability have become increasingly complex and require a coordinated, multi-faceted approach for developing new knowledge and understanding. A multi-state, transdisciplinary project was begun in 2011 to study the potential for both mitigat...
Kaufman, Michelle R; Cornish, Flora; Zimmerman, Rick S; Johnson, Blair T
2014-08-15
Despite increasing recent emphasis on the social and structural determinants of HIV-related behavior, empirical research and interventions lag behind, partly because of the complexity of social-structural approaches. This article provides a comprehensive and practical review of the diverse literature on multi-level approaches to HIV-related behavior change in the interest of contributing to the ongoing shift to more holistic theory, research, and practice. It has the following specific aims: (1) to provide a comprehensive list of relevant variables/factors related to behavior change at all points on the individual-structural spectrum, (2) to map out and compare the characteristics of important recent multi-level models, (3) to reflect on the challenges of operating with such complex theoretical tools, and (4) to identify next steps and make actionable recommendations. Using a multi-level approach implies incorporating increasing numbers of variables and increasingly context-specific mechanisms, overall producing greater intricacies. We conclude with recommendations on how best to respond to this complexity, which include: using formative research and interdisciplinary collaboration to select the most appropriate levels and variables in a given context; measuring social and institutional variables at the appropriate level to ensure meaningful assessments of multiple levels are made; and conceptualizing intervention and research with reference to theoretical models and mechanisms to facilitate transferability, sustainability, and scalability.
Distributing Planning and Control for Teams of Cooperating Mobile Robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, L.E.
2004-07-19
This CRADA project involved the cooperative research of investigators in ORNL's Center for Engineering Science Advanced Research (CESAR) with researchers at Caterpillar, Inc. The subject of the research was the development of cooperative control strategies for autonomous vehicles performing applications of interest to Caterpillar customers. The project involved three Phases of research, conducted over the time period of November 1998 through December 2001. This project led to the successful development of several technologies and demonstrations in realistic simulation that illustrated the effectiveness of our control approaches for distributed planning and cooperation in multi-robot teams. The primary objectives of this researchmore » project were to: (1) Develop autonomous control technologies to enable multiple vehicles to work together cooperatively, (2) Provide the foundational capabilities for a human operator to exercise oversight and guidance during the multi-vehicle task execution, and (3) Integrate these capabilities to the ALLIANCE-based autonomous control approach for multi-robot teams. These objectives have been successfully met with the results implemented and demonstrated in a near real-time multi-vehicle simulation of up to four vehicles performing mission-relevant tasks.« less
Multi-site precipitation downscaling using a stochastic weather generator
NASA Astrophysics Data System (ADS)
Chen, Jie; Chen, Hua; Guo, Shenglian
2018-03-01
Statistical downscaling is an efficient way to solve the spatiotemporal mismatch between climate model outputs and the data requirements of hydrological models. However, the most commonly-used downscaling method only produces climate change scenarios for a specific site or watershed average, which is unable to drive distributed hydrological models to study the spatial variability of climate change impacts. By coupling a single-site downscaling method and a multi-site weather generator, this study proposes a multi-site downscaling approach for hydrological climate change impact studies. Multi-site downscaling is done in two stages. The first stage involves spatially downscaling climate model-simulated monthly precipitation from grid scale to a specific site using a quantile mapping method, and the second stage involves the temporal disaggregating of monthly precipitation to daily values by adjusting the parameters of a multi-site weather generator. The inter-station correlation is specifically considered using a distribution-free approach along with an iterative algorithm. The performance of the downscaling approach is illustrated using a 10-station watershed as an example. The precipitation time series derived from the National Centers for Environment Prediction (NCEP) reanalysis dataset is used as the climate model simulation. The precipitation time series of each station is divided into 30 odd years for calibration and 29 even years for validation. Several metrics, including the frequencies of wet and dry spells and statistics of the daily, monthly and annual precipitation are used as criteria to evaluate the multi-site downscaling approach. The results show that the frequencies of wet and dry spells are well reproduced for all stations. In addition, the multi-site downscaling approach performs well with respect to reproducing precipitation statistics, especially at monthly and annual timescales. The remaining biases mainly result from the non-stationarity of NCEP precipitation. Overall, the proposed approach is efficient for generating multi-site climate change scenarios that can be used to investigate the spatial variability of climate change impacts on hydrology.
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Fazlollahtabar, Hamed
2010-12-01
Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The mission of the Sugarcane Research Unit (SRU) is to provide research-based solutions that enhance the viability of domestic sugarcane industry. To accomplish this mission, SRU uses a multidisciplinary approach to develop improved varieties and environmentally friendly production strategies. Cons...
A multi-domain spectral method for time-fractional differential equations
NASA Astrophysics Data System (ADS)
Chen, Feng; Xu, Qinwu; Hesthaven, Jan S.
2015-07-01
This paper proposes an approach for high-order time integration within a multi-domain setting for time-fractional differential equations. Since the kernel is singular or nearly singular, two main difficulties arise after the domain decomposition: how to properly account for the history/memory part and how to perform the integration accurately. To address these issues, we propose a novel hybrid approach for the numerical integration based on the combination of three-term-recurrence relations of Jacobi polynomials and high-order Gauss quadrature. The different approximations used in the hybrid approach are justified theoretically and through numerical examples. Based on this, we propose a new multi-domain spectral method for high-order accurate time integrations and study its stability properties by identifying the method as a generalized linear method. Numerical experiments confirm hp-convergence for both time-fractional differential equations and time-fractional partial differential equations.
Combining multiple decisions: applications to bioinformatics
NASA Astrophysics Data System (ADS)
Yukinawa, N.; Takenouchi, T.; Oba, S.; Ishii, S.
2008-01-01
Multi-class classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. This article reviews two recent approaches to multi-class classification by combining multiple binary classifiers, which are formulated based on a unified framework of error-correcting output coding (ECOC). The first approach is to construct a multi-class classifier in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. In the second approach, misclassification of each binary classifier is formulated as a bit inversion error with a probabilistic model by making an analogy to the context of information transmission theory. Experimental studies using various real-world datasets including cancer classification problems reveal that both of the new methods are superior or comparable to other multi-class classification methods.
Learn by Doing - Phase I of the ToxCast Research Program
In 2007, the USEPA embarked on a multi-year, multi-million dollar research program to develop and evaluate a new approach to prioritizing the toxicity testing of environmental chemicals. ToxCast was divided into three main phases of effort – a proof of concept, an expansion and ...
On iterative processes in the Krylov-Sonneveld subspaces
NASA Astrophysics Data System (ADS)
Ilin, Valery P.
2016-10-01
The iterative Induced Dimension Reduction (IDR) methods are considered for solving large systems of linear algebraic equations (SLAEs) with nonsingular nonsymmetric matrices. These approaches are investigated by many authors and are charachterized sometimes as the alternative to the classical processes of Krylov type. The key moments of the IDR algorithms consist in the construction of the embedded Sonneveld subspaces, which have the decreasing dimensions and use the orthogonalization to some fixed subspace. Other independent approaches for research and optimization of the iterations are based on the augmented and modified Krylov subspaces by using the aggregation and deflation procedures with present various low rank approximations of the original matrices. The goal of this paper is to show, that IDR method in Sonneveld subspaces present an original interpretation of the modified algorithms in the Krylov subspaces. In particular, such description is given for the multi-preconditioned semi-conjugate direction methods which are actual for the parallel algebraic domain decomposition approaches.
ERIC Educational Resources Information Center
De Laat, Maarten; Lally, Vic; Lipponen, Lasse; Simons, Robert-Jan
2007-01-01
The aim of this paper is to study the online teaching styles of two teachers who each tutor a networked learning community (NLC), within the same workshop. The study is undertaking empirical work using a multi-method approach in order to triangulate and contextualise our findings and enrich our understanding of the teacher participation in these…
NASA Astrophysics Data System (ADS)
Ajadi, O. A.; Meyer, F. J.
2014-12-01
Automatic oil spill detection and tracking from Synthetic Aperture Radar (SAR) images is a difficult task, due in large part to the inhomogeneous properties of the sea surface, the high level of speckle inherent in SAR data, the complexity and the highly non-Gaussian nature of amplitude information, and the low temporal sampling that is often achieved with SAR systems. This research presents a promising new oil spill detection and tracking method that is based on time series of SAR images. Through the combination of a number of advanced image processing techniques, the develop approach is able to mitigate some of these previously mentioned limitations of SAR-based oil-spill detection and enables fully automatic spill detection and tracking across a wide range of spatial scales. The method combines an initial automatic texture analysis with a consecutive change detection approach based on multi-scale image decomposition. The first step of the approach, a texture transformation of the original SAR images, is performed in order to normalize the ocean background and enhance the contrast between oil-covered and oil-free ocean surfaces. The Lipschitz regularity (LR), a local texture parameter, is used here due to its proven ability to normalize the reflectivity properties of ocean water and maximize the visibly of oil in water. To calculate LR, the images are decomposed using two-dimensional continuous wavelet transform (2D-CWT), and transformed into Holder space to measure LR. After texture transformation, the now normalized images are inserted into our multi-temporal change detection algorithm. The multi-temporal change detection approach is a two-step procedure including (1) data enhancement and filtering and (2) multi-scale automatic change detection. The performance of the developed approach is demonstrated by an application to oil spill areas in the Gulf of Mexico. In this example, areas affected by oil spills were identified from a series of ALOS PALSAR images acquired in 2010. The comparison showed exceptional performance of our method. This method can be applied to emergency management and decision support systems with a need for real-time data, and it shows great potential for rapid data analysis in other areas, including volcano detection, flood boundaries, forest health, and wildfires.
Chapter 9. Benefits of International Collaboration | Science ...
In this chapter, we share what we have learned from working with our Brazilian colleagues on a multi university, multiyear, and multi basin ecological assessment and how those experiences were transmitted more broadly. These lessons (each of which is described in subsequent paragraphs) included 1) learning about markedly different ecosystems; 2) values to the U.S. Environmental Protection Agency (USEPA) of testing monitoring protocols in those ecosystems; 3) applying lessons from the CEMIG (Companhia Energética de Minas Gerais) project to research on other continents and elsewhere in Brazil; 4) advantages of academic team research; 5) benefits of corporate-sponsored research and federal student scholarships; 6) communicating with the general public; 7) the research web that has developed out of our work in Brazil; and 8) experiencing Brazilian culture. The USEPA’s NARS survey designs and field methods are being applied in large basin stream surveys in countries outside of the U.S. These applications not only provide valuable tests of the NARS approaches, but enhance International cooperation and generate new understandings of natural and anthropogenic controls on biota and physical habitat in streams. These understandings not only aid interpretation of the condition of streams in the regions surveyed, but also refine approaches for interpreting aquatic resource surveys elsewhere. In this book chapter, Robert Hughes and Philip Kaufmann describe th
A Review of Multivariate Methods for Multimodal Fusion of Brain Imaging Data
Adali, Tülay; Yu, Qingbao; Calhoun, Vince D.
2011-01-01
The development of various neuroimaging techniques is rapidly improving the measurements of brain function/structure. However, despite improvements in individual modalities, it is becoming increasingly clear that the most effective research approaches will utilize multi-modal fusion, which takes advantage of the fact that each modality provides a limited view of the brain. The goal of multimodal fusion is to capitalize on the strength of each modality in a joint analysis, rather than a separate analysis of each. This is a more complicated endeavor that must be approached more carefully and efficient methods should be developed to draw generalized and valid conclusions from high dimensional data with a limited number of subjects. Numerous research efforts have been reported in the field based on various statistical approaches, e.g. independent component analysis (ICA), canonical correlation analysis (CCA) and partial least squares (PLS). In this review paper, we survey a number of multivariate methods appearing in previous reports, which are performed with or without prior information and may have utility for identifying potential brain illness biomarkers. We also discuss the possible strengths and limitations of each method, and review their applications to brain imaging data. PMID:22108139
Determinants of Food Safety Risks: A Multi-disciplinary Approach
ERIC Educational Resources Information Center
Knight, Andrew; Warland, Rex
2005-01-01
This research employs a multi-disciplinary approach by developing a model that draws upon psychometric, cultural, and reflexive modernization perspectives of risk perception. Using data from a 1999 national telephone survey, we tested our model on three food risks ? pesticides, Salmonella, and fat. Results showed that perceptions of risks do vary…
Dynamic systems and inferential information processing in human communication.
Grammer, Karl; Fink, Bernhard; Renninger, LeeAnn
2002-12-01
Research in human communication on an ethological basis is almost obsolete. The reasons for this are manifold and lie partially in methodological problems connected to the observation and description of behavior, as well as the nature of human behavior itself. In this chapter, we present a new, non-intrusive, technical approach to the analysis of human non-verbal behavior, which could help to solve the problem of categorization that plagues the traditional approaches. We utilize evolutionary theory to propose a new theory-driven methodological approach to the 'multi-unit multi-channel modulation' problem of human nonverbal communication. Within this concept, communication is seen as context-dependent (the meaning of a signal is adapted to the situation), as a multi-channel and a multi-unit process (a string of many events interrelated in 'communicative' space and time), and as related to the function it serves. Such an approach can be utilized to successfully bridge the gap between evolutionary psychological research, which focuses on social cognition adaptations, and human ethology, which describes every day behavior in an objective, systematic way.
Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments
Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361
Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.
Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand
2013-01-01
We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.
Multi-mounted X-ray cone-beam computed tomography
NASA Astrophysics Data System (ADS)
Fu, Jian; Wang, Jingzheng; Guo, Wei; Peng, Peng
2018-04-01
As a powerful nondestructive inspection technique, X-ray computed tomography (X-CT) has been widely applied to clinical diagnosis, industrial production and cutting-edge research. Imaging efficiency is currently one of the major obstacles for the applications of X-CT. In this paper, a multi-mounted three dimensional cone-beam X-CT (MM-CBCT) method is reported. It consists of a novel multi-mounted cone-beam scanning geometry and the corresponding three dimensional statistical iterative reconstruction algorithm. The scanning geometry is the most iconic design and significantly different from the current CBCT systems. Permitting the cone-beam scanning of multiple objects simultaneously, the proposed approach has the potential to achieve an imaging efficiency orders of magnitude greater than the conventional methods. Although multiple objects can be also bundled together and scanned simultaneously by the conventional CBCT methods, it will lead to the increased penetration thickness and signal crosstalk. In contrast, MM-CBCT avoids substantially these problems. This work comprises a numerical study of the method and its experimental verification using a dataset measured with a developed MM-CBCT prototype system. This technique will provide a possible solution for the CT inspection in a large scale.
A Summary of the Naval Postgraduate School Research Program
1989-08-30
5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database
Chiu, Yuan-Shyi Peter; Chou, Chung-Li; Chang, Huei-Hsin; Chiu, Singa Wang
2016-01-01
A multi-customer finite production rate (FPR) model with quality assurance and discontinuous delivery policy was investigated in a recent paper (Chiu et al. in J Appl Res Technol 12(1):5-13, 2014) using differential calculus approach. This study employs mathematical modeling along with a two-phase algebraic method to resolve such a specific multi-customer FPR model. As a result, the optimal replenishment lot size and number of shipments can be derived without using the differential calculus. Such a straightforward method may assist practitioners who with insufficient knowledge of calculus in learning and managing the real multi-customer FPR systems more effectively.
2010-01-01
Background There have been a number of interventions to date aimed at improving malaria diagnostic accuracy in sub-Saharan Africa. Yet, limited success is often reported for a number of reasons, especially in rural settings. This paper seeks to provide a framework for applied research aimed to improve malaria diagnosis using a combination of the established methods, participatory action research and social entrepreneurship. Methods This case study introduces the idea of using the social entrepreneurship approach (SEA) to create innovative and sustainable applied health research outcomes. The following key elements define the SEA: (1) identifying a locally relevant research topic and plan, (2) recognizing the importance of international multi-disciplinary teams and the incorporation of local knowledge, (3) engaging in a process of continuous innovation, adaptation and learning, (4) remaining motivated and determined to achieve sustainable long-term research outcomes and, (5) sharing and transferring ownership of the project with the international and local partner. Evaluation The SEA approach has a strong emphasis on innovation lead by local stakeholders. In this case, innovation resulted in a unique holistic research program aimed at understanding patient, laboratory and physician influences on accurate diagnosis of malaria. An evaluation of milestones for each SEA element revealed that the success of one element is intricately related to the success of other elements. Conclusions The SEA will provide an additional framework for researchers and local stakeholders that promotes innovation and adaptability. This approach will facilitate the development of new ideas, strategies and approaches to understand how health issues, such as malaria, affect vulnerable communities. PMID:20128922
Triangulation and the importance of establishing valid methods for food safety culture evaluation.
Jespersen, Lone; Wallace, Carol A
2017-10-01
The research evaluates maturity of food safety culture in five multi-national food companies using method triangulation, specifically self-assessment scale, performance documents, and semi-structured interviews. Weaknesses associated with each individual method are known but there are few studies in food safety where a method triangulation approach is used for both data collection and data analysis. Significantly, this research shows that individual results taken in isolation can lead to wrong conclusions, resulting in potentially failing tactics and wasted investments. However, by applying method triangulation and reviewing results from a range of culture measurement tools it is possible to better direct investments and interventions. The findings add to the food safety culture paradigm beyond a single evaluation of food safety culture using generic culture surveys. Copyright © 2017. Published by Elsevier Ltd.
Eliseyev, Andrey; Aksenova, Tetiana
2016-01-01
In the current paper the decoding algorithms for motor-related BCI systems for continuous upper limb trajectory prediction are considered. Two methods for the smooth prediction, namely Sobolev and Polynomial Penalized Multi-Way Partial Least Squares (PLS) regressions, are proposed. The methods are compared to the Multi-Way Partial Least Squares and Kalman Filter approaches. The comparison demonstrated that the proposed methods combined the prediction accuracy of the algorithms of the PLS family and trajectory smoothness of the Kalman Filter. In addition, the prediction delay is significantly lower for the proposed algorithms than for the Kalman Filter approach. The proposed methods could be applied in a wide range of applications beyond neuroscience. PMID:27196417
Multi-criteria decision analysis for waste management in Saharawi refugee camps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garfi, M.; Tondelli, S.; Bonoli, A.
2009-10-15
The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders:more » The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.« less
Fernandes, Michelle; Stein, Alan; Newton, Charles R.; Cheikh-Ismail, Leila; Kihara, Michael; Wulff, Katharina; de León Quintana, Enrique; Aranzeta, Luis; Soria-Frisch, Aureli; Acedo, Javier; Ibanez, David; Abubakar, Amina; Giuliani, Francesca; Lewis, Tamsin; Kennedy, Stephen; Villar, Jose
2014-01-01
Background The International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st) Project is a population-based, longitudinal study describing early growth and development in an optimally healthy cohort of 4607 mothers and newborns. At 24 months, children are assessed for neurodevelopmental outcomes with the INTERGROWTH-21st Neurodevelopment Package. This paper describes neurodevelopment tools for preschoolers and the systematic approach leading to the development of the Package. Methods An advisory panel shortlisted project-specific criteria (such as multi-dimensional assessments and suitability for international populations) to be fulfilled by a neurodevelopment instrument. A literature review of well-established tools for preschoolers revealed 47 candidates, none of which fulfilled all the project's criteria. A multi-dimensional assessment was, therefore, compiled using a package-based approach by: (i) categorizing desired outcomes into domains, (ii) devising domain-specific criteria for tool selection, and (iii) selecting the most appropriate measure for each domain. Results The Package measures vision (Cardiff tests); cortical auditory processing (auditory evoked potentials to a novelty oddball paradigm); and cognition, language skills, behavior, motor skills and attention (the INTERGROWTH-21st Neurodevelopment Assessment) in 35–45 minutes. Sleep-wake patterns (actigraphy) are also assessed. Tablet-based applications with integrated quality checks and automated, wireless electroencephalography make the Package easy to administer in the field by non-specialist staff. The Package is in use in Brazil, India, Italy, Kenya and the United Kingdom. Conclusions The INTERGROWTH-21st Neurodevelopment Package is a multi-dimensional instrument measuring early child development (ECD). Its developmental approach may be useful to those involved in large-scale ECD research and surveillance efforts. PMID:25423589
NASA Astrophysics Data System (ADS)
Salucci, Marco; Tenuti, Lorenza; Nardin, Cristina; Oliveri, Giacomo; Viani, Federico; Rocca, Paolo; Massa, Andrea
2014-05-01
The application of non-destructive testing and evaluation (NDT/NDE) methodologies in civil engineering has raised a growing interest during the last years because of its potential impact in several different scenarios. As a consequence, Ground Penetrating Radar (GPR) technologies have been widely adopted as an instrument for the inspection of the structural stability of buildings and for the detection of cracks and voids. In this framework, the development and validation of GPR algorithms and methodologies represents one of the most active research areas within the ELEDIA Research Center of the University of Trento. More in detail, great efforts have been devoted towards the development of inversion techniques based on the integration of deterministic and stochastic search algorithms with multi-focusing strategies. These approaches proved to be effective in mitigating the effects of both nonlinearity and ill-posedness of microwave imaging problems, which represent the well-known issues arising in GPR inverse scattering formulations. More in detail, a regularized multi-resolution approach based on the Inexact Newton Method (INM) has been recently applied to subsurface prospecting, showing a remarkable advantage over a single-resolution implementation [1]. Moreover, the use of multi-frequency or frequency-hopping strategies to exploit the information coming from GPR data collected in time domain and transformed into its frequency components has been proposed as well. In this framework, the effectiveness of the multi-resolution multi-frequency techniques has been proven on synthetic data generated with numerical models such as GprMax [2]. The application of inversion algorithms based on Bayesian Compressive Sampling (BCS) [3][4] to GPR is currently under investigation, as well, in order to exploit their capability to provide satisfactory reconstructions in presence of single and multiple sparse scatterers [3][4]. Furthermore, multi-scaling approaches exploiting level-set-based optimization have been developed for the qualitative reconstruction of multiple and disconnected homogeneous scatterers [5]. Finally, the real-time detection and classification of subsurface scatterers has been investigated by means of learning-by-examples (LBE) techniques, such as Support Vector Machines (SVM) [6]. Acknowledgment - This work was partially supported by COST Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar' References [1] M. Salucci, D. Sartori, N. Anselmi, A. Randazzo, G. Oliveri, and A. Massa, 'Imaging Buried Objects within the Second-Order Born Approximation through a Multiresolution Regularized Inexact-Newton Method', 2013 International Symposium on Electromagnetic Theory (EMTS), (Hiroshima, Japan), May 20-24 2013 (invited). [2] A. Giannopoulos, 'Modelling ground penetrating radar by GprMax', Construct. Build. Mater., vol. 19, no. 10, pp.755 -762 2005 [3] L. Poli, G. Oliveri, P. Rocca, and A. Massa, "Bayesian compressive sensing approaches for the reconstruction of two-dimensional sparse scatterers under TE illumination," IEEE Trans. Geosci. Remote Sensing, vol. 51, no. 5, pp. 2920-2936, May. 2013. [4] L. Poli, G. Oliveri, and A. Massa, "Imaging sparse metallic cylinders through a Local Shape Function Bayesian Compressive Sensing approach," Journal of Optical Society of America A, vol. 30, no. 6, pp. 1261-1272, 2013. [5] M. Benedetti, D. Lesselier, M. Lambert, and A. Massa, "Multiple shapes reconstruction by means of multi-region level sets," IEEE Trans. Geosci. Remote Sensing, vol. 48, no. 5, pp. 2330-2342, May 2010. [6] L. Lizzi, F. Viani, P. Rocca, G. Oliveri, M. Benedetti and A. Massa, "Three-dimensional real-time localization of subsurface objects - From theory to experimental validation," 2009 IEEE International Geoscience and Remote Sensing Symposium, vol. 2, pp. II-121-II-124, 12-17 July 2009.
The Blended Finite Element Method for Multi-fluid Plasma Modeling
2016-07-01
Briefing Charts 3. DATES COVERED (From - To) 07 June 2016 - 01 July 2016 4. TITLE AND SUBTITLE The Blended Finite Element Method for Multi-fluid Plasma...BLENDED FINITE ELEMENT METHOD FOR MULTI-FLUID PLASMA MODELING Éder M. Sousa1, Uri Shumlak2 1ERC INC., IN-SPACE PROPULSION BRANCH (RQRS) AIR FORCE RESEARCH...MULTI-FLUID PLASMA MODEL 2 BLENDED FINITE ELEMENT METHOD Blended Finite Element Method Nodal Continuous Galerkin Modal Discontinuous Galerkin Model
The Extraction of One-Dimensional Flow Properties from Multi-Dimensional Data Sets
NASA Technical Reports Server (NTRS)
Baurle, Robert A.; Gaffney, Richard L., Jr.
2007-01-01
The engineering design and analysis of air-breathing propulsion systems relies heavily on zero- or one-dimensional properties (e.g. thrust, total pressure recovery, mixing and combustion efficiency, etc.) for figures of merit. The extraction of these parameters from experimental data sets and/or multi-dimensional computational data sets is therefore an important aspect of the design process. A variety of methods exist for extracting performance measures from multi-dimensional data sets. Some of the information contained in the multi-dimensional flow is inevitably lost when any one-dimensionalization technique is applied. Hence, the unique assumptions associated with a given approach may result in one-dimensional properties that are significantly different than those extracted using alternative approaches. The purpose of this effort is to examine some of the more popular methods used for the extraction of performance measures from multi-dimensional data sets, reveal the strengths and weaknesses of each approach, and highlight various numerical issues that result when mapping data from a multi-dimensional space to a space of one dimension.
The Art of Extracting One-Dimensional Flow Properties from Multi-Dimensional Data Sets
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Gaffney, R. L.
2007-01-01
The engineering design and analysis of air-breathing propulsion systems relies heavily on zero- or one-dimensional properties (e:g: thrust, total pressure recovery, mixing and combustion efficiency, etc.) for figures of merit. The extraction of these parameters from experimental data sets and/or multi-dimensional computational data sets is therefore an important aspect of the design process. A variety of methods exist for extracting performance measures from multi-dimensional data sets. Some of the information contained in the multi-dimensional flow is inevitably lost when any one-dimensionalization technique is applied. Hence, the unique assumptions associated with a given approach may result in one-dimensional properties that are significantly different than those extracted using alternative approaches. The purpose of this effort is to examine some of the more popular methods used for the extraction of performance measures from multi-dimensional data sets, reveal the strengths and weaknesses of each approach, and highlight various numerical issues that result when mapping data from a multi-dimensional space to a space of one dimension.
Wavepacket dynamics and the multi-configurational time-dependent Hartree approach
NASA Astrophysics Data System (ADS)
Manthe, Uwe
2017-06-01
Multi-configurational time-dependent Hartree (MCTDH) based approaches are efficient, accurate, and versatile methods for high-dimensional quantum dynamics simulations. Applications range from detailed investigations of polyatomic reaction processes in the gas phase to high-dimensional simulations studying the dynamics of condensed phase systems described by typical solid state physics model Hamiltonians. The present article presents an overview of the different areas of application and provides a comprehensive review of the underlying theory. The concepts and guiding ideas underlying the MCTDH approach and its multi-mode and multi-layer extensions are discussed in detail. The general structure of the equations of motion is highlighted. The representation of the Hamiltonian and the correlated discrete variable representation (CDVR), which provides an efficient multi-dimensional quadrature in MCTDH calculations, are discussed. Methods which facilitate the calculation of eigenstates, the evaluation of correlation functions, and the efficient representation of thermal ensembles in MCTDH calculations are described. Different schemes for the treatment of indistinguishable particles in MCTDH calculations and recent developments towards a unified multi-layer MCTDH theory for systems including bosons and fermions are discussed.
ERIC Educational Resources Information Center
Moustafa, Amr; Ghani, Mohd Zuri
2016-01-01
This research examines the effectiveness of multi sensory approach for the purpose of improving the knowledge on English Letter sound correspondence among mild disabled students in the state of Kuwait. The discussion in this study is based on the multisensory approach that could be applied in the teaching of reading skills as well as phonemic…
Data warehousing methods and processing infrastructure for brain recovery research.
Gee, T; Kenny, S; Price, C J; Seghier, M L; Small, S L; Leff, A P; Pacurar, A; Strother, S C
2010-09-01
In order to accelerate translational neuroscience with the goal of improving clinical care it has become important to support rapid accumulation and analysis of large, heterogeneous neuroimaging samples and their metadata from both normal control and patient groups. We propose a multi-centre, multinational approach to accelerate the data mining of large samples and facilitate data-led clinical translation of neuroimaging results in stroke. Such data-driven approaches are likely to have an early impact on clinically relevant brain recovery while we simultaneously pursue the much more challenging model-based approaches that depend on a deep understanding of the complex neural circuitry and physiological processes that support brain function and recovery. We present a brief overview of three (potentially converging) approaches to neuroimaging data warehousing and processing that aim to support these diverse methods for facilitating prediction of cognitive and behavioral recovery after stroke, or other types of brain injury or disease.
A Generalized Approach for Measuring Relationships Among Genes.
Wang, Lijun; Ahsan, Md Asif; Chen, Ming
2017-07-21
Several methods for identifying relationships among pairs of genes have been developed. In this article, we present a generalized approach for measuring relationships between any pairs of genes, which is based on statistical prediction. We derive two particular versions of the generalized approach, least squares estimation (LSE) and nearest neighbors prediction (NNP). According to mathematical proof, LSE is equivalent to the methods based on correlation; and NNP is approximate to one popular method called the maximal information coefficient (MIC) according to the performances in simulations and real dataset. Moreover, the approach based on statistical prediction can be extended from two-genes relationships to multi-genes relationships. This application would help to identify relationships among multi-genes.
Huang, Ming-Xiong; Anderson, Bill; Huang, Charles W.; Kunde, Gerd J.; Vreeland, Erika C.; Huang, Jeffrey W.; Matlashov, Andrei N.; Karaulanov, Todor; Nettles, Christopher P.; Gomez, Andrew; Minser, Kayla; Weldon, Caroline; Paciotti, Giulio; Harsh, Michael; Lee, Roland R.; Flynn, Edward R.
2017-01-01
Superparamagnetic Relaxometry (SPMR) is a highly sensitive technique for the in vivo detection of tumor cells and may improve early stage detection of cancers. SPMR employs superparamagnetic iron oxide nanoparticles (SPION). After a brief magnetizing pulse is used to align the SPION, SPMR measures the time decay of SPION using Super-conducting Quantum Interference Device (SQUID) sensors. Substantial research has been carried out in developing the SQUID hardware and in improving the properties of the SPION. However, little research has been done in the pre-processing of sensor signals and post-processing source modeling in SPMR. In the present study, we illustrate new pre-processing tools that were developed to: 1) remove trials contaminated with artifacts, 2) evaluate and ensure that a single decay process associated with bounded SPION exists in the data, 3) automatically detect and correct flux jumps, and 4) accurately fit the sensor signals with different decay models. Furthermore, we developed an automated approach based on multi-start dipole imaging technique to obtain the locations and magnitudes of multiple magnetic sources, without initial guesses from the users. A regularization process was implemented to solve the ambiguity issue related to the SPMR source variables. A procedure based on reduced chi-square cost-function was introduced to objectively obtain the adequate number of dipoles that describe the data. The new pre-processing tools and multi-start source imaging approach have been successfully evaluated using phantom data. In conclusion, these tools and multi-start source modeling approach substantially enhance the accuracy and sensitivity in detecting and localizing sources from the SPMR signals. Furthermore, multi-start approach with regularization provided robust and accurate solutions for a poor SNR condition similar to the SPMR detection sensitivity in the order of 1000 cells. We believe such algorithms will help establishing the industrial standards for SPMR when applying the technique in pre-clinical and clinical settings. PMID:28072579
NASA Astrophysics Data System (ADS)
Shen, Fei; Chen, Chao; Yan, Ruqiang
2017-05-01
Classical bearing fault diagnosis methods, being designed according to one specific task, always pay attention to the effectiveness of extracted features and the final diagnostic performance. However, most of these approaches suffer from inefficiency when multiple tasks exist, especially in a real-time diagnostic scenario. A fault diagnosis method based on Non-negative Matrix Factorization (NMF) and Co-clustering strategy is proposed to overcome this limitation. Firstly, some high-dimensional matrixes are constructed using the Short-Time Fourier Transform (STFT) features, where the dimension of each matrix equals to the number of target tasks. Then, the NMF algorithm is carried out to obtain different components in each dimension direction through optimized matching, such as Euclidean distance and divergence distance. Finally, a Co-clustering technique based on information entropy is utilized to realize classification of each component. To verity the effectiveness of the proposed approach, a series of bearing data sets were analysed in this research. The tests indicated that although the diagnostic performance of single task is comparable to traditional clustering methods such as K-mean algorithm and Guassian Mixture Model, the accuracy and computational efficiency in multi-tasks fault diagnosis are improved.
Multi-atlas segmentation enables robust multi-contrast MRI spleen segmentation for splenomegaly
NASA Astrophysics Data System (ADS)
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L.; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.
2017-02-01
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≍1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Multi-atlas Segmentation Enables Robust Multi-contrast MRI Spleen Segmentation for Splenomegaly.
Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L; Assad, Albert; Abramson, Richard G; Landman, Bennett A
2017-02-11
Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≈1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.
Large area sub-micron chemical imaging of magnesium in sea urchin teeth.
Masic, Admir; Weaver, James C
2015-03-01
The heterogeneous and site-specific incorporation of inorganic ions can profoundly influence the local mechanical properties of damage tolerant biological composites. Using the sea urchin tooth as a research model, we describe a multi-technique approach to spatially map the distribution of magnesium in this complex multiphase system. Through the combined use of 16-bit backscattered scanning electron microscopy, multi-channel energy dispersive spectroscopy elemental mapping, and diffraction-limited confocal Raman spectroscopy, we demonstrate a new set of high throughput, multi-spectral, high resolution methods for the large scale characterization of mineralized biological materials. In addition, instrument hardware and data collection protocols can be modified such that several of these measurements can be performed on irregularly shaped samples with complex surface geometries and without the need for extensive sample preparation. Using these approaches, in conjunction with whole animal micro-computed tomography studies, we have been able to spatially resolve micron and sub-micron structural features across macroscopic length scales on entire urchin tooth cross-sections and correlate these complex morphological features with local variability in elemental composition. Copyright © 2015 Elsevier Inc. All rights reserved.
Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok
2018-05-01
The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.
NASA Astrophysics Data System (ADS)
dall'Acqua, Luisa
2011-08-01
The teleology of our research is to propose a solution to the request of "innovative, creative teaching", proposing a methodology to educate creative Students in a society characterized by multiple reference points and hyper dynamic knowledge, continuously subject to reviews and discussions. We apply a multi-prospective Instructional Design Model (PENTHA ID Model), defined and developed by our research group, which adopts a hybrid pedagogical approach, consisting of elements of didactical connectivism intertwined with aspects of social constructivism and enactivism. The contribution proposes an e-course structure and approach, applying the theoretical design principles of the above mentioned ID Model, describing methods, techniques, technologies and assessment criteria for the definition of lesson modes in an e-course.
Pilot users in agile development processes: motivational factors.
Johannessen, Liv Karen; Gammon, Deede
2010-01-01
Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Asan, Onur; Montague, Enid
2014-01-01
The purpose of this paper is to describe the use of video-based observation research methods in primary care environment and highlight important methodological considerations and provide practical guidance for primary care and human factors researchers conducting video studies to understand patient-clinician interaction in primary care settings. We reviewed studies in the literature which used video methods in health care research, and we also used our own experience based on the video studies we conducted in primary care settings. This paper highlighted the benefits of using video techniques, such as multi-channel recording and video coding, and compared "unmanned" video recording with the traditional observation method in primary care research. We proposed a list that can be followed step by step to conduct an effective video study in a primary care setting for a given problem. This paper also described obstacles, researchers should anticipate when using video recording methods in future studies. With the new technological improvements, video-based observation research is becoming a promising method in primary care and HFE research. Video recording has been under-utilised as a data collection tool because of confidentiality and privacy issues. However, it has many benefits as opposed to traditional observations, and recent studies using video recording methods have introduced new research areas and approaches.
Bland, Andrew J; Tobbell, Jane
2015-11-01
Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action. Copyright © 2015 Elsevier Ltd. All rights reserved.
An emerging paradigm: a strength-based approach to exploring mental imagery
MacIntyre, Tadhg E.; Moran, Aidan P.; Collet, Christian; Guillot, Aymeric
2013-01-01
Mental imagery, or the ability to simulate in the mind information that is not currently perceived by the senses, has attracted considerable research interest in psychology since the early 1970's. Within the past two decades, research in this field—as in cognitive psychology more generally—has been dominated by neuroscientific methods that typically involve comparisons between imagery performance of participants from clinical populations with those who exhibit apparently normal cognitive functioning. Although this approach has been valuable in identifying key neural substrates of visual imagery, it has been less successful in understanding the possible mechanisms underlying another simulation process, namely, motor imagery or the mental rehearsal of actions without engaging in the actual movements involved. In order to address this oversight, a “strength-based” approach has been postulated which is concerned with understanding those on the high ability end of the imagery performance spectrum. Guided by the expert performance approach and principles of ecological validity, converging methods have the potential to enable imagery researchers to investigate the neural “signature” of elite performers, for example. Therefore, the purpose of this paper is to explain the origin, nature, and implications of the strength-based approach to mental imagery. Following a brief explanation of the background to this latter approach, we highlight some important theoretical advances yielded by recent research on mental practice, mental travel, and meta-imagery processes in expert athletes and dancers. Next, we consider the methodological implications of using a strength-based approach to investigate imagery processes. The implications for the field of motor cognition are outlined and specific research questions, in dynamic imagery, imagery perspective, measurement, multi-sensory imagery, and metacognition that may benefit from this approach in the future are sketched briefly. PMID:23554591
Multi-Drafting Feedback Process in a Web-Based Environment
ERIC Educational Resources Information Center
Peled, Yehuda; Sarid, Miriam
2010-01-01
Purpose: The purpose of this paper is to explore the nature of multi-drafting among college students according to demographic characteristics and measure its impact on students' achievements. Design/methodology/approach: The research was conducted in two stages. First, a preliminary research based on data from the Highlearn web-based content…
Jones, Hendrée E.; Fischer, Gabriele; Heil, Sarah H.; Kaltenbach, Karol; Martin, Peter R.; Coyle, Mara G.; Selby, Peter; Stine, Susan M.; O’Grady, Kevin E.; Arria, Amelia M.
2015-01-01
Aims The Maternal Opioid Treatment: Human Experimental Research (MOTHER) project, an eight-site randomized, double-blind, double-dummy, flexible-dosing, parallel-group clinical trial is described. This study is the most current – and single most comprehensive – research effort to investigate the safety and efficacy of maternal and prenatal exposure to methadone and buprenorphine. Methods The MOTHER study design is outlined, and its basic features are presented. Conclusions At least seven important lessons have been learned from the MOTHER study: (1) an interdisciplinary focus improves the design and methods of a randomized clinical trial; (2) multiple sites in a clinical trial present continuing challenges to the investigative team due to variations in recruitment goals, patient populations, and hospital practices that in turn differentially impact recruitment rates, treatment compliance, and attrition; (3) study design and protocols must be flexible in order to meet the unforeseen demands of both research and clinical management; (4) staff turnover needs to be addressed with a proactive focus on both hiring and training; (5) the implementation of a protocol for the treatment of a particular disorder may identify important ancillary clinical issues worthy of investigation; (6) timely tracking of data in a multi-site trial is both demanding and unforgiving; and, (7) complex multi-site trials pose unanticipated challenges that complicate the choice of statistical methods, thereby placing added demands on investigators to effectively communicate their results. PMID:23106924
Data Mining Algorithms for Classification of Complex Biomedical Data
ERIC Educational Resources Information Center
Lan, Liang
2012-01-01
In my dissertation, I will present my research which contributes to solve the following three open problems from biomedical informatics: (1) Multi-task approaches for microarray classification; (2) Multi-label classification of gene and protein prediction from multi-source biological data; (3) Spatial scan for movement data. In microarray…
A Mixed Integer Linear Programming Approach to Electrical Stimulation Optimization Problems.
Abouelseoud, Gehan; Abouelseoud, Yasmine; Shoukry, Amin; Ismail, Nour; Mekky, Jaidaa
2018-02-01
Electrical stimulation optimization is a challenging problem. Even when a single region is targeted for excitation, the problem remains a constrained multi-objective optimization problem. The constrained nature of the problem results from safety concerns while its multi-objectives originate from the requirement that non-targeted regions should remain unaffected. In this paper, we propose a mixed integer linear programming formulation that can successfully address the challenges facing this problem. Moreover, the proposed framework can conclusively check the feasibility of the stimulation goals. This helps researchers to avoid wasting time trying to achieve goals that are impossible under a chosen stimulation setup. The superiority of the proposed framework over alternative methods is demonstrated through simulation examples.
On process optimization considering LCA methodology.
Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío
2012-04-15
The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.
A Newton-Raphson Method Approach to Adjusting Multi-Source Solar Simulators
NASA Technical Reports Server (NTRS)
Snyder, David B.; Wolford, David S.
2012-01-01
NASA Glenn Research Center has been using an in house designed X25 based multi-source solar simulator since 2003. The simulator is set up for triple junction solar cells prior to measurements b y adjusting the three sources to produce the correct short circuit current, lsc, in each of three AM0 calibrated sub-cells. The past practice has been to adjust one source on one sub-cell at a time, iterating until all the sub-cells have the calibrated Isc. The new approach is to create a matrix of measured lsc for small source changes on each sub-cell. A matrix, A, is produced. This is normalized to unit changes in the sources so that Ax(delta)s = (delta)isc. This matrix can now be inverted and used with the known Isc differences from the AM0 calibrated values to indicate changes in the source settings, (delta)s = A ·'x.(delta)isc This approach is still an iterative one, but all sources are changed during each iteration step. It typically takes four to six steps to converge on the calibrated lsc values. Even though the source lamps may degrade over time, the initial matrix evaluation i s not performed each time, since measurement matrix needs to be only approximate. Because an iterative approach is used the method will still continue to be valid. This method may become more important as state-of-the-art solar cell junction responses overlap the sources of the simulator. Also, as the number of cell junctions and sources increase, this method should remain applicable.
Multi-country health surveys: are the analyses misleading?
Masood, Mohd; Reidpath, Daniel D
2014-05-01
The aim of this paper was to review the types of approaches currently utilized in the analysis of multi-country survey data, specifically focusing on design and modeling issues with a focus on analyses of significant multi-country surveys published in 2010. A systematic search strategy was used to identify the 10 multi-country surveys and the articles published from them in 2010. The surveys were selected to reflect diverse topics and foci; and provide an insight into analytic approaches across research themes. The search identified 159 articles appropriate for full text review and data extraction. The analyses adopted in the multi-country surveys can be broadly classified as: univariate/bivariate analyses, and multivariate/multivariable analyses. Multivariate/multivariable analyses may be further divided into design- and model-based analyses. Of the 159 articles reviewed, 129 articles used model-based analysis, 30 articles used design-based analyses. Similar patterns could be seen in all the individual surveys. While there is general agreement among survey statisticians that complex surveys are most appropriately analyzed using design-based analyses, most researchers continued to use the more common model-based approaches. Recent developments in design-based multi-level analysis may be one approach to include all the survey design characteristics. This is a relatively new area, however, and there remains statistical, as well as applied analytic research required. An important limitation of this study relates to the selection of the surveys used and the choice of year for the analysis, i.e., year 2010 only. There is, however, no strong reason to believe that analytic strategies have changed radically in the past few years, and 2010 provides a credible snapshot of current practice.
Monson, Daniel H.; Bowen, Lizabeth
2015-01-01
Overall, a variety of indices used to measure population status throughout the sea otter’s range have provided insights for understanding the mechanisms driving the trajectory of various sea otter populations, which a single index could not, and we suggest using multiple methods to measure a population’s status at multiple spatial and temporal scales. The work described here also illustrates the usefulness of long-term data sets and/or approaches that can be used to assess population status retrospectively, providing information otherwise not available. While not all systems will be as amenable to using all the approaches presented here, we expect innovative researchers could adapt analogous multi-scale methods to a broad range of habitats and species including apex predators occupying the top trophic levels, which are often of conservation concern.
Farquharson, Kelly; Murphy, Kimberly A.
2016-01-01
Purpose: This paper describes methodological procedures involving execution of a large-scale, multi-site longitudinal study of language and reading comprehension in young children. Researchers in the Language and Reading Research Consortium (LARRC) developed and implemented these procedures to ensure data integrity across multiple sites, schools, and grades. Specifically, major features of our approach, as well as lessons learned, are summarized in 10 steps essential for successful completion of a large-scale longitudinal investigation in early grades. Method: Over 5 years, children in preschool through third grade were administered a battery of 35 higher- and lower-level language, listening, and reading comprehension measures (RCM). Data were collected from children, their teachers, and their parents/guardians at four sites across the United States. Substantial and rigorous effort was aimed toward maintaining consistency in processes and data management across sites for children, assessors, and staff. Conclusion: With appropriate planning, flexibility, and communication strategies in place, LARRC developed and executed a successful multi-site longitudinal research study that will meet its goal of investigating the contribution and role of language skills in the development of children's listening and reading comprehension. Through dissemination of our design strategies and lessons learned, research teams embarking on similar endeavors can be better equipped to anticipate the challenges. PMID:27064308
Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach
2014-01-01
Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings. PMID:25285151
Methods and Metrics for Evaluating Environmental Dredging ...
This report documents the objectives, approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Laboratory (NRMRL) and the National Exposure Research laboratory (NERL) of the U.S. Environmental Protection Agency’s (U.S. EPA’s) Office of Research and Development (ORD) and the U.S. EPA’s Great Lakes National Program Office (GLNPO). The objectives of the research study were to: 1) evaluate remedy effectiveness of environmental dredging as applied to contaminated sediments in the Ashtabula River in northeastern Ohio, and 2) monitor the recovery of the surrounding ecosystem. The project was carried out over 6 years from 2006 through 2011 and consisted of the development and evaluation of methods and approaches to assess river and ecosystem conditions prior to dredging (2006), during dredging (2006 and 2007), and following dredging, both short term (2008) and long term (2009-2011). This project report summarizes and interprets the results of this 6-year study to develop and assess methods for monitoring pollutant fate and transport and ecosystem recovery through the use of biological, chemical, and physical lines of evidence (LOEs) such as: 1) comprehensive sampling of and chemical analysis of contaminants in surface, suspended, and historic sediments; 2) extensive grab and multi-level real time water sampling and analysis of contaminants in the water column; 3) sampling, chemi
Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei
2011-04-01
An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.
Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter
2014-10-01
Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.
A multi-state trajectory method for non-adiabatic dynamics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao, Guohua, E-mail: taogh@pkusz.edu.cn
2016-03-07
A multi-state trajectory approach is proposed to describe nuclear-electron coupled dynamics in nonadiabatic simulations. In this approach, each electronic state is associated with an individual trajectory, among which electronic transition occurs. The set of these individual trajectories constitutes a multi-state trajectory, and nuclear dynamics is described by one of these individual trajectories as the system is on the corresponding state. The total nuclear-electron coupled dynamics is obtained from the ensemble average of the multi-state trajectories. A variety of benchmark systems such as the spin-boson system have been tested and the results generated using the quasi-classical version of the method showmore » reasonably good agreement with the exact quantum calculations. Featured in a clear multi-state picture, high efficiency, and excellent numerical stability, the proposed method may have advantages in being implemented to realistic complex molecular systems, and it could be straightforwardly applied to general nonadiabatic dynamics involving multiple states.« less
NASA Technical Reports Server (NTRS)
Montgomery, Raymond C.; Granda, Jose J.
2003-01-01
Conceptually, modeling of flexible, multi-body systems involves a formulation as a set of time-dependent partial differential equations. However, for practical, engineering purposes, this modeling is usually done using the method of Finite Elements, which approximates the set of partial differential equations, thus generalizing the approach to all continuous media. This research investigates the links between the Bond Graph method and the classical methods used to develop system models and advocates the Bond Graph Methodology and current bond graph tools as alternate approaches that will lead to a quick and precise understanding of a flexible multi-body system under automatic control. For long endurance, complex spacecraft, because of articulation and mission evolution the model of the physical system may change frequently. So a method of automatic generation and regeneration of system models that does not lead to implicit equations, as does the Lagrange equation approach, is desirable. The bond graph method has been shown to be amenable to automatic generation of equations with appropriate consideration of causality. Indeed human-interactive software now exists that automatically generates both symbolic and numeric system models and evaluates causality as the user develops the model, e.g. the CAMP-G software package. In this paper the CAMP-G package is used to generate a bond graph model of the International Space Station (ISS) at an early stage in its assembly, Zvezda. The ISS is an ideal example because it is a collection of bodies that are articulated, many of which are highly flexible. Also many reaction jets are used to control translation and attitude, and many electric motors are used to articulate appendages, which consist of photovoltaic arrays and composite assemblies. The Zvezda bond graph model is compared to an existing model, which was generated by the NASA Johnson Space Center during the Verification and Analysis Cycle of Zvezda.
Ringstad, Oystein
2010-08-01
This paper presents and evaluates a methodological approach aiming at analysing some of the complex interaction between patients and different health care practitioners working together in teams. Qualitative health care research describes the values, perceptions and conceptions of patients and practitioners. In modern clinical work patients and professional practitioners often work together on complex cases involving different kinds of knowledge and values, each of them representing different perspectives. We need studies designed to capture this complexity. The methodological approach presented here is exemplified with a study in rehabilitation medicine. In this part of the health care system the clinical work is organized in multi-professional clinical teams including patients, handling complex rehabilitation processes. In the presented approach data are collected in individual in-depth interviews to have thorough descriptions of each individual perspective. The interaction in the teams is analysed by comparing different descriptions of the same situations from the involved individuals. We may then discuss how these perceptions relate to each other and how the individuals in the team interact. Two examples from an empirical study are presented and discussed, illustrating how communication, differences in evaluations and the interpretation of incidents, arguments, emotions and interpersonal relations may be discussed. It is argued that this approach may give information which can supplement the methods commonly applied in qualitative health care research today.
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
A Multigroup Method for the Calculation of Neutron Fluence with a Source Term
NASA Technical Reports Server (NTRS)
Heinbockel, J. H.; Clowdsley, M. S.
1998-01-01
Current research on the Grant involves the development of a multigroup method for the calculation of low energy evaporation neutron fluences associated with the Boltzmann equation. This research will enable one to predict radiation exposure under a variety of circumstances. Knowledge of radiation exposure in a free-space environment is a necessity for space travel, high altitude space planes and satellite design. This is because certain radiation environments can cause damage to biological and electronic systems involving both short term and long term effects. By having apriori knowledge of the environment one can use prediction techniques to estimate radiation damage to such systems. Appropriate shielding can be designed to protect both humans and electronic systems that are exposed to a known radiation environment. This is the goal of the current research efforts involving the multi-group method and the Green's function approach.
An Active Learning Approach to Teach Advanced Multi-Predictor Modeling Concepts to Clinicians
ERIC Educational Resources Information Center
Samsa, Gregory P.; Thomas, Laine; Lee, Linda S.; Neal, Edward M.
2012-01-01
Clinicians have characteristics--high scientific maturity, low tolerance for symbol manipulation and programming, limited time outside of class--that limit the effectiveness of traditional methods for teaching multi-predictor modeling. We describe an active-learning based approach that shows particular promise for accommodating these…
Kumar, Dushyant; Hariharan, Hari; Faizy, Tobias D; Borchert, Patrick; Siemonsen, Susanne; Fiehler, Jens; Reddy, Ravinder; Sedlacik, Jan
2018-05-12
We present a computationally feasible and iterative multi-voxel spatially regularized algorithm for myelin water fraction (MWF) reconstruction. This method utilizes 3D spatial correlations present in anatomical/pathological tissues and underlying B1 + -inhomogeneity or flip angle inhomogeneity to enhance the noise robustness of the reconstruction while intrinsically accounting for stimulated echo contributions using T2-distribution data alone. Simulated data and in vivo data acquired using 3D non-selective multi-echo spin echo (3DNS-MESE) were used to compare the reconstruction quality of the proposed approach against those of the popular algorithm (the method by Prasloski et al.) and our previously proposed 2D multi-slice spatial regularization spatial regularization approach. We also investigated whether the inter-sequence correlations and agreements improved as a result of the proposed approach. MWF-quantifications from two sequences, 3DNS-MESE vs 3DNS-gradient and spin echo (3DNS-GRASE), were compared for both reconstruction approaches to assess correlations and agreements between inter-sequence MWF-value pairs. MWF values from whole-brain data of six volunteers and two multiple sclerosis patients are being reported as well. In comparison with competing approaches such as Prasloski's method or our previously proposed 2D multi-slice spatial regularization method, the proposed method showed better agreements with simulated truths using regression analyses and Bland-Altman analyses. For 3DNS-MESE data, MWF-maps reconstructed using the proposed algorithm provided better depictions of white matter structures in subcortical areas adjoining gray matter which agreed more closely with corresponding contrasts on T2-weighted images than MWF-maps reconstructed with the method by Prasloski et al. We also achieved a higher level of correlations and agreements between inter-sequence (3DNS-MESE vs 3DNS-GRASE) MWF-value pairs. The proposed algorithm provides more noise-robust fits to T2-decay data and improves MWF-quantifications in white matter structures especially in the sub-cortical white matter and major white matter tract regions. Copyright © 2018 Elsevier Inc. All rights reserved.
A novel method for overlapping community detection using Multi-objective optimization
NASA Astrophysics Data System (ADS)
Ebrahimi, Morteza; Shahmoradi, Mohammad Reza; Heshmati, Zainabolhoda; Salehi, Mostafa
2018-09-01
The problem of community detection as one of the most important applications of network science can be addressed effectively by multi-objective optimization. In this paper, we aim to present a novel efficient method based on this approach. Also, in this study the idea of using all Pareto fronts to detect overlapping communities is introduced. The proposed method has two main advantages compared to other multi-objective optimization based approaches. The first advantage is scalability, and the second is the ability to find overlapping communities. Despite most of the works, the proposed method is able to find overlapping communities effectively. The new algorithm works by extracting appropriate communities from all the Pareto optimal solutions, instead of choosing the one optimal solution. Empirical experiments on different features of separated and overlapping communities, on both synthetic and real networks show that the proposed method performs better in comparison with other methods.
Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg
2017-01-01
Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
Fleisher, Linda; Ruggieri, Dominique G.; Miller, Suzanne M.; Manne, Sharon; Albrecht, Terrance; Buzaglo, Joanne; Collins, Michael A.; Katz, Michael; Kinzy, Tyler G.; Liu, Tasnuva; Manning, Cheri; Charap, Ellen Specker; Millard, Jennifer; Miller, Dawn M.; Poole, David; Raivitch, Stephanie; Roach, Nancy; Ross, Eric A.; Meropol, Neal J.
2014-01-01
Objective This article describes the rigorous development process and initial feedback of the PRE-ACT (Preparatory Education About Clinical Trials) web-based- intervention designed to improve preparation for decision making in cancer clinical trials. Methods The multi-step process included stakeholder input, formative research, user testing and feedback. Diverse teams (researchers, advocates and developers) participated including content refinement, identification of actors, and development of video scripts. Patient feedback was provided in the final production period and through a vanguard group (N = 100) from the randomized trial. Results Patients/advocates confirmed barriers to cancer clinical trial participation, including lack of awareness and knowledge, fear of side effects, logistical concerns, and mistrust. Patients indicated they liked the tool’s user-friendly nature, the organized and comprehensive presentation of the subject matter, and the clarity of the videos. Conclusion The development process serves as an example of operationalizing best practice approaches and highlights the value of a multi-disciplinary team to develop a theory-based, sophisticated tool that patients found useful in their decision making process. Practice implications Best practice approaches can be addressed and are important to ensure evidence-based tools that are of value to patients and supports the usefulness of a process map in the development of e-health tools. PMID:24813474
A Matter of Scale: Multi-Scale Ethnographic Research on Education in the United States
ERIC Educational Resources Information Center
Eisenhart, Margaret
2017-01-01
In recent years, cultural anthropologists conducting educational ethnographies in the US have pursued some new methodological approaches. These new approaches can be attributed to advances in cultural theory, evolving norms of research practice, and the affordances of new technologies. In this article, I review three such approaches under the…
Feedback linearizing control of a MIMO power system
NASA Astrophysics Data System (ADS)
Ilyes, Laszlo
Prior research has demonstrated that either the mechanical or electrical subsystem of a synchronous electric generator may be controlled using single-input single-output (SISO) nonlinear feedback linearization. This research suggests a new approach which applies nonlinear feedback linearization to a multi-input multi-output (MIMO) model of the synchronous electric generator connected to an infinite bus load model. In this way, the electrical and mechanical subsystems may be linearized and simultaneously decoupled through the introduction of a pair of auxiliary inputs. This allows well known, linear, SISO control methods to be effectively applied to the resulting systems. The derivation of the feedback linearizing control law is presented in detail, including a discussion on the use of symbolic math processing as a development tool. The linearizing and decoupling properties of the control law are validated through simulation. And finally, the robustness of the control law is demonstrated.
Parametric study using modal analysis of a bi-material plate with defects
NASA Astrophysics Data System (ADS)
Esola, S.; Bartoli, I.; Horner, S. E.; Zheng, J. Q.; Kontsos, A.
2015-03-01
Global vibrational method feasibility as a non-destructive inspection tool for multi-layered composites is evaluated using a simulated parametric study approach. A finite element model of a composite consisting of two, isotropic layers of dissimilar materials and a third, thin isotropic layer of adhesive is constructed as the representative test subject. Next, artificial damage is inserted according to systematic variations of the defect morphology parameters. A free-vibrational modal analysis simulation is executed for pristine and damaged plate conditions. Finally, resultant mode shapes and natural frequencies are extracted, compared and analyzed for trends. Though other defect types may be explored, the focus of this research is on interfacial delamination and its effects on the global, free-vibrational behavior of a composite plate. This study is part of a multi-year research effort conducted for the U.S. Army Program Executive Office - Soldier.
Compact polarimetric synthetic aperture radar for monitoring soil moisture condition
NASA Astrophysics Data System (ADS)
Merzouki, A.; McNairn, H.; Powers, J.; Friesen, M.
2017-12-01
Coarse resolution soil moisture maps are currently operationally delivered by ESA's SMOS and NASA's SMAP passive microwaves sensors. Despite this evolution, operational soil moisture monitoring at the field scale remains challenging. A number of factors contribute to this challenge including the complexity of the retrieval that requires advanced SAR systems with enhanced temporal revisit capabilities. Since the launch of RADARSAT-2 in 2007, Agriculture and Agri-Food Canada (AAFC) has been evaluating the accuracy of these data for estimating surface soil moisture. Thus, a hybrid (multi-angle/multi-polarization) retrieval approach was found well suited for the planned RADARSAT Constellation Mission (RCM) considering the more frequent relook expected with the three satellite configuration. The purpose of this study is to evaluate the capability of C-band CP data to estimate soil moisture over agricultural fields, in anticipation of the launch of RCM. In this research we introduce a new CP approach based on the IEM and simulated RCM CP mode intensities from RADARSAT-2 images acquired at different dates. The accuracy of soil moisture retrieval from the proposed multi-polarization and hybrid methods will be contrasted with that from a more conventional quad-pol approach, and validated against in situ measurements by pooling data collected over AAFC test sites in Ontario, Manitoba and Saskatchewan, Canada.
Hjelm, Markus; Holmgren, Ann-Charlotte; Willman, Ania; Bohman, Doris; Holst, Göran
2015-01-01
Background Family members of older persons (75+) with multi-morbidity are likely to benefit from utilising case management services performed by case managers. However, research has not yet explored their experiences of case managers. Objectives The aim of the study was to deepen the understanding of the importance of case managers to family members of older persons (75+) with multi-morbidity. Design The study design was based on an interpretive phenomenological approach. Method Data were collected through individual interviews with 16 family members in Sweden. The interviews were analysed by means of an interpretive phenomenological approach. Results The findings revealed one overarching theme: “Helps to fulfil my unmet needs”, based on three sub-themes: (1) “Helps me feel secure – Experiencing a trusting relationship”, (2) “Confirms and strengthens me – Challenging my sense of being alone” and (3) “Being my personal guide – Increasing my competence”. Conclusion and discussion The findings indicate that case managers were able to fulfil unmet needs of family members. The latter recognised the importance of case managers providing them with professional services tailored to their individual needs. The findings can contribute to the improvement of case management models not only for older persons but also for their family members. PMID:25918497
Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge
Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip “Eddie”; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant
2014-01-01
Prostate MRI image segmentation has been an area of intense research due to the increased use of MRI as a modality for the clinical workup of prostate cancer. Segmentation is useful for various tasks, e.g. to accurately localize prostate boundaries for radiotherapy or to initialize multi-modal registration algorithms. In the past, it has been difficult for research groups to evaluate prostate segmentation algorithms on multi-center, multi-vendor and multi-protocol data. Especially because we are dealing with MR images, image appearance, resolution and the presence of artifacts are affected by differences in scanners and/or protocols, which in turn can have a large influence on algorithm accuracy. The Prostate MR Image Segmentation (PROMISE12) challenge was setup to allow a fair and meaningful comparison of segmentation methods on the basis of performance and robustness. In this work we will discuss the initial results of the online PROMISE12 challenge, and the results obtained in the live challenge workshop hosted by the MICCAI2012 conference. In the challenge, 100 prostate MR cases from 4 different centers were included, with differences in scanner manufacturer, field strength and protocol. A total of 11 teams from academic research groups and industry participated. Algorithms showed a wide variety in methods and implementation, including active appearance models, atlas registration and level sets. Evaluation was performed using boundary and volume based metrics which were combined into a single score relating the metrics to human expert performance. The winners of the challenge where the algorithms by teams Imorphics and ScrAutoProstate, with scores of 85.72 and 84.29 overall. Both algorithms where significantly better than all other algorithms in the challenge (p < 0.05) and had an efficient implementation with a run time of 8 minutes and 3 second per case respectively. Overall, active appearance model based approaches seemed to outperform other approaches like multi-atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/. PMID:24418598
Agyepong, Irene Akua; Kwamie, Aku; Frimpong, Edith; Defor, Selina; Ibrahim, Abdallah; Aryeetey, Genevieve C; Lokossou, Virgil; Sombie, Issiaka
2017-07-12
Despite improvements over time, West Africa lags behind global as well as sub-Saharan averages in its maternal, newborn and child health (MNCH) outcomes. This is despite the availability of an increasing body of knowledge on interventions that improve such outcomes. Beyond our knowledge of what interventions work, insights are needed on others factors that facilitate or inhibit MNCH outcome improvement. This study aimed to explore health system factors conducive or limiting to MNCH policy and programme implementation and outcomes in West Africa, and how and why they work in context. We conducted a mixed methods multi-country case study focusing predominantly, but not exclusively, on the six West African countries (Burkina Faso, Benin, Mali, Senegal, Nigeria and Ghana) of the Innovating for Maternal and Child Health in Africa initiative. Data collection involved non-exhaustive review of grey and published literature, and 48 key informant interviews. We validated our findings and conclusions at two separate multi-stakeholder meetings organised by the West African Health Organization. To guide our data collection and analysis, we developed a unique theoretical framework of the link between health systems and MNCH, in which we conceptualised health systems as the foundations, pillars and roofing of a shelter for MNCH, and context as the ground on which the foundation is laid. A multitude of MNCH policies and interventions were being piloted, researched or implemented at scale in the sub-region, most of which faced multiple interacting conducive and limiting health system factors to effective implementation, as well as contextual challenges. Context acted through its effect on health system factors as well as on the social determinants of health. To accelerate and sustain improvements in MNCH outcomes in West Africa, an integrated approach to research and practice of simultaneously addressing health systems and contextual factors alongside MNCH service delivery interventions is needed. This requires multi-level, multi-sectoral and multi-stakeholder engagement approaches that span current geographical, language, research and practice community boundaries in West Africa, and effectively link the efforts of actors interested in health systems strengthening with those of actors interested in MNCH outcome improvement.
ERIC Educational Resources Information Center
Marulanda Ángel, Nora Lucía; Martínez García, Juan Manuel
2017-01-01
The demands of the academic field and the constraints students have while learning how to write appropriately call for better approaches to teach academic writing. This research study examines the effect of a multifaceted academic writing module on pre-service teachers' composition skills in an English teacher preparation program at a medium sized…
Research status of multi - robot systems task allocation and uncertainty treatment
NASA Astrophysics Data System (ADS)
Li, Dahui; Fan, Qi; Dai, Xuefeng
2017-08-01
The multi-robot coordination algorithm has become a hot research topic in the field of robotics in recent years. It has a wide range of applications and good application prospects. This paper analyzes and summarizes the current research status of multi-robot coordination algorithms at home and abroad. From task allocation and dealing with uncertainty, this paper discusses the multi-robot coordination algorithm and presents the advantages and disadvantages of each method commonly used.
Raman, Ritu; Mitchell, Marlon; Perez-Pinera, Pablo; Bashir, Rashid; DeStefano, Lizanne
2016-01-01
The rapidly evolving discipline of biological and biomedical engineering requires adaptive instructional approaches that teach students to target and solve multi-pronged and ill-structured problems at the cutting edge of scientific research. Here we present a modular approach to designing a lab-based course in the emerging field of biofabrication and biological design, leading to a final capstone design project that requires students to formulate and test a hypothesis using the scientific method. Students were assessed on a range of metrics designed to evaluate the format of the course, the efficacy of the format for teaching new topics and concepts, and the depth of the contribution this course made to students training for biological engineering careers. The evaluation showed that the problem-based format of the course was well suited to teaching students how to use the scientific method to investigate and uncover the fundamental biological design rules that govern the field of biofabrication. We show that this approach is an efficient and effective method of translating emergent scientific principles from the lab bench to the classroom and training the next generation of biological and biomedical engineers for careers as researchers and industry practicians.
Equipment Selection by using Fuzzy TOPSIS Method
NASA Astrophysics Data System (ADS)
Yavuz, Mahmut
2016-10-01
In this study, Fuzzy TOPSIS method was performed for the selection of open pit truck and the optimal solution of the problem was investigated. Data from Turkish Coal Enterprises was used in the application of the method. This paper explains the Fuzzy TOPSIS approaches with group decision-making application in an open pit coal mine in Turkey. An algorithm of the multi-person multi-criteria decision making with fuzzy set approach was applied an equipment selection problem. It was found that Fuzzy TOPSIS with a group decision making is a method that may help decision-makers in solving different decision-making problems in mining.
Sach, Tracey H; Desborough, James; Houghton, Julie; Holland, Richard
2014-11-06
Economic methods are underutilised within pharmacy research resulting in a lack of quality evidence to support funding decisions for pharmacy interventions. The aim of this study is to illustrate the methods of micro-costing within the pharmacy context in order to raise awareness and use of this approach in pharmacy research. Micro-costing methods are particularly useful where a new service or intervention is being evaluated and for which no previous estimates of the costs of providing the service exist. This paper describes the rationale for undertaking a micro-costing study before detailing and illustrating the process involved. The illustration relates to a recently completed trial of multi-professional medication reviews as an intervention provided in care homes. All costs are presented in UK£2012. In general, costing methods involve three broad steps (identification, measurement and valuation); when using micro-costing, closer attention to detail is required within all three stages of this process. The mean (standard deviation; 95% confidence interval (CI) ) cost per resident of the multi-professional medication review intervention was £104.80 (50.91; 98.72 to 109.45), such that the overall cost of providing the intervention to all intervention home residents was £36,221.29 (95% CI, 32 810.81 to 39 631.77). This study has demonstrated that micro-costing can be a useful method, not only for estimating the cost of a pharmacy intervention to feed into a pharmacy economic evaluation, but also as a source of information to help inform those designing pharmacy services about the potential time and costs involved in delivering such services. © 2014 Royal Pharmaceutical Society.
Multi-scale heat and mass transfer modelling of cell and tissue cryopreservation
Xu, Feng; Moon, Sangjun; Zhang, Xiaohui; Shao, Lei; Song, Young Seok; Demirci, Utkan
2010-01-01
Cells and tissues undergo complex physical processes during cryopreservation. Understanding the underlying physical phenomena is critical to improve current cryopreservation methods and to develop new techniques. Here, we describe multi-scale approaches for modelling cell and tissue cryopreservation including heat transfer at macroscale level, crystallization, cell volume change and mass transport across cell membranes at microscale level. These multi-scale approaches allow us to study cell and tissue cryopreservation. PMID:20047939
Yu, Hua-Gen
2015-01-28
We report a rigorous full dimensional quantum dynamics algorithm, the multi-layer Lanczos method, for computing vibrational energies and dipole transition intensities of polyatomic molecules without any dynamics approximation. The multi-layer Lanczos method is developed by using a few advanced techniques including the guided spectral transform Lanczos method, multi-layer Lanczos iteration approach, recursive residue generation method, and dipole-wavefunction contraction. The quantum molecular Hamiltonian at the total angular momentum J = 0 is represented in a set of orthogonal polyspherical coordinates so that the large amplitude motions of vibrations are naturally described. In particular, the algorithm is general and problem-independent. An applicationmore » is illustrated by calculating the infrared vibrational dipole transition spectrum of CH₄ based on the ab initio T8 potential energy surface of Schwenke and Partridge and the low-order truncated ab initio dipole moment surfaces of Yurchenko and co-workers. A comparison with experiments is made. The algorithm is also applicable for Raman polarizability active spectra.« less
NASA Astrophysics Data System (ADS)
Abdel Raheem, Shehata E.; Ahmed, Mohamed M.; Alazrak, Tarek M. A.
2015-03-01
Soil conditions have a great deal to do with damage to structures during earthquakes. Hence the investigation on the energy transfer mechanism from soils to buildings during earthquakes is critical for the seismic design of multi-story buildings and for upgrading existing structures. Thus, the need for research into soil-structure interaction (SSI) problems is greater than ever. Moreover, recent studies show that the effects of SSI may be detrimental to the seismic response of structure and neglecting SSI in analysis may lead to un-conservative design. Despite this, the conventional design procedure usually involves assumption of fixity at the base of foundation neglecting the flexibility of the foundation, the compressibility of the underneath soil and, consequently, the effect of foundation settlement on further redistribution of bending moment and shear force demands. Hence the SSI analysis of multi-story buildings is the main focus of this research; the effects of SSI are analyzed for typical multi-story building resting on raft foundation. Three methods of analysis are used for seismic demands evaluation of the target moment-resistant frame buildings: equivalent static load; response spectrum methods and nonlinear time history analysis with suit of nine time history records. Three-dimensional FE model is constructed to investigate the effects of different soil conditions and number of stories on the vibration characteristics and seismic response demands of building structures. Numerical results obtained using SSI model with different soil conditions are compared to those corresponding to fixed-base support modeling assumption. The peak responses of story shear, story moment, story displacement, story drift, moments at beam ends, as well as force of inner columns are analyzed. The results of different analysis approaches are used to evaluate the advantages, limitations, and ease of application of each approach for seismic analysis.
A multi-level systems perspective for the science of team science.
Börner, Katy; Contractor, Noshir; Falk-Krzesinski, Holly J; Fiore, Stephen M; Hall, Kara L; Keyton, Joann; Spring, Bonnie; Stokols, Daniel; Trochim, William; Uzzi, Brian
2010-09-15
This Commentary describes recent research progress and professional developments in the study of scientific teamwork, an area of inquiry termed the "science of team science" (SciTS, pronounced "sahyts"). It proposes a systems perspective that incorporates a mixed-methods approach to SciTS that is commensurate with the conceptual, methodological, and translational complexities addressed within the SciTS field. The theoretically grounded and practically useful framework is intended to integrate existing and future lines of SciTS research to facilitate the field's evolution as it addresses key challenges spanning macro, meso, and micro levels of analysis.
NASA Astrophysics Data System (ADS)
Hewitson, B.; Jack, C. D.; Gutowski, W. J., Jr.
2014-12-01
Possibly the leading complication for users of climate information for policy and adaptation is the confusing mix of contrasting data sets that offer widely differing (and often times fundamentally contradictory) indications of the magnitude and direction of past and future regional climate change. In this light, the most pressing scientific-societal challenge is the need to find new ways to understand the sources of conflicting messages from multi-model, multi-method and multi-scale disparities, to develop and implement new analytical methodologies to address this difficulty and so to advance the interpretation and communication of robust climate information to decision makers. Compounding this challenge is the growth of climate services which, in view of the confusing mix of climate change messages, raises serious concerns as to the ethics of communication and dissemination of regional climate change data.The Working Group on Regional Climate (WGRC) of the World Climate Research Program (WCRP) oversees the CORDEX downscaling program which offers a systematic approach to compare the CMIP5 GCMs alongside RCMs and Empirical-statistical (ESD) downscaling within a common experimental design, and which facilitates the evaluation and assessment of the relative information content and sources of error. Using results from the CORDEX RCM and ESD evaluation experiment, and set against the regional messages from the CMIP5 GCMs, we examine the differing messages that arise from each data source. These are then considered in terms of the implications of consequence if each data source were to be independently adopted in a real world use-case scenario. This is then cast in the context of the emerging developments on the distillation dilemma - where the pressing need is for multi-method integration - and how this relates to the WCRP regional research grand challenges.
Zhang, Xinyan; Li, Bingzong; Han, Huiying; Song, Sha; Xu, Hongxia; Hong, Yating; Yi, Nengjun; Zhuang, Wenzhuo
2018-05-10
Multiple myeloma (MM), like other cancers, is caused by the accumulation of genetic abnormalities. Heterogeneity exists in the patients' response to treatments, for example, bortezomib. This urges efforts to identify biomarkers from numerous molecular features and build predictive models for identifying patients that can benefit from a certain treatment scheme. However, previous studies treated the multi-level ordinal drug response as a binary response where only responsive and non-responsive groups are considered. It is desirable to directly analyze the multi-level drug response, rather than combining the response to two groups. In this study, we present a novel method to identify significantly associated biomarkers and then develop ordinal genomic classifier using the hierarchical ordinal logistic model. The proposed hierarchical ordinal logistic model employs the heavy-tailed Cauchy prior on the coefficients and is fitted by an efficient quasi-Newton algorithm. We apply our hierarchical ordinal regression approach to analyze two publicly available datasets for MM with five-level drug response and numerous gene expression measures. Our results show that our method is able to identify genes associated with the multi-level drug response and to generate powerful predictive models for predicting the multi-level response. The proposed method allows us to jointly fit numerous correlated predictors and thus build efficient models for predicting the multi-level drug response. The predictive model for the multi-level drug response can be more informative than the previous approaches. Thus, the proposed approach provides a powerful tool for predicting multi-level drug response and has important impact on cancer studies.
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
Korcsmaros, Tamas; Dunai, Zsuzsanna A; Vellai, Tibor; Csermely, Peter
2013-09-01
The number of bioinformatics tools and resources that support molecular and cell biology approaches is continuously expanding. Moreover, systems and network biology analyses are accompanied more and more by integrated bioinformatics methods. Traditional information-centered university teaching methods often fail, as (1) it is impossible to cover all existing approaches in the frame of a single course, and (2) a large segment of the current bioinformation can become obsolete in a few years. Signaling network offers an excellent example for teaching bioinformatics resources and tools, as it is both focused and complex at the same time. Here, we present an outline of a university bioinformatics course with four sample practices to demonstrate how signaling network studies can integrate biochemistry, genetics, cell biology and network sciences. We show that several bioinformatics resources and tools, as well as important concepts and current trends, can also be integrated to signaling network studies. The research-type hands-on experiences we show enable the students to improve key competences such as teamworking, creative and critical thinking and problem solving. Our classroom course curriculum can be re-formulated as an e-learning material or applied as a part of a specific training course. The multi-disciplinary approach and the mosaic setup of the course have the additional benefit to support the advanced teaching of talented students.
Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population.
Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B Emma; Leung, Hei
2017-06-07
Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. Copyright © 2017 Raghavan et al.
Langlois, Etienne V; Becerril Montekio, Victor; Young, Taryn; Song, Kayla; Alcalde-Rabanal, Jacqueline; Tran, Nhan
2016-03-17
There is an increasing interest worldwide to ensure evidence-informed health policymaking as a means to improve health systems performance. There is a need to engage policymakers in collaborative approaches to generate and use knowledge in real world settings. To address this gap, we implemented two interventions based on iterative exchanges between researchers and policymakers/implementers. This article aims to reflect on the implementation and impact of these multi-site evidence-to-policy approaches implemented in low-resource settings. The first approach was implemented in Mexico and Nicaragua and focused on implementation research facilitated by communities of practice (CoP) among maternal health stakeholders. We conducted a process evaluation of the CoPs and assessed the professionals' abilities to acquire, analyse, adapt and apply research. The second approach, called the Policy BUilding Demand for evidence in Decision making through Interaction and Enhancing Skills (Policy BUDDIES), was implemented in South Africa and Cameroon. The intervention put forth a 'buddying' process to enhance demand and use of systematic reviews by sub-national policymakers. The Policy BUDDIES initiative was assessed using a mixed-methods realist evaluation design. In Mexico, the implementation research supported by CoPs triggered monitoring by local health organizations of the quality of maternal healthcare programs. Health programme personnel involved in CoPs in Mexico and Nicaragua reported improved capacities to identify and use evidence in solving implementation problems. In South Africa, Policy BUDDIES informed a policy framework for medication adherence for chronic diseases, including both HIV and non-communicable diseases. Policymakers engaged in the buddying process reported an enhanced recognition of the value of research, and greater demand for policy-relevant knowledge. The collaborative evidence-to-policy approaches underline the importance of iterations and continuity in the engagement of researchers and policymakers/programme managers, in order to account for swift evolutions in health policy planning and implementation. In developing and supporting evidence-to-policy interventions, due consideration should be given to fit-for-purpose approaches, as different needs in policymaking cycles require adapted processes and knowledge. Greater consideration should be provided to approaches embedding the use of research in real-world policymaking, better suited to the complex adaptive nature of health systems.
Nelson, Geoffrey; Macnaughton, Eric; Goering, Paula
2015-11-01
Using the case of a large-scale, multi-site Canadian Housing First research demonstration project for homeless people with mental illness, At Home/Chez Soi, we illustrate the value of qualitative methods in a randomized controlled trial (RCT) of a complex community intervention. We argue that quantitative RCT research can neither capture the complexity nor tell the full story of a complex community intervention. We conceptualize complex community interventions as having multiple phases and dimensions that require both RCT and qualitative research components. Rather than assume that qualitative research and RCTs are incommensurate, a more pragmatic mixed methods approach was used, which included using both qualitative and quantitative methods to understand program implementation and outcomes. At the same time, qualitative research was used to examine aspects of the intervention that could not be understood through the RCT, such as its conception, planning, sustainability, and policy impacts. Through this example, we show how qualitative research can tell a more complete story about complex community interventions. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Andreo, B.; Barberá, J. A.; Mudarra, M.; Marín, A. I.; García-Orellana, J.; Rodellas, V.; Pérez, I.
2018-02-01
Understanding the transference of water resources within hydrogeological systems, particularly in coastal aquifers, in which groundwater discharge may occur through multiple pathways (through springs, into rivers and streams, towards the sea, etc.), is crucial for sustainable groundwater use. This research aims to demonstrate the usefulness of the application of conventional recharge assessment methods coupled to isotopic techniques for accurately quantifying the hydrogeological balance and submarine groundwater discharge (SGD) from coastal carbonate aquifers. Sierra Almijara (Southern Spain), a carbonate aquifer formed of Triassic marbles, is considered as representative of Mediterranean coastal karst formations. The use of a multi-method approach has permitted the computation of a wide range of groundwater infiltration rates (17-60%) by means of direct application of hydrometeorological methods (Thornthwaite and Kessler) and spatially distributed information (modified APLIS method). A spatially weighted recharge rate of 42% results from the most coherent information on physiographic and hydrogeological characteristics of the studied system. Natural aquifer discharge and groundwater abstraction have been volumetrically quantified, based on flow and water-level data, while the relevance of SGD was estimated from the spatial analysis of salinity, 222Rn and the short-lived radium isotope 224Ra in coastal seawater. The total mean aquifer discharge (44.9-45.9 hm3 year-1) is in agreement with the average recharged groundwater (44.7 hm3 year-1), given that the system is volumetrically equilibrated during the study period. Besides the groundwater resources assessment, the methodological aspects of this research may be interesting for groundwater management and protection strategies in coastal areas, particularly karst environments.
Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W
2015-08-27
Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.
Rebranding city: A strategic urban planning approach in Indonesia
NASA Astrophysics Data System (ADS)
Firzal, Yohannes
2018-03-01
Concomitant with entering the decentralization period has had a significant effect on cities in Indonesia, and is seen as a new era for local life. The decentralization period has also generated sentiments which are locally bounded that can be identified in the discretion given to the local government in charge to rebranding the city. In this paper, the rebranding phenomena have learned from Pekanbaru city where has changed its city brand for few times. By using a qualitative research approach and combining multi methods to collect and process the data, this paper investigates that the rebranding city has found as a strategic approach in urban planning today that is used to inject more senses to the city and its local life by the local government. This research has confirmed, for almost two decades of the decentralization period, the rebranding phenomena are not only found to generate sense locally, but also as a power marker of the local regime.
Asan, Onur; Montague, Enid
2015-01-01
Objective The purpose of this paper is to describe the use of video-based observation research methods in primary care environment and highlight important methodological considerations and provide practical guidance for primary care and human factors researchers conducting video studies to understand patient-clinician interaction in primary care settings. Methods We reviewed studies in the literature which used video methods in health care research and, we also used our own experience based on the video studies we conducted in primary care settings. Results This paper highlighted the benefits of using video techniques such as multi-channel recording and video coding and compared “unmanned” video recording with the traditional observation method in primary care research. We proposed a list, which can be followed step by step to conduct an effective video study in a primary care setting for a given problem. This paper also described obstacles researchers should anticipate when using video recording methods in future studies. Conclusion With the new technological improvements, video-based observation research is becoming a promising method in primary care and HFE research. Video recording has been under-utilized as a data collection tool because of confidentiality and privacy issues. However, it has many benefits as opposed to traditional observations, and recent studies using video recording methods have introduced new research areas and approaches. PMID:25479346
Integration of ultra-high field MRI and histology for connectome based research of brain disorders
Yang, Shan; Yang, Zhengyi; Fischer, Karin; Zhong, Kai; Stadler, Jörg; Godenschweger, Frank; Steiner, Johann; Heinze, Hans-Jochen; Bernstein, Hans-Gert; Bogerts, Bernhard; Mawrin, Christian; Reutens, David C.; Speck, Oliver; Walter, Martin
2013-01-01
Ultra-high field magnetic resonance imaging (MRI) became increasingly relevant for in vivo neuroscientific research because of improved spatial resolutions. However, this is still the unchallenged domain of histological studies, which long played an important role in the investigation of neuropsychiatric disorders. While the field of biological psychiatry strongly advanced on macroscopic levels, current developments are rediscovering the richness of immunohistological information when attempting a multi-level systematic approach to brain function and dysfunction. For most studies, histology sections lost information on three-dimensional reconstructions. Translating histological sections to 3D-volumes would thus not only allow for multi-stain and multi-subject alignment in post mortem data, but also provide a crucial step in big data initiatives involving the network analyses currently performed with in vivo MRI. We therefore investigated potential pitfalls during integration of MR and histological information where no additional blockface information is available. We demonstrated that strengths and requirements from both methods can be effectively combined at a spatial resolution of 200 μm. However, the success of this approach is heavily dependent on choices of hardware, sequence and reconstruction. We provide a fully automated pipeline that optimizes histological 3D reconstructions, providing a potentially powerful solution not only for primary human post mortem research institutions in neuropsychiatric research, but also to help alleviate the massive workloads in neuroanatomical atlas initiatives. We further demonstrate (for the first time) the feasibility and quality of ultra-high spatial resolution (150 μm isotopic) imaging of the entire human brain MRI at 7T, offering new opportunities for analyses on MR-derived information. PMID:24098272
Ochi, Kento; Kamiura, Moto
2015-09-01
A multi-armed bandit problem is a search problem on which a learning agent must select the optimal arm among multiple slot machines generating random rewards. UCB algorithm is one of the most popular methods to solve multi-armed bandit problems. It achieves logarithmic regret performance by coordinating balance between exploration and exploitation. Since UCB algorithms, researchers have empirically known that optimistic value functions exhibit good performance in multi-armed bandit problems. The terms optimistic or optimism might suggest that the value function is sufficiently larger than the sample mean of rewards. The first definition of UCB algorithm is focused on the optimization of regret, and it is not directly based on the optimism of a value function. We need to think the reason why the optimism derives good performance in multi-armed bandit problems. In the present article, we propose a new method, which is called Overtaking method, to solve multi-armed bandit problems. The value function of the proposed method is defined as an upper bound of a confidence interval with respect to an estimator of expected value of reward: the value function asymptotically approaches to the expected value of reward from the upper bound. If the value function is larger than the expected value under the asymptote, then the learning agent is almost sure to be able to obtain the optimal arm. This structure is called sand-sifter mechanism, which has no regrowth of value function of suboptimal arms. It means that the learning agent can play only the current best arm in each time step. Consequently the proposed method achieves high accuracy rate and low regret and some value functions of it can outperform UCB algorithms. This study suggests the advantage of optimism of agents in uncertain environment by one of the simplest frameworks. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Generalizing DTW to the multi-dimensional case requires an adaptive approach
Hu, Bing; Jin, Hongxia; Wang, Jun; Keogh, Eamonn
2017-01-01
In recent years Dynamic Time Warping (DTW) has emerged as the distance measure of choice for virtually all time series data mining applications. For example, virtually all applications that process data from wearable devices use DTW as a core sub-routine. This is the result of significant progress in improving DTW’s efficiency, together with multiple empirical studies showing that DTW-based classifiers at least equal (and generally surpass) the accuracy of all their rivals across dozens of datasets. Thus far, most of the research has considered only the one-dimensional case, with practitioners generalizing to the multi-dimensional case in one of two ways, dependent or independent warping. In general, it appears the community believes either that the two ways are equivalent, or that the choice is irrelevant. In this work, we show that this is not the case. The two most commonly used multi-dimensional DTW methods can produce different classifications, and neither one dominates over the other. This seems to suggest that one should learn the best method for a particular application. However, we will show that this is not necessary; a simple, principled rule can be used on a case-by-case basis to predict which of the two methods we should trust at the time of classification. Our method allows us to ensure that classification results are at least as accurate as the better of the two rival methods, and, in many cases, our method is significantly more accurate. We demonstrate our ideas with the most extensive set of multi-dimensional time series classification experiments ever attempted. PMID:29104448
A multi-modal approach to assessing recovery in youth athletes following concussion.
Reed, Nick; Murphy, James; Dick, Talia; Mah, Katie; Paniccia, Melissa; Verweel, Lee; Dobney, Danielle; Keightley, Michelle
2014-09-25
Concussion is one of the most commonly reported injuries amongst children and youth involved in sport participation. Following a concussion, youth can experience a range of short and long term neurobehavioral symptoms (somatic, cognitive and emotional/behavioral) that can have a significant impact on one's participation in daily activities and pursuits of interest (e.g., school, sports, work, family/social life, etc.). Despite this, there remains a paucity in clinically driven research aimed specifically at exploring concussion within the youth sport population, and more specifically, multi-modal approaches to measuring recovery. This article provides an overview of a novel and multi-modal approach to measuring recovery amongst youth athletes following concussion. The presented approach involves the use of both pre-injury/baseline testing and post-injury/follow-up testing to assess performance across a wide variety of domains (post-concussion symptoms, cognition, balance, strength, agility/motor skills and resting state heart rate variability). The goal of this research is to gain a more objective and accurate understanding of recovery following concussion in youth athletes (ages 10-18 years). Findings from this research can help to inform the development and use of improved approaches to concussion management and rehabilitation specific to the youth sport community.
Combinatorial Methods for Exploring Complex Materials
NASA Astrophysics Data System (ADS)
Amis, Eric J.
2004-03-01
Combinatorial and high-throughput methods have changed the paradigm of pharmaceutical synthesis and have begun to have a similar impact on materials science research. Already there are examples of combinatorial methods used for inorganic materials, catalysts, and polymer synthesis. For many investigations the primary goal has been discovery of new material compositions that optimize properties such as phosphorescence or catalytic activity. In the midst of the excitement generated to "make things", another opportunity arises for materials science to "understand things" by using the efficiency of combinatorial methods. We have shown that combinatorial methods hold potential for rapid and systematic generation of experimental data over the multi-parameter space typical of investigations in polymer physics. We have applied the combinatorial approach to studies of polymer thin films, biomaterials, polymer blends, filled polymers, and semicrystalline polymers. By combining library fabrication, high-throughput measurements, informatics, and modeling we can demonstrate validation of the methodology, new observations, and developments toward predictive models. This talk will present some of our latest work with applications to coating stability, multi-component formulations, and nanostructure assembly.
Investigators’ Successful Strategies for Working with Institutional Review Boards
Cartwright, Juliana C.; Hickman, Susan E.; Nelson, Christine A.; Knafl, Kathleen A.
2014-01-01
This study was designed to identify successful strategies used by investigators for working with their Institutional Review Boards (IRBs) in conducting human subjects research. Telephone interviews were conducted with 46 investigators representing nursing, medicine, and social work. Interview transcripts were analyzed using qualitative descriptive methods. Investigators emphasized the importance of intentionally cultivating positive relationships with IRB staff and members, and managing bureaucracy. A few used evasive measures to avoid conflict with IRBs. Few successful strategies were identified for working with multiple IRBs. Although most investigators developed successful methods for working with IRBs, further research is needed on how differences in IRB culture affect human subjects protection, and on best approaches to IRB approval of multi-site studies. PMID:23813748
MLP: A Parallel Programming Alternative to MPI for New Shared Memory Parallel Systems
NASA Technical Reports Server (NTRS)
Taft, James R.
1999-01-01
Recent developments at the NASA AMES Research Center's NAS Division have demonstrated that the new generation of NUMA based Symmetric Multi-Processing systems (SMPs), such as the Silicon Graphics Origin 2000, can successfully execute legacy vector oriented CFD production codes at sustained rates far exceeding processing rates possible on dedicated 16 CPU Cray C90 systems. This high level of performance is achieved via shared memory based Multi-Level Parallelism (MLP). This programming approach, developed at NAS and outlined below, is distinct from the message passing paradigm of MPI. It offers parallelism at both the fine and coarse grained level, with communication latencies that are approximately 50-100 times lower than typical MPI implementations on the same platform. Such latency reductions offer the promise of performance scaling to very large CPU counts. The method draws on, but is also distinct from, the newly defined OpenMP specification, which uses compiler directives to support a limited subset of multi-level parallel operations. The NAS MLP method is general, and applicable to a large class of NASA CFD codes.
Erdmann, Włodzimierz S; Kowalczyk, Radosław
2015-01-02
There are several methods for obtaining location of the centre of mass of the whole body. They are based on cadaver data, using volume and density of body parts, using radiation and image techniques. Some researchers treated the trunk as a one part only, while others divided the trunk into few parts. In addition some researchers divided the trunk with planes perpendicular to the longitudinal trunk's axis, although the best approach is to obtain trunk parts as anatomical and functional elements. This procedure was used by Dempster and Erdmann. The latter elaborated personalized estimating of inertial quantities of the trunk, while Clauser et al. gave similar approach for extremities. The aim of the investigation was to merge both indirect methods in order to obtain accurate location of the centre of mass of the whole body. As a reference location a direct method based on reaction board procedure, i.e. with a body lying on a board supported on a scale was used. The location of the centre of mass using Clauser's and Erdmann's method appeared almost identical with the location obtained with a direct method. This approach can be used for several situations, especially for people of different morphology, for the bent trunk, and for asymmetrical movements. Copyright © 2014 Elsevier Ltd. All rights reserved.
New generic indexing technology
NASA Technical Reports Server (NTRS)
Freeston, Michael
1996-01-01
There has been no fundamental change in the dynamic indexing methods supporting database systems since the invention of the B-tree twenty-five years ago. And yet the whole classical approach to dynamic database indexing has long since become inappropriate and increasingly inadequate. We are moving rapidly from the conventional one-dimensional world of fixed-structure text and numbers to a multi-dimensional world of variable structures, objects and images, in space and time. But, even before leaving the confines of conventional database indexing, the situation is highly unsatisfactory. In fact, our research has led us to question the basic assumptions of conventional database indexing. We have spent the past ten years studying the properties of multi-dimensional indexing methods, and in this paper we draw the strands of a number of developments together - some quite old, some very new, to show how we now have the basis for a new generic indexing technology for the next generation of database systems.
An Approach for Autonomy: A Collaborative Communication Framework for Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Dufrene, Warren Russell, Jr.
2005-01-01
Research done during the last three years has studied the emersion properties of Complex Adaptive Systems (CAS). The deployment of Artificial Intelligence (AI) techniques applied to remote Unmanned Aerial Vehicles has led the author to investigate applications of CAS within the field of Autonomous Multi-Agent Systems. The core objective of current research efforts is focused on the simplicity of Intelligent Agents (IA) and the modeling of these agents within complex systems. This research effort looks at the communication, interaction, and adaptability of multi-agents as applied to complex systems control. The embodiment concept applied to robotics has application possibilities within multi-agent frameworks. A new framework for agent awareness within a virtual 3D world concept is possible where the vehicle is composed of collaborative agents. This approach has many possibilities for applications to complex systems. This paper describes the development of an approach to apply this virtual framework to the NASA Goddard Space Flight Center (GSFC) tetrahedron structure developed under the Autonomous Nano Technology Swarm (ANTS) program and the Super Miniaturized Addressable Reconfigurable Technology (SMART) architecture program. These projects represent an innovative set of novel concepts deploying adaptable, self-organizing structures composed of many tetrahedrons. This technology is pushing current applied Agents Concepts to new levels of requirements and adaptability.
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Healy, Judith Mary; Tang, Shenglan; Patcharanarumol, Walaiporn; Annear, Peter Leslie
2018-04-01
Drawing on published work from the Asia Pacific Observatory on Health Systems and Policies, this paper presents a framework for undertaking comparative studies on the health systems of countries. Organized under seven types of research approaches, such as national case-studies using a common format, this framework is illustrated using studies of low- and middle-income countries published by the Asia Pacific Observatory. Such studies are important contributions, since much of the health systems research literature comes from high-income countries. No one research approach, however, can adequately analyse a health system, let alone produce a nuanced comparison of different countries. Multiple comparative studies offer a better understanding, as a health system is a complex entity to describe and analyse. Appreciation of context and culture is crucial: what works in one country may not do so in another. Further, a single research method, such as performance indicators, or a study of a particular health system function or component, produces only a partial picture. Applying a comparative framework of several study approaches helps to inform and explain progress against health system targets, to identify differences among countries, and to assess policies and programmes. Multi-method comparative research produces policy-relevant learning that can assist countries to achieve Sustainable Development Goal 3: ensure healthy lives and promoting well-being for all at all ages by 2030.
A multi-strategy approach to informative gene identification from gene expression data.
Liu, Ziying; Phan, Sieu; Famili, Fazel; Pan, Youlian; Lenferink, Anne E G; Cantin, Christiane; Collins, Catherine; O'Connor-McCourt, Maureen D
2010-02-01
An unsupervised multi-strategy approach has been developed to identify informative genes from high throughput genomic data. Several statistical methods have been used in the field to identify differentially expressed genes. Since different methods generate different lists of genes, it is very challenging to determine the most reliable gene list and the appropriate method. This paper presents a multi-strategy method, in which a combination of several data analysis techniques are applied to a given dataset and a confidence measure is established to select genes from the gene lists generated by these techniques to form the core of our final selection. The remainder of the genes that form the peripheral region are subject to exclusion or inclusion into the final selection. This paper demonstrates this methodology through its application to an in-house cancer genomics dataset and a public dataset. The results indicate that our method provides more reliable list of genes, which are validated using biological knowledge, biological experiments, and literature search. We further evaluated our multi-strategy method by consolidating two pairs of independent datasets, each pair is for the same disease, but generated by different labs using different platforms. The results showed that our method has produced far better results.
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J
2008-11-06
To assist with the development of a French online quality-controlled health gateway(CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (FMTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. In this paper,we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French.
Promoting Affordability in Defense Acquisitions: A Multi-Period Portfolio Approach
2014-04-30
has evolved out of many areas of research, ranging from economics to modern control theory (Powell, 2011). The general form of a dynamic programming...states 5 School of Aeronautics & Astronautics A Portfolio Approach: Background • Balance expected profit (performance) against risk ( variance ) in...investments (Markowitz 1952) • Efficiency frontier of optimal portfolios given investor risk averseness • Extends to multi-period case with various
ERIC Educational Resources Information Center
Gallifa, Josep
2009-01-01
This paper presents an institutional research on service quality conducted to analyze the students' motives and influences on their selection of studies and university. The research was carried out by collecting data from first-year students in a multi-campus system where institutions are independent in their recruitment strategies. Results from…
Applying the Scientific Method of Cybersecurity Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tardiff, Mark F.; Bonheyo, George T.; Cort, Katherine A.
The cyber environment has rapidly evolved from a curiosity to an essential component of the contemporary world. As the cyber environment has expanded and become more complex, so have the nature of adversaries and styles of attacks. Today, cyber incidents are an expected part of life. As a result, cybersecurity research emerged to address adversarial attacks interfering with or preventing normal cyber activities. Historical response to cybersecurity attacks is heavily skewed to tactical responses with an emphasis on rapid recovery. While threat mitigation is important and can be time critical, a knowledge gap exists with respect to developing the sciencemore » of cybersecurity. Such a science will enable the development and testing of theories that lead to understanding the broad sweep of cyber threats and the ability to assess trade-offs in sustaining network missions while mitigating attacks. The Asymmetric Resilient Cybersecurity Initiative at Pacific Northwest National Laboratory is a multi-year, multi-million dollar investment to develop approaches for shifting the advantage to the defender and sustaining the operability of systems under attack. The initiative established a Science Council to focus attention on the research process for cybersecurity. The Council shares science practices, critiques research plans, and aids in documenting and reporting reproducible research results. The Council members represent ecology, economics, statistics, physics, computational chemistry, microbiology and genetics, and geochemistry. This paper reports the initial work of the Science Council to implement the scientific method in cybersecurity research. The second section describes the scientific method. The third section in this paper discusses scientific practices for cybersecurity research. Section four describes initial impacts of applying the science practices to cybersecurity research.« less
Green, Melissa A.; Kim, Mimi M.; Barber, Sharrelle; Odulana, Abedowale A.; Godley, Paul A.; Howard, Daniel L.; Corbie-Smith, Giselle M.
2013-01-01
Introduction Prevention and treatment standards are based on evidence obtained in behavioral and clinical research. However, racial and ethnic minorities remain relatively absent from the science that develops these standards. While investigators have successfully recruited participants for individual studies using tailored recruitment methods, these strategies require considerable time and resources. Research registries, typically developed around a disease or condition, serve as a promising model for a targeted recruitment method to increase minority participation in health research. This study assessed the tailored recruitment methods used to populate a health research registry targeting African-American community members. Methods We describe six recruitment methods applied between September 2004 and October 2008 to recruit members into a health research registry. Recruitment included direct (existing studies, public databases, community outreach) and indirect methods (radio, internet, and email) targeting the general population, local universities, and African American communities. We conducted retrospective analysis of the recruitment by method using descriptive statistics, frequencies, and chi-square statistics. Results During the recruitment period, 608 individuals enrolled in the research registry. The majority of enrollees were African American, female, and in good health. Direct and indirect methods were identified as successful strategies for subgroups. Findings suggest significant associations between recruitment methods and age, presence of existing health condition, prior research participation, and motivation to join the registry. Conclusions A health research registry can be a successful tool to increase minority awareness of research opportunities. Multi-pronged recruitment approaches are needed to reach diverse subpopulations. PMID:23340183
NASA Astrophysics Data System (ADS)
Liu, Likun
2018-01-01
In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.
ERIC Educational Resources Information Center
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-01-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective,…
Reverse engineering of gene regulatory networks.
Cho, K H; Choo, S M; Jung, S H; Kim, J R; Choi, H S; Kim, J
2007-05-01
Systems biology is a multi-disciplinary approach to the study of the interactions of various cellular mechanisms and cellular components. Owing to the development of new technologies that simultaneously measure the expression of genetic information, systems biological studies involving gene interactions are increasingly prominent. In this regard, reconstructing gene regulatory networks (GRNs) forms the basis for the dynamical analysis of gene interactions and related effects on cellular control pathways. Various approaches of inferring GRNs from gene expression profiles and biological information, including machine learning approaches, have been reviewed, with a brief introduction of DNA microarray experiments as typical tools for measuring levels of messenger ribonucleic acid (mRNA) expression. In particular, the inference methods are classified according to the required input information, and the main idea of each method is elucidated by comparing its advantages and disadvantages with respect to the other methods. In addition, recent developments in this field are introduced and discussions on the challenges and opportunities for future research are provided.
Multi-task feature learning by using trace norm regularization
NASA Astrophysics Data System (ADS)
Jiangmei, Zhang; Binfeng, Yu; Haibo, Ji; Wang, Kunpeng
2017-11-01
Multi-task learning can extract the correlation of multiple related machine learning problems to improve performance. This paper considers applying the multi-task learning method to learn a single task. We propose a new learning approach, which employs the mixture of expert model to divide a learning task into several related sub-tasks, and then uses the trace norm regularization to extract common feature representation of these sub-tasks. A nonlinear extension of this approach by using kernel is also provided. Experiments conducted on both simulated and real data sets demonstrate the advantage of the proposed approach.
Prediction of protein-protein interaction network using a multi-objective optimization approach.
Chowdhury, Archana; Rakshit, Pratyusha; Konar, Amit
2016-06-01
Protein-Protein Interactions (PPIs) are very important as they coordinate almost all cellular processes. This paper attempts to formulate PPI prediction problem in a multi-objective optimization framework. The scoring functions for the trial solution deal with simultaneous maximization of functional similarity, strength of the domain interaction profiles, and the number of common neighbors of the proteins predicted to be interacting. The above optimization problem is solved using the proposed Firefly Algorithm with Nondominated Sorting. Experiments undertaken reveal that the proposed PPI prediction technique outperforms existing methods, including gene ontology-based Relative Specific Similarity, multi-domain-based Domain Cohesion Coupling method, domain-based Random Decision Forest method, Bagging with REP Tree, and evolutionary/swarm algorithm-based approaches, with respect to sensitivity, specificity, and F1 score.
Energy Efficient Real-Time Scheduling Using DPM on Mobile Sensors with a Uniform Multi-Cores
Kim, Youngmin; Lee, Chan-Gun
2017-01-01
In wireless sensor networks (WSNs), sensor nodes are deployed for collecting and analyzing data. These nodes use limited energy batteries for easy deployment and low cost. The use of limited energy batteries is closely related to the lifetime of the sensor nodes when using wireless sensor networks. Efficient-energy management is important to extending the lifetime of the sensor nodes. Most effort for improving power efficiency in tiny sensor nodes has focused mainly on reducing the power consumed during data transmission. However, recent emergence of sensor nodes equipped with multi-cores strongly requires attention to be given to the problem of reducing power consumption in multi-cores. In this paper, we propose an energy efficient scheduling method for sensor nodes supporting a uniform multi-cores. We extend the proposed T-Ler plane based scheduling for global optimal scheduling of a uniform multi-cores and multi-processors to enable power management using dynamic power management. In the proposed approach, processor selection for a scheduling and mapping method between the tasks and processors is proposed to efficiently utilize dynamic power management. Experiments show the effectiveness of the proposed approach compared to other existing methods. PMID:29240695
Application of Multi-Criteria Decision Making (MCDM) Technique for Gradation of Jute Fibres
NASA Astrophysics Data System (ADS)
Choudhuri, P. K.
2014-12-01
Multi-Criteria Decision Making is a branch of Operation Research (OR) having a comparatively short history of about 40 years. It is being popularly used in the field of engineering, banking, fixing policy matters etc. It can also be applied for taking decisions in daily life like selecting a car to purchase, selecting bride or groom and many others. Various MCDM methods namely Weighted Sum Model (WSM), Weighted Product Model (WPM), Analytic Hierarchy Process (AHP), Technique for Order Preference by Similarity to Ideal Solutions (TOPSIS) and Elimination and Choice Translating Reality (ELECTRE) are there to solve many decision making problems, each having its own limitations. However it is very difficult to decide which MCDM method is the best. MCDM methods are prospective quantitative approaches for solving decision problems involving finite number of alternatives and criteria. Very few research works in textiles have been carried out with the help of this technique particularly where decision taking among several alternatives becomes the major problem based on some criteria which are conflicting in nature. Gradation of jute fibres on the basis of the criteria like strength, root content, defects, colour, density, fineness etc. is an important task to perform. The MCDM technique provides enough scope to be applied for the gradation of jute fibres or ranking among several varieties keeping in view a particular object and on the basis of some selection criteria and their relative weightage. The present paper is an attempt to explore the scope of applying the multiplicative AHP method of multi-criteria decision making technique to determine the quality values of selected jute fibres on the basis of some above stated important criteria and ranking them accordingly. A good agreement in ranking is observed between the existing Bureau of Indian Standards (BIS) grading and proposed method.
NASA Technical Reports Server (NTRS)
Chang, Chau-Lyan; Venkatachari, Balaji Shankar; Cheng, Gary
2013-01-01
With the wide availability of affordable multiple-core parallel supercomputers, next generation numerical simulations of flow physics are being focused on unsteady computations for problems involving multiple time scales and multiple physics. These simulations require higher solution accuracy than most algorithms and computational fluid dynamics codes currently available. This paper focuses on the developmental effort for high-fidelity multi-dimensional, unstructured-mesh flow solvers using the space-time conservation element, solution element (CESE) framework. Two approaches have been investigated in this research in order to provide high-accuracy, cross-cutting numerical simulations for a variety of flow regimes: 1) time-accurate local time stepping and 2) highorder CESE method. The first approach utilizes consistent numerical formulations in the space-time flux integration to preserve temporal conservation across the cells with different marching time steps. Such approach relieves the stringent time step constraint associated with the smallest time step in the computational domain while preserving temporal accuracy for all the cells. For flows involving multiple scales, both numerical accuracy and efficiency can be significantly enhanced. The second approach extends the current CESE solver to higher-order accuracy. Unlike other existing explicit high-order methods for unstructured meshes, the CESE framework maintains a CFL condition of one for arbitrarily high-order formulations while retaining the same compact stencil as its second-order counterpart. For large-scale unsteady computations, this feature substantially enhances numerical efficiency. Numerical formulations and validations using benchmark problems are discussed in this paper along with realistic examples.
Pre-shaping of the Fingertip of Robot Hand Covered with Net Structure Proximity Sensor
NASA Astrophysics Data System (ADS)
Suzuki, Kenji; Suzuki, Yosuke; Hasegawa, Hiroaki; Ming, Aiguo; Ishikawa, Masatoshi; Shimojo, Makoto
To achieve skillful tasks with multi-fingered robot hands, many researchers have been working on sensor-based control of them. Vision sensors and tactile sensors are indispensable for the tasks, however, the correctness of the information from the vision sensors decreases as a robot hand approaches to a grasping object because of occlusion. This research aims to achieve seamless detection for reliable grasp by use of proximity sensors: correcting the positional error of the hand in vision-based approach, and contacting the fingertip in the posture for effective tactile sensing. In this paper, we propose a method for adjusting the posture of the fingertip to the surface of the object. The method applies “Net-Structure Proximity Sensor” on the fingertip, which can detect the postural error in the roll and pitch axes between the fingertip and the object surface. The experimental result shows that the postural error is corrected in the both axes even if the object dynamically rotates.
Green, Melissa A; Kim, Mimi M; Barber, Sharrelle; Odulana, Abedowale A; Godley, Paul A; Howard, Daniel L; Corbie-Smith, Giselle M
2013-05-01
Prevention and treatment standards are based on evidence obtained in behavioral and clinical research. However, racial and ethnic minorities remain relatively absent from the science that develops these standards. While investigators have successfully recruited participants for individual studies using tailored recruitment methods, these strategies require considerable time and resources. Research registries, typically developed around a disease or condition, serve as a promising model for a targeted recruitment method to increase minority participation in health research. This study assessed the tailored recruitment methods used to populate a health research registry targeting African-American community members. We describe six recruitment methods applied between September 2004 and October 2008 to recruit members into a health research registry. Recruitment included direct (existing studies, public databases, community outreach) and indirect methods (radio, internet, and email) targeting the general population, local universities, and African American communities. We conducted retrospective analysis of the recruitment by method using descriptive statistics, frequencies, and chi-square statistics. During the recruitment period, 608 individuals enrolled in the research registry. The majority of enrollees were African American, female, and in good health. Direct and indirect methods were identified as successful strategies for subgroups. Findings suggest significant associations between recruitment methods and age, presence of existing health condition, prior research participation, and motivation to join the registry. A health research registry can be a successful tool to increase minority awareness of research opportunities. Multi-pronged recruitment approaches are needed to reach diverse subpopulations. Copyright © 2013. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Sebastian; Marquetand, Philipp; González, Leticia
2014-08-21
An efficient perturbational treatment of spin-orbit coupling within the framework of high-level multi-reference techniques has been implemented in the most recent version of the COLUMBUS quantum chemistry package, extending the existing fully variational two-component (2c) multi-reference configuration interaction singles and doubles (MRCISD) method. The proposed scheme follows related implementations of quasi-degenerate perturbation theory (QDPT) model space techniques. Our model space is built either from uncontracted, large-scale scalar relativistic MRCISD wavefunctions or based on the scalar-relativistic solutions of the linear-response-theory-based multi-configurational averaged quadratic coupled cluster method (LRT-MRAQCC). The latter approach allows for a consistent, approximatively size-consistent and size-extensive treatment of spin-orbitmore » coupling. The approach is described in detail and compared to a number of related techniques. The inherent accuracy of the QDPT approach is validated by comparing cuts of the potential energy surfaces of acrolein and its S, Se, and Te analoga with the corresponding data obtained from matching fully variational spin-orbit MRCISD calculations. The conceptual availability of approximate analytic gradients with respect to geometrical displacements is an attractive feature of the 2c-QDPT-MRCISD and 2c-QDPT-LRT-MRAQCC methods for structure optimization and ab inito molecular dynamics simulations.« less
Detecting a hierarchical genetic population structure via Multi-InDel markers on the X chromosome
Fan, Guang Yao; Ye, Yi; Hou, Yi Ping
2016-01-01
Detecting population structure and estimating individual biogeographical ancestry are very important in population genetics studies, biomedical research and forensics. Single-nucleotide polymorphism (SNP) has long been considered to be a primary ancestry-informative marker (AIM), but it is constrained by complex and time-consuming genotyping protocols. Following up on our previous study, we propose that a multi-insertion-deletion polymorphism (Multi-InDel) with multiple haplotypes can be useful in ancestry inference and hierarchical genetic population structures. A validation study for the X chromosome Multi-InDel marker (X-Multi-InDel) as a novel AIM was conducted. Genetic polymorphisms and genetic distances among three Chinese populations and 14 worldwide populations obtained from the 1000 Genomes database were analyzed. A Bayesian clustering method (STRUCTURE) was used to discern the continental origins of Europe, East Asia, and Africa. A minimal panel of ten X-Multi-InDels was verified to be sufficient to distinguish human ancestries from three major continental regions with nearly the same efficiency of the earlier panel with 21 insertion-deletion AIMs. Along with the development of more X-Multi-InDels, an approach using this novel marker has the potential for broad applicability as a cost-effective tool toward more accurate determinations of individual biogeographical ancestry and population stratification. PMID:27535707
A novel framework for feature extraction in multi-sensor action potential sorting.
Wu, Shun-Chi; Swindlehurst, A Lee; Nenadic, Zoran
2015-09-30
Extracellular recordings of multi-unit neural activity have become indispensable in neuroscience research. The analysis of the recordings begins with the detection of the action potentials (APs), followed by a classification step where each AP is associated with a given neural source. A feature extraction step is required prior to classification in order to reduce the dimensionality of the data and the impact of noise, allowing source clustering algorithms to work more efficiently. In this paper, we propose a novel framework for multi-sensor AP feature extraction based on the so-called Matched Subspace Detector (MSD), which is shown to be a natural generalization of standard single-sensor algorithms. Clustering using both simulated data and real AP recordings taken in the locust antennal lobe demonstrates that the proposed approach yields features that are discriminatory and lead to promising results. Unlike existing methods, the proposed algorithm finds joint spatio-temporal feature vectors that match the dominant subspace observed in the two-dimensional data without needs for a forward propagation model and AP templates. The proposed MSD approach provides more discriminatory features for unsupervised AP sorting applications. Copyright © 2015 Elsevier B.V. All rights reserved.
A Systems-Based Approach To Integrated Nutrient Management in Narragansett Bay and Its Watershed.
EPA’s Office of Research and Development is embarking on a project to develop and demonstrate a systems-based management approach that will achieve more integrated and effective management of nutrients in southern New England. The geographic focus of this multi-year research proj...
A Systems-Based Approach to Integrated Nutrient Management in Narragansett Bay and its Watershed
EPA’s Office of Research and Development is embarking on a project to develop and demonstrate a systems-based management approach that will achieve more integrated and effective management of nutrients in southern New England. The geographic focus of this multi-year research proj...
Yu, Tsung-Hsien; Tung, Yu-Chi; Chung, Kuo-Piao
2015-08-01
Volume-infection relation studies have been published for high-risk surgical procedures, although the conclusions remain controversial. Inconsistent results may be caused by inconsistent categorization methods, the definitions of service volume, and different statistical approaches. The purpose of this study was to examine whether a relation exists between provider volume and coronary artery bypass graft (CABG) surgical site infection (SSI) using different categorization methods. A population-based cross-sectional multi-level study was conducted. A total of 10,405 patients who received CABG surgery between 2006 and 2008 in Taiwan were recruited. The outcome of interest was surgical site infection for CABG surgery. The associations among several patient, surgeon, and hospital characteristics was examined. The definition of surgeons' and hospitals' service volume was the cumulative CABG service volumes in the previous year for each CABG operation and categorized by three types of approaches: Continuous, quartile, and k-means clustering. The results of multi-level mixed effects modeling showed that hospital volume had no association with SSI. Although the relation between surgeon volume and surgical site infection was negative, it was inconsistent among the different categorization methods. Categorization of service volume is an important issue in volume-infection study. The findings of the current study suggest that different categorization methods might influence the relation between volume and SSI. The selection of an optimal cutoff point should be taken into account for future research.
Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Hinkley, Jeffrey A.
2003-01-01
The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.
Estimating Top-of-Atmosphere Thermal Infrared Radiance Using MERRA-2 Atmospheric Data
NASA Astrophysics Data System (ADS)
Kleynhans, Tania
Space borne thermal infrared sensors have been extensively used for environmental research as well as cross-calibration of other thermal sensing systems. Thermal infrared data from satellites such as Landsat and Terra/MODIS have limited temporal resolution (with a repeat cycle of 1 to 2 days for Terra/MODIS, and 16 days for Landsat). Thermal instruments with finer temporal resolution on geostationary satellites have limited utility for cross-calibration due to their large view angles. Reanalysis atmospheric data is available on a global spatial grid at three hour intervals making it a potential alternative to existing satellite image data. This research explores using the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis data product to predict top-of-atmosphere (TOA) thermal infrared radiance globally at time scales finer than available satellite data. The MERRA-2 data product provides global atmospheric data every three hours from 1980 to the present. Due to the high temporal resolution of the MERRA-2 data product, opportunities for novel research and applications are presented. While MERRA-2 has been used in renewable energy and hydrological studies, this work seeks to leverage the model to predict TOA thermal radiance. Two approaches have been followed, namely physics-based approach and a supervised learning approach, using Terra/MODIS band 31 thermal infrared data as reference. The first physics-based model uses forward modeling to predict TOA thermal radiance. The second model infers the presence of clouds from the MERRA-2 atmospheric data, before applying an atmospheric radiative transfer model. The last physics-based model parameterized the previous model to minimize computation time. The second approach applied four different supervised learning algorithms to the atmospheric data. The algorithms included a linear least squares regression model, a non-linear support vector regression (SVR) model, a multi-layer perceptron (MLP), and a convolutional neural network (CNN). This research found that the multi-layer perceptron model produced the lowest error rates overall, with an RMSE of 1.22W / m2 sr mum when compared to actual Terra/MODIS band 31 image data. This research further aimed to characterize the errors associated with each method so that any potential user will have the best information available should they wish to apply these methods towards their own application.
Application of a Sensemaking Approach to Ethics Training in the Physical Sciences and Engineering
NASA Astrophysics Data System (ADS)
Kligyte, Vykinta; Marcy, Richard T.; Waples, Ethan P.; Sevier, Sydney T.; Godfrey, Elaine S.; Mumford, Michael D.; Hougen, Dean F.
2008-06-01
Integrity is a critical determinant of the effectiveness of research organizations in terms of producing high quality research and educating the new generation of scientists. A number of responsible conduct of research (RCR) training programs have been developed to address this growing organizational concern. However, in spite of a significant body of research in ethics training, it is still unknown which approach has the highest potential to enhance researchers' integrity. One of the approaches showing some promise in improving researchers' integrity has focused on the development of ethical decision-making skills. The current effort proposes a novel curriculum that focuses on broad metacognitive reasoning strategies researchers use when making sense of day-to-day social and professional practices that have ethical implications for the physical sciences and engineering. This sensemaking training has been implemented in a professional sample of scientists conducting research in electrical engineering, atmospheric and computer sciences at a large multi-cultural, multi-disciplinary, and multi-university research center. A pre-post design was used to assess training effectiveness using scenario-based ethical decision-making measures. The training resulted in enhanced ethical decision-making of researchers in relation to four ethical conduct areas, namely data management, study conduct, professional practices, and business practices. In addition, sensemaking training led to researchers' preference for decisions involving the application of the broad metacognitive reasoning strategies. Individual trainee and training characteristics were used to explain the study findings. Broad implications of the findings for ethics training development, implementation, and evaluation in the sciences are discussed.
Application of a sensemaking approach to ethics training in the physical sciences and engineering.
Kligyte, Vykinta; Marcy, Richard T; Waples, Ethan P; Sevier, Sydney T; Godfrey, Elaine S; Mumford, Michael D; Hougen, Dean F
2008-06-01
Integrity is a critical determinant of the effectiveness of research organizations in terms of producing high quality research and educating the new generation of scientists. A number of responsible conduct of research (RCR) training programs have been developed to address this growing organizational concern. However, in spite of a significant body of research in ethics training, it is still unknown which approach has the highest potential to enhance researchers' integrity. One of the approaches showing some promise in improving researchers' integrity has focused on the development of ethical decision-making skills. The current effort proposes a novel curriculum that focuses on broad metacognitive reasoning strategies researchers use when making sense of day-to-day social and professional practices that have ethical implications for the physical sciences and engineering. This sensemaking training has been implemented in a professional sample of scientists conducting research in electrical engineering, atmospheric and computer sciences at a large multi-cultural, multi-disciplinary, and multi-university research center. A pre-post design was used to assess training effectiveness using scenario-based ethical decision-making measures. The training resulted in enhanced ethical decision-making of researchers in relation to four ethical conduct areas, namely data management, study conduct, professional practices, and business practices. In addition, sensemaking training led to researchers' preference for decisions involving the application of the broad metacognitive reasoning strategies. Individual trainee and training characteristics were used to explain the study findings. Broad implications of the findings for ethics training development, implementation, and evaluation in the sciences are discussed.
A Visual Analytics Approach for Station-Based Air Quality Data
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-01-01
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117
A Visual Analytics Approach for Station-Based Air Quality Data.
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-12-24
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.
Combinatorial Optimization in Project Selection Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Dewi, Sari; Sawaluddin
2018-01-01
This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.
Adaptation of Decoy Fusion Strategy for Existing Multi-Stage Search Workflows
NASA Astrophysics Data System (ADS)
Ivanov, Mark V.; Levitsky, Lev I.; Gorshkov, Mikhail V.
2016-09-01
A number of proteomic database search engines implement multi-stage strategies aiming at increasing the sensitivity of proteome analysis. These approaches often employ a subset of the original database for the secondary stage of analysis. However, if target-decoy approach (TDA) is used for false discovery rate (FDR) estimation, the multi-stage strategies may violate the underlying assumption of TDA that false matches are distributed uniformly across the target and decoy databases. This violation occurs if the numbers of target and decoy proteins selected for the second search are not equal. Here, we propose a method of decoy database generation based on the previously reported decoy fusion strategy. This method allows unbiased TDA-based FDR estimation in multi-stage searches and can be easily integrated into existing workflows utilizing popular search engines and post-search algorithms.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
NASA Astrophysics Data System (ADS)
Ballesteros, Daniel; Jiménez-Sánchez, Montserrat; Giralt, Santiago; García-Sansegundo, Joaquín; Meléndez-Asensio, Mónica
2015-10-01
Speleogenetic research on alpine caves has advanced significantly during the last decades. These investigations require techniques from different geoscience disciplines that must be adapted to the methodological constraints of working in deep caves. The Picos de Europa mountains are one of the most important alpine karsts, including 14% of the World's Deepest Caves (caves with more than 1 km depth). A speleogenetic research is currently being developed in selected caves in these mountains; one of them, named Torca La Texa shaft, is the main goal of this article. For this purpose, we have proposed both an optimized multi-method approach for speleogenetic research in alpine caves, and a speleogenetic model of the Torca La Texa shaft. The methodology includes: cave surveying, dye-tracing, cave geometry analyses, cave geomorphological mapping, Uranium series dating (234U/230Th) and geomorphological, structural and stratigraphical studies of the cave surroundings. The SpeleoDisc method was employed to establish the structural control of the cavity. Torca La Texa (2653 m length, 215 m depth) is an alpine cave formed by two cave levels, vadose canyons and shafts, soutirage conduits, and gravity-modified passages. The cave was formed prior to the Middle Pleistocene and its development was controlled by the drop of the base level, producing the development of the two cave levels. Coevally to the cave levels formation, soutirage conduits originated connecting phreatic and epiphreatic conduits and vadose canyons and shafts were formed. Most of the shafts were created before the local glacial maximum (43-45 ka) and only two cave passages are related to dolines developed in recent times. The cave development is strongly related to the structure, locating the cave in the core of a gentle fold with the conduits' geometry and orientation controlled by the bedding and five families of joints.
Le Troter, Arnaud; Fouré, Alexandre; Guye, Maxime; Confort-Gouny, Sylviane; Mattei, Jean-Pierre; Gondin, Julien; Salort-Campana, Emmanuelle; Bendahan, David
2016-04-01
Atlas-based segmentation is a powerful method for automatic structural segmentation of several sub-structures in many organs. However, such an approach has been very scarcely used in the context of muscle segmentation, and so far no study has assessed such a method for the automatic delineation of individual muscles of the quadriceps femoris (QF). In the present study, we have evaluated a fully automated multi-atlas method and a semi-automated single-atlas method for the segmentation and volume quantification of the four muscles of the QF and for the QF as a whole. The study was conducted in 32 young healthy males, using high-resolution magnetic resonance images (MRI) of the thigh. The multi-atlas-based segmentation method was conducted in 25 subjects. Different non-linear registration approaches based on free-form deformable (FFD) and symmetric diffeomorphic normalization algorithms (SyN) were assessed. Optimal parameters of two fusion methods, i.e., STAPLE and STEPS, were determined on the basis of the highest Dice similarity index (DSI) considering manual segmentation (MSeg) as the ground truth. Validation and reproducibility of this pipeline were determined using another MRI dataset recorded in seven healthy male subjects on the basis of additional metrics such as the muscle volume similarity values, intraclass coefficient, and coefficient of variation. Both non-linear registration methods (FFD and SyN) were also evaluated as part of a single-atlas strategy in order to assess longitudinal muscle volume measurements. The multi- and the single-atlas approaches were compared for the segmentation and the volume quantification of the four muscles of the QF and for the QF as a whole. Considering each muscle of the QF, the DSI of the multi-atlas-based approach was high 0.87 ± 0.11 and the best results were obtained with the combination of two deformation fields resulting from the SyN registration method and the STEPS fusion algorithm. The optimal variables for FFD and SyN registration methods were four templates and a kernel standard deviation ranging between 5 and 8. The segmentation process using a single-atlas-based method was more robust with DSI values higher than 0.9. From the vantage of muscle volume measurements, the multi-atlas-based strategy provided acceptable results regarding the QF muscle as a whole but highly variable results regarding individual muscle. On the contrary, the performance of the single-atlas-based pipeline for individual muscles was highly comparable to the MSeg, thereby indicating that this method would be adequate for longitudinal tracking of muscle volume changes in healthy subjects. In the present study, we demonstrated that both multi-atlas and single-atlas approaches were relevant for the segmentation of individual muscles of the QF in healthy subjects. Considering muscle volume measurements, the single-atlas method provided promising perspectives regarding longitudinal quantification of individual muscle volumes.
Optimizing regional collaborative efforts to achieve long-term discipline-specific objectives
USDA-ARS?s Scientific Manuscript database
Current funding programs focused on multi-disciplinary, multi-agency approaches to regional issues can provide opportunities to address discipline-specific advancements in scientific knowledge. Projects funded through the Agricultural Research Service, Joint Fire Science Program, and the Natural Re...
A multi-scale Q1/P0 approach to langrangian shock hydrodynamics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shashkov, Mikhail; Love, Edward; Scovazzi, Guglielmo
A new multi-scale, stabilized method for Q1/P0 finite element computations of Lagrangian shock hydrodynamics is presented. Instabilities (of hourglass type) are controlled by a stabilizing operator derived using the variational multi-scale analysis paradigm. The resulting stabilizing term takes the form of a pressure correction. With respect to currently implemented hourglass control approaches, the novelty of the method resides in its residual-based character. The stabilizing residual has a definite physical meaning, since it embeds a discrete form of the Clausius-Duhem inequality. Effectively, the proposed stabilization samples and acts to counter the production of entropy due to numerical instabilities. The proposed techniquemore » is applicable to materials with no shear strength, for which there exists a caloric equation of state. The stabilization operator is incorporated into a mid-point, predictor/multi-corrector time integration algorithm, which conserves mass, momentum and total energy. Encouraging numerical results in the context of compressible gas dynamics confirm the potential of the method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aderem, Alan; Adkins, Joshua N.; Ansong, Charles
The 20th century was marked by extraordinary advances in our understanding of microbes and infectious disease, but pandemics remain, food and water borne illnesses are frequent, multi-drug resistant microbes are on the rise, and the needed drugs and vaccines have not been developed. The scientific approaches of the past—including the intense focus on individual genes and proteins typical of molecular biology—have not been sufficient to address these challenges. The first decade of the 21st century has seen remarkable innovations in technology and computational methods. These new tools provide nearly comprehensive views of complex biological systems and can provide a correspondinglymore » deeper understanding of pathogen-host interactions. To take full advantage of these innovations, the National Institute of Allergy and Infectious Diseases recently initiated the Systems Biology Program for Infectious Disease Research. As participants of the Systems Biology Program we think that the time is at hand to redefine the pathogen-host research paradigm.« less
2008-01-01
element method (BEM). Reynolds averaged Navier-Stokes (RANS) and the particle finite element method ( PFEM ) will be used in the water/mine/sand domain...and deformable sandy seabed (median grain diameter: 0.2 mm) 12 SOLID/FEM SAND/SPH GEOMATERIALS FNPF/BEM FNPF/BEMRANS/ PFEM
ERIC Educational Resources Information Center
Proyer, Rene T.; Sidler, Nicole; Weber, Marco; Ruch, Willibald
2012-01-01
The relationship between character strengths and vocational interests was tested. In an online study, 197 thirteen to eighteen year-olds completed a questionnaire measuring character strengths and a multi-method measure for interests (questionnaire, nonverbal test, and objective personality tests). The main findings were that intellectual…
The application analysis of the multi-angle polarization technique for ocean color remote sensing
NASA Astrophysics Data System (ADS)
Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli
2017-02-01
The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.
Research into a distributed fault diagnosis system and its application
NASA Astrophysics Data System (ADS)
Qian, Suxiang; Jiao, Weidong; Lou, Yongjian; Shen, Xiaomei
2005-12-01
CORBA (Common Object Request Broker Architecture) is a solution to distributed computing methods over heterogeneity systems, which establishes a communication protocol between distributed objects. It takes great emphasis on realizing the interoperation between distributed objects. However, only after developing some application approaches and some practical technology in monitoring and diagnosis, can the customers share the monitoring and diagnosis information, so that the purpose of realizing remote multi-expert cooperation diagnosis online can be achieved. This paper aims at building an open fault monitoring and diagnosis platform combining CORBA, Web and agent. Heterogeneity diagnosis object interoperate in independent thread through the CORBA (soft-bus), realizing sharing resource and multi-expert cooperation diagnosis online, solving the disadvantage such as lack of diagnosis knowledge, oneness of diagnosis technique and imperfectness of analysis function, so that more complicated and further diagnosis can be carried on. Take high-speed centrifugal air compressor set for example, we demonstrate a distributed diagnosis based on CORBA. It proves that we can find out more efficient approaches to settle the problems such as real-time monitoring and diagnosis on the net and the break-up of complicated tasks, inosculating CORBA, Web technique and agent frame model to carry on complemental research. In this system, Multi-diagnosis Intelligent Agent helps improve diagnosis efficiency. Besides, this system offers an open circumstances, which is easy for the diagnosis objects to upgrade and for new diagnosis server objects to join in.
A hybrid approach for efficient anomaly detection using metaheuristic methods
Ghanem, Tamer F.; Elkilani, Wail S.; Abdul-kader, Hatem M.
2014-01-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. PMID:26199752
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich
Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less
A hybrid approach for efficient anomaly detection using metaheuristic methods.
Ghanem, Tamer F; Elkilani, Wail S; Abdul-Kader, Hatem M
2015-07-01
Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.
Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich; ...
2017-10-28
Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less
On the Design of the Peer-Assisted UGC VoD System
NASA Astrophysics Data System (ADS)
Wan, Yi; Asaka, Takuya; Takahashi, Tatsuro
User Generated Content (UGC) VoD services such as YouTube are becoming more and more popular, and their maintenance costs are growing as well. Many P2P solutions have been proposed to reduce server load in such systems, but almost all of them focus on the single-video approach, which only has limited effect on the systems serving short videos such as UGC. The purpose of this paper is to investigate the potential of an alternative approach, the multi-video approach, and we use a very simple method called collaborative caching to show that methods using the multi-video approach are generally more suitable for current UGC VoD systems. We also study the influence of the major design factors through simulations and provide guidelines for efficiently building systems with this method.
NASA Astrophysics Data System (ADS)
Indarsih, Indrati, Ch. Rini
2016-02-01
In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.
NASA Astrophysics Data System (ADS)
Mao, Cuili; Lu, Rongsheng; Liu, Zhijian
2018-07-01
In fringe projection profilometry, the phase errors caused by the nonlinear intensity response of digital projectors needs to be correctly compensated. In this paper, a multi-frequency inverse-phase method is proposed. The theoretical model of periodical phase errors is analyzed. The periodical phase errors can be adaptively compensated in the wrapped maps by using a set of fringe patterns. The compensated phase is then unwrapped with multi-frequency method. Compared with conventional methods, the proposed method can greatly reduce the periodical phase error without calibrating measurement system. Some simulation and experimental results are presented to demonstrate the validity of the proposed approach.
A Multi-Level Systems Perspective for the Science of Team Science
Börner, Katy; Contractor, Noshir; Falk-Krzesinski, Holly J.; Fiore, Stephen M.; Hall, Kara L.; Keyton, Joann; Spring, Bonnie; Stokols, Daniel; Trochim, William; Uzzi, Brian
2012-01-01
This Commentary describes recent research progress and professional developments in the study of scientific teamwork, an area of inquiry termed the “science of team science” (SciTS, pronounced “sahyts”). It proposes a systems perspective that incorporates a mixed-methods approach to SciTS that is commensurate with the conceptual, methodological, and translational complexities addressed within the SciTS field. The theoretically grounded and practically useful framework is intended to integrate existing and future lines of SciTS research to facilitate the field’s evolution as it addresses key challenges spanning macro, meso, and micro levels of analysis. PMID:20844283
Developing and Validating the Socio-Technical Model in Ontology Engineering
NASA Astrophysics Data System (ADS)
Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin
2018-03-01
This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.
Wang, Bao-Zhen; Chen, Zhi
2013-01-01
This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.
A multi-method approach toward de novo glycan characterization: a Man-5 case study.
Prien, Justin M; Prater, Bradley D; Cockrill, Steven L
2010-05-01
Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.
Bray, Jeremy W.; Kelly, Erin L.; Hammer, Leslie B.; Almeida, David M.; Dearing, James W.; King, Rosalind B.; Buxton, Orfeu M.
2013-01-01
Recognizing a need for rigorous, experimental research to support the efforts of workplaces and policymakers in improving the health and wellbeing of employees and their families, the National Institutes of Health and the Centers for Disease Control and Prevention formed the Work, Family & Health Network (WFHN). The WFHN is implementing an innovative multisite study with a rigorous experimental design (adaptive randomization, control groups), comprehensive multilevel measures, a novel and theoretically based intervention targeting the psychosocial work environment, and translational activities. This paper describes challenges and benefits of designing a multilevel and transdisciplinary research network that includes an effectiveness study to assess intervention effects on employees, families, and managers; a daily diary study to examine effects on family functioning and daily stress; a process study to understand intervention implementation; and translational research to understand and inform diffusion of innovation. Challenges were both conceptual and logistical, spanning all aspects of study design and implementation. In dealing with these challenges, however, the WFHN developed innovative, transdisciplinary, multi-method approaches to conducting workplace research that will benefit both the research and business communities. PMID:24618878
Towards Personal Exposures: How Technology Is Changing Air Pollution and Health Research.
Larkin, A; Hystad, P
2017-12-01
We present a review of emerging technologies and how these can transform personal air pollution exposure assessment and subsequent health research. Estimating personal air pollution exposures is currently split broadly into methods for modeling exposures for large populations versus measuring exposures for small populations. Air pollution sensors, smartphones, and air pollution models capitalizing on big/new data sources offer tremendous opportunity for unifying these approaches and improving long-term personal exposure prediction at scales needed for population-based research. A multi-disciplinary approach is needed to combine these technologies to not only estimate personal exposures for epidemiological research but also determine drivers of these exposures and new prevention opportunities. While available technologies can revolutionize air pollution exposure research, ethical, privacy, logistical, and data science challenges must be met before widespread implementations occur. Available technologies and related advances in data science can improve long-term personal air pollution exposure estimates at scales needed for population-based research. This will advance our ability to evaluate the impacts of air pollution on human health and develop effective prevention strategies.
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Li, Miao; Li, Jun; Zhou, Yiyu
2015-12-08
The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts-MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing.
Li, Miao; Li, Jun; Zhou, Yiyu
2015-01-01
The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts—MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing. PMID:26670234
Marsh, Kevin; Lanitis, Tereza; Neasham, David; Orfanos, Panagiotis; Caro, Jaime
2014-04-01
The objective of this study is to support those undertaking a multi-criteria decision analysis (MCDA) by reviewing the approaches adopted in healthcare MCDAs to date, how these varied with the objective of the study, and the lessons learned from this experience. Searches of EMBASE and MEDLINE identified 40 studies that provided 41 examples of MCDA in healthcare. Data were extracted on the objective of the study, methods employed, and decision makers' and study authors' reflections on the advantages and disadvantages of the methods. The recent interest in MCDA in healthcare is mirrored in an increase in the application of MCDA to evaluate healthcare interventions. Of the studies identified, the first was published in 1990, but more than half were published since 2011. They were undertaken in 18 different countries, and were designed to support investment (coverage and reimbursement), authorization, prescription, and research funding allocation decisions. Many intervention types were assessed: pharmaceuticals, public health interventions, screening, surgical interventions, and devices. Most used the value measurement approach and scored performance using predefined scales. Beyond these similarities, a diversity of different approaches were adopted, with only limited correspondence between the approach and the type of decision or product. Decision makers consulted as part of these studies, as well as the authors of the studies are positive about the potential of MCDA to improve decision making. Further work is required, however, to develop guidance for those undertaking MCDA.
The use of willingness-to-pay (WTP) survey techniques based on multi-attribute utility (MAU) approaches has been recommended by some authors as a way to deal simultaneously with two difficulties that increasingly plague environmental valuation. The first of th...
Show me the data: advances in multi-model benchmarking, assimilation, and forecasting
NASA Astrophysics Data System (ADS)
Dietze, M.; Raiho, A.; Fer, I.; Cowdery, E.; Kooper, R.; Kelly, R.; Shiklomanov, A. N.; Desai, A. R.; Simkins, J.; Gardella, A.; Serbin, S.
2016-12-01
Researchers want their data to inform carbon cycle predictions, but there are considerable bottlenecks between data collection and the use of data to calibrate and validate earth system models and inform predictions. This talk highlights recent advancements in the PEcAn project aimed at it making it easier for individual researchers to confront models with their own data: (1) The development of an easily extensible site-scale benchmarking system aimed at ensuring that models capture process rather than just reproducing pattern; (2) Efficient emulator-based Bayesian parameter data assimilation to constrain model parameters; (3) A novel, generalized approach to ensemble data assimilation to estimate carbon pools and fluxes and quantify process error; (4) automated processing and downscaling of CMIP climate scenarios to support forecasts that include driver uncertainty; (5) a large expansion in the number of models supported, with new tools for conducting multi-model and multi-site analyses; and (6) a network-based architecture that allows analyses to be shared with model developers and other collaborators. Application of these methods is illustrated with data across a wide range of time scales, from eddy-covariance to forest inventories to tree rings to paleoecological pollen proxies.
A Multi-Systemic School-Based Approach for Addressing Childhood Aggression
ERIC Educational Resources Information Center
Runions, Kevin
2008-01-01
School-based approaches to addressing aggression in the early grades have focused on explicit curriculum addressing social and emotional processes. The current study reviews research on the distinct modes of aggression, the status of current research on social and emotional processing relevant to problems of aggression amongst young children, as…
Vigelius, Matthias; Meyer, Bernd
2012-01-01
For many biological applications, a macroscopic (deterministic) treatment of reaction-drift-diffusion systems is insufficient. Instead, one has to properly handle the stochastic nature of the problem and generate true sample paths of the underlying probability distribution. Unfortunately, stochastic algorithms are computationally expensive and, in most cases, the large number of participating particles renders the relevant parameter regimes inaccessible. In an attempt to address this problem we present a genuine stochastic, multi-dimensional algorithm that solves the inhomogeneous, non-linear, drift-diffusion problem on a mesoscopic level. Our method improves on existing implementations in being multi-dimensional and handling inhomogeneous drift and diffusion. The algorithm is well suited for an implementation on data-parallel hardware architectures such as general-purpose graphics processing units (GPUs). We integrate the method into an operator-splitting approach that decouples chemical reactions from the spatial evolution. We demonstrate the validity and applicability of our algorithm with a comprehensive suite of standard test problems that also serve to quantify the numerical accuracy of the method. We provide a freely available, fully functional GPU implementation. Integration into Inchman, a user-friendly web service, that allows researchers to perform parallel simulations of reaction-drift-diffusion systems on GPU clusters is underway. PMID:22506001
Granovsky, Alexander A
2011-06-07
The distinctive desirable features, both mathematically and physically meaningful, for all partially contracted multi-state multi-reference perturbation theories (MS-MR-PT) are explicitly formulated. The original approach to MS-MR-PT theory, called extended multi-configuration quasi-degenerate perturbation theory (XMCQDPT), having most, if not all, of the desirable properties is introduced. The new method is applied at the second order of perturbation theory (XMCQDPT2) to the 1(1)A(')-2(1)A(') conical intersection in allene molecule, the avoided crossing in LiF molecule, and the 1(1)A(1) to 2(1)A(1) electronic transition in cis-1,3-butadiene. The new theory has several advantages compared to those of well-established approaches, such as second order multi-configuration quasi-degenerate perturbation theory and multi-state-second order complete active space perturbation theory. The analysis of the prevalent approaches to the MS-MR-PT theory performed within the framework of the XMCQDPT theory unveils the origin of their common inherent problems. We describe the efficient implementation strategy that makes XMCQDPT2 an especially useful general-purpose tool in the high-level modeling of small to large molecular systems. © 2011 American Institute of Physics
2014-10-01
Changes in Approach b. Problems/Delays and Plans for Resolution c. Changes that Impacted Expenditures d. Changes in use or care of vertebrate animals...the field of Restorative Transplantation matures , significant opportunities are emerging for transplant researchers and clinicians to capitalize on...that the maturing field of Restorative Transplantation will benefit the most from the establishment of a multi-institutional, multi-disciplinary
Douglas, Heather E; Raban, Magdalena Z; Walter, Scott R; Westbrook, Johanna I
2017-03-01
Multi-tasking is an important skill for clinical work which has received limited research attention. Its impacts on clinical work are poorly understood. In contrast, there is substantial multi-tasking research in cognitive psychology, driver distraction, and human-computer interaction. This review synthesises evidence of the extent and impacts of multi-tasking on efficiency and task performance from health and non-healthcare literature, to compare and contrast approaches, identify implications for clinical work, and to develop an evidence-informed framework for guiding the measurement of multi-tasking in future healthcare studies. The results showed healthcare studies using direct observation have focused on descriptive studies to quantify concurrent multi-tasking and its frequency in different contexts, with limited study of impact. In comparison, non-healthcare studies have applied predominantly experimental and simulation designs, focusing on interleaved and concurrent multi-tasking, and testing theories of the mechanisms by which multi-tasking impacts task efficiency and performance. We propose a framework to guide the measurement of multi-tasking in clinical settings that draws together lessons from these siloed research efforts. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Vergo, John; Karat, Clare-Marie; Karat, John; Pinhanez, Claudio; Arora, Renee; Cofino, Thomas; Riecken, Doug; Podlaseck, Mark
This paper summarizes a 10-month long research project conducted at the IBM T.J. Watson Research Center aimed at developing the design concept of a multi-institutional art and culture web site. The work followed a user-centered design (UCD) approach, where interaction with prototypes and feedback from potential users of the web site were sought…
Arulandhu, Alfred J.; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M.; Prins, Theo W.; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B.; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara
2017-01-01
Abstract DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. PMID:29020743
Arulandhu, Alfred J; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M; Prins, Theo W; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara; Kok, Esther
2017-10-01
DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. © The Authors 2017. Published by Oxford University Press.
Generating multi-double-scroll attractors via nonautonomous approach.
Hong, Qinghui; Xie, Qingguo; Shen, Yi; Wang, Xiaoping
2016-08-01
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify the availability and feasibility of this method.
Power calculation for comparing diagnostic accuracies in a multi-reader, multi-test design.
Kim, Eunhee; Zhang, Zheng; Wang, Youdan; Zeng, Donglin
2014-12-01
Receiver operating characteristic (ROC) analysis is widely used to evaluate the performance of diagnostic tests with continuous or ordinal responses. A popular study design for assessing the accuracy of diagnostic tests involves multiple readers interpreting multiple diagnostic test results, called the multi-reader, multi-test design. Although several different approaches to analyzing data from this design exist, few methods have discussed the sample size and power issues. In this article, we develop a power formula to compare the correlated areas under the ROC curves (AUC) in a multi-reader, multi-test design. We present a nonparametric approach to estimate and compare the correlated AUCs by extending DeLong et al.'s (1988, Biometrics 44, 837-845) approach. A power formula is derived based on the asymptotic distribution of the nonparametric AUCs. Simulation studies are conducted to demonstrate the performance of the proposed power formula and an example is provided to illustrate the proposed procedure. © 2014, The International Biometric Society.
Comparison of analysis methods for airway quantification
NASA Astrophysics Data System (ADS)
Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.
2012-03-01
Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.
Kaleem, Muhammad; Gurve, Dharmendra; Guergachi, Aziz; Krishnan, Sridhar
2018-06-25
The objective of the work described in this paper is development of a computationally efficient methodology for patient-specific automatic seizure detection in long-term multi-channel EEG recordings. Approach: A novel patient-specific seizure detection approach based on signal-derived Empirical Mode Decomposition (EMD)-based dictionary approach is proposed. For this purpose, we use an empirical framework for EMD-based dictionary creation and learning, inspired by traditional dictionary learning methods, in which the EMD-based dictionary is learned from the multi-channel EEG data being analyzed for automatic seizure detection. We present the algorithm for dictionary creation and learning, whose purpose is to learn dictionaries with a small number of atoms. Using training signals belonging to seizure and non-seizure classes, an initial dictionary, termed as the raw dictionary, is formed. The atoms of the raw dictionary are composed of intrinsic mode functions obtained after decomposition of the training signals using the empirical mode decomposition algorithm. The raw dictionary is then trained using a learning algorithm, resulting in a substantial decrease in the number of atoms in the trained dictionary. The trained dictionary is then used for automatic seizure detection, such that coefficients of orthogonal projections of test signals against the trained dictionary form the features used for classification of test signals into seizure and non-seizure classes. Thus no hand-engineered features have to be extracted from the data as in traditional seizure detection approaches. Main results: The performance of the proposed approach is validated using the CHB-MIT benchmark database, and averaged accuracy, sensitivity and specificity values of 92.9%, 94.3% and 91.5%, respectively, are obtained using support vector machine classifier and five-fold cross-validation method. These results are compared with other approaches using the same database, and the suitability of the approach for seizure detection in long-term multi-channel EEG recordings is discussed. Significance: The proposed approach describes a computationally efficient method for automatic seizure detection in long-term multi-channel EEG recordings. The method does not rely on hand-engineered features, as are required in traditional approaches. Furthermore, the approach is suitable for scenarios where the dictionary once formed and trained can be used for automatic seizure detection of newly recorded data, making the approach suitable for long-term multi-channel EEG recordings. © 2018 IOP Publishing Ltd.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1986-01-01
Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.
Multi-Centrality Graph Spectral Decompositions and Their Application to Cyber Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Pin-Yu; Choudhury, Sutanay; Hero, Alfred
Many modern datasets can be represented as graphs and hence spectral decompositions such as graph principal component analysis (PCA) can be useful. Distinct from previous graph decomposition approaches based on subspace projection of a single topological feature, e.g., the centered graph adjacency matrix (graph Laplacian), we propose spectral decomposition approaches to graph PCA and graph dictionary learning that integrate multiple features, including graph walk statistics, centrality measures and graph distances to reference nodes. In this paper we propose a new PCA method for single graph analysis, called multi-centrality graph PCA (MC-GPCA), and a new dictionary learning method for ensembles ofmore » graphs, called multi-centrality graph dictionary learning (MC-GDL), both based on spectral decomposition of multi-centrality matrices. As an application to cyber intrusion detection, MC-GPCA can be an effective indicator of anomalous connectivity pattern and MC-GDL can provide discriminative basis for attack classification.« less
Leveraging Citizen Science and Information Technology for Population Physical Activity Promotion
King, Abby C.; Winter, Sandra J.; Sheats, Jylana L.; Rosas, Lisa G.; Buman, Matthew P.; Salvo, Deborah; Rodriguez, Nicole M.; Seguin, Rebecca A.; Moran, Mika; Garber, Randi; Broderick, Bonnie; Zieff, Susan G.; Sarmiento, Olga Lucia; Gonzalez, Silvia A.; Banchoff, Ann; Dommarco, Juan Rivera
2016-01-01
PURPOSE While technology is a major driver of many of society’s comforts, conveniences, and advances, it has been responsible, in a significant way, for engineering regular physical activity and a number of other positive health behaviors out of people’s daily lives. A key question concerns how to harness information and communication technologies (ICT) to bring about positive changes in the health promotion field. One such approach involves community-engaged “citizen science,” in which local residents leverage the potential of ICT to foster data-driven consensus-building and mobilization efforts that advance physical activity at the individual, social, built environment, and policy levels. METHOD The history of citizen science in the research arena is briefly described and an evidence-based method that embeds citizen science in a multi-level, multi-sectoral community-based participatory research framework for physical activity promotion is presented. RESULTS Several examples of this citizen science-driven community engagement framework for promoting active lifestyles, called “Our Voice”, are discussed, including pilot projects from diverse communities in the U.S. as well as internationally. CONCLUSIONS The opportunities and challenges involved in leveraging citizen science activities as part of a broader population approach to promoting regular physical activity are explored. The strategic engagement of citizen scientists from socio-demographically diverse communities across the globe as both assessment as well as change agents provides a promising, potentially low-cost and scalable strategy for creating more active, healthful, and equitable neighborhoods and communities worldwide. PMID:27525309
Kim, Sungjin; Jinich, Adrián; Aspuru-Guzik, Alán
2017-04-24
We propose a multiple descriptor multiple kernel (MultiDK) method for efficient molecular discovery using machine learning. We show that the MultiDK method improves both the speed and accuracy of molecular property prediction. We apply the method to the discovery of electrolyte molecules for aqueous redox flow batteries. Using multiple-type-as opposed to single-type-descriptors, we obtain more relevant features for machine learning. Following the principle of "wisdom of the crowds", the combination of multiple-type descriptors significantly boosts prediction performance. Moreover, by employing multiple kernels-more than one kernel function for a set of the input descriptors-MultiDK exploits nonlinear relations between molecular structure and properties better than a linear regression approach. The multiple kernels consist of a Tanimoto similarity kernel and a linear kernel for a set of binary descriptors and a set of nonbinary descriptors, respectively. Using MultiDK, we achieve an average performance of r 2 = 0.92 with a test set of molecules for solubility prediction. We also extend MultiDK to predict pH-dependent solubility and apply it to a set of quinone molecules with different ionizable functional groups to assess their performance as flow battery electrolytes.
LinkWinds: An Approach to Visual Data Analysis
NASA Technical Reports Server (NTRS)
Jacobson, Allan S.
1992-01-01
The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.
Multi-Excitation Magnetoacoustic Tomography with Magnetic Induction for Bioimpedance Imaging
Li, Xu; He, Bin
2011-01-01
Magnetoacoustic tomography with magnetic induction (MAT-MI) is an imaging approach proposed to conduct non-invasive electrical conductivity imaging of biological tissue with high spatial resolution. In the present study, based on the analysis of the relationship between the conductivity distribution and the generated MAT-MI acoustic source, we propose a new multi-excitation MAT-MI approach and the corresponding reconstruction algorithms. In the proposed method, multiple magnetic excitations using different coil configurations are employed and ultrasound measurements corresponding to each excitation are collected to derive the conductivity distribution inside the sample. A modified reconstruction algorithm is also proposed for the multi-excitation MAT-MI imaging approach when only limited bandwidth acoustic measurements are available. Computer simulation and phantom experiment studies have been done to demonstrate the merits of the proposed method. It is shown that if unlimited bandwidth acoustic data is available, we can accurately reconstruct the internal conductivity contrast of an object using the proposed method. With limited bandwidth data and the use of the modified algorithm we can reconstruct the relative conductivity contrast of an object instead of only boundaries at the conductivity heterogeneity. Benefits that come with this new method include better differentiation of tissue types with conductivity contrast using the MAT-MI approach, specifically for potential breast cancer screening application in the future. PMID:20529729
ERIC Educational Resources Information Center
de la Torre, Adela
2014-01-01
Niños Sanos, Familia Sana (NSFS) is a 5-year multi-intervention study aimed at preventing childhood obesity among Mexican-origin children in rural California. Using a transdisciplinary approach and community-based participatory research (CBPR) methodology, NSFS's development included a diversely trained team working in collaboration with community…
Manifold regularized matrix completion for multi-label learning with ADMM.
Liu, Bin; Li, Yingming; Xu, Zenglin
2018-05-01
Multi-label learning is a common machine learning problem arising from numerous real-world applications in diverse fields, e.g, natural language processing, bioinformatics, information retrieval and so on. Among various multi-label learning methods, the matrix completion approach has been regarded as a promising approach to transductive multi-label learning. By constructing a joint matrix comprising the feature matrix and the label matrix, the missing labels of test samples are regarded as missing values of the joint matrix. With the low-rank assumption of the constructed joint matrix, the missing labels can be recovered by minimizing its rank. Despite its success, most matrix completion based approaches ignore the smoothness assumption of unlabeled data, i.e., neighboring instances should also share a similar set of labels. Thus they may under exploit the intrinsic structures of data. In addition, the matrix completion problem can be less efficient. To this end, we propose to efficiently solve the multi-label learning problem as an enhanced matrix completion model with manifold regularization, where the graph Laplacian is used to ensure the label smoothness over it. To speed up the convergence of our model, we develop an efficient iterative algorithm, which solves the resulted nuclear norm minimization problem with the alternating direction method of multipliers (ADMM). Experiments on both synthetic and real-world data have shown the promising results of the proposed approach. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Taher, M.; Hamidah, I.; Suwarma, I. R.
2017-09-01
This paper outlined the results of an experimental study on the effects of multi-representation approach in learning Archimedes Law on students’ mental model improvement. The multi-representation techniques implemented in the study were verbal, pictorial, mathematical, and graphical representations. Students’ mental model was classified into three levels, i.e. scientific, synthetic, and initial levels, based on the students’ level of understanding. The present study employed the pre-experimental methodology, using one group pretest-posttest design. The subject of the study was 32 eleventh grade students in a Public Senior High School in Riau Province. The research instrument included model mental test on hydrostatic pressure concept, in the form of essay test judged by experts. The findings showed that there was positive change in students’ mental model, indicating that multi-representation approach was effective to improve students’ mental model.
Ashing, Kimlin; Rosales, Monica; Fernandez, Alejandro
2015-02-01
To better understand research participation among hard-to-reach populations, this exploratory investigation examined characteristics of enrollees and non-enrollees from a population-based longitudinal study with African-American and Latina-American breast cancer survivors. A mixed-method recruitment approach was utilized to enroll participants from cancer registries and community groups who were 1-6 years post-diagnosis. Four hundred and sixty-eight participants agreed to participate constituting an 81% participation rate; 65 and 55% completed Time-1, and both Time-1 and Time-2 assessments, respectively. African-Americans were more likely to agree to participate and complete the T1 assessment (73%) than Latinas (62%) (p < 0.05). Participation was influenced by educational attainment and comorbidities (p < 0.05) for African-Americans. Among Latinas, language proficiency, comorbidities and psychological difficulties (p < 0.01) influenced participation. Our findings suggest that enrollment in research studies may be influenced by complex and multi-dimensional factors stemming from subjects' characteristics including ethnicity, culture, language proficiency and literary, and socioeconomic status, as well as medical characteristics including co-occurring chronic illness and psychological status. Thus, comprehensive, multi-method research studies are urgently needed to better understand and address the challenge of minority recruitment in biomedical research. To increase research participation among cancer survivors, it is imperative to implement focused strategies that will support and encourage individuals' enrollment and continued participation in studies.
Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)
2001-01-01
Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are interfaced. This capability rapidly provides the high-fidelity results needed in the early design phase. Moreover, the capability is applicable to the general field of engineering science and mechanics. Hence, it provides a collaborative capability that accounts for interactions among engineering analysis methods.
A multi-frequency receiver function inversion approach for crustal velocity structure
NASA Astrophysics Data System (ADS)
Li, Xuelei; Li, Zhiwei; Hao, Tianyao; Wang, Sheng; Xing, Jian
2017-05-01
In order to constrain the crustal velocity structures better, we developed a new nonlinear inversion approach based on multi-frequency receiver function waveforms. With the global optimizing algorithm of Differential Evolution (DE), low-frequency receiver function waveforms can primarily constrain large-scale velocity structures, while high-frequency receiver function waveforms show the advantages in recovering small-scale velocity structures. Based on the synthetic tests with multi-frequency receiver function waveforms, the proposed approach can constrain both long- and short-wavelength characteristics of the crustal velocity structures simultaneously. Inversions with real data are also conducted for the seismic stations of KMNB in southeast China and HYB in Indian continent, where crustal structures have been well studied by former researchers. Comparisons of inverted velocity models from previous and our studies suggest good consistency, but better waveform fitness with fewer model parameters are achieved by our proposed approach. Comprehensive tests with synthetic and real data suggest that the proposed inversion approach with multi-frequency receiver function is effective and robust in inverting the crustal velocity structures.
Elder, John P.; Ayala, Guadalupe X.; McKenzie, Thomas L.; Litrownik, Alan J.; Gallo, Linda C.; Arredondo, Elva M.; Talavera, Gregory A.; Kaplan, Robert M.
2013-01-01
Background The Institute for Behavioral and Community Health (IBACH) is a transdisciplinary organization with a team-oriented approach to the translation of research to practice and policy within the context of behavioral medicine. Objectives This paper tracks the growth of IBACH — in the context of evolving multi-university transdisciplinary research efforts — from a behavioral medicine research focus to community approaches to disease prevention and control, ultimately specializing in Latino health research and practice. We describe how this growth was informed by our partnerships with community members and organizations, and training a diverse array of students and young professionals. Methods Since 1982, IBACH’s research has evolved to address a greater breadth of factors associated with health and well-being. This was driven by our strong community focus and emphasis on collaborations, the diversity of our investigative teams, and our emphasis on training. Although behavioral science still forms the core of IBACH’s scientific orientation, research efforts extend beyond those traditionally examined. Conclusions IBACH’s “team science” successes have been fueled by a specific population emphasis making IBACH one of the nation’s leaders in Latino health behavior research. PMID:25435566
de Bruin, Anique B H
2016-12-01
Since emergence of the field 'Educational Neuroscience' (EN) in the late nineties of the previous century, a debate has emerged about the potential this field holds to influence teaching and learning in the classroom. By now, most agree that the original claims promising direct translations to teaching and learning were too strong. I argue here that research questions in (health professions) education require multi-methodological approaches, including neuroscience, while carefully weighing what (combination of) approaches are most suitable given a research question. Only through a multi-methodological approach will convergence of evidence emerge, which is so desperately needed for improving teaching and learning in the classroom. However, both researchers and teachers should become aware of the so-called 'seductive allure' of EN; that is, the demonstrable physical location and apparent objectivity of the measurements can be interpreted as yielding more powerful evidence and warranting stronger conclusions than, e.g., behavioral experiments, where in fact oftentimes the reverse is the case. I conclude that our tendency as researchers to commit ourselves to one methodological approach and to addressing educational research questions from a single methodological perspective is limiting progress in educational science and in translation to education.
NASA Astrophysics Data System (ADS)
Xue, ShiChuan; Wu, JunJie; Xu, Ping; Yang, XueJun
2018-02-01
Quantum computing is a significant computing capability which is superior to classical computing because of its superposition feature. Distinguishing several quantum states from quantum algorithm outputs is often a vital computational task. In most cases, the quantum states tend to be non-orthogonal due to superposition; quantum mechanics has proved that perfect outcomes could not be achieved by measurements, forcing repetitive measurement. Hence, it is important to determine the optimum measuring method which requires fewer repetitions and a lower error rate. However, extending current measurement approaches mainly aiming at quantum cryptography to multi-qubit situations for quantum computing confronts challenges, such as conducting global operations which has considerable costs in the experimental realm. Therefore, in this study, we have proposed an optimum subsystem method to avoid these difficulties. We have provided an analysis of the comparison between the reduced subsystem method and the global minimum error method for two-qubit problems; the conclusions have been verified experimentally. The results showed that the subsystem method could effectively discriminate non-orthogonal two-qubit states, such as separable states, entangled pure states, and mixed states; the cost of the experimental process had been significantly reduced, in most circumstances, with acceptable error rate. We believe the optimal subsystem method is the most valuable and promising approach for multi-qubit quantum computing applications.
NASA Astrophysics Data System (ADS)
Bye, B. L.; Godøy, Ø.
2014-12-01
Environmental and climate changes are important elements of our global challenges. They are observed at a global scale and in particular in the Arctic. In order to give better estimates of the future changes, the Arctic has to be monitored and analyzed by a multi-disciplinary observation system that will improve Earth System Models. The best chance to achieve significant results within a relatively short time frame is found in regions with a large natural climate gradient, and where processes sensitive to the expected changes are particularly important. Svalbard and the surrounding ocean areas fulfil all these criteria. The vision for SIOS is to be a regional observational system for long term acquisition and proliferation of fundamental knowledge on global environmental change within an Earth System Science perspective in and around Svalbard. SIOS will systematically develop and implement methods for how observational networks are to be construed. The distributed SIOS data management system (SDMS) will be implemented through a combination of technologies tailored to the multi-disciplinary nature of the Arctic data. One of these technologies is The Brokering approach or "Framework". The Brokering approach provides a series of services such as discovery, access, transformation and semantics support to enable translation from one discipline/culture to another. This is exactly the challenges the SDMS will have to handle and thus the Brokering approach is integrated in the design of the system. A description of the design strategy for the SDMS that includes The Brokering approach will be presented. The design and implementation plans for the SDMS are based on research done in the EU funded ESFRI project SIOS and examples of solutions for interoperable systems producing Arctic datasets and products coordinated through SIOS will be showcased. The reported experience from SIOS brokering approach will feed into the process of developing a sustainable brokering governance in the framework of Research Data Alliance. It will also support the Global Earth Observation System of Systems (GEOSS). This is a contribution to increase our global capacity to create interoperable systems that provide multi-disciplinary dataset and products.
Innovative Contamination Certification of Multi-Mission Flight Hardware
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Hughes, David W.; Montt, Kristina M.; Triolo, Jack J.
1998-01-01
Maintaining contamination certification of multi-mission flight hardware is an innovative approach to controlling mission costs. Methods for assessing ground induced degradation between missions have been employed by the Hubble Space Telescope (HST) Project for the multi-mission (servicing) hardware. By maintaining the cleanliness of the hardware between missions, and by controlling the materials added to the hardware during modification and refurbishment both project funding for contamination recertification and schedule have been significantly reduced. These methods will be discussed and HST hardware data will be presented.
Innovative Contamination Certification of Multi-Mission Flight Hardware
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Hughes, David W.; Montt, Kristina M.; Triolo, Jack J.
1999-01-01
Maintaining contamination certification of multi-mission flight hardware is an innovative approach to controlling mission costs. Methods for assessing ground induced degradation between missions have been employed by the Hubble Space Telescope (HST) Project for the multi-mission (servicing) hardware. By maintaining the cleanliness of the hardware between missions, and by controlling the materials added to the hardware during modification and refurbishment both project funding for contamination recertification and schedule have been significantly reduced. These methods will be discussed and HST hardware data will be presented.
A scale-based approach to interdisciplinary research and expertise in sports.
Ibáñez-Gijón, Jorge; Buekers, Martinus; Morice, Antoine; Rao, Guillaume; Mascret, Nicolas; Laurin, Jérome; Montagne, Gilles
2017-02-01
After more than 20 years since the introduction of ecological and dynamical approaches in sports research, their promising opportunity for interdisciplinary research has not been fulfilled yet. The complexity of the research process and the theoretical and empirical difficulties associated with an integrated ecological-dynamical approach have been the major factors hindering the generalisation of interdisciplinary projects in sports sciences. To facilitate this generalisation, we integrate the major concepts from the ecological and dynamical approaches to study behaviour as a multi-scale process. Our integration gravitates around the distinction between functional (ecological) and execution (organic) scales, and their reciprocal intra- and inter-scale constraints. We propose an (epistemological) scale-based definition of constraints that accounts for the concept of synergies as emergent coordinative structures. To illustrate how we can operationalise the notion of multi-scale synergies we use an interdisciplinary model of locomotor pointing. To conclude, we show the value of this approach for interdisciplinary research in sport sciences, as we discuss two examples of task-specific dimensionality reduction techniques in the context of an ongoing project that aims to unveil the determinants of expertise in basketball free throw shooting. These techniques provide relevant empirical evidence to help bootstrap the challenging modelling efforts required in sport sciences.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
A New Method for Setting Calculation Sequence of Directional Relay Protection in Multi-Loop Networks
NASA Astrophysics Data System (ADS)
Haijun, Xiong; Qi, Zhang
2016-08-01
Workload of relay protection setting calculation in multi-loop networks may be reduced effectively by optimization setting calculation sequences. A new method of setting calculation sequences of directional distance relay protection in multi-loop networks based on minimum broken nodes cost vector (MBNCV) was proposed to solve the problem experienced in current methods. Existing methods based on minimum breakpoint set (MBPS) lead to more break edges when untying the loops in dependent relationships of relays leading to possibly more iterative calculation workloads in setting calculations. A model driven approach based on behavior trees (BT) was presented to improve adaptability of similar problems. After extending the BT model by adding real-time system characters, timed BT was derived and the dependency relationship in multi-loop networks was then modeled. The model was translated into communication sequence process (CSP) models and an optimization setting calculation sequence in multi-loop networks was finally calculated by tools. A 5-nodes multi-loop network was applied as an example to demonstrate effectiveness of the modeling and calculation method. Several examples were then calculated with results indicating the method effectively reduces the number of forced broken edges for protection setting calculation in multi-loop networks.
Listening to the Voices of Boys: A Mosaic Approach to Exploring the Motivation to Engage in Reading
ERIC Educational Resources Information Center
Fiedler, Krista M.
2012-01-01
The purpose of this study was to examine what information third-grade boys attending school in three rural school districts might contribute to current understanding about what motivates boys to engage in reading. A multi-method, multi-sensory Mosaic Approach was used to explore, record, and interpret the voices of the 14 boys with varying levels…
NASA Astrophysics Data System (ADS)
Nouri, N. M.; Mostafapour, K.; Kamran, M.
2018-02-01
In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.
Culture and community psychology: toward a renewed and reimagined vision.
Kral, Michael J; Ramírez García, Jorge I; Aber, Mark S; Masood, Nausheen; Dutta, Urmitapa; Todd, Nathan R
2011-03-01
Interest is growing in community psychology to look more closely at culture. Culture has resided in community psychology in its emphasis on context, ecology, and diversity, however we believe that the field will benefit from a more explicit focus on culture. We suggest a cultural approach that values the community's points of view and an understanding of shared and divergent meanings, goals, and norms within a theory of empowerment. Furthermore, we posit the importance of pluralistic, multi-method programs of research and action encompassing both idiographic and nomothetic approaches, and critical reflexivity of our roles and agendas. Culture can be further incorporated into all the branches and fibers of community psychology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Lijian, E-mail: ljjiang@hnu.edu.cn; Li, Xinping, E-mail: exping@126.com
Stochastic multiscale modeling has become a necessary approach to quantify uncertainty and characterize multiscale phenomena for many practical problems such as flows in stochastic porous media. The numerical treatment of the stochastic multiscale models can be very challengeable as the existence of complex uncertainty and multiple physical scales in the models. To efficiently take care of the difficulty, we construct a computational reduced model. To this end, we propose a multi-element least square high-dimensional model representation (HDMR) method, through which the random domain is adaptively decomposed into a few subdomains, and a local least square HDMR is constructed in eachmore » subdomain. These local HDMRs are represented by a finite number of orthogonal basis functions defined in low-dimensional random spaces. The coefficients in the local HDMRs are determined using least square methods. We paste all the local HDMR approximations together to form a global HDMR approximation. To further reduce computational cost, we present a multi-element reduced least-square HDMR, which improves both efficiency and approximation accuracy in certain conditions. To effectively treat heterogeneity properties and multiscale features in the models, we integrate multiscale finite element methods with multi-element least-square HDMR for stochastic multiscale model reduction. This approach significantly reduces the original model's complexity in both the resolution of the physical space and the high-dimensional stochastic space. We analyze the proposed approach, and provide a set of numerical experiments to demonstrate the performance of the presented model reduction techniques. - Highlights: • Multi-element least square HDMR is proposed to treat stochastic models. • Random domain is adaptively decomposed into some subdomains to obtain adaptive multi-element HDMR. • Least-square reduced HDMR is proposed to enhance computation efficiency and approximation accuracy in certain conditions. • Integrating MsFEM and multi-element least square HDMR can significantly reduce computation complexity.« less
NASA Astrophysics Data System (ADS)
Fatrias, D.; Kamil, I.; Meilani, D.
2018-03-01
Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nardin, Gaël; Li, Hebin; Autry, Travis M.
2015-03-21
We review our recent work on multi-dimensional coherent optical spectroscopy (MDCS) of semiconductor nanostructures. Two approaches, appropriate for the study of semiconductor materials, are presented and compared. A first method is based on a non-collinear geometry, where the Four-Wave-Mixing (FWM) signal is detected in the form of a radiated optical field. This approach works for samples with translational symmetry, such as Quantum Wells (QWs) or large and dense ensembles of Quantum Dots (QDs). A second method detects the FWM in the form of a photocurrent in a collinear geometry. This second approach extends the horizon of MDCS to sub-diffraction nanostructures,more » such as single QDs, nanowires, or nanotubes, and small ensembles thereof. Examples of experimental results obtained on semiconductor QW structures are given for each method. In particular, it is shown how MDCS can assess coupling between excitons confined in separated QWs.« less
Flexible and Scalable Data Fusion using Proactive Schemaless Information Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widener, Patrick
2014-05-01
Exascale data environments are fast approaching, driven by diverse structured and unstructured data such as system and application telemetry streams, open-source information capture, and on-demand simulation output. Storage costs having plummeted, the question is now one of converting vast stores of data to actionable information. Complicating this problem are the low degrees of awareness across domain boundaries about what potentially useful data may exist, and write-once- read-never issues (data generation/collection rates outpacing data analysis and integration rates). Increasingly, technologists and researchers need to correlate previously unrelated data sources and artifacts to produce fused data views for domain-specific purposes. New toolsmore » and approaches for creating such views from vast amounts of data are vitally important to maintaining research and operational momentum. We propose to research and develop tools and services to assist in the creation, refinement, discovery and reuse of fused data views over large, diverse collections of heterogeneously structured data. We innovate in the following ways. First, we enable and encourage end-users to introduce customized index methods selected for local benefit rather than for global interaction (flexible multi-indexing). We envision rich combinations of such views on application data: views that span backing stores with different semantics, that introduce analytic methods of indexing, and that define multiple views on individual data items. We specifically decline to build a big fused database of everything providing a centralized index over all data, or to export a rigid schema to all comers as in federated query approaches. Second, we proactively advertise these application-specific views so that they may be programmatically reused and extended (data proactivity). Through this mechanism, both changes in state (new data in existing view collected) and changes in structure (new or derived view exists) are made known. Lastly, we embrace found data heterogeneity by coupling multi-indexing to backing stores with appropriate semantics (as opposed to a single store or schema).« less
Flexible and Scalable Data Fusion using Proactive, Schemaless Information Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widener, Patrick M.
2014-05-01
Exascale data environments are fast approaching, driven by diverse structured and unstructured data such as system and application telemetry streams, open-source information capture, and on-demand simulation output. Storage costs having plummeted, the question is now one of converting vast stores of data to actionable information. Complicating this problem are the low degrees of awareness across domain boundaries about what potentially useful data may exist, and write-once-read- never issues (data generation/collection rates outpacing data analysis and integration rates). Increasingly, technologists and researchers need to correlate previously unrelated data sources and artifacts to produce fused data views for domain-specific purposes. New toolsmore » and approaches for creating such views from vast amounts of data are vitally important to maintaining research and operational momentum. We propose to research and develop tools and services to assist in the creation, refinement, discovery and reuse of fused data views over large, diverse collections of heterogeneously structured data. We innovate in the following ways. First, we enable and encourage end-users to introduce customized index methods selected for local benefit rather than for global interaction (flexible multi-indexing). We envision rich combinations of such views on application data: views that span backing stores with different semantics, that introduce analytic methods of indexing, and that define multiple views on individual data items. We specifically decline to build a big fused database of everything providing a centralized index over all data, or to export a rigid schema to all comers as in federated query approaches. Second, we proactively advertise these application-specific views so that they may be programmatically reused and extended (data proactivity). Through this mechanism, both changes in state (new data in existing view collected) and changes in structure (new or derived view exists) are made known. Lastly, we embrace found data heterogeneity by coupling multi-indexing to backing stores with appropriate semantics (as opposed to a single store or schema).« less
Programmed Multi-Image Lectures for College Biology Instruction.
ERIC Educational Resources Information Center
Jensen, William A.; Knauft, Robert L.
1977-01-01
Discusses the use of a programed multi-image lecture approach for teaching a botany course to nonmajor students at the University of California, Berkeley. Also considers the advantages, production, method of presentation, and design of the multimedia lectures. (HM)
Multiphase flow models for hydraulic fracturing technology
NASA Astrophysics Data System (ADS)
Osiptsov, Andrei A.
2017-10-01
The technology of hydraulic fracturing of a hydrocarbon-bearing formation is based on pumping a fluid with particles into a well to create fractures in porous medium. After the end of pumping, the fractures filled with closely packed proppant particles create highly conductive channels for hydrocarbon flow from far-field reservoir to the well to surface. The design of the hydraulic fracturing treatment is carried out with a simulator. Those simulators are based on mathematical models, which need to be accurate and close to physical reality. The entire process of fracture placement and flowback/cleanup can be conventionally split into the following four stages: (i) quasi-steady state effectively single-phase suspension flow down the wellbore, (ii) particle transport in an open vertical fracture, (iii) displacement of fracturing fluid by hydrocarbons from the closed fracture filled with a random close pack of proppant particles, and, finally, (iv) highly transient gas-liquid flow in a well during cleanup. The stage (i) is relatively well described by the existing hydralics models, while the models for the other three stages of the process need revisiting and considerable improvement, which was the focus of the author’s research presented in this review paper. For stage (ii), we consider the derivation of a multi-fluid model for suspension flow in a narrow vertical hydraulic fracture at moderate Re on the scale of fracture height and length and also the migration of particles across the flow on the scale of fracture width. At the stage of fracture cleanaup (iii), a novel multi-continua model for suspension filtration is developed. To provide closure relationships for permeability of proppant packings to be used in this model, a 3D direct numerical simulation of single phase flow is carried out using the lattice-Boltzmann method. For wellbore cleanup (iv), we present a combined 1D model for highly-transient gas-liquid flow based on the combination of multi-fluid and drift-flux approaches. The derivation of the drift-flux model from conservation olaws is criticall revisited in order to define the list of underlying assumptions and to mark the applicability margins of the model. All these fundamental problems share the same technological application (hydraulic fracturing) and the same method of research, namely, the multi-fluid approach to multiphase flow modeling and the consistent use of asymptotic methods. Multi-fluid models are then discussed in comparison with semi-empirical (often postulated) models widely used in the industry.
Omitola, Olufemi Gbenga; Soyele, Olujide Oladele; Sigbeku, Opeyemi; Okoh, Dickson; Akinshipo, Abdulwarith Olaitan; Butali, Azeez; Adeola, Henry Ademola
2017-01-01
Oral cancer is a leading cause of cancer deaths among African populations. Lack of standard cancer registries and under-reporting has inaccurately depicted its magnitude in Nigeria. Development of multi-centre collaborative oral pathology networks such as the African Oral Pathology Research Consortium (AOPRC) facilitates skill and expertise exchange and fosters a robust and systematic investigation of oral diseases across Africa. In this descriptive cross-sectional study, we have leveraged the auspices of the AOPRC to examine the burden of oral cancer in Nigeria, using a multi-centre approach. Data from 4 major tertiary health institutions in Western and Southern Nigeria was generated using a standardized data extraction format and analysed using the SPSS data analysis software (version 20.0; SPSS Inc. Chicago, IL). Of the 162 cases examined across the 4 centres, we observed that oral squamous cell carcinomas (OSCC) occurred mostly in the 6 th and 7 th decades of life and maxillary were more frequent than mandibular OSCC lesions. Regional variations were observed both for location, age group and gender distribution. Significant regional differences was found between poorly, moderately and well differentiated OSCC (p value = 0.0071). A multi-centre collaborative oral pathology research approach is an effective way to achieve better insight into the patterns and distribution of various oral diseases in men of African descent. The wider outlook for AOPRC is to employ similar approaches to drive intensive oral pathology research targeted at addressing the current morbidity and mortality of various oral diseases across Africa.
Diaz-Balteiro, L; Belavenutti, P; Ezquerro, M; González-Pachón, J; Ribeiro Nobre, S; Romero, C
2018-05-15
There is an important body of literature using multi-criteria distance function methods for the aggregation of a battery of sustainability indicators in order to obtain a composite index. This index is considered to be a proxy of the sustainability goodness of a natural system. Although this approach has been profusely used in the literature, it is not exempt from difficulties and potential pitfalls. Thus, in this paper, a significant number of critical issues have been identified showing different procedures capable of avoiding, or at least of mitigating, the inherent potential pitfalls associated with each one. The recommendations made in the paper could increase the theoretical soundness of the multi-criteria distance function methods when this type of approach is applied in the sustainability field, thus increasing the accuracy and realism of the sustainability measurements obtained. Copyright © 2018 Elsevier Ltd. All rights reserved.
Teaching University Students Cultural Diversity by Means of Multi-Cultural Picture Books in Taiwan
ERIC Educational Resources Information Center
Wu, Jia-Fen
2017-01-01
In a pluralistic society, learning about foreign cultures is an important goal in the kind of multi-cultural education that will lead to cultural competency. This study adopted a qualitative dominant mixed-method approach to examine the effectiveness of the multi-cultural picture books on: (1) students' achieving awareness towards cultural…
Tutoring and Multi-Agent Systems: Modeling from Experiences
ERIC Educational Resources Information Center
Bennane, Abdellah
2010-01-01
Tutoring systems become complex and are offering varieties of pedagogical software as course modules, exercises, simulators, systems online or offline, for single user or multi-user. This complexity motivates new forms and approaches to the design and the modelling. Studies and research in this field introduce emergent concepts that allow the…
Novel Approach to Facilitating Tradeoff Multi-Objective Grouping Optimization
ERIC Educational Resources Information Center
Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping
2016-01-01
The grouping problem is critical in collaborative learning (CL) because of the complexity and difficulty in adequate grouping, based on various grouping criteria and numerous learners. Previous studies have paid attention to certain research questions, and the consideration for a number of learner characteristics has arisen. Such a multi-objective…
Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems.
Gao, Lei; Bourke, A K; Nelson, John
2014-06-01
Physical activity has a positive impact on people's well-being and it had been shown to decrease the occurrence of chronic diseases in the older adult population. To date, a substantial amount of research studies exist, which focus on activity recognition using inertial sensors. Many of these studies adopt a single sensor approach and focus on proposing novel features combined with complex classifiers to improve the overall recognition accuracy. In addition, the implementation of the advanced feature extraction algorithms and the complex classifiers exceed the computing ability of most current wearable sensor platforms. This paper proposes a method to adopt multiple sensors on distributed body locations to overcome this problem. The objective of the proposed system is to achieve higher recognition accuracy with "light-weight" signal processing algorithms, which run on a distributed computing based sensor system comprised of computationally efficient nodes. For analysing and evaluating the multi-sensor system, eight subjects were recruited to perform eight normal scripted activities in different life scenarios, each repeated three times. Thus a total of 192 activities were recorded resulting in 864 separate annotated activity states. The methods for designing such a multi-sensor system required consideration of the following: signal pre-processing algorithms, sampling rate, feature selection and classifier selection. Each has been investigated and the most appropriate approach is selected to achieve a trade-off between recognition accuracy and computing execution time. A comparison of six different systems, which employ single or multiple sensors, is presented. The experimental results illustrate that the proposed multi-sensor system can achieve an overall recognition accuracy of 96.4% by adopting the mean and variance features, using the Decision Tree classifier. The results demonstrate that elaborate classifiers and feature sets are not required to achieve high recognition accuracies on a multi-sensor system. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
A multi-view face recognition system based on cascade face detector and improved Dlib
NASA Astrophysics Data System (ADS)
Zhou, Hongjun; Chen, Pei; Shen, Wei
2018-03-01
In this research, we present a framework for multi-view face detect and recognition system based on cascade face detector and improved Dlib. This method is aimed to solve the problems of low efficiency and low accuracy in multi-view face recognition, to build a multi-view face recognition system, and to discover a suitable monitoring scheme. For face detection, the cascade face detector is used to extracted the Haar-like feature from the training samples, and Haar-like feature is used to train a cascade classifier by combining Adaboost algorithm. Next, for face recognition, we proposed an improved distance model based on Dlib to improve the accuracy of multiview face recognition. Furthermore, we applied this proposed method into recognizing face images taken from different viewing directions, including horizontal view, overlooks view, and looking-up view, and researched a suitable monitoring scheme. This method works well for multi-view face recognition, and it is also simulated and tested, showing satisfactory experimental results.
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Hooshyar, Milad
2014-11-01
Reservoir systems with multiple operators can benefit from coordination of operation policies. To maximize the total benefit of these systems the literature has normally used the social planner's approach. Based on this approach operation decisions are optimized using a multi-objective optimization model with a compound system's objective. While the utility of the system can be increased this way, fair allocation of benefits among the operators remains challenging for the social planner who has to assign controversial weights to the system's beneficiaries and their objectives. Cooperative game theory provides an alternative framework for fair and efficient allocation of the incremental benefits of cooperation. To determine the fair and efficient utility shares of the beneficiaries, cooperative game theory solution methods consider the gains of each party in the status quo (non-cooperation) as well as what can be gained through the grand coalition (social planner's solution or full cooperation) and partial coalitions. Nevertheless, estimation of the benefits of different coalitions can be challenging in complex multi-beneficiary systems. Reinforcement learning can be used to address this challenge and determine the gains of the beneficiaries for different levels of cooperation, i.e., non-cooperation, partial cooperation, and full cooperation, providing the essential input for allocation based on cooperative game theory. This paper develops a game theory-reinforcement learning (GT-RL) method for determining the optimal operation policies in multi-operator multi-reservoir systems with respect to fairness and efficiency criteria. As the first step to underline the utility of the GT-RL method in solving complex multi-agent multi-reservoir problems without a need for developing compound objectives and weight assignment, the proposed method is applied to a hypothetical three-agent three-reservoir system.
Market-based control strategy for long-span structures considering the multi-time delay issue
NASA Astrophysics Data System (ADS)
Li, Hongnan; Song, Jianzhu; Li, Gang
2017-01-01
To solve the different time delays that exist in the control device installed on spatial structures, in this study, discrete analysis using a 2 N precise algorithm was selected to solve the multi-time-delay issue for long-span structures based on the market-based control (MBC) method. The concept of interval mixed energy was introduced from computational structural mechanics and optimal control research areas, and it translates the design of the MBC multi-time-delay controller into a solution for the segment matrix. This approach transforms the serial algorithm in time to parallel computing in space, greatly improving the solving efficiency and numerical stability. The designed controller is able to consider the issue of time delay with a linear controlling force combination and is especially effective for large time-delay conditions. A numerical example of a long-span structure was selected to demonstrate the effectiveness of the presented controller, and the time delay was found to have a significant impact on the results.
NREL/Boeing Spectrolab Team Wins Research and Development Award | News |
approach represents a powerful new technology for designing super-efficient multi-junction solar cells. The results in superior electrical performance. But, with the HEMM approach, the atoms are unevenly spaced
Diaz, Maureen H; Winchell, Jonas M
2016-01-01
Over the past decade there have been significant advancements in the methods used for detecting and characterizing Mycoplasma pneumoniae, a common cause of respiratory illness and community-acquired pneumonia worldwide. The repertoire of available molecular diagnostics has greatly expanded from nucleic acid amplification techniques (NAATs) that encompass a variety of chemistries used for detection, to more sophisticated characterizing methods such as multi-locus variable-number tandem-repeat analysis (MLVA), Multi-locus sequence typing (MLST), matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS), single nucleotide polymorphism typing, and numerous macrolide susceptibility profiling methods, among others. These many molecular-based approaches have been developed and employed to continually increase the level of discrimination and characterization in order to better understand the epidemiology and biology of M. pneumoniae. This review will summarize recent molecular techniques and procedures and lend perspective to how each has enhanced the current understanding of this organism and will emphasize how Next Generation Sequencing may serve as a resource for researchers to gain a more comprehensive understanding of the genomic complexities of this insidious pathogen.
SChloro: directing Viridiplantae proteins to six chloroplastic sub-compartments.
Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Casadio, Rita
2017-02-01
Chloroplasts are organelles found in plants and involved in several important cell processes. Similarly to other compartments in the cell, chloroplasts have an internal structure comprising several sub-compartments, where different proteins are targeted to perform their functions. Given the relation between protein function and localization, the availability of effective computational tools to predict protein sub-organelle localizations is crucial for large-scale functional studies. In this paper we present SChloro, a novel machine-learning approach to predict protein sub-chloroplastic localization, based on targeting signal detection and membrane protein information. The proposed approach performs multi-label predictions discriminating six chloroplastic sub-compartments that include inner membrane, outer membrane, stroma, thylakoid lumen, plastoglobule and thylakoid membrane. In comparative benchmarks, the proposed method outperforms current state-of-the-art methods in both single- and multi-compartment predictions, with an overall multi-label accuracy of 74%. The results demonstrate the relevance of the approach that is eligible as a good candidate for integration into more general large-scale annotation pipelines of protein subcellular localization. The method is available as web server at http://schloro.biocomp.unibo.it gigi@biocomp.unibo.it.
A versatile clearing agent for multi-modal brain imaging
Costantini, Irene; Ghobril, Jean-Pierre; Di Giovanna, Antonino Paolo; Mascaro, Anna Letizia Allegra; Silvestri, Ludovico; Müllenbroich, Marie Caroline; Onofri, Leonardo; Conti, Valerio; Vanzi, Francesco; Sacconi, Leonardo; Guerrini, Renzo; Markram, Henry; Iannello, Giulio; Pavone, Francesco Saverio
2015-01-01
Extensive mapping of neuronal connections in the central nervous system requires high-throughput µm-scale imaging of large volumes. In recent years, different approaches have been developed to overcome the limitations due to tissue light scattering. These methods are generally developed to improve the performance of a specific imaging modality, thus limiting comprehensive neuroanatomical exploration by multi-modal optical techniques. Here, we introduce a versatile brain clearing agent (2,2′-thiodiethanol; TDE) suitable for various applications and imaging techniques. TDE is cost-efficient, water-soluble and low-viscous and, more importantly, it preserves fluorescence, is compatible with immunostaining and does not cause deformations at sub-cellular level. We demonstrate the effectiveness of this method in different applications: in fixed samples by imaging a whole mouse hippocampus with serial two-photon tomography; in combination with CLARITY by reconstructing an entire mouse brain with light sheet microscopy and in translational research by imaging immunostained human dysplastic brain tissue. PMID:25950610
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
NASA Astrophysics Data System (ADS)
Schrooyen, Pierre; Chatelain, Philippe; Hillewaert, Koen; Magin, Thierry E.
2014-11-01
The atmospheric entry of spacecraft presents several challenges in simulating the aerothermal flow around the heat shield. Predicting an accurate heat-flux is a complex task, especially regarding the interaction between the flow in the free stream and the erosion of the thermal protection material. To capture this interaction, a continuum approach is developed to go progressively from the region fully occupied by fluid to a receding porous medium. The volume averaged Navier-Stokes equations are used to model both phases in the same computational domain considering a single set of conservation laws. The porosity is itself a variable of the computation, allowing to take volumetric ablation into account through adequate source terms. This approach is implemented within a computational tool based on a high-order discontinuous Galerkin discretization. The multi-dimensional tool has already been validated and has proven its efficient parallel implementation. Within this platform, a fully implicit method was developed to simulate multi-phase reacting flows. Numerical results to verify and validate the methodology are considered within this work. Interactions between the flow and the ablated geometry are also presented. Supported by Fund for Research Training in Industry and Agriculture.
Multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality.
Han, Arum; Wang, Olivia; Graff, Mason; Mohanty, Swomitra K; Edwards, Thayne L; Han, Ki-Ho; Bruno Frazier, A
2003-08-01
This paper describes an approach for fabricating multi-layer microfluidic systems from a combination of glass and plastic materials. Methods and characterization results for the microfabrication technologies underlying the process flow are presented. The approach is used to fabricate and characterize multi-layer plastic/glass microfluidic systems containing electrical and mechanical functionality. Hot embossing, heat staking of plastics, injection molding, microstenciling of electrodes, and stereolithography were combined with conventional MEMS fabrication techniques to realize the multi-layer systems. The approach enabled the integration of multiple plastic/glass materials into a single monolithic system, provided a solution for the integration of electrical functionality throughout the system, provided a mechanism for the inclusion of microactuators such as micropumps/valves, and provided an interconnect technology for interfacing fluids and electrical components between the micro system and the macro world.
Kulmanov, Maxat; Khan, Mohammed Asif; Hoehndorf, Robert; Wren, Jonathan
2018-02-15
A large number of protein sequences are becoming available through the application of novel high-throughput sequencing technologies. Experimental functional characterization of these proteins is time-consuming and expensive, and is often only done rigorously for few selected model organisms. Computational function prediction approaches have been suggested to fill this gap. The functions of proteins are classified using the Gene Ontology (GO), which contains over 40 000 classes. Additionally, proteins have multiple functions, making function prediction a large-scale, multi-class, multi-label problem. We have developed a novel method to predict protein function from sequence. We use deep learning to learn features from protein sequences as well as a cross-species protein-protein interaction network. Our approach specifically outputs information in the structure of the GO and utilizes the dependencies between GO classes as background information to construct a deep learning model. We evaluate our method using the standards established by the Computational Assessment of Function Annotation (CAFA) and demonstrate a significant improvement over baseline methods such as BLAST, in particular for predicting cellular locations. Web server: http://deepgo.bio2vec.net, Source code: https://github.com/bio-ontology-research-group/deepgo. robert.hoehndorf@kaust.edu.sa. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2016-04-01
Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Uncertainty-Based Multi-Objective Optimization of Groundwater Remediation Design
NASA Astrophysics Data System (ADS)
Singh, A.; Minsker, B.
2003-12-01
Management of groundwater contamination is a cost-intensive undertaking filled with conflicting objectives and substantial uncertainty. A critical source of this uncertainty in groundwater remediation design problems comes from the hydraulic conductivity values for the aquifer, upon which the prediction of flow and transport of contaminants are dependent. For a remediation solution to be reliable in practice it is important that it is robust over the potential error in the model predictions. This work focuses on incorporating such uncertainty within a multi-objective optimization framework, to get reliable as well as Pareto optimal solutions. Previous research has shown that small amounts of sampling within a single-objective genetic algorithm can produce highly reliable solutions. However with multiple objectives the noise can interfere with the basic operations of a multi-objective solver, such as determining non-domination of individuals, diversity preservation, and elitism. This work proposes several approaches to improve the performance of noisy multi-objective solvers. These include a simple averaging approach, taking samples across the population (which we call extended averaging), and a stochastic optimization approach. All the approaches are tested on standard multi-objective benchmark problems and a hypothetical groundwater remediation case-study; the best-performing approach is then tested on a field-scale case at Umatilla Army Depot.
Multi-Instance Metric Transfer Learning for Genome-Wide Protein Function Prediction.
Xu, Yonghui; Min, Huaqing; Wu, Qingyao; Song, Hengjie; Ye, Bicui
2017-02-06
Multi-Instance (MI) learning has been proven to be effective for the genome-wide protein function prediction problems where each training example is associated with multiple instances. Many studies in this literature attempted to find an appropriate Multi-Instance Learning (MIL) method for genome-wide protein function prediction under a usual assumption, the underlying distribution from testing data (target domain, i.e., TD) is the same as that from training data (source domain, i.e., SD). However, this assumption may be violated in real practice. To tackle this problem, in this paper, we propose a Multi-Instance Metric Transfer Learning (MIMTL) approach for genome-wide protein function prediction. In MIMTL, we first transfer the source domain distribution to the target domain distribution by utilizing the bag weights. Then, we construct a distance metric learning method with the reweighted bags. At last, we develop an alternative optimization scheme for MIMTL. Comprehensive experimental evidence on seven real-world organisms verifies the effectiveness and efficiency of the proposed MIMTL approach over several state-of-the-art methods.
Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening †
Yoon, Sang Min
2018-01-01
Human Activity Recognition (HAR) aims to identify the actions performed by humans using signals collected from various sensors embedded in mobile devices. In recent years, deep learning techniques have further improved HAR performance on several benchmark datasets. In this paper, we propose one-dimensional Convolutional Neural Network (1D CNN) for HAR that employs a divide and conquer-based classifier learning coupled with test data sharpening. Our approach leverages a two-stage learning of multiple 1D CNN models; we first build a binary classifier for recognizing abstract activities, and then build two multi-class 1D CNN models for recognizing individual activities. We then introduce test data sharpening during prediction phase to further improve the activity recognition accuracy. While there have been numerous researches exploring the benefits of activity signal denoising for HAR, few researches have examined the effect of test data sharpening for HAR. We evaluate the effectiveness of our approach on two popular HAR benchmark datasets, and show that our approach outperforms both the two-stage 1D CNN-only method and other state of the art approaches. PMID:29614767
Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening.
Cho, Heeryon; Yoon, Sang Min
2018-04-01
Human Activity Recognition (HAR) aims to identify the actions performed by humans using signals collected from various sensors embedded in mobile devices. In recent years, deep learning techniques have further improved HAR performance on several benchmark datasets. In this paper, we propose one-dimensional Convolutional Neural Network (1D CNN) for HAR that employs a divide and conquer-based classifier learning coupled with test data sharpening. Our approach leverages a two-stage learning of multiple 1D CNN models; we first build a binary classifier for recognizing abstract activities, and then build two multi-class 1D CNN models for recognizing individual activities. We then introduce test data sharpening during prediction phase to further improve the activity recognition accuracy. While there have been numerous researches exploring the benefits of activity signal denoising for HAR, few researches have examined the effect of test data sharpening for HAR. We evaluate the effectiveness of our approach on two popular HAR benchmark datasets, and show that our approach outperforms both the two-stage 1D CNN-only method and other state of the art approaches.
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.
Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data
Abram, Samantha V.; Helwig, Nathaniel E.; Moodie, Craig A.; DeYoung, Colin G.; MacDonald, Angus W.; Waller, Niels G.
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks. PMID:27516732
An Integer Programming Model for Multi-Echelon Supply Chain Decision Problem Considering Inventories
NASA Astrophysics Data System (ADS)
Harahap, Amin; Mawengkang, Herman; Siswadi; Effendi, Syahril
2018-01-01
In this paper we address a problem that is of significance to the industry, namely the optimal decision of a multi-echelon supply chain and the associated inventory systems. By using the guaranteed service approach to model the multi-echelon inventory system, we develop a mixed integer; programming model to simultaneously optimize the transportation, inventory and network structure of a multi-echelon supply chain. To solve the model we develop a direct search approach using a strategy of releasing nonbasic variables from their bounds, combined with the “active constraint” method. This strategy is used to force the appropriate non-integer basic variables to move to their neighbourhood integer points.
Generating multi-double-scroll attractors via nonautonomous approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Qinghui; Xie, Qingguo, E-mail: qgxie@mail.hust.edu.cn; Shen, Yi
It is a common phenomenon that multi-scroll attractors are realized by introducing the various nonlinear functions with multiple breakpoints in double scroll chaotic systems. Differently, we present a nonautonomous approach for generating multi-double-scroll attractors (MDSA) without changing the original nonlinear functions. By using the multi-level-logic pulse excitation technique in double scroll chaotic systems, MDSA can be generated. A Chua's circuit, a Jerk circuit, and a modified Lorenz system are given as designed example and the Matlab simulation results are presented. Furthermore, the corresponding realization circuits are designed. The Pspice results are in agreement with numerical simulation results, which verify themore » availability and feasibility of this method.« less
Algorithms for the automatic generation of 2-D structured multi-block grids
NASA Technical Reports Server (NTRS)
Schoenfeld, Thilo; Weinerfelt, Per; Jenssen, Carl B.
1995-01-01
Two different approaches to the fully automatic generation of structured multi-block grids in two dimensions are presented. The work aims to simplify the user interactivity necessary for the definition of a multiple block grid topology. The first approach is based on an advancing front method commonly used for the generation of unstructured grids. The original algorithm has been modified toward the generation of large quadrilateral elements. The second method is based on the divide-and-conquer paradigm with the global domain recursively partitioned into sub-domains. For either method each of the resulting blocks is then meshed using transfinite interpolation and elliptic smoothing. The applicability of these methods to practical problems is demonstrated for typical geometries of fluid dynamics.
Addressing data privacy in matched studies via virtual pooling.
Saha-Chaudhuri, P; Weinberg, C R
2017-09-07
Data confidentiality and shared use of research data are two desirable but sometimes conflicting goals in research with multi-center studies and distributed data. While ideal for straightforward analysis, confidentiality restrictions forbid creation of a single dataset that includes covariate information of all participants. Current approaches such as aggregate data sharing, distributed regression, meta-analysis and score-based methods can have important limitations. We propose a novel application of an existing epidemiologic tool, specimen pooling, to enable confidentiality-preserving analysis of data arising from a matched case-control, multi-center design. Instead of pooling specimens prior to assay, we apply the methodology to virtually pool (aggregate) covariates within nodes. Such virtual pooling retains most of the information used in an analysis with individual data and since individual participant data is not shared externally, within-node virtual pooling preserves data confidentiality. We show that aggregated covariate levels can be used in a conditional logistic regression model to estimate individual-level odds ratios of interest. The parameter estimates from the standard conditional logistic regression are compared to the estimates based on a conditional logistic regression model with aggregated data. The parameter estimates are shown to be similar to those without pooling and to have comparable standard errors and confidence interval coverage. Virtual data pooling can be used to maintain confidentiality of data from multi-center study and can be particularly useful in research with large-scale distributed data.
Towards a framework for the elicitation of dilemmas.
Burger, Marc J C
2008-08-01
This paper covers the main findings of the doctoral research that was concerned with seeking to extend aspects of dilemma theory. In professional practice, the Trompenaars Hampden-Turner Dilemma Reconciliation Process(TM) is a vehicle delivering dilemma theory in application. It informs a manager or leader on how to explore the dilemmas they face, how to reconcile the tensions that result, and how to structure the action steps for implementing the reconciled solutions. This vehicle forms the professional practice of the author who seeks to bring more rigor to consulting practice and thereby also contribute to theory development in the domain. The critical review of dilemma theory reveals that previous authors are inconsistent and variously invalid in their use of the terms 'dilemma theory,' 'dilemma methodology,' 'dilemma process,' 'dilemma reconciliation,' etc., and therefore an attempt is made to resolve these inconsistencies by considering whether 'dilemmaism' at the meta-level might be positioned as a new paradigm of inquiry for (management) research that embodies ontological, epistemological, and methodical premises that frame an approach to the resolution of real world business problems in (multi) disciplinary; (multi) functional and (multi) cultural business environments. This research offers contributions to knowledge, professional practice and theory development from the exploration of the SPID model as a way to make the elicitation of dilemmas more rigorous and structured and in the broader context of exploring 'dilemmaism' as a new paradigm of inquiry.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-04-01
To identify the key methodological challenges for public health economic modelling and set an agenda for future research. An iterative literature search identified papers describing methodological challenges for developing the structure of public health economic models. Additional multidisciplinary literature searches helped expand upon important ideas raised within the review. Fifteen articles were identified within the formal literature search, highlighting three key challenges: inclusion of non-healthcare costs and outcomes; inclusion of equity; and modelling complex systems and multi-component interventions. Based upon these and multidisciplinary searches about dynamic complexity, the social determinants of health, and models of human behaviour, six areas for future research were specified. Future research should focus on: the use of systems approaches within health economic modelling; approaches to assist the systematic consideration of the social determinants of health; methods for incorporating models of behaviour and social interactions; consideration of equity; and methodology to help modellers develop valid, credible and transparent public health economic model structures.
Simpson, John; Raith, Andrea; Rouse, Paul; Ehrgott, Matthias
2017-10-09
Purpose The operations research method of data envelopment analysis (DEA) shows promise for assessing radiotherapy treatment plan quality. The purpose of this paper is to consider the technical requirements for using DEA for plan assessment. Design/methodology/approach In total, 41 prostate treatment plans were retrospectively analysed using the DEA method. The authors investigate the impact of DEA weight restrictions with reference to the ability to differentiate plan performance at a level of clinical significance. Patient geometry influences plan quality and the authors compare differing approaches for managing patient geometry within the DEA method. Findings The input-oriented DEA method is the method of choice when performing plan analysis using the key undesirable plan metrics as the DEA inputs. When considering multiple inputs, it is necessary to constrain the DEA input weights in order to identify potential plan improvements at a level of clinical significance. All tested approaches for the consideration of patient geometry yielded consistent results. Research limitations/implications This work is based on prostate plans and individual recommendations would therefore need to be validated for other treatment sites. Notwithstanding, the method that requires both optimised DEA weights according to clinical significance and appropriate accounting for patient geometric factors is universally applicable. Practical implications DEA can potentially be used during treatment plan development to guide the planning process or alternatively used retrospectively for treatment plan quality audit. Social implications DEA is independent of the planning system platform and therefore has the potential to be used for multi-institutional quality audit. Originality/value To the authors' knowledge, this is the first published examination of the optimal approach in the use of DEA for radiotherapy treatment plan assessment.
Chang, Wei; Chamie, Gabriel; Mwai, Daniel; Clark, Tamara D.; Thirumurthy, Harsha; Charlebois, Edwin D.; Petersen, Maya; Kabami, Jane; Ssemmondo, Emmanuel; Kadede, Kevin; Kwarisiima, Dalsone; Sang, Norton; Bukusi, Elizabeth A.; Cohen, Craig R.; Kamya, Moses; Havlir, Diane V.; Kahn, James G.
2016-01-01
Background In 2013-14, we achieved 89% adult HIV testing coverage using a hybrid testing approach in 32 communities in Uganda and Kenya (SEARCH: NCT01864603). To inform scalability, we sought to determine: 1) overall cost and efficiency of this approach; and 2) costs associated with point-of-care (POC) CD4 testing, multi-disease services, and community mobilization. Methods We applied micro-costing methods to estimate costs of population-wide HIV testing in 12 SEARCH Trial communities. Main intervention components of the hybrid approach are census, multi-disease community health campaigns (CHC), and home-based testing (HBT) for CHC non-attendees. POC CD4 tests were provided for all HIV-infected participants. Data were extracted from expenditure records, activity registers, staff interviews, and time and motion logs. Results The mean cost per adult tested for HIV was $20.5 (range: $17.1 - $32.1) [2014 US$], including a POC CD4 test at $16 per HIV+ person identified. Cost per adult tested for HIV was $13.8 at CHC vs. $31.7 via HBT. The cost per HIV+ adult identified was $231 ($87 - $1,245), with variability due mainly to HIV prevalence among persons tested (i.e., HIV positivity rate). The marginal costs of multi-disease testing at CHCs were $1.16/person for hypertension and diabetes, and $0.90 for malaria. Community mobilization constituted 15.3% of total costs. Conclusions The hybrid testing approach achieved very high HIV testing coverage, with POC CD4, at costs similar to previously reported mobile, home-based, or venue-based HIV testing approaches in sub-Saharan Africa. By leveraging HIV infrastructure, multi-disease services were offered at low marginal costs. PMID:27741031
Liu, Yang; Yin, Xiu-Wen; Wang, Zi-Yu; Li, Xue-Lian; Pan, Meng; Li, Yan-Ping; Dong, Ling
2017-11-01
One of the advantages of biopharmaceutics classification system of Chinese materia medica (CMMBCS) is expanding the classification research level from single ingredient to multi-components of Chinese herb, and from multi-components research to holistic research of the Chinese materia medica. In present paper, the alkaloids of extract of huanglian were chosen as the main research object to explore their change rules in solubility and intestinal permeability of single-component and multi-components, and to determine the biopharmaceutical classification of extract of Huanglian from holistic level. The typical shake-flask method and HPLC were used to detect the solubility of single ingredient of alkaloids from extract of huanglian. The quantitative research of alkaloids in intestinal absorption was measured in single-pass intestinal perfusion experiment while permeability coefficient of extract of huanglian was calculated by self-defined weight coefficient method. Copyright© by the Chinese Pharmaceutical Association.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
Mapping Informative Clusters in a Hierarchial Framework of fMRI Multivariate Analysis
Xu, Rui; Zhen, Zonglei; Liu, Jia
2010-01-01
Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies. PMID:21152081
Aging, training, and the brain: A review and future directions
Lustig, Cindy; Shah, Priti; Seidler, Rachael; Reuter-Lorenz, Patricia A.
2010-01-01
As the population ages, the need for effective methods to maintain or even improve older adults’ cognitive performance becomes increasingly pressing. Here we provide a brief review of the major intervention approaches that have been the focus of past research with healthy older adults (strategy training, multi-modal interventions, cardiovascular exercise, and process-based training), and new approaches that incorporate neuroimaging. As outcome measures, neuroimaging data on intervention-related changes in volume, structural integrity, and functional activation can provide important insights into the nature and duration of an intervention's effects. Perhaps even more intriguingly, several recent studies have used neuroimaging data as a guide to identify core cognitive processes that can be trained in one task with effective transfer to other tasks that share the same underlying processes. Although many open questions remain, this research has greatly increased our understanding of how to promote successful aging of cognition and the brain. PMID:19876740
Discontinuities, cross-scale patterns, and the organization of ecosystems
Nash, Kirsty L.; Allen, Craig R.; Angeler, David G.; Barichievy, Chris; Eason, Tarsha; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Knutson, Melinda; Nelson, R. John; Nystrom, Magnus; Stow, Craig A.; Sandstrom, Shana M.
2014-01-01
Ecological structures and processes occur at specific spatiotemporal scales, and interactions that occur across multiple scales mediate scale-specific (e.g., individual, community, local, or regional) responses to disturbance. Despite the importance of scale, explicitly incorporating a multi-scale perspective into research and management actions remains a challenge. The discontinuity hypothesis provides a fertile avenue for addressing this problem by linking measureable proxies to inherent scales of structure within ecosystems. Here we outline the conceptual framework underlying discontinuities and review the evidence supporting the discontinuity hypothesis in ecological systems. Next we explore the utility of this approach for understanding cross-scale patterns and the organization of ecosystems by describing recent advances for examining nonlinear responses to disturbance and phenomena such as extinctions, invasions, and resilience. To stimulate new research, we present methods for performing discontinuity analysis, detail outstanding knowledge gaps, and discuss potential approaches for addressing these gaps.
NASA Astrophysics Data System (ADS)
Ward, Eric D.; Webb, Ryan R.; deWeck, Olivier L.
2016-11-01
There is a general consensus that Mars is the next high priority destination for human space exploration. There has been no lack of analysis and recommendations for human missions to Mars, including, for example, the NASA Design Reference Architectures and the Mars Direct proposal. These studies and others usually employ the traditional approach of selecting a baseline mission architecture and running individual trade studies. However, this can cause blind spots, as not all combinations are explored. An alternative approach is to holistically analyze the entire architectural trade-space such that all of the possible system interactions are identified and measured. In such a framework, an optimal design is sought by minimizing cost for maximal value. While cost is relatively easy to model for manned spaceflight, value is more difficult to define. In our efforts to develop a surface base architecture for the MIT Mars 2040 project, we explored several methods for quantifying value, including technology development benefits, challenge, and various metrics for measuring scientific return. We developed a science multi-score method that combines astrobiology and geologic research goals, which is weighted by the crew-member hours that can be used for scientific research rather than other activities.
NASA Astrophysics Data System (ADS)
Kreutzer, Sebastian; Meszner, Sascha; Faust, Dominik; Fuchs, Markus
2014-05-01
Interpreting former landscape evolution asks for understanding the processes that sculpt such landforms by means of deciphering complex systems. For reconstructing terrestrial Quaternary environments based on loess archives this might be considered, at least, as a three step process: (1) Identifying valuable records in appropriate morphological positions in a previously defined research area, (2) analysing the profiles by field work and laboratory methods and finally (3) linking the previously considered pseudo-isolated systems to set up a comprehensive picture. Especially the first and the last step might bring some pitfalls, as it is tempting to specify single records as pseudo-isolated, closed systems. They might be, with regard to their preservation in their specific morphological position, but in fact they are part of a complex, open system. Between 2008 and 2013, Late-Pleistocene loess archives in Saxony have been intensively investigated by field and laboratory methods. Linking pedo- and luminescence dating based chronostratigraphies, a composite profile for the entire Saxonian Loess Region has been established. With this, at least, two-fold approach we tried to avoid misinterpretations that might appear when focussing on one standard profile in an open morphological system. Our contribution focuses on this multi-proxy approach to decipher the Late-Pleistocene landscape evolution in the Saxonian Loess Region. Highlighting the challenges and advantages of combining different methods, we believe that (1) this multi-proxy approach is without alternative, (2) the combination of different profiles may simplify the more complex reality, but it may be a useful generalisation to understand and reveal the stratigraphical significance of the landscape evolution in this region.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... least one, but no more than two, site-specific research projects to test innovative approaches to... Criterion; Disability and Rehabilitation Research Projects and Spinal Cord Injury Model Systems Centers and Multi-Site Collaborative Research Projects AGENCY: Office of Special Education and Rehabilitative...
Polynomial mixture method of solving ordinary differential equations
NASA Astrophysics Data System (ADS)
Shahrir, Mohammad Shazri; Nallasamy, Kumaresan; Ratnavelu, Kuru; Kamali, M. Z. M.
2017-11-01
In this paper, a numerical solution of fuzzy quadratic Riccati differential equation is estimated using a proposed new approach that provides mixture of polynomials where iteratively the right mixture will be generated. This mixture provide a generalized formalism of traditional Neural Networks (NN). Previous works have shown reliable results using Runge-Kutta 4th order (RK4). This can be achieved by solving the 1st Order Non-linear Differential Equation (ODE) that is found commonly in Riccati differential equation. Research has shown improved results relatively to the RK4 method. It can be said that Polynomial Mixture Method (PMM) shows promising results with the advantage of continuous estimation and improved accuracy that can be produced over Mabood et al, RK-4, Multi-Agent NN and Neuro Method (NM).
The any particle molecular orbital grid-based Hartree-Fock (APMO-GBHF) approach
NASA Astrophysics Data System (ADS)
Posada, Edwin; Moncada, Félix; Reyes, Andrés
2018-02-01
The any particle molecular orbital grid-based Hartree-Fock approach (APMO-GBHF) is proposed as an initial step to perform multi-component post-Hartree-Fock, explicitly correlated, and density functional theory methods without basis set errors. The method has been applied to a number of electronic and multi-species molecular systems. Results of these calculations show that the APMO-GBHF total energies are comparable with those obtained at the APMO-HF complete basis set limit. In addition, results reveal a considerable improvement in the description of the nuclear cusps of electronic and non-electronic densities.
Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kudryashov, Nikolay A.; Shilnikov, Kirill E.
Numerical computation of the three dimensional problem of the freezing interface propagation during the cryosurgery coupled with the multi-objective optimization methods is used in order to improve the efficiency and safety of the cryosurgery operations performing. Prostate cancer treatment and cutaneous cryosurgery are considered. The heat transfer in soft tissue during the thermal exposure to low temperature is described by the Pennes bioheat model and is coupled with an enthalpy method for blurred phase change computations. The finite volume method combined with the control volume approximation of the heat fluxes is applied for the cryosurgery numerical modeling on the tumormore » tissue of a quite arbitrary shape. The flux relaxation approach is used for the stability improvement of the explicit finite difference schemes. The method of the additional heating elements mounting is studied as an approach to control the cellular necrosis front propagation. Whereas the undestucted tumor tissue and destucted healthy tissue volumes are considered as objective functions, the locations of additional heating elements in cutaneous cryosurgery and cryotips in prostate cancer cryotreatment are considered as objective variables in multi-objective problem. The quasi-gradient method is proposed for the searching of the Pareto front segments as the multi-objective optimization problem solutions.« less
Supervised Semantic Classification for Nuclear Proliferation Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Cheriyadat, Anil M; Gleason, Shaun Scott
2010-01-01
Existing feature extraction and classification approaches are not suitable for monitoring proliferation activity using high-resolution multi-temporal remote sensing imagery. In this paper we present a supervised semantic labeling framework based on the Latent Dirichlet Allocation method. This framework is used to analyze over 120 images collected under different spatial and temporal settings over the globe representing three major semantic categories: airports, nuclear, and coal power plants. Initial experimental results show a reasonable discrimination of these three categories even though coal and nuclear images share highly common and overlapping objects. This research also identified several research challenges associated with nuclear proliferationmore » monitoring using high resolution remote sensing images.« less
Multi-GPU and multi-CPU accelerated FDTD scheme for vibroacoustic applications
NASA Astrophysics Data System (ADS)
Francés, J.; Otero, B.; Bleda, S.; Gallego, S.; Neipp, C.; Márquez, A.; Beléndez, A.
2015-06-01
The Finite-Difference Time-Domain (FDTD) method is applied to the analysis of vibroacoustic problems and to study the propagation of longitudinal and transversal waves in a stratified media. The potential of the scheme and the relevance of each acceleration strategy for massively computations in FDTD are demonstrated in this work. In this paper, we propose two new specific implementations of the bi-dimensional scheme of the FDTD method using multi-CPU and multi-GPU, respectively. In the first implementation, an open source message passing interface (OMPI) has been included in order to massively exploit the resources of a biprocessor station with two Intel Xeon processors. Moreover, regarding CPU code version, the streaming SIMD extensions (SSE) and also the advanced vectorial extensions (AVX) have been included with shared memory approaches that take advantage of the multi-core platforms. On the other hand, the second implementation called the multi-GPU code version is based on Peer-to-Peer communications available in CUDA on two GPUs (NVIDIA GTX 670). Subsequently, this paper presents an accurate analysis of the influence of the different code versions including shared memory approaches, vector instructions and multi-processors (both CPU and GPU) and compares them in order to delimit the degree of improvement of using distributed solutions based on multi-CPU and multi-GPU. The performance of both approaches was analysed and it has been demonstrated that the addition of shared memory schemes to CPU computing improves substantially the performance of vector instructions enlarging the simulation sizes that use efficiently the cache memory of CPUs. In this case GPU computing is slightly twice times faster than the fine tuned CPU version in both cases one and two nodes. However, for massively computations explicit vector instructions do not worth it since the memory bandwidth is the limiting factor and the performance tends to be the same than the sequential version with auto-vectorisation and also shared memory approach. In this scenario GPU computing is the best option since it provides a homogeneous behaviour. More specifically, the speedup of GPU computing achieves an upper limit of 12 for both one and two GPUs, whereas the performance reaches peak values of 80 GFlops and 146 GFlops for the performance for one GPU and two GPUs respectively. Finally, the method is applied to an earth crust profile in order to demonstrate the potential of our approach and the necessity of applying acceleration strategies in these type of applications.
A scale space feature based registration technique for fusion of satellite imagery
NASA Technical Reports Server (NTRS)
Raghavan, Srini; Cromp, Robert F.; Campbell, William C.
1997-01-01
Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.
Automatic image enhancement based on multi-scale image decomposition
NASA Astrophysics Data System (ADS)
Feng, Lu; Wu, Zhuangzhi; Pei, Luo; Long, Xiong
2014-01-01
In image processing and computational photography, automatic image enhancement is one of the long-range objectives. Recently the automatic image enhancement methods not only take account of the globe semantics, like correct color hue and brightness imbalances, but also the local content of the image, such as human face and sky of landscape. In this paper we describe a new scheme for automatic image enhancement that considers both global semantics and local content of image. Our automatic image enhancement method employs the multi-scale edge-aware image decomposition approach to detect the underexposure regions and enhance the detail of the salient content. The experiment results demonstrate the effectiveness of our approach compared to existing automatic enhancement methods.
Allen, Lisa K; Hetherington, Erin; Manyama, Mange; Hatfield, Jennifer M; van Marle, Guido
2010-02-03
There have been a number of interventions to date aimed at improving malaria diagnostic accuracy in sub-Saharan Africa. Yet, limited success is often reported for a number of reasons, especially in rural settings. This paper seeks to provide a framework for applied research aimed to improve malaria diagnosis using a combination of the established methods, participatory action research and social entrepreneurship. This case study introduces the idea of using the social entrepreneurship approach (SEA) to create innovative and sustainable applied health research outcomes. The following key elements define the SEA: (1) identifying a locally relevant research topic and plan, (2) recognizing the importance of international multi-disciplinary teams and the incorporation of local knowledge, (3) engaging in a process of continuous innovation, adaptation and learning, (4) remaining motivated and determined to achieve sustainable long-term research outcomes and, (5) sharing and transferring ownership of the project with the international and local partner. The SEA approach has a strong emphasis on innovation lead by local stakeholders. In this case, innovation resulted in a unique holistic research program aimed at understanding patient, laboratory and physician influences on accurate diagnosis of malaria. An evaluation of milestones for each SEA element revealed that the success of one element is intricately related to the success of other elements. The SEA will provide an additional framework for researchers and local stakeholders that promotes innovation and adaptability. This approach will facilitate the development of new ideas, strategies and approaches to understand how health issues, such as malaria, affect vulnerable communities.
NASA Astrophysics Data System (ADS)
Mueller, K. L.; Callahan, W.; Davis, K. J.; Dickerson, R. R.; Duren, R. M.; Gurney, K. R.; Karion, A.; Keeling, R. F.; Kim, J.; Lauvaux, T.; Miller, C. E.; Shepson, P. B.; Turnbull, J. C.; Weiss, R. F.; Whetstone, J. R.
2017-12-01
City and State governments are increasingly interested in mitigating greenhouse gas (GHG) emissions to improve sustainability within their jurisdictions. Estimation of urban GHG emissions remains an active research area with many sources of uncertainty. To support the effort of improving measurement of trace gas emissions in city environments, several federal agencies along with academic, research, and private entities have been working within a handful of domestic metropolitan areas to improve both (1) the assessment of GHG emissions accuracy using a variety of measurement technologies, and (2) the tools that can better assess GHG inventory data at urban mitigation scales based upon these measurements. The National Institute of Standards and Technology (NIST) activities have focused on three areas, or testbeds: Indianapolis (INFLUX experiment), Los Angeles (the LA Megacities project), and the Northeastern Corridor areas encompassing Washington and Baltimore (the NEC/BW GHG Measurements project). These cities represent diverse meteorological, terrain, demographic, and emissions characteristics having a broad range of complexities. To date this research has involved multiple measurement systems and integrated observing approaches, all aimed at advancing development of a robust, science-base upon which higher accuracy quantification approaches can rest. Progress toward such scientifically robust, widely-accepted emissions quantification methods will rely upon continuous performance assessment. Such assessment is challenged by the complexities of cities themselves (e.g., population, urban form) along with the many variables impacting a city's technological ability to estimate its GHG emissions (e.g., meteorology, density of observations). We present the different NIST testbeds and a proposal to initiate conceptual development of a reference framework supporting the comparison of multi-city GHG emissions estimates. Such a reference framework has potential to provide the basis for city governments to choose the measurements and methods that can quantify their GHG and related trace gas emissions at levels commensurate with their needs.
Large Margin Multi-Modal Multi-Task Feature Extraction for Image Classification.
Yong Luo; Yonggang Wen; Dacheng Tao; Jie Gui; Chao Xu
2016-01-01
The features used in many image analysis-based applications are frequently of very high dimension. Feature extraction offers several advantages in high-dimensional cases, and many recent studies have used multi-task feature extraction approaches, which often outperform single-task feature extraction approaches. However, most of these methods are limited in that they only consider data represented by a single type of feature, even though features usually represent images from multiple modalities. We, therefore, propose a novel large margin multi-modal multi-task feature extraction (LM3FE) framework for handling multi-modal features for image classification. In particular, LM3FE simultaneously learns the feature extraction matrix for each modality and the modality combination coefficients. In this way, LM3FE not only handles correlated and noisy features, but also utilizes the complementarity of different modalities to further help reduce feature redundancy in each modality. The large margin principle employed also helps to extract strongly predictive features, so that they are more suitable for prediction (e.g., classification). An alternating algorithm is developed for problem optimization, and each subproblem can be efficiently solved. Experiments on two challenging real-world image data sets demonstrate the effectiveness and superiority of the proposed method.
Herrgård, Markus; Sukumara, Sumesh; Campodonico, Miguel; Zhuang, Kai
2015-12-01
In recent years, bio-based chemicals have gained interest as a renewable alternative to petrochemicals. However, there is a significant need to assess the technological, biological, economic and environmental feasibility of bio-based chemicals, particularly during the early research phase. Recently, the Multi-scale framework for Sustainable Industrial Chemicals (MuSIC) was introduced to address this issue by integrating modelling approaches at different scales ranging from cellular to ecological scales. This framework can be further extended by incorporating modelling of the petrochemical value chain and the de novo prediction of metabolic pathways connecting existing host metabolism to desirable chemical products. This multi-scale, multi-disciplinary framework for quantitative assessment of bio-based chemicals will play a vital role in supporting engineering, strategy and policy decisions as we progress towards a sustainable chemical industry. © 2015 Authors; published by Portland Press Limited.
Multi-class Mode of Action Classification of Toxic Compounds Using Logic Based Kernel Methods.
Lodhi, Huma; Muggleton, Stephen; Sternberg, Mike J E
2010-09-17
Toxicity prediction is essential for drug design and development of effective therapeutics. In this paper we present an in silico strategy, to identify the mode of action of toxic compounds, that is based on the use of a novel logic based kernel method. The technique uses support vector machines in conjunction with the kernels constructed from first order rules induced by an Inductive Logic Programming system. It constructs multi-class models by using a divide and conquer reduction strategy that splits multi-classes into binary groups and solves each individual problem recursively hence generating an underlying decision list structure. In order to evaluate the effectiveness of the approach for chemoinformatics problems like predictive toxicology, we apply it to toxicity classification in aquatic systems. The method is used to identify and classify 442 compounds with respect to the mode of action. The experimental results show that the technique successfully classifies toxic compounds and can be useful in assessing environmental risks. Experimental comparison of the performance of the proposed multi-class scheme with the standard multi-class Inductive Logic Programming algorithm and multi-class Support Vector Machine yields statistically significant results and demonstrates the potential power and benefits of the approach in identifying compounds of various toxic mechanisms. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application fuzzy multi-attribute decision analysis method to prioritize project success criteria
NASA Astrophysics Data System (ADS)
Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To
2017-11-01
Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.
NASA Astrophysics Data System (ADS)
Subagadis, Y. H.; Schütze, N.; Grundmann, J.
2014-09-01
The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.
USDA-ARS?s Scientific Manuscript database
Collaborative research in Peru sought to promote sustainable potato production and, mitigate adverse impacts of climate change through two approaches: first calcium amendments to increase crop yield and, second to enhance frost tolerance in native potatoes. All the multi-year, multi-location experim...
ERIC Educational Resources Information Center
Willis, Cameron; Greene, Julie; Riley, Barbara
2017-01-01
Inter-organisational partnerships are widely used approaches in public health and chronic disease prevention (CDP), and may include organisations from different sectors, such as research-policy-practice sectors, inter-governmental sectors, or public and private sectors. While multiple conceptual frameworks related to multi-sectoral partnerships…
A Theory of Competence in Anesthesiology: Faculty Perspectives on Resident Performance
ERIC Educational Resources Information Center
Street, John P.
2009-01-01
This study was conducted to develop a theory of resident competence in anesthesiology and was guided by this research question: from the perspective of anesthesiology faculty members, "What are the attributes and indicators of clinical competence in residents?" The author used a grounded theory approach for this multi-case, multi-site…
The U.S. EPA Atlantic Ecology Division (AED) has initiated a multi-year research program to develop empirical nitrogen load-response models for embayments in southern New England. This is part of a multi-regional effort to develop nutrient load-response models for the Gulf of Mex...
A Multi-Channel Approach for Collaborative Web-Based Learning
ERIC Educational Resources Information Center
Azeta, A. A.
2008-01-01
This paper describes an architectural framework and a prototype implementation of a web-based multi-channel e-Learning application that allows students, lecturers and the research communities to collaborate irrespective of the communication device a user is carrying. The application was developed based on the concept of "right once run on any…
The Development of Solution Focused Multi-Agency Meetings in a Psychological Service
ERIC Educational Resources Information Center
Alexander, Shiona; Sked, Heather
2010-01-01
This article outlines the successful development of multi-agency meetings as part of a staged approach aimed at supporting families and children within the Scottish Highland Council Area. Drawing on the research evidence for the factors which help to make meetings effective, a distinctive meeting structure was developed. This structure is…
NASA Astrophysics Data System (ADS)
Dash, Jonathan P.; Watt, Michael S.; Pearse, Grant D.; Heaphy, Marie; Dungey, Heidi S.
2017-09-01
Research into remote sensing tools for monitoring physiological stress caused by biotic and abiotic factors is critical for maintaining healthy and highly-productive plantation forests. Significant research has focussed on assessing forest health using remotely sensed data from satellites and manned aircraft. Unmanned aerial vehicles (UAVs) may provide new tools for improved forest health monitoring by providing data with very high temporal and spatial resolutions. These platforms also pose unique challenges and methods for health assessments must be validated before use. In this research, we simulated a disease outbreak in mature Pinus radiata D. Don trees using targeted application of herbicide. The objective was to acquire a time-series simulated disease expression dataset to develop methods for monitoring physiological stress from a UAV platform. Time-series multi-spectral imagery was acquired using a UAV flown over a trial at regular intervals. Traditional field-based health assessments of crown health (density) and needle health (discolouration) were carried out simultaneously by experienced forest health experts. Our results showed that multi-spectral imagery collected from a UAV is useful for identifying physiological stress in mature plantation trees even during the early stages of tree stress. We found that physiological stress could be detected earliest in data from the red edge and near infra-red bands. In contrast to previous findings, red edge data did not offer earlier detection of physiological stress than the near infra-red data. A non-parametric approach was used to model physiological stress based on spectral indices and was found to provide good classification accuracy (weighted kappa = 0.694). This model can be used to map physiological stress based on high-resolution multi-spectral data.
USDA-ARS?s Scientific Manuscript database
The cross-site process evaluation plan for the Childhood Obesity Research Demonstration (CORD) project is described here. The CORD project comprises 3 unique demonstration projects designed to integrate multi-level, multi-setting health care and public health interventions over a 4-year funding peri...
NASA Astrophysics Data System (ADS)
Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu
2015-12-01
For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.
Ni, Yongnian; Liu, Ying; Kokot, Serge
2011-02-07
This work is concerned with the research and development of methodology for analysis of complex mixtures such as pharmaceutical or food samples, which contain many analytes. Variously treated samples (swill washed, fried and scorched) of the Rhizoma atractylodis macrocephalae (RAM) traditional Chinese medicine (TCM) as well as the common substitute, Rhizoma atractylodis (RA) TCM were chosen as examples for analysis. A combined data matrix of chromatographic 2-D HPLC-DAD-FLD (two-dimensional high performance liquid chromatography with diode array and fluorescence detectors) fingerprint profiles was constructed with the use of the HPLC-DAD and HPLC-FLD individual data matrices; the purpose was to collect maximum information and to interpret this complex data with the use of various chemometrics methods e.g. the rank-ordering multi-criteria decision making (MCDM) PROMETHEE and GAIA, K-nearest neighbours (KNN), partial least squares (PLS), back propagation-artificial neural networks (BP-ANN) methods. The chemometrics analysis demonstrated that the combined 2-D HPLC-DAD-FLD data matrix does indeed provide more information and facilitates better performing classification/prediction models for the analysis of such complex samples as the RAM and RA ones noted above. It is suggested that this fingerprint approach is suitable for analysis of other complex, multi-analyte substances.
Higdon, Roger; Earl, Rachel K.; Stanberry, Larissa; Hudac, Caitlin M.; Montague, Elizabeth; Stewart, Elizabeth; Janko, Imre; Choiniere, John; Broomall, William; Kolker, Natali
2015-01-01
Abstract Complex diseases are caused by a combination of genetic and environmental factors, creating a difficult challenge for diagnosis and defining subtypes. This review article describes how distinct disease subtypes can be identified through integration and analysis of clinical and multi-omics data. A broad shift toward molecular subtyping of disease using genetic and omics data has yielded successful results in cancer and other complex diseases. To determine molecular subtypes, patients are first classified by applying clustering methods to different types of omics data, then these results are integrated with clinical data to characterize distinct disease subtypes. An example of this molecular-data-first approach is in research on Autism Spectrum Disorder (ASD), a spectrum of social communication disorders marked by tremendous etiological and phenotypic heterogeneity. In the case of ASD, omics data such as exome sequences and gene and protein expression data are combined with clinical data such as psychometric testing and imaging to enable subtype identification. Novel ASD subtypes have been proposed, such as CHD8, using this molecular subtyping approach. Broader use of molecular subtyping in complex disease research is impeded by data heterogeneity, diversity of standards, and ineffective analysis tools. The future of molecular subtyping for ASD and other complex diseases calls for an integrated resource to identify disease mechanisms, classify new patients, and inform effective treatment options. This in turn will empower and accelerate precision medicine and personalized healthcare. PMID:25831060
A Scalable and Robust Multi-Agent Approach to Distributed Optimization
NASA Technical Reports Server (NTRS)
Tumer, Kagan
2005-01-01
Modularizing a large optimization problem so that the solutions to the subproblems provide a good overall solution is a challenging problem. In this paper we present a multi-agent approach to this problem based on aligning the agent objectives with the system objectives, obviating the need to impose external mechanisms to achieve collaboration among the agents. This approach naturally addresses scaling and robustness issues by ensuring that the agents do not rely on the reliable operation of other agents We test this approach in the difficult distributed optimization problem of imperfect device subset selection [Challet and Johnson, 2002]. In this problem, there are n devices, each of which has a "distortion", and the task is to find the subset of those n devices that minimizes the average distortion. Our results show that in large systems (1000 agents) the proposed approach provides improvements of over an order of magnitude over both traditional optimization methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents fail midway through the simulation) the system remains coordinated and still outperforms a failure-free and centralized optimization algorithm.
Cooperative control theory and integrated flight and propulsion control
NASA Technical Reports Server (NTRS)
Schmidt, David K.; Schierman, John D.
1994-01-01
This report documents the activities and research results obtained under a grant (NAG3-998) from the NASA Lewis Research Center. The focus of the research was the investigation of dynamic interactions between airframe and engines for advanced ASTOVL aircraft configurations, and the analysis of the implications of these interactions on the stability and performance of the airframe and engine control systems. In addition, the need for integrated flight and propulsion control for such aircraft was addressed. The major contribution of this research was the exposition of the fact that airframe and engine interactions could be present, and their effects could include loss of stability and performance of the control systems. Also, the significance of two directional, as opposed to one-directional, coupling was identified and explained. A multi variable stability and performance analysis methodology was developed, and applied to several candidate aircraft configurations. Also exposed was the fact that with interactions present along with some integrated control approaches, the engine command/limiting logic (which represents an important non-linear component of the engine control system) can impact closed-loop airframe/engine system stability. Finally, a brief investigation of control-law synthesis techniques appropriate for the class of systems was pursued, and it was determined that multi variable techniques, included model-following formulations of LQG and/or H (infinity) methods showed promise. However, for practical reasons, decentralized control architectures are preferred, which is an architecture incompatible with these synthesis methods.
A Generalized Mixture Framework for Multi-label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, …, Yd|X) using a product of posterior distributions over components of the output space. Our approach captures different input–output and output–output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods. PMID:26613069
GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS WHATS WHAT?
After years of research and many publications, the question still remains: What is the best method to collect representative ground water samples from monitoring wells? Numerous systems and devices are currently available for obtaining both multi-level samples as well as traditi...
Physical insights into the blood-brain barrier translocation mechanisms
NASA Astrophysics Data System (ADS)
Theodorakis, Panagiotis E.; Müller, Erich A.; Craster, Richard V.; Matar, Omar K.
2017-08-01
The number of individuals suffering from diseases of the central nervous system (CNS) is growing with an aging population. While candidate drugs for many of these diseases are available, most of these pharmaceutical agents cannot reach the brain rendering most of the drug therapies that target the CNS inefficient. The reason is the blood-brain barrier (BBB), a complex and dynamic interface that controls the influx and efflux of substances through a number of different translocation mechanisms. Here, we present these mechanisms providing, also, the necessary background related to the morphology and various characteristics of the BBB. Moreover, we discuss various numerical and simulation approaches used to study the BBB, and possible future directions based on multi-scale methods. We anticipate that this review will motivate multi-disciplinary research on the BBB aiming at the design of effective drug therapies.
2012-01-01
Background Tremendous progress has been made in the last ten years in reducing morbidity and mortality caused by malaria, in part because of increases in global funding for malaria control and elimination. Today, many countries are striving for malaria elimination. However, a major challenge is the neglect of cross-border and regional initiatives in malaria control and elimination. This paper seeks to better understand Global Fund support for multi-country initiatives. Methods Documents and proposals were extracted and reviewed from two main sources, the Global Fund website and Aidspan.org. Documents and reports from the Global Fund Technical Review Panel, Board, and Secretariat documents such as guidelines and proposal templates were reviewed to establish the type of policies enacted and guidance provided from the Global Fund on multi-country initiatives and applications. From reviewing this information, the researchers created 29 variables according to eight dimensions to use in a review of Round 10 applications. All Round 10 multi-country applications (for HIV, malaria and tuberculosis) and all malaria multi-country applications (6) from Rounds 1 – 10 were extracted from the Global Fund website. A blind review was conducted of Round 10 applications using the 29 variables as a framework, followed by a review of four of the six successful malaria multi-country grant applications from Rounds 1 – 10. Findings During Rounds 3 – 10 of the Global Fund, only 5.8% of grants submitted were for multi-country initiatives. Out of 83 multi-country proposals submitted, 25.3% were approved by the Technical Review Panel (TRP) for funding, compared to 44.9% of single-country applications. The majority of approved multi-country applications were for HIV (76.2%), followed by malaria (19.0%), then tuberculosis (4.8%). TRP recommendations resulted in improvements to application forms, although guidance was generally vague. The in-depth review of Round 10 multi-country proposals showed that applicants described their projects in one of two ways: a regional ‘network approach’ by which benefits are derived from economies of scale or from enhanced opportunities for mutual support and learning or the development of common policies and approaches; or a ‘cross-border’ approach for enabling activities to be more effectively delivered towards border-crossing populations or vectors. In Round 10, only those with a ‘network approach’ were recommended for funding. The Global Fund has only ever approved six malaria multi-country applications. Four approved applications stated strong arguments for a multi-country initiative, combining both ‘cross-border’ and ‘network’ approaches. Conclusion With the cancellation of Round 11 and the proposal that the Global Fund adopt a more targeted and strategic approach to funding, the time is opportune for the Global Fund to develop a clear consensus about the key factors and criteria for funding malaria specific multi-country initiatives. This study found that currently there was a lack of guidance on the key features that a successful multi-country proposal needs to be approved and that applications directed towards the ‘network’ approach were most successful in Round 10. This type of multi-country proposal may favour other diseases such as HIV, whereas the need for malaria control and elimination is different, focusing on cross-border coordination and delivery of interventions to specific groups. The Global Fund should seek to address these issues and give better guidance to countries and regions and investigate disease-specific calls for multi-country and regional applications. PMID:23057734
Thayer, Erin K.; Rathkey, Daniel; Miller, Marissa Fuqua; Palmer, Ryan; Mejicano, George C.; Pusic, Martin; Kalet, Adina; Gillespie, Colleen; Carney, Patricia A.
2016-01-01
Issue Medical educators and educational researchers continue to improve their processes for managing medical student and program evaluation data using sound ethical principles. This is becoming even more important as curricular innovations are occurring across undergraduate and graduate medical education. Dissemination of findings from this work is critical, and peer-reviewed journals often require an institutional review board (IRB) determination. Approach IRB data repositories, originally designed for the longitudinal study of biological specimens, can be applied to medical education research. The benefits of such an approach include obtaining expedited review for multiple related studies within a single IRB application and allowing for more flexibility when conducting complex longitudinal studies involving large datasets from multiple data sources and/or institutions. In this paper, we inform educators and educational researchers on our analysis of the use of the IRB data repository approach to manage ethical considerations as part of best practices for amassing, pooling, and sharing data for educational research, evaluation, and improvement purposes. Implications Fostering multi-institutional studies while following sound ethical principles in the study of medical education is needed, and the IRB data repository approach has many benefits, especially for longitudinal assessment of complex multi-site data. PMID:27443407
Reynolds-averaged Navier-Stokes based ice accretion for aircraft wings
NASA Astrophysics Data System (ADS)
Lashkajani, Kazem Hasanzadeh
This thesis addresses one of the current issues in flight safety towards increasing icing simulation capabilities for prediction of complex 2D and 3D glaze ice shapes over aircraft surfaces. During the 1980's and 1990's, the field of aero-icing was established to support design and certification of aircraft flying in icing conditions. The multidisciplinary technologies used in such codes were: aerodynamics (panel method), droplet trajectory calculations (Lagrangian framework), thermodynamic module (Messinger model) and geometry module (ice accretion). These are embedded in a quasi-steady module to simulate the time-dependent ice accretion process (multi-step procedure). The objectives of the present research are to upgrade the aerodynamic module from Laplace to Reynolds-Average Navier-Stokes equations solver. The advantages are many. First, the physical model allows accounting for viscous effects in the aerodynamic module. Second, the solution of the aero-icing module directly provides the means for characterizing the aerodynamic effects of icing, such as loss of lift and increased drag. Third, the use of a finite volume approach to solving the Partial Differential Equations allows rigorous mesh and time convergence analysis. Finally, the approaches developed in 2D can be easily transposed to 3D problems. The research was performed in three major steps, each providing insights into the overall numerical approaches. The most important realization comes from the need to develop specific mesh generation algorithms to ensure feasible solutions in very complex multi-step aero-icing calculations. The contributions are presented in chronological order of their realization. First, a new framework for RANS based two-dimensional ice accretion code, CANICE2D-NS, is developed. A multi-block RANS code from U. of Liverpool (named PMB) is providing the aerodynamic field using the Spalart-Allmaras turbulence model. The ICEM-CFD commercial tool is used for the iced airfoil remeshing and field smoothing. The new coupling is fully automated and capable of multi-step ice accretion simulations via a quasi-steady approach. In addition, the framework allows for flow analysis and aerodynamic performance prediction of the iced airfoils. The convergence of the quasi-steady algorithm is verified and identifies the need for an order of magnitude increase in the number of multi-time steps in icing simulations to achieve solver independent solutions. Second, a Multi-Block Navier-Stokes code, NSMB, is coupled with the CANICE2D icing framework. Attention is paid to the roughness implementation of the ONERA roughness model within the Spalart-Allmaras turbulence model, and to the convergence of the steady and quasi-steady iterative procedure. Effects of uniform surface roughness in quasi-steady ice accretion simulation are analyzed through different validation test cases. The results of CANICE2D-NS show good agreement with experimental data both in terms of predicted ice shapes as well as aerodynamic analysis of predicted and experimental ice shapes. Third, an efficient single-block structured Navier-Stokes CFD code, NSCODE, is coupled with the CANICE2D-NS icing framework. Attention is paid to the roughness implementation of the Boeing model within the Spalart-Allmaras turbulence model, and to acceleration of the convergence of the steady and quasi-steady iterative procedures. Effects of uniform surface roughness in quasi-steady ice accretion simulation are analyzed through different validation test cases, including code to code comparisons with the same framework coupled with the NSMB Navier-Stokes solver. The efficiency of the J-multigrid approach to solve the flow equations on complex iced geometries is demonstrated. Since it was noted in all these calculations that the ICEM-CFD grid generation package produced a number of issues such as inefficient mesh quality and smoothing deficiencies (notably grid shocks), a fourth study proposes a new mesh generation algorithm. A PDE based multi-block structured grid generation code, NSGRID, is developed for this purpose. The study includes the developments of novel mesh generation algorithms over complex glaze ice shapes containing multi-curvature ice accretion geometries, such as single/double ice horns. The twofold approaches tackle surface geometry discretization as well as field mesh generation. An adaptive curvilinear curvature control algorithm is constructed solving a 1D elliptic PDE equation with periodic source terms. This method controls the arclength grid spacing so that high convex and concave curvature regions around ice horns are appropriately captured and is shown to effectively treat the grid shock problem. Then, a novel blended method is developed by defining combinations of source terms with 2D elliptic equations. The source terms include two common control functions, Sorenson and Spekreijse, and an additional third source term to improve orthogonality. This blended method is shown to be very effective for improving grid quality metrics for complex glaze ice meshes with RANS resolution. The performance in terms of residual reduction per non-linear iteration of several solution algorithms (Point-Jacobi, Gauss-Seidel, ADI, Point and Line SOR) are discussed within the context of a full Multi-grid operator. Details are given on the various formulations used in the linearization process. It is shown that the performance of the solution algorithm depends on the type of control function used. Finally, the algorithms are validated on standard complex experimental ice shapes, demonstrating the applicability of the methods. Finally, the automated framework of RANS based two-dimensional multi-step ice accretion, CANICE2D-NS is developed, coupled with a Multi-Block Navier-Stokes CFD code, NSCODE2D, a Multi-Block elliptic grid generation code, NSGRID2D, and a Multi-Block Eulerian droplet solver, NSDROP2D (developed at Polytechnique Montreal). The framework allows Lagrangian and Eulerian droplet computations within a chimera approach treating multi-elements geometries. The code was tested on public and confidential validation test cases including standard NATO cases. In addition, up to 10 times speedup is observed in the mesh generation procedure by using the implicit line SOR and ADI smoothers within a multigrid procedure. The results demonstrate the benefits and robustness of the new framework in predicting ice shapes and aerodynamic performance parameters.
Gregory, Emma; West, Therese A; Cole, Wesley R; Bailie, Jason M; McCulloch, Karen L; Ettenhofer, Mark L; Cecchini, Amy; Qashu, Felicia M
2017-01-01
The large number of U.S. service members diagnosed with concussion/mild traumatic brain injury each year underscores the necessity for clear and effective clinical guidance for managing concussion. Relevant research continues to emerge supporting a gradual return to pre-injury activity levels without aggravating symptoms; however, available guidance does not provide detailed standards for this return to activity process. To fill this gap, the Defense and Veterans Brain Injury Center released a recommendation for primary care providers detailing a step-wise return to unrestricted activity during the acute phase of concussion. This guidance was developed in collaboration with an interdisciplinary group of clinical, military, and academic subject matter experts using an evidence-based approach. Systematic evaluation of the guidance is critical to ensure positive patient outcomes, to discover barriers to implementation by providers, and to identify ways to improve the recommendation. Here we describe a multi-level, mixed-methods approach to evaluate the recommendation incorporating outcomes from both patients and providers. Procedures were developed to implement the study within complex but ecologically-valid settings at multiple military treatment facilities and operational medical units. Special consideration was given to anticipated challenges such as the frequent movement of military personnel, selection of appropriate design and measures, study implementation at multiple sites, and involvement of multiple service branches (Army, Navy, and Marine Corps). We conclude by emphasizing the need to consider contemporary approaches for evaluating the effectiveness of clinical guidance. Copyright © 2016 Elsevier Inc. All rights reserved.
Risk Governance of Multiple Natural Hazards: Centralized versus Decentralized Approach in Europe
NASA Astrophysics Data System (ADS)
Komendantova, Nadejda; Scolobig, Anna; Vinchon, Charlotte
2014-05-01
The multi-risk approach is a relatively new field and its definition includes the need to consider multiple hazards and vulnerabilities in their interdependency (Selva, 2013) and the current multi-hazards disasters, such as the 2011 Tohoku earthquake, tsunami and nuclear catastrophe, showed the need for a multi-risk approach in hazard mitigation and management. Our knowledge about multi-risk assessment, including studies from different scientific disciplines and developed assessment tools, is constantly growing (White et al., 2001). However, the link between scientific knowledge, its implementation and the results in terms of improved governance and decision-making have gained significantly less attention (IRGC, 2005; Kappes et al., 2012), even though the interest to risk governance, in general, has increased significantly during the last years (Verweiy and Thompson, 2006). Therefore, the key research question is how risk assessment is implemented and what is the potential for the implementation of a multi-risk approach in different governance systems across Europe. More precisely, how do the characteristics of risk governance, such as the degree of centralization versus decentralization, influence the implementation of a multi-risk approach. The methodology of this research includes comparative case study analysis of top-down and bottom-up interactions in governance in the city of Naples, (Italy), where the institutional landscape is marked by significant autonomy of Italian regions in decision-making processes for assessing the majority of natural risks, excluding volcanic, and in Guadeloupe, French West Indies, an overseas department of France, where the decision-making process is marked by greater centralization in decision making associated with a well established state governance within regions, delegated to the prefect and decentralised services of central ministries. The research design included documentary analysis and extensive empirical work involving policy makers, private sector actors and practitioners in risk and emergency management. This work was informed by 36 semi-structured interviews, three workshops with over seventy participants from eleven different countries, feedback from questionnaires and focus group discussions (Scolobig et al., 2013). The results show that both governance systems have their own strengths and weaknesses (Komendantova et al., 2013). Elements of the centralized multi-risk governance system could lead to improvements in interagency communication and the creation of an inter-agency environment, where the different departments at the national level can exchange information, identify the communities that are most exposed to multiple risks and set priorities, while providing consistent information about and responses to multi-risk to the relevant stakeholders at the local level. A decentralised multi-risk governance system by contrast can instead favour the creation of local multi-risk commissions to conduct discussions between experts in meteorological, geological and technological risks and practitioners, to elaborate risk and hazard maps, and to develop local capacities which would include educational and training activities. Both governance systems suffer from common deficiencies, the most important being the frequent lack of capacities at the local level, especially financial, but sometimes also technical and institutional ones, as the responsibilities for disaster risk management are often transferred from the national to local levels without sufficient resources for implementation of programs on risk management (UNISDR, 2013). The difficulty in balancing available resources between short-term and medium-term priorities often complicates the issue. Our recommendations are that the implementation of multi-risk approach can be facilitated through knowledge exchange and dialogue between different disciplinary communities, such as geological and meteorological, and between the natural and social sciences. The implementation of a multi-risk approach can be strengthened through the creation of multi-risk platforms and multi-risk commissions, which can liaise between risk management experts and local communities and to unify numerous actions on natural hazard management. However, the multi-risk approach cannot be a subsidiary to a single risk approach, and both have to be pursued. References: IRGC. (2011). Concept note: Improving the management of emerging risks: Risks from new technologies, system interactions, and unforeseen or changing circumstances. International Risk Governance Council (IRGC), Geneva. Kappes, M. S., Keiler, M., Elverfeldt, von K., & Glade, T, (2012). Challenges of analyzing multi-hazard risk: A review. Natural Hazards, 64(2), 1925-1958. doi: 10.1007/s11069-012-0294-2. Komendantova N, Scolobig A, Vinchon C (2013). Multi-risk approach in centralized and decentralized risk governance systems: Case studies of Naples, Italy and Guadeloupe, France. International Relations and Diplomacy, 1(3):224-239 (December 2013) Scolobig, A., Vichon, C., Komendantova, N., Bengoubou-Valerius, M., & Patt, A. (2013). Social and institutional barriers to effective multi-hazard and multi-risk decision-making governance. D6.3 MATRIX project. Selva, J. (2013). Long-term multi-risk assessment: statistical treatment of interaction among risks. Natural Hazards, 67(2),701-722. UNISDR. (2013). Implementing the HYOGO framework for action in Europe: Regional synthesis report 2011-2013. Verweij, M., & Thompson, M. (Eds.). (2006). Clumsy solutions for a complex world: Governance, politics, and plural perceptions. New York: Palgrave Macmillan. White, G., Kates, R., & Burton, I. (2001). Knowing better and losing even more: the use of knowledge in hazards management. Environmental Hazards, 3, 81-92.
Landform Geodiversity - State of the Art and future Suggestions
NASA Astrophysics Data System (ADS)
Zwoliński, Zbigniew
2014-05-01
The purpose of this paper is to present the current state of understanding of geodiversity in general terms and with regard to the relief forms of the earth. It will be pointed the key factors and elements for investigation of the landform geodiversity. A subject area of landform geodiversity encompasses among others the couple relationships between geology/lithology and landforms, the couple connections between landforms and water as well as the climate, the multi-directional connections/relationships/feedbacks between landforms and other/all components of the natural environment, the linkage between landform geodiversity and morphoclimatic zones, the role of anthropopression (anthropogenetic factors) within landform geodiversity, landform geodiversity in man-made environment, classification and typology of landform geodiversity, and the location and nature of past and present unique landforms. Geodiversity research is carried out in many countries and by different authors. Each research approaches represented by these authors differentiate a bit. An overview of these research approaches will be one part of this presentation. After the review of the methodological assumptions will be a brief overview of the research methods used by different authors. On the one hand the variety of research methods is justified because they correspond to the characteristics of the investigated areas and indicate the best way to describe the landform geodiversity. On the other hand the variety of research methods should also be seen as common, universal methodic solutions for investigation of geodiversity to comparative studies in the glocal scale, i.e. from local through regional to global scales. At the end of the presentation will be presented the selected future aspects of the landform geodiversity in the context, inter alia, the relationship to biodiversity, the role of the anthropopression in geodiversity, ecosystem services, sustainable development, and geoconservation.
NASA Astrophysics Data System (ADS)
Ahmadi, Bahman; Nariman-zadeh, Nader; Jamali, Ali
2017-06-01
In this article, a novel approach based on game theory is presented for multi-objective optimal synthesis of four-bar mechanisms. The multi-objective optimization problem is modelled as a Stackelberg game. The more important objective function, tracking error, is considered as the leader, and the other objective function, deviation of the transmission angle from 90° (TA), is considered as the follower. In a new approach, a group method of data handling (GMDH)-type neural network is also utilized to construct an approximate model for the rational reaction set (RRS) of the follower. Using the proposed game-theoretic approach, the multi-objective optimal synthesis of a four-bar mechanism is then cast into a single-objective optimal synthesis using the leader variables and the obtained RRS of the follower. The superiority of using the synergy game-theoretic method of Stackelberg with a GMDH-type neural network is demonstrated for two case studies on the synthesis of four-bar mechanisms.
NASA Technical Reports Server (NTRS)
Pulliam, T. H.; Nemec, M.; Holst, T.; Zingg, D. W.; Kwak, Dochan (Technical Monitor)
2002-01-01
A comparison between an Evolutionary Algorithm (EA) and an Adjoint-Gradient (AG) Method applied to a two-dimensional Navier-Stokes code for airfoil design is presented. Both approaches use a common function evaluation code, the steady-state explicit part of the code,ARC2D. The parameterization of the design space is a common B-spline approach for an airfoil surface, which together with a common griding approach, restricts the AG and EA to the same design space. Results are presented for a class of viscous transonic airfoils in which the optimization tradeoff between drag minimization as one objective and lift maximization as another, produces the multi-objective design space. Comparisons are made for efficiency, accuracy and design consistency.
A quasiparticle-based multi-reference coupled-cluster method.
Rolik, Zoltán; Kállay, Mihály
2014-10-07
The purpose of this paper is to introduce a quasiparticle-based multi-reference coupled-cluster (MRCC) approach. The quasiparticles are introduced via a unitary transformation which allows us to represent a complete active space reference function and other elements of an orthonormal multi-reference (MR) basis in a determinant-like form. The quasiparticle creation and annihilation operators satisfy the fermion anti-commutation relations. On the basis of these quasiparticles, a generalization of the normal-ordered operator products for the MR case can be introduced as an alternative to the approach of Mukherjee and Kutzelnigg [Recent Prog. Many-Body Theor. 4, 127 (1995); Mukherjee and Kutzelnigg, J. Chem. Phys. 107, 432 (1997)]. Based on the new normal ordering any quasiparticle-based theory can be formulated using the well-known diagram techniques. Beyond the general quasiparticle framework we also present a possible realization of the unitary transformation. The suggested transformation has an exponential form where the parameters, holding exclusively active indices, are defined in a form similar to the wave operator of the unitary coupled-cluster approach. The definition of our quasiparticle-based MRCC approach strictly follows the form of the single-reference coupled-cluster method and retains several of its beneficial properties. Test results for small systems are presented using a pilot implementation of the new approach and compared to those obtained by other MR methods.
The Childhood Obesity Declines Project: Implications for Research and Evaluation Approaches.
Young-Hyman, Deborah; Morris, Kathryn; Kettel Khan, Laura; Dawkins-Lyn, Nicola; Dooyema, Carrie; Harris, Carole; Jernigan, Jan; Ottley, Phyllis; Kauh, Tina
2018-03-01
Childhood obesity remains prevalent and is increasing in some disadvantaged populations. Numerous research, policy and community initiatives are undertaken to impact this pandemic. Understudied are natural experiments. The need to learn from these efforts is paramount. Resulting evidence may not be readily available to inform future research, community initiatives, and policy development/implementation. We discuss the implications of using an adaptation of the Systematic Screening and Assessment (SSA) method to evaluate the Childhood Obesity Declines (COBD) project. The project examined successful initiatives, programs and policies in four diverse communities which were concurrent with significant declines in child obesity. In the context of other research designs and evaluation schemas, rationale for use of SSA is presented. Evidence generated by this method is highlighted and guidance suggested for evaluation of future studies of community-based childhood obesity prevention initiatives. Support for the role of stakeholder collaboratives, in particular the National Collaborative on Childhood Obesity Research, as a synergistic vehicle to accelerate research on childhood obesity is discussed. SSA mapped active processes and provided contextual understanding of multi-level/component simultaneous efforts to reduce rates of childhood obesity in community settings. Initiatives, programs and policies were not necessarily coordinated. And although direct attribution of intervention/initiative/policy components could not be made, the what, by who, how, to whom was temporally associated with statistically significant reductions in childhood obesity. SSA provides evidence for context and processes which are not often evaluated in other data analytic methods. SSA provides an additional tool to layer with other evaluation approaches.
Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.
Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M
2016-10-07
Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.
A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.
Zeng, Ping; Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun
2017-01-01
In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.
A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol
Tan, Qingping; Meng, Xiankai; Shao, Zeming; Xie, Qinzheng; Yan, Ying; Cao, Wei; Xu, Jianjun
2017-01-01
In this paper, based on our previous multi-pattern uniform resource locator (URL) binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on—all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications. PMID:28399157
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, S; Tianjin University, Tianjin; Hara, W
Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2)more » electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.« less
NASA Astrophysics Data System (ADS)
Salleh, S. A.; Rahman, A. S. A. Abd; Othman, A. N.; Mohd, W. M. N. Wan
2018-02-01
As different approach produces different results, it is crucial to determine the methods that are accurate in order to perform analysis towards the event. This research aim is to compare the Rank Reciprocal (MCDM) and Artificial Neural Network (ANN) analysis techniques in determining susceptible zones of landslide hazard. The study is based on data obtained from various sources such as local authority; Dewan Bandaraya Kuala Lumpur (DBKL), Jabatan Kerja Raya (JKR) and other agencies. The data were analysed and processed using Arc GIS. The results were compared by quantifying the risk ranking and area differential. It was also compared with the zonation map classified by DBKL. The results suggested that ANN method gives better accuracy compared to MCDM with 18.18% higher accuracy assessment of the MCDM approach. This indicated that ANN provides more reliable results and it is probably due to its ability to learn from the environment thus portraying realistic and accurate result.
Multi-scale graph-cut algorithm for efficient water-fat separation.
Berglund, Johan; Skorpil, Mikael
2017-09-01
To improve the accuracy and robustness to noise in water-fat separation by unifying the multiscale and graph cut based approaches to B 0 -correction. A previously proposed water-fat separation algorithm that corrects for B 0 field inhomogeneity in 3D by a single quadratic pseudo-Boolean optimization (QPBO) graph cut was incorporated into a multi-scale framework, where field map solutions are propagated from coarse to fine scales for voxels that are not resolved by the graph cut. The accuracy of the single-scale and multi-scale QPBO algorithms was evaluated against benchmark reference datasets. The robustness to noise was evaluated by adding noise to the input data prior to water-fat separation. Both algorithms achieved the highest accuracy when compared with seven previously published methods, while computation times were acceptable for implementation in clinical routine. The multi-scale algorithm was more robust to noise than the single-scale algorithm, while causing only a small increase (+10%) of the reconstruction time. The proposed 3D multi-scale QPBO algorithm offers accurate water-fat separation, robustness to noise, and fast reconstruction. The software implementation is freely available to the research community. Magn Reson Med 78:941-949, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
A Multi-Stage Reverse Logistics Network Problem by Using Hybrid Priority-Based Genetic Algorithm
NASA Astrophysics Data System (ADS)
Lee, Jeong-Eun; Gen, Mitsuo; Rhee, Kyong-Gu
Today remanufacturing problem is one of the most important problems regarding to the environmental aspects of the recovery of used products and materials. Therefore, the reverse logistics is gaining become power and great potential for winning consumers in a more competitive context in the future. This paper considers the multi-stage reverse Logistics Network Problem (m-rLNP) while minimizing the total cost, which involves reverse logistics shipping cost and fixed cost of opening the disassembly centers and processing centers. In this study, we first formulate the m-rLNP model as a three-stage logistics network model. Following for solving this problem, we propose a Genetic Algorithm pri (GA) with priority-based encoding method consisting of two stages, and introduce a new crossover operator called Weight Mapping Crossover (WMX). Additionally also a heuristic approach is applied in the 3rd stage to ship of materials from processing center to manufacturer. Finally numerical experiments with various scales of the m-rLNP models demonstrate the effectiveness and efficiency of our approach by comparing with the recent researches.
NASA Astrophysics Data System (ADS)
Kim, Han Seul; Kim, Yong-Hoon
We have been developing a multi-space-constrained density functional theory approach for the first-principles calculations of nano-scale junctions subjected to non-equilibrium conditions and charge transport through them. In this presentation, we apply the method to vertically-stacked graphene/hexagonal boron nitride (hBN)/graphene Van der Waals heterostructures in the context of tunneling transistor applications. Bias-dependent changes in energy level alignment, wavefunction hybridization, and current are extracted. In particular, we compare quantum transport properties of single-layer (graphene) and infinite (graphite) electrode limits on the same ground, which is not possible within the traditional non-equilibrium Green function formalism. The effects of point defects within hBN on the current-voltage characteristics will be also discussed. Global Frontier Program (2013M3A6B1078881), Nano-Material Technology Development Programs (2016M3A7B4024133, 2016M3A7B4909944, and 2012M3A7B4049888), and Pioneer Program (2016M3C1A3906149) of the National Research Foundation.
Governance for public health and health equity: The Tröndelag model for public health work.
Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim
2018-06-01
Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.
Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F
2017-05-01
This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.
Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling
NASA Technical Reports Server (NTRS)
Yang, Lee C.; Kuchar, James K.
2000-01-01
Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.
Papanikolaou, Yannis; Tsoumakas, Grigorios; Laliotis, Manos; Markantonatos, Nikos; Vlahavas, Ioannis
2017-09-22
In this paper we present the approach that we employed to deal with large scale multi-label semantic indexing of biomedical papers. This work was mainly implemented within the context of the BioASQ challenge (2013-2017), a challenge concerned with biomedical semantic indexing and question answering. Our main contribution is a MUlti-Label Ensemble method (MULE) that incorporates a McNemar statistical significance test in order to validate the combination of the constituent machine learning algorithms. Some secondary contributions include a study on the temporal aspects of the BioASQ corpus (observations apply also to the BioASQ's super-set, the PubMed articles collection) and the proper parametrization of the algorithms used to deal with this challenging classification task. The ensemble method that we developed is compared to other approaches in experimental scenarios with subsets of the BioASQ corpus giving positive results. In our participation in the BioASQ challenge we obtained the first place in 2013 and the second place in the four following years, steadily outperforming MTI, the indexing system of the National Library of Medicine (NLM). The results of our experimental comparisons, suggest that employing a statistical significance test to validate the ensemble method's choices, is the optimal approach for ensembling multi-label classifiers, especially in contexts with many rare labels.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
Semi-Supervised Multi-View Learning for Gene Network Reconstruction
Ceci, Michelangelo; Pio, Gianvito; Kuzmanovski, Vladimir; Džeroski, Sašo
2015-01-01
The task of gene regulatory network reconstruction from high-throughput data is receiving increasing attention in recent years. As a consequence, many inference methods for solving this task have been proposed in the literature. It has been recently observed, however, that no single inference method performs optimally across all datasets. It has also been shown that the integration of predictions from multiple inference methods is more robust and shows high performance across diverse datasets. Inspired by this research, in this paper, we propose a machine learning solution which learns to combine predictions from multiple inference methods. While this approach adds additional complexity to the inference process, we expect it would also carry substantial benefits. These would come from the automatic adaptation to patterns on the outputs of individual inference methods, so that it is possible to identify regulatory interactions more reliably when these patterns occur. This article demonstrates the benefits (in terms of accuracy of the reconstructed networks) of the proposed method, which exploits an iterative, semi-supervised ensemble-based algorithm. The algorithm learns to combine the interactions predicted by many different inference methods in the multi-view learning setting. The empirical evaluation of the proposed algorithm on a prokaryotic model organism (E. coli) and on a eukaryotic model organism (S. cerevisiae) clearly shows improved performance over the state of the art methods. The results indicate that gene regulatory network reconstruction for the real datasets is more difficult for S. cerevisiae than for E. coli. The software, all the datasets used in the experiments and all the results are available for download at the following link: http://figshare.com/articles/Semi_supervised_Multi_View_Learning_for_Gene_Network_Reconstruction/1604827. PMID:26641091
SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Z; Folkert, M; Wang, J
2016-06-15
Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidentialmore » reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.« less
Shahamiri, Seyed Reza; Salim, Siti Salwah Binti
2014-09-01
Automatic speech recognition (ASR) can be very helpful for speakers who suffer from dysarthria, a neurological disability that damages the control of motor speech articulators. Although a few attempts have been made to apply ASR technologies to sufferers of dysarthria, previous studies show that such ASR systems have not attained an adequate level of performance. In this study, a dysarthric multi-networks speech recognizer (DM-NSR) model is provided using a realization of multi-views multi-learners approach called multi-nets artificial neural networks, which tolerates variability of dysarthric speech. In particular, the DM-NSR model employs several ANNs (as learners) to approximate the likelihood of ASR vocabulary words and to deal with the complexity of dysarthric speech. The proposed DM-NSR approach was presented as both speaker-dependent and speaker-independent paradigms. In order to highlight the performance of the proposed model over legacy models, multi-views single-learner models of the DM-NSRs were also provided and their efficiencies were compared in detail. Moreover, a comparison among the prominent dysarthric ASR methods and the proposed one is provided. The results show that the DM-NSR recorded improved recognition rate by up to 24.67% and the error rate was reduced by up to 8.63% over the reference model.
Single Cell Multi-Omics Technology: Methodology and Application.
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions.
Single Cell Multi-Omics Technology: Methodology and Application
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions. PMID:29732369
Co-Labeling for Multi-View Weakly Labeled Learning.
Xu, Xinxing; Li, Wen; Xu, Dong; Tsang, Ivor W
2016-06-01
It is often expensive and time consuming to collect labeled training samples in many real-world applications. To reduce human effort on annotating training samples, many machine learning techniques (e.g., semi-supervised learning (SSL), multi-instance learning (MIL), etc.) have been studied to exploit weakly labeled training samples. Meanwhile, when the training data is represented with multiple types of features, many multi-view learning methods have shown that classifiers trained on different views can help each other to better utilize the unlabeled training samples for the SSL task. In this paper, we study a new learning problem called multi-view weakly labeled learning, in which we aim to develop a unified approach to learn robust classifiers by effectively utilizing different types of weakly labeled multi-view data from a broad range of tasks including SSL, MIL and relative outlier detection (ROD). We propose an effective approach called co-labeling to solve the multi-view weakly labeled learning problem. Specifically, we model the learning problem on each view as a weakly labeled learning problem, which aims to learn an optimal classifier from a set of pseudo-label vectors generated by using the classifiers trained from other views. Unlike traditional co-training approaches using a single pseudo-label vector for training each classifier, our co-labeling approach explores different strategies to utilize the predictions from different views, biases and iterations for generating the pseudo-label vectors, making our approach more robust for real-world applications. Moreover, to further improve the weakly labeled learning on each view, we also exploit the inherent group structure in the pseudo-label vectors generated from different strategies, which leads to a new multi-layer multiple kernel learning problem. Promising results for text-based image retrieval on the NUS-WIDE dataset as well as news classification and text categorization on several real-world multi-view datasets clearly demonstrate that our proposed co-labeling approach achieves state-of-the-art performance for various multi-view weakly labeled learning problems including multi-view SSL, multi-view MIL and multi-view ROD.
Qian, Yu; Wei, Chungwen; Lee, F. Eun-Hyung; Campbell, John; Halliley, Jessica; Lee, Jamie A.; Cai, Jennifer; Kong, Megan; Sadat, Eva; Thomson, Elizabeth; Dunn, Patrick; Seegmiller, Adam C.; Karandikar, Nitin J.; Tipton, Chris; Mosmann, Tim; Sanz, Iñaki; Scheuermann, Richard H.
2011-01-01
Background Advances in multi-parameter flow cytometry (FCM) now allow for the independent detection of larger numbers of fluorochromes on individual cells, generating data with increasingly higher dimensionality. The increased complexity of these data has made it difficult to identify cell populations from high-dimensional FCM data using traditional manual gating strategies based on single-color or two-color displays. Methods To address this challenge, we developed a novel program, FLOCK (FLOw Clustering without K), that uses a density-based clustering approach to algorithmically identify biologically relevant cell populations from multiple samples in an unbiased fashion, thereby eliminating operator-dependent variability. Results FLOCK was used to objectively identify seventeen distinct B cell subsets in a human peripheral blood sample and to identify and quantify novel plasmablast subsets responding transiently to tetanus and other vaccinations in peripheral blood. FLOCK has been implemented in the publically available Immunology Database and Analysis Portal – ImmPort (http://www.immport.org) for open use by the immunology research community. Conclusions FLOCK is able to identify cell subsets in experiments that use multi-parameter flow cytometry through an objective, automated computational approach. The use of algorithms like FLOCK for FCM data analysis obviates the need for subjective and labor intensive manual gating to identify and quantify cell subsets. Novel populations identified by these computational approaches can serve as hypotheses for further experimental study. PMID:20839340
2010-01-01
Background This review investigates the effectiveness of interventions aimed at generating demand for and use of sexual and reproductive health (SRH) services by young people; and interventions aimed at generating wider community support for their use. Methods Reports and publications were found in the peer-reviewed and grey literature through academic search engines; web searches; the bibliographies of known conference proceedings and papers; and consultation with experts. The studies were reviewed against a set of inclusion criteria and those that met these were explored in more depth. Results The evidence-base for interventions aimed at both generating demand and community support for SRH services for young people was found under-developed and many available studies do not provide strong evidence. However, the potential of several methods to increase youth uptake has been demonstrated, this includes the linking of school education programs with youth friendly services, life skills approaches and social marketing and franchising. There is also evidence that the involvement of key community gatekeepers such as parents and religious leaders is vital to generating wider community support. In general a combined multi-component approach seems most promising with several success stories to build on. Conclusions Many areas for further research have been highlighted and there is a great need for more rigorous evaluation of programmes in this area. In particular, further evaluation of individual components within a multi-component approach is needed to elucidate the most effective interventions. PMID:20863411
Recent Results from NASA's Morphing Project
NASA Technical Reports Server (NTRS)
McGowan, Anna-Maria R.; Washburn, Anthony E.; Horta, Lucas G.; Bryant, Robert G.; Cox, David E.; Siochi, Emilie J.; Padula, Sharon L.; Holloway, Nancy M.
2002-01-01
The NASA Morphing Project seeks to develop and assess advanced technologies and integrated component concepts to enable efficient, multi-point adaptability in air and space vehicles. In the context of the project, the word "morphing" is defined as "efficient, multi-point adaptability" and may include macro, micro, structural and/or fluidic approaches. The project includes research on smart materials, adaptive structures, micro flow control, biomimetic concepts, optimization and controls. This paper presents an updated overview of the content of the Morphing Project including highlights of recent research results.
Xu, Gongxian; Liu, Ying; Gao, Qunwang
2016-02-10
This paper deals with multi-objective optimization of continuous bio-dissimilation process of glycerol to 1, 3-propanediol. In order to maximize the production rate of 1, 3-propanediol, maximize the conversion rate of glycerol to 1, 3-propanediol, maximize the conversion rate of glycerol, and minimize the concentration of by-product ethanol, we first propose six new multi-objective optimization models that can simultaneously optimize any two of the four objectives above. Then these multi-objective optimization problems are solved by using the weighted-sum and normal-boundary intersection methods respectively. Both the Pareto filter algorithm and removal criteria are used to remove those non-Pareto optimal points obtained by the normal-boundary intersection method. The results show that the normal-boundary intersection method can successfully obtain the approximate Pareto optimal sets of all the proposed multi-objective optimization problems, while the weighted-sum approach cannot achieve the overall Pareto optimal solutions of some multi-objective problems. Copyright © 2015 Elsevier B.V. All rights reserved.
Multiple echo multi-shot diffusion sequence.
Chabert, Steren; Galindo, César; Tejos, Cristian; Uribe, Sergio A
2014-04-01
To measure both transversal relaxation time (T2 ) and diffusion coefficients within a single scan using a multi-shot approach. Both measurements have drawn interest in many applications, especially in skeletal muscle studies, which have short T2 values. Multiple echo single-shot schemes have been proposed to obtain those variables simultaneously within a single scan, resulting in a reduction of the scanning time. However, one problem with those approaches is the associated long echo read-out. Consequently, the minimum achievable echo time tends to be long, limiting the application of these sequences to tissues with relatively long T2 . To address this problem, we propose to extend the multi-echo sequences using a multi-shot approach, so that to allow shorter echo times. A multi-shot dual-echo EPI sequence with diffusion gradients and echo navigators was modified to include independent diffusion gradients in any of the two echoes. The multi-shot approach allows us to drastically reduce echo times. Results showed a good agreement for the T2 and mean diffusivity measurements with gold standard sequences in phantoms and in vivo data of calf muscles from healthy volunteers. A fast and accurate method is proposed to measure T2 and diffusion coefficients simultaneously, tested in vitro and in healthy volunteers. Copyright © 2013 Wiley Periodicals, Inc.
Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret
2015-06-11
Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence about the value of sustaining interventions over time, identifying correlates of sustainability along with strategies for sustaining evidence-supported interventions, advancing the theoretical base and research designs for sustainability research, and advancing the workforce capacity, research culture, and funding mechanisms for this important work.
ERIC Educational Resources Information Center
Komatsu, Taro
2012-01-01
This article discusses methodological issues associated with education research in Bosnia and Herzegovina (BiH) and describes strategies taken to address them. Within a case study, mixed methods allowed the author to examine school leaders' perceptions multi-dimensionally. Multi-level analysis was essential to the understanding of policy-making…
Giordano, James
2017-01-01
Research in neuroscience and neurotechnology (neuroS/T) is progressing at a rapid pace with translational applications both in medicine, and more widely in the social milieu. Current and projected neuroS/T research and its applications evoke a number of neuroethicolegal and social issues (NELSI). This paper defines inherent and derivative NELSI of current and near-term neuroS/T development and engagement, and provides an overview of our group's ongoing work to develop a systematized approach to their address. Our proposed operational neuroethical risk assessment and mitigation paradigm (ONRAMP) is presented, which entails querying, framing, and modeling patterns and trajectories of neuroS/T research and translational uses, and the NELSI generated by such advancements and their applications. Extant ethical methods are addressed, with suggestion toward possible revision or re-formulation to meet the needs and exigencies fostered by neuroS/T and resultant NELSI in multi-cultural contexts. The relevance and importance of multi-disciplinary expertise in focusing upon NELSI is discussed, and the need for neuroethics education toward cultivating such a cadre of expertise is emphasized. Copyright © 2016 Elsevier Inc. All rights reserved.
Methodology for quantitative rapid multi-tracer PET tumor characterizations.
Kadrmas, Dan J; Hoffman, John M
2013-10-04
Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted.
Methodology for Quantitative Rapid Multi-Tracer PET Tumor Characterizations
Kadrmas, Dan J.; Hoffman, John M.
2013-01-01
Positron emission tomography (PET) can image a wide variety of functional and physiological parameters in vivo using different radiotracers. As more is learned about the molecular basis for disease and treatment, the potential value of molecular imaging for characterizing and monitoring disease status has increased. Characterizing multiple aspects of tumor physiology by imaging multiple PET tracers in a single patient provides additional complementary information, and there is a significant body of literature supporting the potential value of multi-tracer PET imaging in oncology. However, imaging multiple PET tracers in a single patient presents a number of challenges. A number of techniques are under development for rapidly imaging multiple PET tracers in a single scan, where signal-recovery processing algorithms are employed to recover various imaging endpoints for each tracer. Dynamic imaging is generally used with tracer injections staggered in time, and kinetic constraints are utilized to estimate each tracers' contribution to the multi-tracer imaging signal. This article summarizes past and ongoing work in multi-tracer PET tumor imaging, and then organizes and describes the main algorithmic approaches for achieving multi-tracer PET signal-recovery. While significant advances have been made, the complexity of the approach necessitates protocol design, optimization, and testing for each particular tracer combination and application. Rapid multi-tracer PET techniques have great potential for both research and clinical cancer imaging applications, and continued research in this area is warranted. PMID:24312149
Model selection and assessment for multi-species occupancy models
Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.
2016-01-01
While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.
Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method
NASA Astrophysics Data System (ADS)
Lee, G.; Jun, K. S.; Cung, E. S.
2014-09-01
This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.
2013-01-01
Background Qualitative research methods are increasingly used within clinical trials to address broader research questions than can be addressed by quantitative methods alone. These methods enable health professionals, service users, and other stakeholders to contribute their views and experiences to evaluation of healthcare treatments, interventions, or policies, and influence the design of trials. Qualitative data often contribute information that is better able to reform policy or influence design. Methods Health services researchers, including trialists, clinicians, and qualitative researchers, worked collaboratively to develop a comprehensive portfolio of standard operating procedures (SOPs) for the West Wales Organisation for Rigorous Trials in Health (WWORTH), a clinical trials unit (CTU) at Swansea University, which has recently achieved registration with the UK Clinical Research Collaboration (UKCRC). Although the UKCRC requires a total of 25 SOPs from registered CTUs, WWORTH chose to add an additional qualitative-methods SOP (QM-SOP). Results The qualitative methods SOP (QM-SOP) defines good practice in designing and implementing qualitative components of trials, while allowing flexibility of approach and method. Its basic principles are that: qualitative researchers should be contributors from the start of trials with qualitative potential; the qualitative component should have clear aims; and the main study publication should report on the qualitative component. Conclusions We recommend that CTUs consider developing a QM-SOP to enhance the conduct of quantitative trials by adding qualitative data and analysis. We judge that this improves the value of quantitative trials, and contributes to the future development of multi-method trials. PMID:23433341
Evaluating Urban Resilience to Climate Change: A Multi-Sector Approach (Final Report)
EPA is announcing the availability of this final report prepared by the Air, Climate, and Energy (ACE) Research Program, located within the Office of Research and Development, with support from Cadmus. One of the goals of the ACE research program is to provide scientific informat...
ERIC Educational Resources Information Center
Laursen, Sandra L.; Hassi, Marja-Liisa; Kogan, Marina; Weston, Timothy J.
2014-01-01
Slow faculty uptake of research-based, student-centered teaching and learning approaches limits the advancement of U.S. undergraduate mathematics education. A study of inquiry-based learning (IBL) as implemented in over 100 course sections at 4 universities provides an example of such multicourse, multi-institution uptake. Despite variation in how…
Student Perceptions of Service Quality in a Multi-Campus Higher Education System in Spain
ERIC Educational Resources Information Center
Gallifa, Josep; Batalle, Pere
2010-01-01
Purpose: This paper aims to present an in-depth case study with student perceptions of service quality, discussing the relevance of these perceptions for the important issue of quality improvement in higher education. Design/methodology/approach: The paper presents institutional research carried out in a multi-campus system in Spain made up of…
Multi-classification of cell deformation based on object alignment and run length statistic.
Li, Heng; Liu, Zhiwen; An, Xing; Shi, Yonggang
2014-01-01
Cellular morphology is widely applied in digital pathology and is essential for improving our understanding of the basic physiological processes of organisms. One of the main issues of application is to develop efficient methods for cell deformation measurement. We propose an innovative indirect approach to analyze dynamic cell morphology in image sequences. The proposed approach considers both the cellular shape change and cytoplasm variation, and takes each frame in the image sequence into account. The cell deformation is measured by the minimum energy function of object alignment, which is invariant to object pose. Then an indirect analysis strategy is employed to overcome the limitation of gradual deformation by run length statistic. We demonstrate the power of the proposed approach with one application: multi-classification of cell deformation. Experimental results show that the proposed method is sensitive to the morphology variation and performs better than standard shape representation methods.
Agile methods in biomedical software development: a multi-site experience report.
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-05-30
Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.
Agile methods in biomedical software development: a multi-site experience report
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-01-01
Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914
Hoang, Tuan; Tran, Dat; Huang, Xu
2013-01-01
Common Spatial Pattern (CSP) is a state-of-the-art method for feature extraction in Brain-Computer Interface (BCI) systems. However it is designed for 2-class BCI classification problems. Current extensions of this method to multiple classes based on subspace union and covariance matrix similarity do not provide a high performance. This paper presents a new approach to solving multi-class BCI classification problems by forming a subspace resembled from original subspaces and the proposed method for this approach is called Approximation-based Common Principal Component (ACPC). We perform experiments on Dataset 2a used in BCI Competition IV to evaluate the proposed method. This dataset was designed for motor imagery classification with 4 classes. Preliminary experiments show that the proposed ACPC feature extraction method when combining with Support Vector Machines outperforms CSP-based feature extraction methods on the experimental dataset.
Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser
NASA Astrophysics Data System (ADS)
Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu
2018-02-01
Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.
Comparative Study of Impedance Eduction Methods, Part 2: NASA Tests and Methodology
NASA Technical Reports Server (NTRS)
Jones, Michael G.; Watson, Willie R.; Howerton, Brian M.; Busse-Gerstengarbe, Stefan
2013-01-01
A number of methods have been developed at NASA Langley Research Center for eduction of the acoustic impedance of sound-absorbing liners mounted in the wall of a flow duct. This investigation uses methods based on the Pridmore-Brown and convected Helmholtz equations to study the acoustic behavior of a single-layer, conventional liner fabricated by the German Aerospace Center and tested in the NASA Langley Grazing Flow Impedance Tube. Two key assumptions are explored in this portion of the investigation. First, a comparison of results achieved with uniform-flow and shear-flow impedance eduction methods is considered. Also, an approach based on the Prony method is used to extend these methods from single-mode to multi-mode implementations. Finally, a detailed investigation into the effects of harmonic distortion on the educed impedance is performed, and the results are used to develop guidelines regarding acceptable levels of harmonic distortion
NASA Astrophysics Data System (ADS)
Antle, J. M.
2017-12-01
AgMIP has developed innovative protocol-based methods for regional integrated assessment (RIA) that can be implemented by national researchers working with local and national stakeholders (http://www.agmip.org/regional-integrated-assessments-handbook/). The approach has been implemented by regional teams in Sub-Saharan Africa and South Asia. This presentation first summarizes novel elements of the AgMIP RIA methods, and their strengths and limitations, based on their application by AgMIP researchers. Key insights from the application of these methods for climate impact and adaptation in Sub-Saharan Africa and South Asia are presented. A major finding is that detailed, site-specific, systems-based analysis show much more local and regional variation in impacts than studies based on analysis of individual crops, and provide the basis for analysis of multi-faceted technology and policy options to facilitate the transition to sustainable and resilient development pathways. The presentation concludes with observations about advancing integrated assessments carried out by and for national and local researchers and stakeholders.
Mathematical model of snake-type multi-directional wave generation
NASA Astrophysics Data System (ADS)
Muarif; Halfiani, Vera; Rusdiana, Siti; Munzir, Said; Ramli, Marwan
2018-01-01
Research on extreme wave generation is one intensive research on water wave study because the fact that the occurrence of this wave in the ocean can cause serious damage to the ships and offshore structures. One method to be used to generate the wave is self-correcting. This method controls the signal on the wavemakers in a wave tank. Some studies also consider the nonlinear wave generation in a wave tank by using numerical approach. Study on wave generation is essential in the effectiveness and efficiency of offshore structure model testing before it can be operated in the ocean. Generally, there are two types of wavemakers implemented in the hydrodynamic laboratory, piston-type and flap-type. The flap-type is preferred to conduct a testing to a ship in deep water. Single flap wavemaker has been explained in many studies yet snake-type wavemaker (has more than one flap) is still a case needed to be examined. Hence, the formulation in controlling the wavemaker need to be precisely analyzed such that the given input can generate the desired wave in the space-limited wave tank. By applying the same analogy and methodhology as the previous study, this article represents multi-directional wave generation by implementing snake-type wavemakers.
Kahn, Arnold
2011-04-01
The Longevity Consortium is a multi-investigator, multi-institutional research group focused on identifying the genetic variants that regulate human lifespan and healthy aging. The text that follows is an introduction to a series of seven articles prepared by Consortium investigators that represent a profile of planned and ongoing research and up-to-date reviews of topics of major interest to biogerontologists and others scientists and clinicians interested in ageing research. Copyright © 2010 Elsevier B.V. All rights reserved.
An Approach for Web Service Selection Based on Confidence Level of Decision Maker
Khezrian, Mojtaba; Jahan, Ali; Wan Kadir, Wan Mohd Nasir; Ibrahim, Suhaimi
2014-01-01
Web services today are among the most widely used groups for Service Oriented Architecture (SOA). Service selection is one of the most significant current discussions in SOA, which evaluates discovered services and chooses the best candidate from them. Although a majority of service selection techniques apply Quality of Service (QoS), the behaviour of QoS-based service selection leads to service selection problems in Multi-Criteria Decision Making (MCDM). In the existing works, the confidence level of decision makers is neglected and does not consider their expertise in assessing Web services. In this paper, we employ the VIKOR (VIšekriterijumskoKOmpromisnoRangiranje) method, which is absent in the literature for service selection, but is well-known in other research. We propose a QoS-based approach that deals with service selection by applying VIKOR with improvement of features. This research determines the weights of criteria based on user preference and accounts for the confidence level of decision makers. The proposed approach is illustrated by an example in order to demonstrate and validate the model. The results of this research may facilitate service consumers to attain a more efficient decision when selecting the appropriate service. PMID:24897426
Object recognition through a multi-mode fiber
NASA Astrophysics Data System (ADS)
Takagi, Ryosuke; Horisaki, Ryoichi; Tanida, Jun
2017-04-01
We present a method of recognizing an object through a multi-mode fiber. A number of speckle patterns transmitted through a multi-mode fiber are provided to a classifier based on machine learning. We experimentally demonstrated binary classification of face and non-face targets based on the method. The measurement process of the experimental setup was random and nonlinear because a multi-mode fiber is a typical strongly scattering medium and any reference light was not used in our setup. Comparisons between three supervised learning methods, support vector machine, adaptive boosting, and neural network, are also provided. All of those learning methods achieved high accuracy rates at about 90% for the classification. The approach presented here can realize a compact and smart optical sensor. It is practically useful for medical applications, such as endoscopy. Also our study indicated a promising utilization of artificial intelligence, which has rapidly progressed, for reducing optical and computational costs in optical sensing systems.