Sample records for method development approach

  1. Method Engineering: A Service-Oriented Approach

    NASA Astrophysics Data System (ADS)

    Cauvet, Corine

    In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.

  2. A robust quantitative near infrared modeling approach for blend monitoring.

    PubMed

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  3. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  4. The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment

    NASA Astrophysics Data System (ADS)

    Albrecht, Tamee R.; Crootof, Arica; Scott, Christopher A.

    2018-04-01

    The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex resource and development challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, while the WEF nexus offers a promising conceptual approach, the use of WEF nexus methods to systematically evaluate water, energy, and food interlinkages or support development of socially and politically-relevant resource policies has been limited. This paper reviews WEF nexus methods to provide a knowledge base of existing approaches and promote further development of analytical methods that align with nexus thinking. The systematic review of 245 journal articles and book chapters reveals that (a) use of specific and reproducible methods for nexus assessment is uncommon (less than one-third); (b) nexus methods frequently fall short of capturing interactions among water, energy, and food—the very linkages they conceptually purport to address; (c) assessments strongly favor quantitative approaches (nearly three-quarters); (d) use of social science methods is limited (approximately one-quarter); and (e) many nexus methods are confined to disciplinary silos—only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. To help overcome these limitations, we derive four key features of nexus analytical tools and methods—innovation, context, collaboration, and implementation—from the literature that reflect WEF nexus thinking. By evaluating existing nexus analytical approaches based on these features, we highlight 18 studies that demonstrate promising advances to guide future research. This paper finds that to address complex resource and development challenges, mixed-methods and transdisciplinary approaches are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and decision-makers.

  5. WILDLIFE RISK ASSESSMENT: DEVELOPMENT OF METHODS TO ASSESS THE EFFECTS OF MERCURY AND HABITAT ALTERATION ON POPULATIONS OF AQUATIC-DEPENDENT WILDLIFE

    EPA Science Inventory

    NHEERL is conducting a demonstration project to develop tools and approaches for assessing the risks of multiple stressors to populations of piscivorous wildlife, leading to the development of risk-based criteria. Specifically, we are developing methods and approaches to assess...

  6. Introduction to Stand-up Meetings in Agile Methods

    NASA Astrophysics Data System (ADS)

    Hasnain, Eisha; Hall, Tracy

    2009-05-01

    In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.

  7. FODEM: A Multi-Threaded Research and Development Method for Educational Technology

    ERIC Educational Resources Information Center

    Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki

    2012-01-01

    Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…

  8. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  9. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  10. State of the art in the validation of screening methods for the control of antibiotic residues: is there a need for further development?

    PubMed

    Gaudin, Valérie

    2017-09-01

    Screening methods are used as a first-line approach to detect the presence of antibiotic residues in food of animal origin. The validation process guarantees that the method is fit-for-purpose, suited to regulatory requirements, and provides evidence of its performance. This article is focused on intra-laboratory validation. The first step in validation is characterisation of performance, and the second step is the validation itself with regard to pre-established criteria. The validation approaches can be absolute (a single method) or relative (comparison of methods), overall (combination of several characteristics in one) or criterion-by-criterion. Various approaches to validation, in the form of regulations, guidelines or standards, are presented and discussed to draw conclusions on their potential application for different residue screening methods, and to determine whether or not they reach the same conclusions. The approach by comparison of methods is not suitable for screening methods for antibiotic residues. The overall approaches, such as probability of detection (POD) and accuracy profile, are increasingly used in other fields of application. They may be of interest for screening methods for antibiotic residues. Finally, the criterion-by-criterion approach (Decision 2002/657/EC and of European guideline for the validation of screening methods), usually applied to the screening methods for antibiotic residues, introduced a major characteristic and an improvement in the validation, i.e. the detection capability (CCβ). In conclusion, screening methods are constantly evolving, thanks to the development of new biosensors or liquid chromatography coupled to tandem-mass spectrometry (LC-MS/MS) methods. There have been clear changes in validation approaches these last 20 years. Continued progress is required and perspectives for future development of guidelines, regulations and standards for validation are presented here.

  11. The use of qualitative methods in developing the descriptive systems of preference-based measures of health-related quality of life for use in economic evaluation.

    PubMed

    Stevens, Katherine; Palfreyman, Simon

    2012-12-01

    To describe how qualitative methods can be used in the development of descriptive systems of preference-based measures (PBMs) of health-related quality of life. The requirements of the National Institute for Health and Clinical Excellence and other agencies together with the increasing use of patient-reported outcome measures has led to an increase in the demand for PBMs. Recently, interest has grown in developing new PBMs and while previous research on PBMs has mainly focused on the methods of valuation, research into the methods of developing descriptive systems is an emerging field. Traditionally, descriptive systems of PBMs were developed by using top-down methods, where content was derived from existing measures, the literature, or health surveys. A contrasting approach is a bottom-up methodology, which takes the views of patients or laypeople on how their life is affected by their health. This approach generally requires the use of qualitative methods. Qualitative methods lend themselves well to the development of PBMs. They also ensure that the measure has appropriate language, content validity, and responsiveness to change. While the use of qualitative methods in the development of non-PBMs is fairly standard, their use in developing PBMs was until recently nonexistent. In this article, we illustrate the use of qualitative methods by presenting two case studies of recently developed PBMs, one generic and one condition specific. We outline the stages involved, discuss the strengths and weaknesses of the approach, and compare with the top-down approach used in the majority of PBMs to date. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Forestry sector analysis for developing countries: issues and methods.

    Treesearch

    R.W. Haynes

    1993-01-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...

  13. Method development and qualification of capillary zone electrophoresis for investigation of therapeutic monoclonal antibody quality.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2016-10-01

    Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities.

    PubMed

    Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P

    2015-09-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.

  15. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities

    PubMed Central

    Green, Carla A.; Duan, Naihua; Gibbons, Robert D.; Hoagwood, Kimberly E.; Palinkas, Lawrence A.; Wisdom, Jennifer P.

    2015-01-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings. PMID:24722814

  16. Agile methods in biomedical software development: a multi-site experience report.

    PubMed

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-05-30

    Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.

  17. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  18. Participatory design of healthcare technology with children.

    PubMed

    Sims, Tara

    2018-02-12

    Purpose There are many frameworks and methods for involving children in design research. Human-Computer Interaction provides rich methods for involving children when designing technologies. The paper aims to discuss these issues. Design/methodology/approach This paper examines various approaches to involving children in design, considering whether users view children as study objects or active participants. Findings The BRIDGE method is a sociocultural approach to product design that views children as active participants, enabling them to contribute to the design process as competent and resourceful partners. An example is provided, in which BRIDGE was successfully applied to developing upper limb prostheses with children. Originality/value Approaching design in this way can provide children with opportunities to develop social, academic and design skills and to develop autonomy.

  19. DEVELOPMENT AND VALIDATION OF BROMATOMETRIC, DIAZOTIZATION AND VIS-SPECTROPHOTOMETRIC METHODS FOR THE DETERMINATION OF MESALAZINE IN PHARMACEUTICAL FORMULATION.

    PubMed

    Zawada, Elzabieta; Pirianowicz-Chaber, Elzabieta; Somogi, Aleksander; Pawinski, Tomasz

    2017-03-01

    Three new methods were developed for the quantitative determination of mesalazine in the form of the pure substance or in the form of suppositories and tablets - accordingly: bromatometric, diazotization and visible light spectrophotometry method. Optimizing the time and the temperature of the bromination reaction (50⁰C, 50 min) 4-amino-2,3,5,6-tetrabromophenol was obtained. The results obtained were reproducible, accurate and precise. Developed methods were compared to the pharmacopoeial approach - alkalimetry in an aqueous medium. The validation parameters of all methods were comparable. Developed methods for quantification of mesalazine are a viable alternative to other more expensive approaches.

  20. The dynamical systems approach to numerical integration

    NASA Astrophysics Data System (ADS)

    Wisdom, Jack

    2018-03-01

    The dynamical systems approach to numerical integration is reviewed and extended. The new method is compared to some alternative methods based on the Lie series approach. The test problem is the motion of the outer planets. The algorithms developed using the dynamical systems approach perform well.

  1. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  2. Creating Worlds, Constructing Meaning: The Scottish Storyline Method. Teacher to Teacher Series.

    ERIC Educational Resources Information Center

    Creswell, Jeff

    The approach known as the Storyline Method was developed by a group of educators at Jordanhill College of Education in Glasgow (Scotland). The development of the Storyline Method took place over years, and the approach, with its simple framework of Storyline, key questions, and activities, has stood the test of time. Storyline uses the power of…

  3. Optimal guidance law development for an advanced launch system

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Leung, Martin S. K.

    1995-01-01

    The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.

  4. Development of a Compound Optimization Approach Based on Imperialist Competitive Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Qimei; Yang, Zhihong; Wang, Yong

    In this paper, an improved novel approach is developed for the imperialist competitive algorithm to achieve a greater performance. The Nelder-Meand simplex method is applied to execute alternately with the original procedures of the algorithm. The approach is tested on twelve widely-used benchmark functions and is also compared with other relative studies. It is shown that the proposed approach has a faster convergence rate, better search ability, and higher stability than the original algorithm and other relative methods.

  5. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  6. Newton's method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    More, J. J.; Sorensen, D. C.

    1982-02-01

    Newton's method plays a central role in the development of numerical techniques for optimization. In fact, most of the current practical methods for optimization can be viewed as variations on Newton's method. It is therefore important to understand Newton's method as an algorithm in its own right and as a key introduction to the most recent ideas in this area. One of the aims of this expository paper is to present and analyze two main approaches to Newton's method for unconstrained minimization: the line search approach and the trust region approach. The other aim is to present some of themore » recent developments in the optimization field which are related to Newton's method. In particular, we explore several variations on Newton's method which are appropriate for large scale problems, and we also show how quasi-Newton methods can be derived quite naturally from Newton's method.« less

  7. Setting Objectives of Value Education in Constructivist Approach in the Light of Revised Blooms Taxonomy (RBT)

    ERIC Educational Resources Information Center

    Paleeri, Sankaranarayanan

    2015-01-01

    Transaction methods and approaches of value education have to change from lecturing to process based methods according to the development of constructivist approach. The process based methods provide creative interpretation and active participation from student side. Teachers have to organize suitable activities to transact values through process…

  8. The Development of a Robot-Based Learning Companion: A User-Centered Design Approach

    ERIC Educational Resources Information Center

    Hsieh, Yi-Zeng; Su, Mu-Chun; Chen, Sherry Y.; Chen, Gow-Dong

    2015-01-01

    A computer-vision-based method is widely employed to support the development of a variety of applications. In this vein, this study uses a computer-vision-based method to develop a playful learning system, which is a robot-based learning companion named RobotTell. Unlike existing playful learning systems, a user-centered design (UCD) approach is…

  9. Combining existing numerical models with data assimilation using weighted least-squares finite element methods.

    PubMed

    Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J

    2017-01-01

    A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. A High School English Chairman Looks at the Humanities Approach for All Students.

    ERIC Educational Resources Information Center

    Mersand, Joseph

    As the humanities approach to education expands, the high school English chairman can aid his teachers in developing this approach by suggesting useful teaching methods, praising successful teaching techniques, and permitting teachers to pursue independent methods of instruction. One method is an in-depth study of an historical period (such as the…

  11. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  12. The Continuing Search to Find a More Effective and Less Intimidating Way to Teach Research Methods in Higher Education

    ERIC Educational Resources Information Center

    Bell, Robin

    2016-01-01

    Existing literature examining the teaching of research methods highlights difficulties students face when developing research competencies. Studies of student-centred teaching approaches have found increased student performance and improved confidence in undertaking research projects. To develop a student-centred approach, it could be beneficial…

  13. The Paradigm Revolution in Inquiry: Implications for Vocational Research and Development.

    ERIC Educational Resources Information Center

    Guba, Egon G.

    While the rationalistic approach traditionally employed in research and development efforts in the social sciences may be the best method of inquiry to use in the physical sciences, social scientists, and, more particularly, vocational education researchers, would do better to adopt a naturalistic method of research. The naturalist approach to…

  14. Evaluation of evidence-based literature and formulation of recommendations for the clinical preventive guidelines for immigrants and refugees in Canada

    PubMed Central

    Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John

    2011-01-01

    Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711

  15. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    PubMed

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  16. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  17. Developing comparative criminology and the case of China: an introduction.

    PubMed

    Liu, Jianhong

    2007-02-01

    Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.

  18. HPLC-MS/MS method for dexmedetomidine quantification with Design of Experiments approach: application to pediatric pharmacokinetic study.

    PubMed

    Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta

    2017-02-01

    The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.

  19. An integrated bioanalytical method development and validation approach: case studies.

    PubMed

    Xue, Y-J; Melo, Brian; Vallejo, Martha; Zhao, Yuwen; Tang, Lina; Chen, Yuan-Shek; Keller, Karin M

    2012-10-01

    We proposed an integrated bioanalytical method development and validation approach: (1) method screening based on analyte's physicochemical properties and metabolism information to determine the most appropriate extraction/analysis conditions; (2) preliminary stability evaluation using both quality control and incurred samples to establish sample collection, storage and processing conditions; (3) mock validation to examine method accuracy and precision and incurred sample reproducibility; and (4) method validation to confirm the results obtained during method development. This integrated approach was applied to the determination of compound I in rat plasma and compound II in rat and dog plasma. The effectiveness of the approach was demonstrated by the superior quality of three method validations: (1) a zero run failure rate; (2) >93% of quality control results within 10% of nominal values; and (3) 99% incurred sample within 9.2% of the original values. In addition, rat and dog plasma methods for compound II were successfully applied to analyze more than 900 plasma samples obtained from Investigational New Drug (IND) toxicology studies in rats and dogs with near perfect results: (1) a zero run failure rate; (2) excellent accuracy and precision for standards and quality controls; and (3) 98% incurred samples within 15% of the original values. Copyright © 2011 John Wiley & Sons, Ltd.

  20. DEVELOPMENT OF SAMPLING AND ANALYTICAL METHODS FOR THE MEASUREMENT OF NITROUS OXIDE FROM FOSSIL FUEL COMBUSTION SOURCES

    EPA Science Inventory

    The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...

  1. Future Issues and Perspectives in the Evaluation of Social Development.

    ERIC Educational Resources Information Center

    Marsden, David; Oakley, Peter

    1991-01-01

    An instrumental/technocratic approach to evaluation of social development relies on primarily quantitative methods. An interpretive approach resists claims to legitimacy and authority of "experts" and questions existing interpretations. The latter approach is characterized by cultural relativism and subjectivity. (SK)

  2. Inquiring into Three Approaches to Hands-On Learning in Elementary and Secondary Science Methods Courses.

    ERIC Educational Resources Information Center

    Barnes, Marianne B.; Foley, Kathleen R.

    1999-01-01

    Investigates three approaches to hands-on science learning in two contexts, an elementary science methods class and a secondary science methods class. Focused on an activity on foam. Concludes that when developing models for teaching science methods courses, methods instructors need to share power with prospective teachers. (Author/MM)

  3. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND GENERATION MODEL. PART I, PRELIMINARY DISCUSSIONS OF METHODOLOGY. SUPPLEMENT, COMPUTER PROGRAMS OF THE HDL INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    ALTMANN, BERTHOLD; BROWN, WILLIAM G.

    THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…

  4. Developing Critical Understanding in HRM Students: Using Innovative Teaching Methods to Encourage Deep Approaches to Study

    ERIC Educational Resources Information Center

    Butler, Michael J. R.; Reddy, Peter

    2010-01-01

    Purpose: This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability…

  5. Using Student-Centered Cases in the Classroom: An Action Inquiry Approach to Leadership Development

    ERIC Educational Resources Information Center

    Foster, Pacey; Carboni, Inga

    2009-01-01

    This article addresses the concern that business schools are not adequately developing the practical leadership skills that are required in the real world of management. The article begins by discussing the limitations of traditional case methods for teaching behavioral skills. This approach is contrasted with an alternative case method drawn from…

  6. Recommendations for the evaluation of specimen stability for flow cytometric testing during drug development.

    PubMed

    Brown, Lynette; Green, Cherie L; Jones, Nicholas; Stewart, Jennifer J; Fraser, Stephanie; Howell, Kathy; Xu, Yuanxin; Hill, Carla G; Wiwi, Christopher A; White, Wendy I; O'Brien, Peter J; Litwin, Virginia

    2015-03-01

    The objective of this manuscript is to present an approach for evaluating specimen stability for flow cytometric methods used during drug development. While this approach specifically addresses stability assessment for assays to be used in clinical trials with centralized testing facilities, the concepts can be applied to any stability assessment for flow cytometric methods. The proposed approach is implemented during assay development and optimization, and includes suggestions for designing a stability assessment plan, data evaluation and acceptance criteria. Given that no single solution will be applicable in all scenarios, this manuscript offers the reader a roadmap for stability assessment and is intended to guide the investigator during both the method development phase and in the experimental design of the validation plan. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Development of a High-Order Space-Time Matrix-Free Adjoint Solver

    NASA Technical Reports Server (NTRS)

    Ceze, Marco A.; Diosady, Laslo T.; Murman, Scott M.

    2016-01-01

    The growth in computational power and algorithm development in the past few decades has granted the science and engineering community the ability to simulate flows over complex geometries, thus making Computational Fluid Dynamics (CFD) tools indispensable in analysis and design. Currently, one of the pacing items limiting the utility of CFD for general problems is the prediction of unsteady turbulent ows.1{3 Reynolds-averaged Navier-Stokes (RANS) methods, which predict a time-invariant mean flowfield, struggle to provide consistent predictions when encountering even mild separation, such as the side-of-body separation at a wing-body junction. NASA's Transformative Tools and Technologies project is developing both numerical methods and physical modeling approaches to improve the prediction of separated flows. A major focus of this e ort is efficient methods for resolving the unsteady fluctuations occurring in these flows to provide valuable engineering data of the time-accurate flow field for buffet analysis, vortex shedding, etc. This approach encompasses unsteady RANS (URANS), large-eddy simulations (LES), and hybrid LES-RANS approaches such as Detached Eddy Simulations (DES). These unsteady approaches are inherently more expensive than traditional engineering RANS approaches, hence every e ort to mitigate this cost must be leveraged. Arguably, the most cost-effective approach to improve the efficiency of unsteady methods is the optimal placement of the spatial and temporal degrees of freedom (DOF) using solution-adaptive methods.

  8. Value Contestations in Development Intervention: Community Development and Sustainable Livelihoods Approaches.

    ERIC Educational Resources Information Center

    Arce, Alberto

    2003-01-01

    Both community development and sustainable livelihood approaches ignore value contestations that underlie people's interests and experiences. A case from Bolivia demonstrates that local values, social relations, actions, and language strategies must underlie policy and method in development. (Contains 28 references.) (SK)

  9. Toward a Method for Exposing and Elucidating Ethical Issues with Human Cognitive Enhancement Technologies.

    PubMed

    Hofmann, Bjørn

    2017-04-01

    To develop a method for exposing and elucidating ethical issues with human cognitive enhancement (HCE). The intended use of the method is to support and facilitate open and transparent deliberation and decision making with respect to this emerging technology with great potential formative implications for individuals and society. Literature search to identify relevant approaches. Conventional content analysis of the identified papers and methods in order to assess their suitability for assessing HCE according to four selection criteria. Method development. Amendment after pilot testing on smart-glasses. Based on three existing approaches in health technology assessment a method for exposing and elucidating ethical issues in the assessment of HCE technologies was developed. Based on a pilot test for smart-glasses, the method was amended. The method consists of six steps and a guiding list of 43 questions. A method for exposing and elucidating ethical issues in the assessment of HCE was developed. The method provides the ground work for context specific ethical assessment and analysis. Widespread use, amendments, and further developments of the method are encouraged.

  10. An Autonomous Sensor Tasking Approach for Large Scale Space Object Cataloging

    NASA Astrophysics Data System (ADS)

    Linares, R.; Furfaro, R.

    The field of Space Situational Awareness (SSA) has progressed over the last few decades with new sensors coming online, the development of new approaches for making observations, and new algorithms for processing them. Although there has been success in the development of new approaches, a missing piece is the translation of SSA goals to sensors and resource allocation; otherwise known as the Sensor Management Problem (SMP). This work solves the SMP using an artificial intelligence approach called Deep Reinforcement Learning (DRL). Stable methods for training DRL approaches based on neural networks exist, but most of these approaches are not suitable for high dimensional systems. The Asynchronous Advantage Actor-Critic (A3C) method is a recently developed and effective approach for high dimensional systems, and this work leverages these results and applies this approach to decision making in SSA. The decision space for the SSA problems can be high dimensional, even for tasking of a single telescope. Since the number of SOs in space is relatively high, each sensor will have a large number of possible actions at a given time. Therefore, efficient DRL approaches are required when solving the SMP for SSA. This work develops a A3C based method for DRL applied to SSA sensor tasking. One of the key benefits of DRL approaches is the ability to handle high dimensional data. For example DRL methods have been applied to image processing for the autonomous car application. For example, a 256x256 RGB image has 196608 parameters (256*256*3=196608) which is very high dimensional, and deep learning approaches routinely take images like this as inputs. Therefore, when applied to the whole catalog the DRL approach offers the ability to solve this high dimensional problem. This work has the potential to, for the first time, solve the non-myopic sensor tasking problem for the whole SO catalog (over 22,000 objects) providing a truly revolutionary result.

  11. Function-Task-Competency Approach to Curriculum Development in Vocational Education in Agriculture: Research Report No. 1. Project Background, Plan, and Model Development.

    ERIC Educational Resources Information Center

    Matteson, Harold R.

    The report explains the construction of the function-task-competency method of developing vocational education curricula in agriculture at the secondary and postsecondary levels. It discusses at some length five approaches to the development of vocational education curricula used in the past: the subject approach (which centers on subjects taught…

  12. The effects of a professional development program for physics teachers on their teaching and the learning of their students

    NASA Astrophysics Data System (ADS)

    Lee, Mee-Kyeong

    The purposes of the study were (1) to investigate the effects of the 2000 Iowa Professional Development Program on classroom teaching and student learning and (2) to examine the effectiveness of Constructivist/STS approaches in terms of student perceptions regarding their science classrooms, student attitudes toward science, and student creativity. The 2000 Iowa Professional Development Program which focused on Constructivist/STS approaches was carried out at the University of Iowa for visiting Korean physics teachers. Several methods of data collection were used, including observations by means of classroom videotapes, teacher perception surveys, teacher interviews, and student surveys. The data collected was analyzed using both quantitative and qualitative methods. Major findings include: (1) The 2000 Iowa Professional Development Program did not significantly influence teacher perceptions concerning their teaching in terms of Constructivist/STS approaches in their classrooms. (2) The 2000 Iowa Professional Development Program significantly influenced improvement in teaching practices regarding Constructivist/STS approaches. (3) Students taught with Constructivist/STS approaches perceived their learning environments as more constructivist than did those taught with traditional methods. (4) Students taught with Constructivist/STS approaches improved significantly in the development of more positive attitudes toward science, while such positive attitudes decreased among students taught with traditional methods. (5) Students taught with Constructivist/STS approaches improved significantly in their use of creativity skills over those taught in traditional classrooms. (6) Most teachers favored the implementation of Constructivist/STS approaches. They perceived that students became more interested in lessons utilizing such approaches over time. The major difficulties which the teachers experienced with regard to the implementation of Constructivist/STS teaching include: inability to cover required curriculum content; getting away from textbooks; acceptance by parents, community, and supervisors; motivating students to be involved in classroom activities; and lack of materials for Constructivist/STS teaching. The results imply that efforts to improve educational conditions, in tandem with more consistent and ongoing professional development programs, are necessary to encourage teachers to use what they learned, to keep their initial interest and ideas alive, and to contribute specifically to the reform of science education.

  13. An adaptive signal-processing approach to online adaptive tutoring.

    PubMed

    Bergeron, Bryan; Cline, Andrew

    2011-01-01

    Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.

  14. A multivariate quadrature based moment method for LES based modeling of supersonic combustion

    NASA Astrophysics Data System (ADS)

    Donde, Pratik; Koo, Heeseok; Raman, Venkat

    2012-07-01

    The transported probability density function (PDF) approach is a powerful technique for large eddy simulation (LES) based modeling of scramjet combustors. In this approach, a high-dimensional transport equation for the joint composition-enthalpy PDF needs to be solved. Quadrature based approaches provide deterministic Eulerian methods for solving the joint-PDF transport equation. In this work, it is first demonstrated that the numerical errors associated with LES require special care in the development of PDF solution algorithms. The direct quadrature method of moments (DQMOM) is one quadrature-based approach developed for supersonic combustion modeling. This approach is shown to generate inconsistent evolution of the scalar moments. Further, gradient-based source terms that appear in the DQMOM transport equations are severely underpredicted in LES leading to artificial mixing of fuel and oxidizer. To overcome these numerical issues, a semi-discrete quadrature method of moments (SeQMOM) is formulated. The performance of the new technique is compared with the DQMOM approach in canonical flow configurations as well as a three-dimensional supersonic cavity stabilized flame configuration. The SeQMOM approach is shown to predict subfilter statistics accurately compared to the DQMOM approach.

  15. Spatial weighting approach in numerical method for disaggregation of MDGs indicators

    NASA Astrophysics Data System (ADS)

    Permai, S. D.; Mukhaiyar, U.; Satyaning PP, N. L. P.; Soleh, M.; Aini, Q.

    2018-03-01

    Disaggregation use to separate and classify the data based on certain characteristics or on administrative level. Disaggregated data is very important because some indicators not measured on all characteristics. Detailed disaggregation for development indicators is important to ensure that everyone benefits from development and support better development-related policymaking. This paper aims to explore different methods to disaggregate national employment-to-population ratio indicator to province- and city-level. Numerical approach applied to overcome the problem of disaggregation unavailability by constructing several spatial weight matrices based on the neighbourhood, Euclidean distance and correlation. These methods can potentially be used and further developed to disaggregate development indicators into lower spatial level even by several demographic characteristics.

  16. Introductory Guide to the Statistics of Molecular Genetics

    ERIC Educational Resources Information Center

    Eley, Thalia C.; Rijsdijk, Fruhling

    2005-01-01

    Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…

  17. Research approaches to mass casualty incidents response: development from routine perspectives to complexity science.

    PubMed

    Shen, Weifeng; Jiang, Libing; Zhang, Mao; Ma, Yuefeng; Jiang, Guanyu; He, Xiaojun

    2014-01-01

    To review the research methods of mass casualty incident (MCI) systematically and introduce the concept and characteristics of complexity science and artificial system, computational experiments and parallel execution (ACP) method. We searched PubMed, Web of Knowledge, China Wanfang and China Biology Medicine (CBM) databases for relevant studies. Searches were performed without year or language restrictions and used the combinations of the following key words: "mass casualty incident", "MCI", "research method", "complexity science", "ACP", "approach", "science", "model", "system" and "response". Articles were searched using the above keywords and only those involving the research methods of mass casualty incident (MCI) were enrolled. Research methods of MCI have increased markedly over the past few decades. For now, dominating research methods of MCI are theory-based approach, empirical approach, evidence-based science, mathematical modeling and computer simulation, simulation experiment, experimental methods, scenario approach and complexity science. This article provides an overview of the development of research methodology for MCI. The progresses of routine research approaches and complexity science are briefly presented in this paper. Furthermore, the authors conclude that the reductionism underlying the exact science is not suitable for MCI complex systems. And the only feasible alternative is complexity science. Finally, this summary is followed by a review that ACP method combining artificial systems, computational experiments and parallel execution provides a new idea to address researches for complex MCI.

  18. Management of Teacher Scientific-Methodical Work in Vocational Educational Institutions on the Basis of Project-Target Approach

    ERIC Educational Resources Information Center

    Shakuto, Elena A.; Dorozhkin, Evgenij M.; Kozlova, Anastasia A.

    2016-01-01

    The relevance of the subject under analysis is determined by the lack of theoretical development of the problem of management of teacher scientific-methodical work in vocational educational institutions based upon innovative approaches in the framework of project paradigm. The purpose of the article is to develop and test a science-based…

  19. Using a Mixed Methods Approach to Explore Strategies, Metacognitive Awareness and the Effects of Task Design on Listening Development

    ERIC Educational Resources Information Center

    O'Bryan, Anne; Hegelheimer, Volker

    2009-01-01

    Although research in the area of listening processes and strategies is increasing, it still remains the least understood and least researched of the four skills (Vandergrift, 2007). Based on research in listening comprehension, task design and strategies, this article uses a mixed methods approach to shed light on the development of four…

  20. Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

    ERIC Educational Resources Information Center

    Lin, Su-ching; Wu, Ming-sui

    2016-01-01

    This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…

  1. The Application of a Multiphase Triangulation Approach to Mixed Methods: The Research of an Aspiring School Principal Development Program

    ERIC Educational Resources Information Center

    Youngs, Howard; Piggot-Irvine, Eileen

    2012-01-01

    Mixed methods research has emerged as a credible alternative to unitary research approaches. The authors show how a combination of a triangulation convergence model with a triangulation multilevel model was used to research an aspiring school principal development pilot program. The multilevel model is used to show the national and regional levels…

  2. Model correlation and damage location for large space truss structures: Secant method development and evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver; Beattie, Christopher A.

    1991-01-01

    On-orbit testing of a large space structure will be required to complete the certification of any mathematical model for the structure dynamic response. The process of establishing a mathematical model that matches measured structure response is referred to as model correlation. Most model correlation approaches have an identification technique to determine structural characteristics from the measurements of the structure response. This problem is approached with one particular class of identification techniques - matrix adjustment methods - which use measured data to produce an optimal update of the structure property matrix, often the stiffness matrix. New methods were developed for identification to handle problems of the size and complexity expected for large space structures. Further development and refinement of these secant-method identification algorithms were undertaken. Also, evaluation of these techniques is an approach for model correlation and damage location was initiated.

  3. Modeling and analysis of cascade solar cells

    NASA Technical Reports Server (NTRS)

    Ho, F. D.

    1986-01-01

    A brief review is given of the present status of the development of cascade solar cells. It is known that photovoltaic efficiencies can be improved through this development. The designs and calculations of the multijunction cells, however, are quite complicated. The main goal is to find a method which is a compromise between accuracy and simplicity for modeling a cascade solar cell. Three approaches are presently under way, among them (1) equivalent circuit approach, (2) numerical approach, and (3) analytical approach. Here, the first and the second approaches are discussed. The equivalent circuit approach using SPICE (Simulation Program, Integrated Circuit Emphasis) to the cascade cells and the cascade-cell array is highlighted. The methods of extracting parameters for modeling are discussed.

  4. Mass spectrometry-based protein identification by integrating de novo sequencing with database searching.

    PubMed

    Wang, Penghao; Wilson, Susan R

    2013-01-01

    Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.

  5. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  6. Development of Fabrication Methods of Filler/Polymer Nanocomposites: With Focus on Simple Melt-Compounding-Based Approach without Surface Modification of Nanofillers

    PubMed Central

    Tanahashi, Mitsuru

    2010-01-01

    Many attempts have been made to fabricate various types of inorganic nanoparticle-filled polymers (filler/polymer nanocomposites) by a mechanical or chemical approach. However, these approaches require modification of the nanofiller surfaces and/or complicated polymerization reactions, making them unsuitable for industrial-scale production of the nanocomposites. The author and coworkers have proposed a simple melt-compounding method for the fabrication of silica/polymer nanocomposites, wherein silica nanoparticles without surface modification were dispersed through the breakdown of loose agglomerates of colloidal nano-silica spheres in a kneaded polymer melt. This review aims to discuss experimental techniques of the proposed method and its advantages over other developed methods.

  7. A rapid, automated approach to optimisation of multiple reaction monitoring conditions for quantitative bioanalytical mass spectrometry.

    PubMed

    Higton, D M

    2001-01-01

    An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. Copyright 2001 John Wiley & Sons, Ltd.

  8. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Cung, E. S.

    2014-09-01

    This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  9. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  10. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  11. Method of the active contour for segmentation of bone systems on bitmap images

    NASA Astrophysics Data System (ADS)

    Vu, Hai Anh; Safonov, Roman A.; Kolesnikova, Anna S.; Kirillova, Irina V.; Kossovich, Leonid U.

    2018-02-01

    It is developed within a method of the active contours the approach, which is allowing to realize separation of a contour of a object of the image in case of its segmentation. This approach exceeds a parametric method on speed, but also does not concede to it on decision accuracy. The approach is offered within this operation will allow to realize allotment of a contour with high accuracy of the image and quicker than a parametric method of the active contours.

  12. Fourier analysis and signal processing by use of the Moebius inversion formula

    NASA Technical Reports Server (NTRS)

    Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.

    1990-01-01

    A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.

  13. A variational dynamic programming approach to robot-path planning with a distance-safety criterion

    NASA Technical Reports Server (NTRS)

    Suh, Suk-Hwan; Shin, Kang G.

    1988-01-01

    An approach to robot-path planning is developed by considering both the traveling distance and the safety of the robot. A computationally-efficient algorithm is developed to find a near-optimal path with a weighted distance-safety criterion by using a variational calculus and dynamic programming (VCDP) method. The algorithm is readily applicable to any factory environment by representing the free workspace as channels. A method for deriving these channels is also proposed. Although it is developed mainly for two-dimensional problems, this method can be easily extended to a class of three-dimensional problems. Numerical examples are presented to demonstrate the utility and power of this method.

  14. Developing Evidence for Public Health Policy and Practice: The Implementation of a Knowledge Translation Approach in a Staged, Multi-Methods Study in England, 2007-09

    ERIC Educational Resources Information Center

    South, Jane; Cattan, Mima

    2014-01-01

    Effective knowledge translation processes are critical for the development of evidence-based public health policy and practice. This paper reports on the design and implementation of an innovative approach to knowledge translation within a mixed methods study on lay involvement in public health programme delivery. The study design drew on…

  15. The GRADE approach for assessing new technologies as applied to apheresis devices in ulcerative colitis

    PubMed Central

    2010-01-01

    Background In the last few years, a new non-pharmacological treatment, termed apheresis, has been developed to lessen the burden of ulcerative colitis (UC). Several methods can be used to establish treatment recommendations, but over the last decade an informal collaboration group of guideline developers, methodologists, and clinicians has developed a more sensible and transparent approach known as the Grading of Recommendations, Assessment, Development and Evaluation (GRADE). GRADE has mainly been used in clinical practice guidelines and systematic reviews. The aim of the present study is to describe the use of this approach in the development of recommendations for a new health technology, and to analyse the strengths, weaknesses, opportunities, and threats found when doing so. Methods A systematic review of the use of apheresis for UC treatment was performed in June 2004 and updated in May 2008. Two related clinical questions were selected, the outcomes of interest defined, and the quality of the evidence assessed. Finally, the overall quality of each question was taken into account to formulate recommendations following the GRADE approach. To evaluate this experience, a SWOT (strengths, weaknesses, opportunities and threats) analysis was performed to enable a comparison with our previous experience with the SIGN (Scottish Intercollegiate Guidelines Network) method. Results Application of the GRADE approach allowed recommendations to be formulated and the method to be clarified and made more explicit and transparent. Two weak recommendations were proposed to answer to the formulated questions. Some challenges, such as the limited number of studies found for the new technology and the difficulties encountered when searching for the results for the selected outcomes, none of which are specific to GRADE, were identified. GRADE was considered to be a more time-consuming method, although it has the advantage of taking into account patient values when defining and grading the relevant outcomes, thereby avoiding any influence from literature precedents, which could be considered to be a strength of this method. Conclusions The GRADE approach could be appropriate for making the recommendation development process for Health Technology Assessment (HTA) reports more explicit, especially with regard to new technologies. PMID:20553616

  16. Development and Validation of HPLC-DAD and UHPLC-DAD Methods for the Simultaneous Determination of Guanylhydrazone Derivatives Employing a Factorial Design.

    PubMed

    Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula

    2017-08-30

    Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.

  17. Cost estimation: An expert-opinion approach. [cost analysis of research projects using the Delphi method (forecasting)

    NASA Technical Reports Server (NTRS)

    Buffalano, C.; Fogleman, S.; Gielecki, M.

    1976-01-01

    A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.

  18. Leveraging object-oriented development at Ames

    NASA Technical Reports Server (NTRS)

    Wenneson, Greg; Connell, John

    1994-01-01

    This paper presents lessons learned by the Software Engineering Process Group (SEPG) from results of supporting two projects at NASA Ames using an Object Oriented Rapid Prototyping (OORP) approach supported by a full featured visual development environment. Supplemental lessons learned from a large project in progress and a requirements definition are also incorporated. The paper demonstrates how productivity gains can be made by leveraging the developer with a rich development environment, correct and early requirements definition using rapid prototyping, and earlier and better effort estimation and software sizing through object-oriented methods and metrics. Although the individual elements of OO methods, RP approach and OO metrics had been used on other separate projects, the reported projects were the first integrated usage supported by a rich development environment. Overall the approach used was twice as productive (measured by hours per OO Unit) as a C++ development.

  19. Sediment quality criteria: A review with recommendations for developing criteria for the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driver, C.J.

    1994-05-01

    Criteria for determining the quality of liver sediment are necessary to ensure that concentrations of contaminants in aquatic systems are within acceptable limits for the protection of aquatic and human life. Such criteria should facilitate decision-making about remediation, handling, and disposal of contaminants. Several approaches to the development of sediment quality criteria (SQC) have been described and include both descriptive and numerical methods. However, no single method measures all impacts at all times to all organisms (U.S. EPA 1992b). The U.S. EPA`s interest is primarily in establishing chemically based, numerical SQC that are applicable nation-wide (Shea 1988). Of the approachesmore » proposed for SQC development, only three are being considered for numerical SQC on a national level. These approaches include an Equilibrium Partitioning Approach, a site-specific method using bioassays (the Apparent Effects Threshold Approach), and an approach similar to EPA`s water quality criteria (Pavlou and Weston 1984). Although national (or even regional) criteria address a number of political, litigative, and engineering needs, some researchers feel that protection of benthic communities require site-specific, biologically based criteria (Baudo et al. 1990). This is particularly true for areas where complex mixtures of contaminants are present in sediments. Other scientifically valid and accepted procedures for freshwater SQC include a background concentration approach, methods using field or spiked bioassays, a screening level concentration approach, the Apparent Effects Threshold Approach, the Sediment Quality Triad, the International Joint Commission Sediment Assessment Strategy, and the National Status and Trends Program Approach. The various sediment assessment approaches are evaluated for application to the Hanford Reach and recommendations for Hanford Site sediment quality criteria are discussed.« less

  20. Just-in-Time Teaching, Just-in-Need Learning: Designing towards Optimized Pedagogical Outcomes

    ERIC Educational Resources Information Center

    Killi, Steinar; Morrison, Andrew

    2015-01-01

    Teaching methods are constantly being changed, new ones are developed and old methods have undergone a renaissance. Two main approaches to teaching prevail: a) lecture-based and project-based and b) an argumentative approach to known knowledge or learning by exploration. Today, there is a balance between these two approaches, and they are more…

  1. Communities of Practice: A Research Paradigm for the Mixed Methods Approach

    ERIC Educational Resources Information Center

    Denscombe, Martyn

    2008-01-01

    The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…

  2. Approaches to Developing Health in Early Years Settings

    ERIC Educational Resources Information Center

    Mooney, Ann; Boddy, Janet; Statham, June; Warwick, Ian

    2008-01-01

    Purpose: The purpose of the paper is to consider the opportunities and difficulties in developing health-promotion work in early years settings in the UK. Design/methodology/approach: As the first study of its kind conducted in the UK, a multi-method approach was adopted involving: an overview of health-related guidance and of effective…

  3. Let's Be PALS: An Evidence-Based Approach to Professional Development

    ERIC Educational Resources Information Center

    Dunst, Carl J.; Trivette, Carol M.

    2009-01-01

    An evidence-based approach to professional development is described on the basis of the findings from a series of research syntheses and meta-analyses of adult learning methods and strategies. The approach, called PALS (Participatory Adult Learning Strategy), places major emphasis on both active learner involvement in all aspects of training…

  4. A Strength-Based Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Zwart, Rosanne C.; Korthagen, Fred A. J.; Attema-Noordewier, Saskia

    2015-01-01

    Based on positive psychology, self-determination theory and a perspective on teacher quality, this study proposes and examines a strength-based approach to teacher professional development. A mixed method pre-test/post-test design was adopted to study perceived outcomes of the approach for 93 teachers of six primary schools in the Netherlands and…

  5. Using a Community-Engaged Research (CEnR) approach to develop and pilot a photo grid method to gain insights into early child health and development in a socio-economic disadvantaged community.

    PubMed

    Lowrie, Emma; Tyrrell-Smith, Rachel

    2017-01-01

    This paper reports on the use of a Community-Engaged Research (CEnR) approach to develop a new research tool to involve members of the community in thinking about priorities for early child health and development in a deprived area of the UK. The CEnR approach involves researchers, professionals and members of the public working together during all stages of research and development.Researchers used a phased approach to the development of a Photo Grid tool including reviewing tools which could be used for community engagement, and testing the new tool based on feedback from workshops with local early years professionals and parents of young children.The Photo Grid tool is a flat square grid on which photo cards can be placed. Participants were asked to pace at the top of the grid the photos they considered most important for early child health and development, working down to the less important ones at the bottom. The findings showed that the resulting Photo Grid tool was a useful and successful method of engaging with the local community. The evidence for this is the high numbers of participants who completed a pilot study and who provided feedback on the method. By involving community members throughout the research process, it was possible to develop a method that would be acceptable to the local population, thus decreasing the likelihood of a lack of engagement. The success of the tool is therefore particularly encouraging as it engages "seldom heard voices," such as those with low literacy. The aim of this research was to consult with professionals and parents to develop a new research toolkit (Photo Grid), to understand community assets and priorities in relation to early child health and development in Blackpool, a socio-economic disadvantaged community. A Community-Engaged Research (CEnR) approach was used to consult with community members. This paper describes the process of using a CEnR approach in developing a Photo Grid toolkit. A phased CEnR approach was used to design, test and pilot a Photo Grid tool. Members of the Blackpool community; parents with children aged 0-4 years, health professionals, members of the early year's workforce, and community development workers were involved in the development of the research tool at various stages. They were recruited opportunistically via a venue-based time-space sampling method. In total, 213 parents and 18 professionals engaged in the research process. Using a CEnR approach allowed effective engagement with the local community and professionals, evidence by high levels of engagement throughout the development process. This approach improved the acceptability and usability of the resulting Photo Grid toolkit. Community members found the method accessible, engaging, useful, and thought provoking. The Photo Grid toolkit was seen by community members as accessible, engaging, useful and thought provoking in an area of high social deprivation, complex problems, and low literacy. The Photo Grid is an adaptable tool which can be used in other areas of socio-economic disadvantage to engage with the community to understand a wide variety of complex topics.

  6. Comparison of methods for developing the dynamics of rigid-body systems

    NASA Technical Reports Server (NTRS)

    Ju, M. S.; Mansour, J. M.

    1989-01-01

    Several approaches for developing the equations of motion for a three-degree-of-freedom PUMA robot were compared on the basis of computational efficiency (i.e., the number of additions, subtractions, multiplications, and divisions). Of particular interest was the investigation of the use of computer algebra as a tool for developing the equations of motion. Three approaches were implemented algebraically: Lagrange's method, Kane's method, and Wittenburg's method. Each formulation was developed in absolute and relative coordinates. These six cases were compared to each other and to a recursive numerical formulation. The results showed that all of the formulations implemented algebraically required fewer calculations than the recursive numerical algorithm. The algebraic formulations required fewer calculations in absolute coordinates than in relative coordinates. Each of the algebraic formulations could be simplified, using patterns from Kane's method, to yield the same number of calculations in a given coordinate system.

  7. A high precision extrapolation method in multiphase-field model for simulating dendrite growth

    NASA Astrophysics Data System (ADS)

    Yang, Cong; Xu, Qingyan; Liu, Baicheng

    2018-05-01

    The phase-field method coupling with thermodynamic data has become a trend for predicting the microstructure formation in technical alloys. Nevertheless, the frequent access to thermodynamic database and calculation of local equilibrium conditions can be time intensive. The extrapolation methods, which are derived based on Taylor expansion, can provide approximation results with a high computational efficiency, and have been proven successful in applications. This paper presents a high precision second order extrapolation method for calculating the driving force in phase transformation. To obtain the phase compositions, different methods in solving the quasi-equilibrium condition are tested, and the M-slope approach is chosen for its best accuracy. The developed second order extrapolation method along with the M-slope approach and the first order extrapolation method are applied to simulate dendrite growth in a Ni-Al-Cr ternary alloy. The results of the extrapolation methods are compared with the exact solution with respect to the composition profile and dendrite tip position, which demonstrate the high precision and efficiency of the newly developed algorithm. To accelerate the phase-field and extrapolation computation, the graphic processing unit (GPU) based parallel computing scheme is developed. The application to large-scale simulation of multi-dendrite growth in an isothermal cross-section has demonstrated the ability of the developed GPU-accelerated second order extrapolation approach for multiphase-field model.

  8. A Different Approach to Have Science and Technology Student-Teachers Gain Varied Methods in Laboratory Applications: A Sample of Computer Assisted POE Application

    ERIC Educational Resources Information Center

    Saka, Arzu

    2012-01-01

    The purpose of this study is to develop a new approach and assess the application for the science and technology student-teachers to gain varied laboratory methods in science and technology teaching. It is also aimed to describe the computer-assisted POE application in the subject of "Photosynthesis-Light" developed in the context of…

  9. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  10. Assessing metacognition of grade 2 and grade 4 students using an adaptation of multi-method interview approach during mathematics problem-solving

    NASA Astrophysics Data System (ADS)

    Kuzle, A.

    2018-06-01

    The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.

  11. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  12. Community Development in the School Workplace

    ERIC Educational Resources Information Center

    Brouwer, Patricia; Brekelmans, Mieke; Nieuwenhuis, Loek; Simons, Robert-Jan

    2012-01-01

    Purpose: The aim of this study is to explore whether and to what degree community development of teacher teams takes place and how community development comes about, that is, what community-building efforts teacher teams undertake. Design/methodology/approach: Using a multi method approach, quantitative and qualitative data were gathered from…

  13. Developing Skills in Years 11 and 12 Secondary School Economics

    ERIC Educational Resources Information Center

    Stokes, Anthony; Wright, Sarah

    2013-01-01

    This paper explores different approaches for developing skills in economics in schools. It considers the different preferred learning styles of students through the VARK method and applies a contextual learning approach to engage students and develop skills. The key skills that are considered are literacy, numeracy, information and communication…

  14. On Anticipatory Development of Dual Education Based on the Systemic Approach

    ERIC Educational Resources Information Center

    Alshynbayeva, Zhuldyz; Sarbassova, Karlygash; Galiyeva, Temir; Kaltayeva, Gulnara; Bekmagambetov, Aidos

    2016-01-01

    The article addresses separate theoretical and methodical aspects of the anticipatory development of dual education in the Republic of Kazakhstan based on the systemic approach. It states the need to develop orientating basis of prospective professional activities in students. We define the concepts of anticipatory cognition and anticipatory…

  15. GEOGRAPHIC-SPECIFIC WATER QUALITY CRITERIA DEVELOPMENT WITH MONITORING DATA USING CONDITIONAL PROBABILITIES - A PROPOSED APPROACH

    EPA Science Inventory

    A conditional probability approach using monitoring data to develop geographic-specific water quality criteria for protection of aquatic life is presented. Typical methods to develop criteria using existing monitoring data are limited by two issues: (1) how to extrapolate to an...

  16. The Influence of Leadership Development Approaches on Social Capital: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Burbaugh, Bradley James

    2015-01-01

    Leadership programs serve as a mechanism to develop the leadership capacity of individuals, groups, and organizations. Although considerable time and resources have been devoted to understanding the outcomes of leadership development, little time and effort has been dedicated to understanding the developmental approaches that influence the…

  17. Fractional Order Modeling of Atmospheric Turbulence - A More Accurate Modeling Methodology for Aero Vehicles

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2014-01-01

    The presentation covers a recently developed methodology to model atmospheric turbulence as disturbances for aero vehicle gust loads and for controls development like flutter and inlet shock position. The approach models atmospheric turbulence in their natural fractional order form, which provides for more accuracy compared to traditional methods like the Dryden model, especially for high speed vehicle. The presentation provides a historical background on atmospheric turbulence modeling and the approaches utilized for air vehicles. This is followed by the motivation and the methodology utilized to develop the atmospheric turbulence fractional order modeling approach. Some examples covering the application of this method are also provided, followed by concluding remarks.

  18. Methods of Farm Guidance

    ERIC Educational Resources Information Center

    Vir, Dharm

    1971-01-01

    A survey of teaching methods for farm guidance workers in India, outlining some approaches developed by and used in other nations. Discusses mass educational methods, group educational methods, and the local leadership method. (JB)

  19. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    PubMed

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millis, Andrew

    Understanding the behavior of interacting electrons in molecules and solids so that one can predict new superconductors, catalysts, light harvesters, energy and battery materials and optimize existing ones is the ``quantum many-body problem’’. This is one of the scientific grand challenges of the 21 st century. A complete solution to the problem has been proven to be exponentially hard, meaning that straightforward numerical approaches fail. New insights and new methods are needed to provide accurate yet feasible approximate solutions. This CMSCN project brought together chemists and physicists to combine insights from the two disciplines to develop innovative new approaches. Outcomesmore » included the Density Matrix Embedding method, a new, computationally inexpensive and extremely accurate approach that may enable first principles treatment of superconducting and magnetic properties of strongly correlated materials, new techniques for existing methods including an Adaptively Truncated Hilbert Space approach that will vastly expand the capabilities of the dynamical mean field method, a self-energy embedding theory and a new memory-function based approach to the calculations of the behavior of driven systems. The methods developed under this project are now being applied to improve our understanding of superconductivity, to calculate novel topological properties of materials and to characterize and improve the properties of nanoscale devices.« less

  1. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  2. Efficient Fluid Dynamic Design Optimization Using Cartesian Grids

    NASA Technical Reports Server (NTRS)

    Dadone, A.; Grossman, B.; Sellers, Bill (Technical Monitor)

    2004-01-01

    This report is subdivided in three parts. The first one reviews a new approach to the computation of inviscid flows using Cartesian grid methods. The crux of the method is the curvature-corrected symmetry technique (CCST) developed by the present authors for body-fitted grids. The method introduces ghost cells near the boundaries whose values are developed from an assumed flow-field model in vicinity of the wall consisting of a vortex flow, which satisfies the normal momentum equation and the non-penetration condition. The CCST boundary condition was shown to be substantially more accurate than traditional boundary condition approaches. This improved boundary condition is adapted to a Cartesian mesh formulation, which we call the Ghost Body-Cell Method (GBCM). In this approach, all cell centers exterior to the body are computed with fluxes at the four surrounding cell edges. There is no need for special treatment corresponding to cut cells which complicate other Cartesian mesh methods.

  3. A Psychometric Approach to Theory-Based Behavior Change Intervention Development: Example From the Colorado Meaning-Activity Project.

    PubMed

    Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L

    2018-05-18

    There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.

  4. Improvement of economic security management system of municipalities with account of transportation system development: methods of assessment

    NASA Astrophysics Data System (ADS)

    Khe Sun, Pak; Vorona-Slivinskaya, Lubov; Voskresenskay, Elena

    2017-10-01

    The article highlights the necessity of a complex approach to assess economic security of municipalities, which would consider municipal management specifics. The approach allows comparing the economic security level of municipalities, but it does not describe parameter differences between compared municipalities. Therefore, there is a second method suggested: parameter rank order method. Applying these methods allowed to figure out the leaders and outsiders of the economic security among municipalities and rank all economic security parameters according to the significance level. Complex assessment of the economic security of municipalities, based on the combination of the two approaches, allowed to assess the security level more accurate. In order to assure economic security and equalize its threshold values, one should pay special attention to transportation system development in municipalities. Strategic aims of projects in the area of transportation infrastructure development in municipalities include the following issues: contribution into creating and elaborating transportation logistics and manufacture transport complexes, development of transportation infrastructure with account of internal and external functions of the region, public transport development, improvement of transport security and reducing its negative influence on the environment.

  5. Evolving the Principles and Practice of Validation for New Alternative Approaches to Toxicity Testing.

    PubMed

    Whelan, Maurice; Eskes, Chantra

    Validation is essential for the translation of newly developed alternative approaches to animal testing into tools and solutions suitable for regulatory applications. Formal approaches to validation have emerged over the past 20 years or so and although they have helped greatly to progress the field, it is essential that the principles and practice underpinning validation continue to evolve to keep pace with scientific progress. The modular approach to validation should be exploited to encourage more innovation and flexibility in study design and to increase efficiency in filling data gaps. With the focus now on integrated approaches to testing and assessment that are based on toxicological knowledge captured as adverse outcome pathways, and which incorporate the latest in vitro and computational methods, validation needs to adapt to ensure it adds value rather than hinders progress. Validation needs to be pursued both at the method level, to characterise the performance of in vitro methods in relation their ability to detect any association of a chemical with a particular pathway or key toxicological event, and at the methodological level, to assess how integrated approaches can predict toxicological endpoints relevant for regulatory decision making. To facilitate this, more emphasis needs to be given to the development of performance standards that can be applied to classes of methods and integrated approaches that provide similar information. Moreover, the challenge of selecting the right reference chemicals to support validation needs to be addressed more systematically, consistently and in a manner that better reflects the state of the science. Above all however, validation requires true partnership between the development and user communities of alternative methods and the appropriate investment of resources.

  6. Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty

    NASA Astrophysics Data System (ADS)

    Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh

    2014-04-01

    Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.

  7. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Microbial Burden Approach : New Monitoring Approach for Measuring Microbial Burden

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Kasthuri; Vaishampayan, Parag; Barmatz, Martin

    2013-01-01

    Advantages of new approach for differentiating live cells/ spores from dead cells/spores. Four examples of Salmonella outbreaks leading to costly destruction of dairy products. List of possible collaboration activities between JPL and other industries (for future discussion). Limitations of traditional microbial monitoring approaches. Introduction to new approach for rapid measurement of viable (live) bacterial cells/spores and its areas of application. Detailed example for determining live spores using new approach (similar procedure for determining live cells). JPL has developed a patented approach for measuring amount of live and dead cells/spores. This novel "molecular" method takes less than 5 to 7 hrs. compared to the seven days required using conventional techniques. Conventional "molecular" techniques can not discriminate live cells/spores among dead cells/spores. The JPL-developed novel method eliminates false positive results obtained from conventional "molecular" techniques that lead to unnecessary delay in the processing and to unnecessary destruction of food products.

  9. State of the art in non-animal approaches for skin sensitization testing: from individual test methods towards testing strategies.

    PubMed

    Ezendam, Janine; Braakhuis, Hedwig M; Vandebriel, Rob J

    2016-12-01

    The hazard assessment of skin sensitizers relies mainly on animal testing, but much progress is made in the development, validation and regulatory acceptance and implementation of non-animal predictive approaches. In this review, we provide an update on the available computational tools and animal-free test methods for the prediction of skin sensitization hazard. These individual test methods address mostly one mechanistic step of the process of skin sensitization induction. The adverse outcome pathway (AOP) for skin sensitization describes the key events (KEs) that lead to skin sensitization. In our review, we have clustered the available test methods according to the KE they inform: the molecular initiating event (MIE/KE1)-protein binding, KE2-keratinocyte activation, KE3-dendritic cell activation and KE4-T cell activation and proliferation. In recent years, most progress has been made in the development and validation of in vitro assays that address KE2 and KE3. No standardized in vitro assays for T cell activation are available; thus, KE4 cannot be measured in vitro. Three non-animal test methods, addressing either the MIE, KE2 or KE3, are accepted as OECD test guidelines, and this has accelerated the development of integrated or defined approaches for testing and assessment (e.g. testing strategies). The majority of these approaches are mechanism-based, since they combine results from multiple test methods and/or computational tools that address different KEs of the AOP to estimate skin sensitization potential and sometimes potency. Other approaches are based on statistical tools. Until now, eleven different testing strategies have been published, the majority using the same individual information sources. Our review shows that some of the defined approaches to testing and assessment are able to accurately predict skin sensitization hazard, sometimes even more accurate than the currently used animal test. A few defined approaches are developed to provide an estimate of the potency sub-category of a skin sensitizer as well, but these approaches need further independent evaluation with a new dataset of chemicals. To conclude, this update shows that the field of non-animal approaches for skin sensitization has evolved greatly in recent years and that it is possible to predict skin sensitization hazard without animal testing.

  10. Repositioning interprofessional education from the margins to the centre of Australian health professional education ? what is required?

    PubMed

    Dunston, Roger; Forman, Dawn; Thistlethwaite, Jill; Steketee, Carole; Rogers, Gary D; Moran, Monica

    2018-01-16

    Objective This paper examines the implementation and implications of four development and research initiatives, collectively titled the Curriculum Renewal Studies program (CRS), occurring over a 6-year period ending in 2015 and focusing on interprofessional education (IPE) within Australian pre-registration health professional education. Methods The CRS was developed as an action-focused and participatory program of studies. This research and development program used a mixed-methods approach. Structured survey, interviews and extensive documentary analyses were supplemented by semi-structured interviews, focus groups, large group consultations and consensus building methods. Narrative accounts of participants' experiences and an approach to the future development of Australian IPE were developed. Results Detailed accounts of existing Australian IPE curricula and educational activity were developed. These accounts were published and used in several settings to support curriculum and national workforce development. Reflective activities engaging with the findings facilitated the development of a national approach to the future development of Australian IPE - a national approach focused on coordinated and collective governance and development. Conclusion This paper outlines the design of an innovative approach to national IPE governance and development. It explores how ideas drawn from sociocultural theories were used to guide the choice of methods and to enrich data analysis. Finally, the paper reflects on the implications of CRS findings for health professional education, workforce development and the future of Australian IPE. What is known about the topic? IPE to enable the achievement of interprofessional and collaborative practice capabilities is widely accepted and promoted. However, many problems exist in embedding and sustaining IPE as a system-wide element of health professional education. How these implementation problems can be successfully addressed is a health service and education development priority. What does this paper add? The paper presents a summary of how Australian IPE was conceptualised, developed and delivered across 26 universities during the period of the four CRS studies. It points to strengths and limitations of existing IPE. An innovative approach to the future development of Australian IPE is presented. The importance of sociocultural factors in the development of practitioner identity and practice development is identified. What are the implications for practitioners? The findings of the CRS program present a challenging view of current Australian IPE activity and what will be required to meet industry and health workforce expectations related to the development of an Australian interprofessional- and collaborative-practice-capable workforce. Although the directions identified pose considerable challenges for the higher education and health sectors, they also provide a consensus-based approach to the future development of Australian IPE. As such they can be used as a blueprint for national development.

  11. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  12. Old practices, new windows: reflections on a communications skills innovation.

    PubMed

    Cantillon, Peter

    2017-03-01

    Most of the great innovations in communication skills education, from Balint's concept of the 'doctor as drug' to the Calgary Cambridge conceptualisation of the consultation, were founded in general practice. It can be argued however, that there has been a hiatus in the development of new approaches to analysing the consultation since the mid-1990s. It is most welcome therefore that in this issue of the journal two papers are presented that describe and evaluate a novel approach to consultation analysis entitled 'the windows method'. Building on the more structured approaches that preceded it, the windows method offers some genuine innovations in terms of its emphasis on emotional knowledge and the manner in which it addresses many of the potential deficiencies in feedback practice associated with older methods. The new approach is very much in step with current thinking about emotional development and the establishment of appropriate environments for feedback. The windows method has the potential to breathe fresh life into old and well-established communication skills education practices.

  13. Advances in thickness measurements and dynamic visualization of the tear film using non-invasive optical approaches.

    PubMed

    Bai, Yuqiang; Nichols, Jason J

    2017-05-01

    The thickness of tear film has been investigated under both invasive and non-invasive methods. While invasive methods are largely historical, more recent noninvasive methods are generally based on optical approaches that provide accurate, precise, and rapid measures. Optical microscopy, interferometry, and optical coherence tomography (OCT) have been developed to characterize the thickness of tear film or certain aspects of the tear film (e.g., the lipid layer). This review provides an in-depth overview on contemporary optical techniques used in studying the tear film, including both advantages and limitations of these approaches. It is anticipated that further developments of high-resolution OCT and other interferometric methods will enable a more accurate and precise measurement of the thickness of the tear film and its related dynamic properties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. A multi-scale method of mapping urban influence

    Treesearch

    Timothy G. Wade; James D. Wickham; Nicola Zacarelli; Kurt H. Riitters

    2009-01-01

    Urban development can impact environmental quality and ecosystem services well beyond urban extent. Many methods to map urban areas have been developed and used in the past, but most have simply tried to map existing extent of urban development, and all have been single-scale techniques. The method presented here uses a clustering approach to look beyond the extant...

  15. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  16. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  17. Systematic process synthesis and design methods for cost effective waste minimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents.

  18. HPLC method development for evolving applications in the pharmaceutical industry and nanoscale chemistry

    NASA Astrophysics Data System (ADS)

    Castiglione, Steven Louis

    As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved across three methods, each employing a different approach for size resolution. These methods were evaluated primarily for sensitivity, which proved to be a substantial hurdle to further development, but does not appear to deter future research efforts.

  19. A novel design process for selection of attributes for inclusion in discrete choice experiments: case study exploring variation in clinical decision-making about thrombolysis in the treatment of acute ischaemic stroke.

    PubMed

    De Brún, Aoife; Flynn, Darren; Ternent, Laura; Price, Christopher I; Rodgers, Helen; Ford, Gary A; Rudd, Matthew; Lancsar, Emily; Simpson, Stephen; Teah, John; Thomson, Richard G

    2018-06-22

    A discrete choice experiment (DCE) is a method used to elicit participants' preferences and the relative importance of different attributes and levels within a decision-making process. DCEs have become popular in healthcare; however, approaches to identify the attributes/levels influencing a decision of interest and to selection methods for their inclusion in a DCE are under-reported. Our objectives were: to explore the development process used to select/present attributes/levels from the identified range that may be influential; to describe a systematic and rigorous development process for design of a DCE in the context of thrombolytic therapy for acute stroke; and, to discuss the advantages of our five-stage approach to enhance current guidance for developing DCEs. A five-stage DCE development process was undertaken. Methods employed included literature review, qualitative analysis of interview and ethnographic data, expert panel discussions, a quantitative structured prioritisation (ranking) exercise and pilot testing of the DCE using a 'think aloud' approach. The five-stage process reported helped to reduce the list of 22 initial patient-related factors to a final set of nine variable factors and six fixed factors for inclusion in a testable DCE using a vignette model of presentation. In order for the data and conclusions generated by DCEs to be deemed valid, it is crucial that the methods of design and development are documented and reported. This paper has detailed a rigorous and systematic approach to DCE development which may be useful to researchers seeking to establish methods for reducing and prioritising attributes for inclusion in future DCEs.

  20. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  1. The Development and Evaluation of Speaking Learning Model by Cooperative Approach

    ERIC Educational Resources Information Center

    Darmuki, Agus; Andayani; Nurkamto, Joko; Saddhono, Kundharu

    2018-01-01

    A cooperative approach-based Speaking Learning Model (SLM) has been developed to improve speaking skill of Higher Education students. This research aimed at evaluating the effectiveness of cooperative-based SLM viewed from the development of student's speaking ability and its effectiveness on speaking activity. This mixed method study combined…

  2. Sustainable Participation in Regular Exercise amongst Older People: Developing an Action Research Approach

    ERIC Educational Resources Information Center

    Davies, Jeanne; Lester, Carolyn; O'Neill, Martin; Williams, Gareth

    2008-01-01

    Objective: This article describes the Triangle Project's work with a post industrial community, where healthy living activities were developed in response to community members' expressed needs. Method: An action research partnership approach was taken to reduce health inequalities, with local people developing their own activities to address…

  3. Effective Professional Development for E-Learning: What Do the Managers Think?

    ERIC Educational Resources Information Center

    Wilson, Amy

    2012-01-01

    Introducing new methods of teaching and learning requires an institutional approach to professional development in order to cater for the different levels and requirements of staff. The increase in e-learning use has prompted many institutions to adopt a whole organisation approach to professional development for lecturers. This paper proposes to…

  4. The Adolescent Mentalization-based Integrative Treatment (AMBIT) approach to outcome evaluation and manualization: adopting a learning organization approach.

    PubMed

    Fuggle, Peter; Bevington, Dickon; Cracknell, Liz; Hanley, James; Hare, Suzanne; Lincoln, John; Richardson, Garry; Stevens, Nina; Tovey, Heather; Zlotowitz, Sally

    2015-07-01

    AMBIT (Adolescent Mentalization-Based Integrative Treatment) is a developing team approach to working with hard-to-reach adolescents. The approach applies the principle of mentalization to relationships with clients, team relationships and working across agencies. It places a high priority on the need for locally developed evidence-based practice, and proposes that outcome evaluation needs to be explicitly linked with processes of team learning using a learning organization framework. A number of innovative methods of team learning are incorporated into the AMBIT approach, particularly a system of web-based wiki-formatted AMBIT manuals individualized for each participating team. The paper describes early development work of the model and illustrates ways of establishing explicit links between outcome evaluation, team learning and manualization by describing these methods as applied to two AMBIT-trained teams; one team working with young people on the edge of care (AMASS - the Adolescent Multi-Agency Support Service) and another working with substance use (CASUS - Child and Adolescent Substance Use Service in Cambridgeshire). Measurement of the primary outcomes for each team (which were generally very positive) facilitated team learning and adaptations of methods of practice that were consolidated through manualization. © The Author(s) 2014.

  5. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1992-01-01

    Research conducted during the period from July 1991 through December 1992 is covered. A method based upon the quasi-analytical approach was developed for computing the aerodynamic sensitivity coefficients of three dimensional wings in transonic and subsonic flow. In addition, the method computes for comparison purposes the aerodynamic sensitivity coefficients using the finite difference approach. The accuracy and validity of the methods are currently under investigation.

  6. Hubble Space Telescope Angular Velocity Estimation During the Robotic Servicing Mission

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Queen, Steven Z.; VanEepoel, John M.; Sanner, Robert M.

    2005-01-01

    In 2004 NASA began investigation of a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would require estimates of the HST attitude and rates in order to achieve a capture by the proposed Hubble robotic vehicle (HRV). HRV was to be equipped with vision-based sensors, capable of estimating the relative attitude between HST and HRV. The inertial HST attitude is derived from the measured relative attitude and the HRV computed inertial attitude. However, the relative rate between HST and HRV cannot be measured directly. Therefore, the HST rate with respect to inertial space is not known. Two approaches are developed to estimate the HST rates. Both methods utilize the measured relative attitude and the HRV inertial attitude and rates. First, a non-linear estimator is developed. The nonlinear approach estimates the HST rate through an estimation of the inertial angular momentum. Second, a linearized approach is developed. The linearized approach is a pseudo-linear Kalman filter. Simulation test results for both methods are given. Even though the development began as an application for the HST robotic servicing mission, the methods presented are applicable to any rendezvous/capture mission involving a non-cooperative target spacecraft.

  7. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  8. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  9. Approaches to Machine Learning.

    DTIC Science & Technology

    1984-02-16

    The field of machine learning strives to develop methods and techniques to automatic the acquisition of new information, new skills, and new ways of organizing existing information. In this article, we review the major approaches to machine learning in symbolic domains, covering the tasks of learning concepts from examples, learning search methods, conceptual clustering, and language acquisition. We illustrate each of the basic approaches with paradigmatic examples. (Author)

  10. Introduction to the Theme "New Methods and Novel Therapeutic Approaches in Pharmacology and Toxicology".

    PubMed

    Insel, Paul A; Amara, Susan G; Blaschke, Terrence F; Meyer, Urs A

    2017-01-06

    Major advances in scientific discovery and insights can result from the development and use of new techniques, as exemplified by the work of Solomon Snyder, who writes a prefatory article in this volume. The Editors have chosen "New Methods and Novel Therapeutic Approaches in Pharmacology and Toxicology" as the Theme for a number of articles in this volume. These include ones that review the development and use of new experimental tools and approaches (e.g., nanobodies and techniques to explore protein-protein interactions), new types of therapeutics (e.g., aptamers and antisense oligonucleotides), and systems pharmacology, which assembles (big) data derived from omics studies together with information regarding drugs and patients. The application of these new methods and therapeutic approaches has the potential to have a major impact on basic and clinical research in pharmacology and toxicology as well as on patient care.

  11. Towards efficient multi-scale methods for monitoring sugarcane aphid infestations in sorghum

    USDA-ARS?s Scientific Manuscript database

    We discuss approaches and issues involved with developing optimal monitoring methods for sugarcane aphid infestations (SCA) in grain sorghum. We discuss development of sequential sampling methods that allow for estimation of the number of aphids per sample unit, and statistical decision making rela...

  12. Hubble Space Telescope Angular Velocity Estimation During the Robotic Servicing Mission

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Sanner, Robert M.

    2005-01-01

    In 2004 NASA began investigation of a robotic servicing mission for the Hubble Space Telescope (HST). Such a mission would require estimates of the HST attitude and rates in order to achieve a capture by the proposed Hubble robotic vehicle (HRV). HRV was to be equipped with vision-based sensors, capable of estimating the relative attitude between HST and HRV. The inertial HST attitude is derived from the measured relative attitude and the HRV computed inertial attitude. However, the relative rate between HST and HRV cannot be measured directly. Therefore, the HST rate with respect to inertial space is not known. Two approaches are developed to estimate the HST rates. Both methods utilize the measured relative attitude and the HRV inertial attitude and rates. First, a nonlinear estimator is developed. The nonlinear approach estimates the HST rate through an estimation of the inertial angular momentum. The development includes an analysis of the estimator stability given errors in the measured attitude. Second, a linearized approach is developed. The linearized approach is a pseudo-linear Kalman filter. Simulation test results for both methods are given, including scenarios with erroneous measured attitudes. Even though the development began as an application for the HST robotic servicing mission, the methods presented are applicable to any rendezvous/capture mission involving a non-cooperative target spacecraft.

  13. A Locally Modal B-Spline Based Full-Vector Finite-Element Method with PML for Nonlinear and Lossy Plasmonic Waveguide

    NASA Astrophysics Data System (ADS)

    Karimi, Hossein; Nikmehr, Saeid; Khodapanah, Ehsan

    2016-09-01

    In this paper, we develop a B-spline finite-element method (FEM) based on a locally modal wave propagation with anisotropic perfectly matched layers (PMLs), for the first time, to simulate nonlinear and lossy plasmonic waveguides. Conventional approaches like beam propagation method, inherently omit the wave spectrum and do not provide physical insight into nonlinear modes especially in the plasmonic applications, where nonlinear modes are constructed by linear modes with very close propagation constant quantities. Our locally modal B-spline finite element method (LMBS-FEM) does not suffer from the weakness of the conventional approaches. To validate our method, first, propagation of wave for various kinds of linear, nonlinear, lossless and lossy materials of metal-insulator plasmonic structures are simulated using LMBS-FEM in MATLAB and the comparisons are made with FEM-BPM module of COMSOL Multiphysics simulator and B-spline finite-element finite-difference wide angle beam propagation method (BSFEFD-WABPM). The comparisons show that not only our developed numerical approach is computationally more accurate and efficient than conventional approaches but also it provides physical insight into the nonlinear nature of the propagation modes.

  14. Evaluating a Newly Developed Differentiation Approach in Terms of Student Achievement and Teachers' Opinions

    ERIC Educational Resources Information Center

    Altintas, Esra; Ozdemir, Ahmet S.

    2015-01-01

    This study aims to evaluate a differentiation approach that was recently developed to teach mathematics to gifted middle school students in terms of its practice by teachers by studying the effect of the approach on achievement among both gifted and non-gifted students. From mixed research methods, the study used an explanatory design. It was…

  15. The "Push-Pull" Approach to Fast-Track Management Development: A Case Study in Scientific Publishing

    ERIC Educational Resources Information Center

    Fojt, Martin; Parkinson, Stephen; Peters, John; Sandelands, Eric

    2008-01-01

    Purpose: The purpose of this paper is to explore how a medium sized business has addressed what it has termed a "push-pull" method of management and organization development, based around an action learning approach. Design/methodology/approach: The paper sets out a methodology that other SMEs might look to replicate in their management and…

  16. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids by Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  17. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  18. Invention in Argument.

    ERIC Educational Resources Information Center

    Fahnestock, Jeanne; Secor, Marie

    A genre approach to teaching the argumentative essay in composition classes has been developed. The need for this approach emanated from problems associated with the other methods of teaching persuasive discourse, such as the logical/analytic, content/problem solving, and rhetorical/generative approaches. The genre approach depends on the…

  19. The Professional Approach to Moral Education.

    ERIC Educational Resources Information Center

    Wright, Derek

    1982-01-01

    Defines the professional approach to moral education and contrasts it with the commonsense approach. The professional approach means deliberately planning school life to develop pupils as moral persons. The commonsense method treats students as members of the moral community, teachers exercising power and control over them. (RM)

  20. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  1. Method of fan sound mode structure determination

    NASA Technical Reports Server (NTRS)

    Pickett, G. F.; Sofrin, T. G.; Wells, R. W.

    1977-01-01

    A method for the determination of fan sound mode structure in the Inlet of turbofan engines using in-duct acoustic pressure measurements is presented. The method is based on the simultaneous solution of a set of equations whose unknowns are modal amplitude and phase. A computer program for the solution of the equation set was developed. An additional computer program was developed which calculates microphone locations the use of which results in an equation set that does not give rise to numerical instabilities. In addition to the development of a method for determination of coherent modal structure, experimental and analytical approaches are developed for the determination of the amplitude frequency spectrum of randomly generated sound models for use in narrow annulus ducts. Two approaches are defined: one based on the use of cross-spectral techniques and the other based on the use of an array of microphones.

  2. Clarifying the landscape approach: A Letter to the Editor on "Integrated landscape approaches to managing social and environmental issues in the tropics".

    PubMed

    Erbaugh, James; Agrawal, Arun

    2017-11-01

    Objectives, assumptions, and methods for landscape restoration and the landscape approach. World leaders have pledged 350 Mha for restoration using a landscape approach. The landscape approach is thus poised to become one of the most influential methods for multi-functional land management. Reed et al (2016) meaningfully advance scholarship on the landscape approach, but they incorrectly define the approach as it exists within their text. This Letter to the Editor clarifies the landscape approach as an ethic for land management, demonstrates how it relates to landscape restoration, and motivates continued theoretical development and empirical assessment of the landscape approach. © 2017 John Wiley & Sons Ltd.

  3. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  4. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  5. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. The REFLECT Approach to Literacy: Some Issues of Method.

    ERIC Educational Resources Information Center

    Dyer, Caroline; Choksi, Archana

    1998-01-01

    Describes REFLECT (Regenerated Freirean Literacy through Empowering Community Techniques) that is an approach to literacy teaching and learning developed by ActionAid. Focuses on the interplay between this "neutral" literacy method and its impact on the social context of one group of pastoralists, the semisedentary Katchi Rabaris of…

  7. A Mixed Learning Approach in Mechatronics Education

    ERIC Educational Resources Information Center

    Yilmaz, O.; Tuncalp, K.

    2011-01-01

    This study aims to investigate the effect of a Web-based mixed learning approach model on mechatronics education. The model combines different perception methods such as reading, listening, and speaking and practice methods developed in accordance with the vocational background of students enrolled in the course Electromechanical Systems in…

  8. Choice of Appropriate Multimedia Technology and Teaching Methods for Different Culture Groups

    ERIC Educational Resources Information Center

    Taratoukhina, Julia

    2014-01-01

    This paper describes the prerequisites for development in the area of cross-cultural multimedia didactics. This approach is based on research studies of differences between mentalities, ways of working with educational information, culturally-specific teaching methods and teaching techniques that determine differentiated approaches to the choice…

  9. Reimagining Teacher Development: Cultivating Spirit

    ERIC Educational Resources Information Center

    Dress, Amelia

    2012-01-01

    Although well-meaning, some methods of training approach teaching as a one-size-fits-all approach. Yet, there are myriad techniques for teaching and no one method works for all teachers or all students. Indeed, good teachers use a variety of techniques. Unfortunately, search for objective standards by which to measure quality teaching has…

  10. SPF Full-scale emissions test method development status ...

    EPA Pesticide Factsheets

    This is a non-technical presentation that is intended to inform ASTM task group members about our intended approach to full-scale emissions testing that includes the application of spray foam in an environmental chamber. The presentation describes the approach to emissions characterization, types of measurement systems employed, and expected outcomes from the planned tests. Purpose of this presentation is to update the ASTM D22.05 work group regarding status of our full-scale emissions test method development.

  11. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  12. Assessing Change in the Teaching Practice of Faculty in a Faculty Development Program for Primary Care Physicians: Toward a Mixed Method Evaluation Approach.

    ERIC Educational Resources Information Center

    Pinheiro, Sandro O.; Rohrer, Jonathan D.; Heimann, C. F. Larry

    This paper describes a mixed method evaluation study that was developed to assess faculty teaching behavior change in a faculty development fellowship program for community-based hospital faculty. Principles of adult learning were taught to faculty participants over the fellowship period. These included instruction in teaching methods, group…

  13. Perspectives of unlicensed assistive personnel on career development.

    PubMed

    Akaragian, Salpy; Crooks, Heidi; Pieters, Huibrie C

    2013-09-01

    An equivalency program, Method 3, is a viable but underused option for unlicensed assistive personnel (UAP) who pursue licensure. This study describes the perceptions of UAP on opportunities for career development. Eighteen UAP participated in three focus groups. Thematic analysis was conducted with verbatim transcription. Three major themes represented the lively discussions that occurred: core driving forces, processes of career development, and anticipated and desirable outcomes. Various subthemes described these major themes. Method 3 provides a realistic approach to help UAP persevere with career development. Collaboration with management and peers, encouragement, and effective communication contributed to the success of participants, despite obstacles and challenges. Camaraderie and flexible scheduling were critical elements in participants' pursuit of first licensure. Taking small steps was described as an effective approach for UAP to persevere with career development. Support for informal career development is essential. Nursing leaders should consider an equivalency approach to accommodate individual preferences and learning needs for career development. Copyright 2013, SLACK Incorporated.

  14. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  15. The systematic assessment of traditional evidence from the premodern Chinese medical literature: a text-mining approach.

    PubMed

    May, Brian H; Zhang, Anthony; Lu, Yubo; Lu, Chuanjian; Xue, Charlie C L

    2014-12-01

    This project aimed to develop an approach to evaluating information contained in the premodern Traditional Chinese Medicine (TCM) literature that was (1) comprehensive, systematic, and replicable and (2) able to produce quantifiable output that could be used to answer specific research questions in order to identify natural products for clinical and experimental research. The project involved two stages. In stage 1, 14 TCM collections and compendia were evaluated for suitability as sources for searching; 8 of these were compared in detail. The results were published in the Journal of Alternative and Complementary Medicine. Stage 2 developed a text-mining approach for two of these sources. The text-mining approach was developed for Zhong Hua Yi Dian; Encyclopaedia of Traditional Chinese Medicine, 4th edition) and Zhong Yi Fang Ji Da Ci Dian; Great Compendium of Chinese Medical Formulae). This approach developed procedures for search term selection; methods for screening, classifying, and scoring data; procedures for systematic searching and data extraction; data checking procedures; and approaches for analyzing results. Examples are provided for studies of memory impairment and diabetic nephropathy, and issues relating to data interpretation are discussed. This approach to the analysis of large collections of the premodern TCM literature uses widely available sources and provides a text-mining approach that is systematic, replicable, and adaptable to the requirements of the particular project. Researchers can use these methods to explore changes in the names and conceptions of a disease over time, to identify which therapeutic methods have been more or less frequently used in different eras for particular disorders, and to assist in the selection of natural products for research efforts.

  16. Evaluation Methodology between Globalization and Localization Features Approaches for Skin Cancer Lesions Classification

    NASA Astrophysics Data System (ADS)

    Ahmed, H. M.; Al-azawi, R. J.; Abdulhameed, A. A.

    2018-05-01

    Huge efforts have been put in the developing of diagnostic methods to skin cancer disease. In this paper, two different approaches have been addressed for detection the skin cancer in dermoscopy images. The first approach uses a global method that uses global features for classifying skin lesions, whereas the second approach uses a local method that uses local features for classifying skin lesions. The aim of this paper is selecting the best approach for skin lesion classification. The dataset has been used in this paper consist of 200 dermoscopy images from Pedro Hispano Hospital (PH2). The achieved results are; sensitivity about 96%, specificity about 100%, precision about 100%, and accuracy about 97% for globalization approach while, sensitivity about 100%, specificity about 100%, precision about 100%, and accuracy about 100% for Localization Approach, these results showed that the localization approach achieved acceptable accuracy and better than globalization approach for skin cancer lesions classification.

  17. Male contraception

    PubMed Central

    Chao, Jing; Page, Stephanie T.; Anderson, Richard A.

    2015-01-01

    Clear evidence shows that many men and women would welcome new male methods of contraception, but none have become available. The hormonal approach is based on suppression of gonadotropins and thus of testicular function and spermatogenesis, and has been investigated for several decades. This approach can achieve sufficient suppression of spermatogenesis for effective contraception in most men, but not all; the basis for these men responding insufficiently is unclear. Alternatively, the nonhormonal approach is based on identifying specific processes in sperm development, maturation and function. A range of targets has been identified in animal models, and targeted effectively. This approach, however, remains in the pre-clinical domain at present. There are, therefore, grounds for considering that safe, effective and reversible methods of contraception for men can be developed. PMID:24947599

  18. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Technical Reports Server (NTRS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  19. A hybrid-stress finite element approach for stress and vibration analysis in linear anisotropic elasticity

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.

    1987-01-01

    A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.

  20. Development of purely structure-based pharmacophores for the topoisomerase I-DNA-ligand binding pocket

    NASA Astrophysics Data System (ADS)

    Drwal, Malgorzata N.; Agama, Keli; Pommier, Yves; Griffith, Renate

    2013-12-01

    Purely structure-based pharmacophores (SBPs) are an alternative method to ligand-based approaches and have the advantage of describing the entire interaction capability of a binding pocket. Here, we present the development of SBPs for topoisomerase I, an anticancer target with an unusual ligand binding pocket consisting of protein and DNA atoms. Different approaches to cluster and select pharmacophore features are investigated, including hierarchical clustering and energy calculations. In addition, the performance of SBPs is evaluated retrospectively and compared to the performance of ligand- and complex-based pharmacophores. SBPs emerge as a valid method in virtual screening and a complementary approach to ligand-focussed methods. The study further reveals that the choice of pharmacophore feature clustering and selection methods has a large impact on the virtual screening hit lists. A prospective application of the SBPs in virtual screening reveals that they can be used successfully to identify novel topoisomerase inhibitors.

  1. A three-dimensional quality-guided phase unwrapping method for MR elastography

    NASA Astrophysics Data System (ADS)

    Wang, Huifang; Weaver, John B.; Perreard, Irina I.; Doyley, Marvin M.; Paulsen, Keith D.

    2011-07-01

    Magnetic resonance elastography (MRE) uses accumulated phases that are acquired at multiple, uniformly spaced relative phase offsets, to estimate harmonic motion information. Heavily wrapped phase occurs when the motion is large and unwrapping procedures are necessary to estimate the displacements required by MRE. Two unwrapping methods were developed and compared in this paper. The first method is a sequentially applied approach. The three-dimensional MRE phase image block for each slice was processed by two-dimensional unwrapping followed by a one-dimensional phase unwrapping approach along the phase-offset direction. This unwrapping approach generally works well for low noise data. However, there are still cases where the two-dimensional unwrapping method fails when noise is high. In this case, the baseline of the corrupted regions within an unwrapped image will not be consistent. Instead of separating the two-dimensional and one-dimensional unwrapping in a sequential approach, an interleaved three-dimensional quality-guided unwrapping method was developed to combine both the two-dimensional phase image continuity and one-dimensional harmonic motion information. The quality of one-dimensional harmonic motion unwrapping was used to guide the three-dimensional unwrapping procedures and it resulted in stronger guidance than in the sequential method. In this work, in vivo results generated by the two methods were compared.

  2. Salivary biomarker development using genomic, proteomic and metabolomic approaches

    PubMed Central

    2012-01-01

    The use of saliva as a diagnostic sample provides a non-invasive, cost-efficient method of sample collection for disease screening without the need for highly trained professionals. Saliva collection is far more practical and safe compared with invasive methods of sample collection, because of the infection risk from contaminated needles during, for example, blood sampling. Furthermore, the use of saliva could increase the availability of accurate diagnostics for remote and impoverished regions. However, the development of salivary diagnostics has required technical innovation to allow stabilization and detection of analytes in the complex molecular mixture that is saliva. The recent development of cost-effective room temperature analyte stabilization methods, nucleic acid pre-amplification techniques and direct saliva transcriptomic analysis have allowed accurate detection and quantification of transcripts found in saliva. Novel protein stabilization methods have also facilitated improved proteomic analyses. Although candidate biomarkers have been discovered using epigenetic, transcriptomic, proteomic and metabolomic approaches, transcriptomic analyses have so far achieved the most progress in terms of sensitivity and specificity, and progress towards clinical implementation. Here, we review recent developments in salivary diagnostics that have been accomplished using genomic, transcriptomic, proteomic and metabolomic approaches. PMID:23114182

  3. A Theoretical Framework for Integrating Creativity Development into Curriculum: The Case of a Korean Engineering School

    ERIC Educational Resources Information Center

    Lim, Cheolil; Lee, Jihyun; Lee, Sunhee

    2014-01-01

    Existing approaches to developing creativity rely on the sporadic teaching of creative thinking techniques or the engagement of learners in a creativity-promoting environment. Such methods cannot develop students' creativity as fully as a multilateral approach that integrates creativity throughout a curriculum. The purpose of this study was to…

  4. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  5. Simulating the Mind

    NASA Astrophysics Data System (ADS)

    Dietrich, Dietmar; Fodor, Georg; Zucker, Gerhard; Bruckner, Dietmar

    The approach to developing models described within the following chapters breaks with some of the previously used approaches in Artificial Intelligence. This is the first attempt to use methods from psychoanalysis organized in a strictly topdown design method in order to take an important step towards the creation of intelligent systems. Hence, the vision and the research hypothesis are described in the beginning and will hopefully prove to have sufficient grounds for this approach.

  6. Brain perfusion imaging using a Reconstruction-of-Difference (RoD) approach for cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Mow, M.; Zbijewski, W.; Sisniega, A.; Xu, J.; Dang, H.; Stayman, J. W.; Wang, X.; Foos, D. H.; Koliatsos, V.; Aygun, N.; Siewerdsen, J. H.

    2017-03-01

    Purpose: To improve the timely detection and treatment of intracranial hemorrhage or ischemic stroke, recent efforts include the development of cone-beam CT (CBCT) systems for perfusion imaging and new approaches to estimate perfusion parameters despite slow rotation speeds compared to multi-detector CT (MDCT) systems. This work describes development of a brain perfusion CBCT method using a reconstruction of difference (RoD) approach to enable perfusion imaging on a newly developed CBCT head scanner prototype. Methods: A new reconstruction approach using RoD with a penalized-likelihood framework was developed to image the temporal dynamics of vascular enhancement. A digital perfusion simulation was developed to give a realistic representation of brain anatomy, artifacts, noise, scanner characteristics, and hemo-dynamic properties. This simulation includes a digital brain phantom, time-attenuation curves and noise parameters, a novel forward projection method for improved computational efficiency, and perfusion parameter calculation. Results: Our results show the feasibility of estimating perfusion parameters from a set of images reconstructed from slow scans, sparse data sets, and arc length scans as short as 60 degrees. The RoD framework significantly reduces noise and time-varying artifacts from inconsistent projections. Proper regularization and the use of overlapping reconstructed arcs can potentially further decrease bias and increase temporal resolution, respectively. Conclusions: A digital brain perfusion simulation with RoD imaging approach has been developed and supports the feasibility of using a CBCT head scanner for perfusion imaging. Future work will include testing with data acquired using a 3D-printed perfusion phantom currently and translation to preclinical and clinical studies.

  7. Comparison of Transmission Line Methods for Surface Acoustic Wave Modeling

    NASA Technical Reports Server (NTRS)

    Wilson, William; Atkinson, Gary

    2009-01-01

    Surface Acoustic Wave (SAW) technology is low cost, rugged, lightweight, extremely low power and can be used to develop passive wireless sensors. For these reasons, NASA is investigating the use of SAW technology for Integrated Vehicle Health Monitoring (IVHM) of aerospace structures. To facilitate rapid prototyping of passive SAW sensors for aerospace applications, SAW models have been developed. This paper reports on the comparison of three methods of modeling SAWs. The three models are the Impulse Response Method (a first order model), and two second order matrix methods; the conventional matrix approach, and a modified matrix approach that is extended to include internal finger reflections. The second order models are based upon matrices that were originally developed for analyzing microwave circuits using transmission line theory. Results from the models are presented with measured data from devices. Keywords: Surface Acoustic Wave, SAW, transmission line models, Impulse Response Method.

  8. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  9. Black Ink and Red Ink (BIRI) Testing: A Testing Method to Evaluate Both Recall and Recognition Learning in Accelerated Adult-Learning Courses

    ERIC Educational Resources Information Center

    Rodgers, Joseph Lee; Rodgers, Jacci L.

    2011-01-01

    We propose, develop, and evaluate the black ink-red ink (BIRI) method of testing. This approach uses two different methods within the same test administration setting, one that matches recognition learning and the other that matches recall learning. Students purposively define their own tradeoff between the two approaches. Evaluation of the method…

  10. Assessing and Evaluating Multidisciplinary Translational Teams: A Mixed Methods Approach

    PubMed Central

    Wooten, Kevin C.; Rose, Robert M.; Ostir, Glenn V.; Calhoun, William J.; Ameredes, Bill T.; Brasier, Allan R.

    2014-01-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team type taxonomy. Based on team maturation and scientific progress, teams were designated as: a) early in development, b) traditional, c) process focused, or d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored. PMID:24064432

  11. State space approach to mixed boundary value problems.

    NASA Technical Reports Server (NTRS)

    Chen, C. F.; Chen, M. M.

    1973-01-01

    A state-space procedure for the formulation and solution of mixed boundary value problems is established. This procedure is a natural extension of the method used in initial value problems; however, certain special theorems and rules must be developed. The scope of the applications of the approach includes beam, arch, and axisymmetric shell problems in structural analysis, boundary layer problems in fluid mechanics, and eigenvalue problems for deformable bodies. Many classical methods in these fields developed by Holzer, Prohl, Myklestad, Thomson, Love-Meissner, and others can be either simplified or unified under new light shed by the state-variable approach. A beam problem is included as an illustration.

  12. DEVELOPMENT AND EVALUATION OF AN AGGREGATE SURFACE SAMPLING METHOD FOR USE IN ASSESSING DERMAL EXPOSURES OF YOUNG CHILDREN

    EPA Science Inventory

    In the macroactivity approach, dermal exposure is estimated using empirically-derived transfer coefficients to aggregate the mass transfer associated with a series of contacts with a contaminated medium. The macroactivity approach affords the possibility of developing screenin...

  13. Alternatives for Jet Engine Control

    NASA Technical Reports Server (NTRS)

    Leake, R. J.; Sain, M. K.

    1976-01-01

    Approaches are developed as alternatives to current design methods which rely heavily on linear quadratic and Riccati equation methods. The main alternatives are discussed in two broad categories, local multivariable frequency domain methods and global nonlinear optimal methods.

  14. A submerged singularity method for calculating potential flow velocities at arbitrary near-field points

    NASA Technical Reports Server (NTRS)

    Maskew, B.

    1976-01-01

    A discrete singularity method has been developed for calculating the potential flow around two-dimensional airfoils. The objective was to calculate velocities at any arbitrary point in the flow field, including points that approach the airfoil surface. That objective was achieved and is demonstrated here on a Joukowski airfoil. The method used combined vortices and sources ''submerged'' a small distance below the airfoil surface and incorporated a near-field subvortex technique developed earlier. When a velocity calculation point approached the airfoil surface, the number of discrete singularities effectively increased (but only locally) to keep the point just outside the error region of the submerged singularity discretization. The method could be extended to three dimensions, and should improve nonlinear methods, which calculate interference effects between multiple wings, and which include the effects of force-free trailing vortex sheets. The capability demonstrated here would extend the scope of such calculations to allow the close approach of wings and vortex sheets (or vortices).

  15. Orthogonal analytical methods for botanical standardization: Determination of green tea catechins by qNMR and LC-MS/MS

    PubMed Central

    Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.

    2013-01-01

    The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106

  16. Experiences with a Requirements-Based Programming Approach to the Development of a NASA Autonomous Ground Control System

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Requirements-to-Design-to-Code (R2D2C) is an approach to the engineering of computer-based systems that embodies the idea of requirements-based programming in system development. It goes further; however, in that the approach offers not only an underlying formalism, but full formal development from requirements capture through to the automatic generation of provably-correct code. As such, the approach has direct application to the development of systems requiring autonomic properties. We describe a prototype tool to support the method, and illustrate its applicability to the development of LOGOS, a NASA autonomous ground control system, which exhibits autonomic behavior. Finally, we briefly discuss other areas where the approach and prototype tool are being considered for application.

  17. A cross-species bi-clustering approach to identifying conserved co-regulated genes.

    PubMed

    Sun, Jiangwen; Jiang, Zongliang; Tian, Xiuchun; Bi, Jinbo

    2016-06-15

    A growing number of studies have explored the process of pre-implantation embryonic development of multiple mammalian species. However, the conservation and variation among different species in their developmental programming are poorly defined due to the lack of effective computational methods for detecting co-regularized genes that are conserved across species. The most sophisticated method to date for identifying conserved co-regulated genes is a two-step approach. This approach first identifies gene clusters for each species by a cluster analysis of gene expression data, and subsequently computes the overlaps of clusters identified from different species to reveal common subgroups. This approach is ineffective to deal with the noise in the expression data introduced by the complicated procedures in quantifying gene expression. Furthermore, due to the sequential nature of the approach, the gene clusters identified in the first step may have little overlap among different species in the second step, thus difficult to detect conserved co-regulated genes. We propose a cross-species bi-clustering approach which first denoises the gene expression data of each species into a data matrix. The rows of the data matrices of different species represent the same set of genes that are characterized by their expression patterns over the developmental stages of each species as columns. A novel bi-clustering method is then developed to cluster genes into subgroups by a joint sparse rank-one factorization of all the data matrices. This method decomposes a data matrix into a product of a column vector and a row vector where the column vector is a consistent indicator across the matrices (species) to identify the same gene cluster and the row vector specifies for each species the developmental stages that the clustered genes co-regulate. Efficient optimization algorithm has been developed with convergence analysis. This approach was first validated on synthetic data and compared to the two-step method and several recent joint clustering methods. We then applied this approach to two real world datasets of gene expression during the pre-implantation embryonic development of the human and mouse. Co-regulated genes consistent between the human and mouse were identified, offering insights into conserved functions, as well as similarities and differences in genome activation timing between the human and mouse embryos. The R package containing the implementation of the proposed method in C ++ is available at: https://github.com/JavonSun/mvbc.git and also at the R platform https://www.r-project.org/ jinbo@engr.uconn.edu. © The Author 2016. Published by Oxford University Press.

  18. Methods of Measurement in epidemiology: Sedentary Behaviour

    PubMed Central

    Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH

    2012-01-01

    Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206

  19. Decentralized model reference adaptive control of large flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Fu-Ming; Fong, I-Kong; Lin, Yu-Hwan

    1988-01-01

    A decentralized model reference adaptive control (DMRAC) method is developed for large flexible structures (LFS). The development follows that of a centralized model reference adaptive control for LFS that have been shown to be feasible. The proposed method is illustrated using a simply supported beam with collocated actuators and sensors. Results show that the DMRAC can achieve either output regulation or output tracking with adequate convergence, provided the reference model inputs and their time derivatives are integrable, bounded, and approach zero as t approaches infinity.

  20. Integrating Allergen Analysis Within a Risk Assessment Framework: Approaches to Development of Targeted Mass Spectrometry Methods for Allergen Detection and Quantification in the iFAAM Project.

    PubMed

    Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare

    2018-01-01

    Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.

  1. The Influence of Living Values Education-Based Civic Education Textbook on Students' Character Formation

    ERIC Educational Resources Information Center

    Komalasari, Kokom; Saripudin, Didin

    2018-01-01

    This study aims to develop and examine a civic education textbook model based on living values education in order to foster the development of junior high school students' characters. This research employs Research and Development approach with an explorative method being used at model development stage and experiment method at model testing…

  2. A practical approach for predicting retention time shifts due to pressure and temperature gradients in ultra-high-pressure liquid chromatography.

    PubMed

    Åsberg, Dennis; Chutkowski, Marcin; Leśko, Marek; Samuelsson, Jörgen; Kaczmarski, Krzysztof; Fornstedt, Torgny

    2017-01-06

    Large pressure gradients are generated in ultra-high-pressure liquid chromatography (UHPLC) using sub-2μm particles causing significant temperature gradients over the column due to viscous heating. These pressure and temperature gradients affect retention and ultimately result in important selectivity shifts. In this study, we developed an approach for predicting the retention time shifts due to these gradients. The approach is presented as a step-by-step procedure and it is based on empirical linear relationships describing how retention varies as a function of temperature and pressure and how the average column temperature increases with the flow rate. It requires only four experiments on standard equipment, is based on straightforward calculations, and is therefore easy to use in method development. The approach was rigorously validated against experimental data obtained with a quality control method for the active pharmaceutical ingredient omeprazole. The accuracy of retention time predictions was very good with relative errors always less than 1% and in many cases around 0.5% (n=32). Selectivity shifts observed between omeprazole and the related impurities when changing the flow rate could also be accurately predicted resulting in good estimates of the resolution between critical peak pairs. The approximations which the presented approach are based on were all justified. The retention factor as a function of pressure and temperature was studied in an experimental design while the temperature distribution in the column was obtained by solving the fundamental heat and mass balance equations for the different experimental conditions. We strongly believe that this approach is sufficiently accurate and experimentally feasible for this separation to be a valuable tool when developing a UHPLC method. After further validation with other separation systems, it could become a useful approach in UHPLC method development, especially in the pharmaceutical industry where demands are high for robustness and regulatory oversight. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Atomization simulations using an Eulerian-VOF-Lagrangian method

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Liaw, Paul; Chen, C. P.

    1994-01-01

    This paper summarizes the technical development and validation of a multiphase computational fluid dynamics (CFD) numerical method using the volume-of-fluid (VOF) model and a Lagrangian tracking model which can be employed to analyze general multiphase flow problems with free surface mechanism. The gas-liquid interface mass, momentum and energy conservations are modeled by continuum surface mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed flow regimes. The objectives of the present study are to develop and verify the fractional volume-of-fluid cell partitioning approach into a predictor-corrector algorithm and to demonstrate the effectiveness of the present innovative approach by simulating benchmark problems including the coaxial jet atomization.

  4. A Comparison of Surface Acoustic Wave Modeling Methods

    NASA Technical Reports Server (NTRS)

    Wilson, W. c.; Atkinson, G. M.

    2009-01-01

    Surface Acoustic Wave (SAW) technology is low cost, rugged, lightweight, extremely low power and can be used to develop passive wireless sensors. For these reasons, NASA is investigating the use of SAW technology for Integrated Vehicle Health Monitoring (IVHM) of aerospace structures. To facilitate rapid prototyping of passive SAW sensors for aerospace applications, SAW models have been developed. This paper reports on the comparison of three methods of modeling SAWs. The three models are the Impulse Response Method a first order model, and two second order matrix methods; the conventional matrix approach, and a modified matrix approach that is extended to include internal finger reflections. The second order models are based upon matrices that were originally developed for analyzing microwave circuits using transmission line theory. Results from the models are presented with measured data from devices.

  5. Approach to developing reliable space reactor power systems

    NASA Technical Reports Server (NTRS)

    Mondt, Jack F.; Shinbrot, Charles H.

    1991-01-01

    During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top-down systems approach which includes a point design based on a detailed technical specification of a 100-kW power system. The SP-100 system requirements implicitly recognize the challenge of achieving a high system reliability for a ten-year lifetime, while at the same time using technologies that require very significant development efforts. A low-cost method for assessing reliability, based on an understanding of fundamental failure mechanisms and design margins for specific failure mechanisms, is being developed as part of the SP-100 Program.

  6. The Effect of Cooperative Learning Method and Systematic Teaching on Students' Achievement and Retention of Knowledge in Social Studies Lesson

    ERIC Educational Resources Information Center

    Korkmaz Toklucu, Selma; Tay, Bayram

    2016-01-01

    Problem Statement: Many effective instructional strategies, methods, and techniques, which were developed in accordance with constructivist approach, can be used together in social studies lessons. Constructivist education comprises active learning processes. Two active learning approaches are cooperative learning and systematic teaching. Purpose…

  7. A Practical Approach to Implementing the Core Competencies in a Child and Adolescent Psychiatry Residency Program

    ERIC Educational Resources Information Center

    Dingle, Arden D.; Sexson, Sandra B.

    2007-01-01

    Objective: The authors describe the development and implementation of the Accreditation Council for Graduate Medical Education's core competencies in a child and adolescent psychiatry residency program. Method: The authors identify the program's organizational approach and participants and detail various strategies and methods of defining,…

  8. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  9. Lagrangian based methods for coherent structure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less

  10. [Determinants of strategic management of a health center].

    PubMed

    Huard, Pierre; Schaller, Philippe

    2014-01-01

    The article highlights the value of a strategic approach for the development of a primary care health centre. The method is adapted from corporate strategy: (i) analysis of the situation of the health centre and the obstacles to its development. (ii) selection of relations on which the strategy can be developed. (iii) elaboration of a system of interventions to create a cumulative development process. (iv) Illustration of the method by application to a case. The example illustrates the principles and method and highlights the importance of interpretations and choices in elaboration of a strategy, which is therefore always a unique construction. The strategic approach provides a framework that (i) provides a subject of discussion and negotiation between members of the health centre, (ii) strengthens the consistency of structural decisions, (iii) helps the health centre to overcome obstacles and initiate a development process.

  11. Recommended approach to sofware development

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.; Page, J.; Eslinger, S.; Church, V.; Merwarth, P.

    1983-01-01

    A set of guideline for an organized, disciplined approach to software development, based on data collected and studied for 46 flight dynamics software development projects. Methods and practices for each phase of a software development life cycle that starts with requirements analysis and ends with acceptance testing are described; maintenance and operation is not addressed. For each defined life cycle phase, guidelines for the development process and its management, and the products produced and their reviews are presented.

  12. Ada developers' supplement to the recommended approach

    NASA Technical Reports Server (NTRS)

    Kester, Rush; Landis, Linda

    1993-01-01

    This document is a collection of guidelines for programmers and managers who are responsible for the development of flight dynamics applications in Ada. It is intended to be used in conjunction with the Recommended Approach to Software Development (SEL-81-305), which describes the software development life cycle, its products, reviews, methods, tools, and measures. The Ada Developers' Supplement provides additional detail on such topics as reuse, object-oriented analysis, and object-oriented design.

  13. Prince Edward Island Newstarts' Comprehensive Manpower Development System.

    ERIC Educational Resources Information Center

    Connor, Thomas R.

    1971-01-01

    An approach to new methods of helping disadvantaged people gain employment taken by Prince Edward Island Newstart is outlined. This approach is a Comprehensive Manpower Development System. The major components of the system consist of: (1) variants of some standard manpower training programs, (2) innovative recruitment and assignment techniques,…

  14. The Science ELF: Assessing the Enquiry Levels Framework as a Heuristic for Professional Development

    ERIC Educational Resources Information Center

    Wheeler, Lindsay B.; Bell, Randy L.; Whitworth, Brooke A.; Maeng, Jennifer L.

    2015-01-01

    This study utilized an explanatory sequential mixed methods approach to explore randomly assigned treatment and control participants' frequency of inquiry instruction in secondary science classrooms. Eleven treatment participants received professional development (PD) that emphasized a structured approach to inquiry instruction, while 10 control…

  15. Coaching: An Apprenticeship Approach for the 21st Century

    ERIC Educational Resources Information Center

    Salavert, Roser

    2015-01-01

    Coaching, an apprentice-based approach to support professional and personal development towards achieving set goals, is a well-established practice in the fields of sports training and management and one of the fastest growing professional development methods in the education field. How the coaching partnership fosters leadership and improves…

  16. Pedagogical Approaches to Develop Critical Thinking and Crisis Leadership

    ERIC Educational Resources Information Center

    Powley, Edward H.; Taylor, Scott N.

    2014-01-01

    Management schools must be prepared to aid leaders and managers to succeed in uncertain environments. We offer two approaches, each designed for critical thinking skill development, to teach graduate management students about leading in and through potential disruption to organizational life. First, we present a personalized case method that…

  17. The rank correlated SLW model of gas radiation in non-uniform media

    NASA Astrophysics Data System (ADS)

    Solovjov, Vladimir P.; Andre, Frederic; Lemonnier, Denis; Webb, Brent W.

    2017-08-01

    A comprehensive theoretical development of possible reference approaches in modelling of radiation transfer in non-uniform gaseous media is developed within the framework of the Generalized SLW Model. The notion of absorption spectrum ;correlation; adopted currently for global methods in gas radiation is critically revisited and replaced by a less restrictive concept of rank correlated spectrum. Within this framework it is shown that eight different reference approaches are possible, of which only three have been reported in the literature. Among the approaches presented is a novel Rank Correlated SLW Model, which is distinguished by the fact that i) it does not require the specification of a reference gas thermodynamic state, and ii) it preserves the emission term in the spectrally integrated Radiative Transfer Equation. Construction of this reference model requires only two absorption line blackbody distribution functions, and subdivision into gray gases can be performed using standard quadratures. Consequently, this new reference approach appears to have significant advantages over all other methods, and is, in general, a significant improvement in the global modelling of gas radiation. All reference approaches are summarized in the present work, and their use in radiative transfer prediction is demonstrated for simple example cases. Further, a detailed rigorous theoretical development of the improved methods is provided.

  18. An Artificial Neural Networks Method for Solving Partial Differential Equations

    NASA Astrophysics Data System (ADS)

    Alharbi, Abir

    2010-09-01

    While there already exists many analytical and numerical techniques for solving PDEs, this paper introduces an approach using artificial neural networks. The approach consists of a technique developed by combining the standard numerical method, finite-difference, with the Hopfield neural network. The method is denoted Hopfield-finite-difference (HFD). The architecture of the nets, energy function, updating equations, and algorithms are developed for the method. The HFD method has been used successfully to approximate the solution of classical PDEs, such as the Wave, Heat, Poisson and the Diffusion equations, and on a system of PDEs. The software Matlab is used to obtain the results in both tabular and graphical form. The results are similar in terms of accuracy to those obtained by standard numerical methods. In terms of speed, the parallel nature of the Hopfield nets methods makes them easier to implement on fast parallel computers while some numerical methods need extra effort for parallelization.

  19. CARA: Cognitive Architecture for Reasoning About Adversaries

    DTIC Science & Technology

    2012-01-20

    synthesis approach taken here the KIDS principle (Keep It Descriptive, Stupid ) applies, and agents and organizations are profiled in great detail...developed two algorithms to make forecasts about adversarial behavior. We developed game-theoretical approaches to reason about group behavior. We...to automatically make forecasts about group behavior together with methods to quantify the uncertainty inherent in such forecasts; • Developed

  20. Technology Development Risk Assessment for Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Godsell, Aga M.; Go, Susie

    2006-01-01

    A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.

  1. The regional approach and regional studies method in the process of geography teaching

    NASA Astrophysics Data System (ADS)

    Dermendzhieva, Stela; Doikov, Martin

    2017-03-01

    We define the regional approach as a manner of relations among the global trends of development of the "Society-man-nature" system and the local differentiating level of knowledge. Conditionally, interactions interlace under the influence of the character of Geography as a science, education, approaches, goals and teaching methods. Global, national and local development differentiates in three concentric circles at the level of knowledge. It is determined as a conception of modern, complex and effective mechanism for young people, through which knowledge develops in regional historical and cultural perspective; self-consciousness for socio-economic and cultural integration is formed as a part of the. historical-geographical image of the native land. This way an attitude to the. native land is formed as a connecting construct between patriotism to the motherland and the same in global aspect. The possibility for integration and cooperation of the educative geographical content with all the local historical-geographical, regional, profession orientating, artistic, municipal and district institutions, is outlined. Contemporary geographical education appears to be a powerful and indispensable mechanism for organization of human sciences, while the regional approach and the application of the regional studies method stimulate and motivate the development and realization of optimal capacities for direct connection with the local structures and environments.

  2. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  3. TH-CD-207A-07: Prediction of High Dimensional State Subject to Respiratory Motion: A Manifold Learning Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Sawant, A; Ruan, D

    Purpose: The development of high dimensional imaging systems (e.g. volumetric MRI, CBCT, photogrammetry systems) in image-guided radiotherapy provides important pathways to the ultimate goal of real-time volumetric/surface motion monitoring. This study aims to develop a prediction method for the high dimensional state subject to respiratory motion. Compared to conventional linear dimension reduction based approaches, our method utilizes manifold learning to construct a descriptive feature submanifold, where more efficient and accurate prediction can be performed. Methods: We developed a prediction framework for high-dimensional state subject to respiratory motion. The proposed method performs dimension reduction in a nonlinear setting to permit moremore » descriptive features compared to its linear counterparts (e.g., classic PCA). Specifically, a kernel PCA is used to construct a proper low-dimensional feature manifold, where low-dimensional prediction is performed. A fixed-point iterative pre-image estimation method is applied subsequently to recover the predicted value in the original state space. We evaluated and compared the proposed method with PCA-based method on 200 level-set surfaces reconstructed from surface point clouds captured by the VisionRT system. The prediction accuracy was evaluated with respect to root-mean-squared-error (RMSE) for both 200ms and 600ms lookahead lengths. Results: The proposed method outperformed PCA-based approach with statistically higher prediction accuracy. In one-dimensional feature subspace, our method achieved mean prediction accuracy of 0.86mm and 0.89mm for 200ms and 600ms lookahead lengths respectively, compared to 0.95mm and 1.04mm from PCA-based method. The paired t-tests further demonstrated the statistical significance of the superiority of our method, with p-values of 6.33e-3 and 5.78e-5, respectively. Conclusion: The proposed approach benefits from the descriptiveness of a nonlinear manifold and the prediction reliability in such low dimensional manifold. The fixed-point iterative approach turns out to work well practically for the pre-image recovery. Our approach is particularly suitable to facilitate managing respiratory motion in image-guide radiotherapy. This work is supported in part by NIH grant R01 CA169102-02.« less

  4. A transdisciplinary approach for supporting the integration of ecosystem services into land and water management

    NASA Astrophysics Data System (ADS)

    Fatt Siew, Tuck; Döll, Petra

    2015-04-01

    Transdisciplinary approaches are useful for supporting integrated land and water management. However, the implementation of the approach in practice to facilitate the co-production of useable socio-hydrological (and -ecological) knowledge among scientists and stakeholders is challenging. It requires appropriate methods to bring individuals with diverse interests and needs together and to integrate their knowledge for generating shared perspectives/understanding, identifying common goals, and developing actionable management strategies. The approach and the methods need, particularly, to be adapted to the local political and socio-cultural conditions. To demonstrate how knowledge co-production and integration can be done in practice, we present a transdisciplinary approach which has been implemented and adapted for supporting land and water management that takes ecosystem services into account in an arid region in northwestern China. Our approach comprises three steps: (1) stakeholder analysis and interdisciplinary knowledge integration, (2) elicitation of perspectives of scientists and stakeholders, scenario development, and identification of management strategies, and (3) evaluation of knowledge integration and social learning. Our adapted approach has enabled interdisciplinary and cross-sectoral communication among scientists and stakeholders. Furthermore, the application of a combination of participatory methods, including actor modeling, Bayesian Network modeling, and participatory scenario development, has contributed to the integration of system, target, and transformation knowledge of involved stakeholders. The realization of identified management strategies is unknown because other important and representative decision makers have not been involved in the transdisciplinary research process. The contribution of our transdisciplinary approach to social learning still needs to be assessed.

  5. Methodical Approaches to Determine the Level of Risk Associated with the Formation of the Capital Structure in Conditions of Unsteady Economy

    ERIC Educational Resources Information Center

    Petrovskaya, Maria V.; Larionova, Anna A.; Zaitseva, Natalia A.; Bondarchuk, Natalya V.; Grigorieva, Elena M.

    2016-01-01

    The relevance of the problem stated in the article is that in conditions of nonstationary economy the modification of existing approaches and methods is necessary during the formation of the capital. These methods allow taking into account the heterogeneity of factors' change in time and the purpose of the development of a particular company,…

  6. [Causal analysis approaches in epidemiology].

    PubMed

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  7. Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems

    NASA Technical Reports Server (NTRS)

    Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.

    2005-01-01

    The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.

  8. Improving the dictionary lookup approach for disease normalization using enhanced dictionary and query expansion

    PubMed Central

    Jonnagaddala, Jitendra; Jue, Toni Rose; Chang, Nai-Wen; Dai, Hong-Jie

    2016-01-01

    The rapidly increasing biomedical literature calls for the need of an automatic approach in the recognition and normalization of disease mentions in order to increase the precision and effectivity of disease based information retrieval. A variety of methods have been proposed to deal with the problem of disease named entity recognition and normalization. Among all the proposed methods, conditional random fields (CRFs) and dictionary lookup method are widely used for named entity recognition and normalization respectively. We herein developed a CRF-based model to allow automated recognition of disease mentions, and studied the effect of various techniques in improving the normalization results based on the dictionary lookup approach. The dataset from the BioCreative V CDR track was used to report the performance of the developed normalization methods and compare with other existing dictionary lookup based normalization methods. The best configuration achieved an F-measure of 0.77 for the disease normalization, which outperformed the best dictionary lookup based baseline method studied in this work by an F-measure of 0.13. Database URL: https://github.com/TCRNBioinformatics/DiseaseExtract PMID:27504009

  9. Male contraception.

    PubMed

    Chao, Jing; Page, Stephanie T; Anderson, Richard A

    2014-08-01

    Clear evidence shows that many men and women would welcome new male methods of contraception, but none have become available. The hormonal approach is based on suppression of gonadotropins and thus of testicular function and spermatogenesis, and has been investigated for several decades. This approach can achieve sufficient suppression of spermatogenesis for effective contraception in most men, but not all; the basis for these men responding insufficiently is unclear. Alternatively, the non-hormonal approach is based on identifying specific processes in sperm development, maturation and function. A range of targets has been identified in animal models, and targeted effectively. This approach, however, remains in the pre-clinical domain at present. There are, therefore, grounds for considering that safe, effective and reversible methods of contraception for men can be developed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Designing eHealth that Matters via a Multidisciplinary Requirements Development Approach

    PubMed Central

    Wentzel, Jobke; Van Gemert-Pijnen, Julia EWC

    2013-01-01

    Background Requirements development is a crucial part of eHealth design. It entails all the activities devoted to requirements identification, the communication of requirements to other developers, and their evaluation. Currently, a requirements development approach geared towards the specifics of the eHealth domain is lacking. This is likely to result in a mismatch between the developed technology and end user characteristics, physical surroundings, and the organizational context of use. It also makes it hard to judge the quality of eHealth design, since it makes it difficult to gear evaluations of eHealth to the main goals it is supposed to serve. Objective In order to facilitate the creation of eHealth that matters, we present a practical, multidisciplinary requirements development approach which is embedded in a holistic design approach for eHealth (the Center for eHealth Research roadmap) that incorporates both human-centered design and business modeling. Methods Our requirements development approach consists of five phases. In the first, preparatory, phase the project team is composed and the overall goal(s) of the eHealth intervention are decided upon. Second, primary end users and other stakeholders are identified by means of audience segmentation techniques and our stakeholder identification method. Third, the designated context of use is mapped and end users are profiled by means of requirements elicitation methods (eg, interviews, focus groups, or observations). Fourth, stakeholder values and eHealth intervention requirements are distilled from data transcripts, which leads to phase five, in which requirements are communicated to other developers using a requirements notation template we developed specifically for the context of eHealth technologies. Results The end result of our requirements development approach for eHealth interventions is a design document which includes functional and non-functional requirements, a list of stakeholder values, and end user profiles in the form of personas (fictitious end users, representative of a primary end user group). Conclusions The requirements development approach presented in this article enables eHealth developers to apply a systematic and multi-disciplinary approach towards the creation of requirements. The cooperation between health, engineering, and social sciences creates a situation in which a mismatch between design, end users, and the organizational context can be avoided. Furthermore, we suggest to evaluate eHealth on a feature-specific level in order to learn exactly why such a technology does or does not live up to its expectations. PMID:23796508

  11. Optic disk localization by a robust fusion method

    NASA Astrophysics Data System (ADS)

    Zhang, Jielin; Yin, Fengshou; Wong, Damon W. K.; Liu, Jiang; Baskaran, Mani; Cheng, Ching-Yu; Wong, Tien Yin

    2013-02-01

    The optic disk localization plays an important role in developing computer-aided diagnosis (CAD) systems for ocular diseases such as glaucoma, diabetic retinopathy and age-related macula degeneration. In this paper, we propose an intelligent fusion of methods for the localization of the optic disk in retinal fundus images. Three different approaches are developed to detect the location of the optic disk separately. The first method is the maximum vessel crossing method, which finds the region with the most number of blood vessel crossing points. The second one is the multichannel thresholding method, targeting the area with the highest intensity. The final method searches the vertical and horizontal region-of-interest separately on the basis of blood vessel structure and neighborhood entropy profile. Finally, these three methods are combined using an intelligent fusion method to improve the overall accuracy. The proposed algorithm was tested on the STARE database and the ORIGAlight database, each consisting of images with various pathologies. The preliminary result on the STARE database can achieve 81.5%, while a higher result of 99% can be obtained for the ORIGAlight database. The proposed method outperforms each individual approach and state-of-the-art method which utilizes an intensity-based approach. The result demonstrates a high potential for this method to be used in retinal CAD systems.

  12. Appreciative Inquiry as a Method for Participatory Change in Secondary Schools in Lebanon

    ERIC Educational Resources Information Center

    Shuayb, Maha

    2014-01-01

    Appreciative inquiry is a strategy which takes a positive approach to organizational development. It aims to identify good practice, design effective development plans, and ensure implementation. This article examines the potentials and limitations of using the appreciative inquiry in a mixed methods research design for developing school…

  13. Developing a Competency-Based Pan-European Accreditation Framework for Health Promotion

    ERIC Educational Resources Information Center

    Battel-Kirk, Barbara; Van der Zanden, Gerard; Schipperen, Marielle; Contu, Paolo; Gallardo, Carmen; Martinez, Ana; Garcia de Sola, Silvia; Sotgiu, Alessandra; Zaagsma, Miriam; Barry, Margaret M.

    2012-01-01

    Background: The CompHP Pan-European Accreditation Framework for Health Promotion was developed as part of the CompHP Project that aimed to develop competency-based standards and an accreditation system for health promotion practice, education, and training in Europe. Method: A phased, multiple-method approach was employed to facilitate consensus…

  14. Processing ultrasound backscatter to monitor high-intensity focused ultrasound (HIFU) therapy

    NASA Astrophysics Data System (ADS)

    Kaczkowski, Peter J.; Anand, Ajay; Bailey, Michael R.

    2005-09-01

    The development of new noninvasive surgical methods such as HIFU for the treatment of cancer and internal bleeding requires simultaneous development of new sensing approaches to guide, monitor, and assess the therapy. Ultrasound imaging using echo amplitude has long been used to map tissue morphology for diagnostic interpretation by the clinician. New quantitative ultrasonic methods that rely on amplitude and phase processing for tissue characterization are being developed for monitoring of ablative therapy. We have been developing the use of full wave ultrasound backscattering for real-time temperature estimation, and to image changes in tissue backscatter spectrum as therapy progresses. Both approaches rely on differential processing of the backscatter signal in time, and precise measurement of phase differences. Noise and artifacts from motion and nonstationary speckle statistics are addressed by constraining inversions for tissue parameters with physical models. We present results of HIFU experiments with static point and scanned HIFU exposures in which temperature rise can be accurately mapped using a new heat transfer equation (HTE) model-constrained inverse approach. We also present results of a recently developed spectral imaging method that elucidates microbubble-mediated nonlinearity not visible as a change in backscatter amplitude. [Work supported by Army MRMC.

  15. Coupling of Peridynamics and Finite Element Formulation for Multiscale Simulations

    DTIC Science & Technology

    2012-10-16

    unidirectional fiber - reinforced composites, Computer Methods in Applied Mechanics and Engineering 217 (2012) 247-261. [44] S. A. Silling, M. Epton...numerical testing for different grid widths to horizon ratios , (4) development of an approach to add another material variable in the given approach...partition of unity principle, (3) numerical testing for different grid widths to horizon ratios , (4) development of an approach to add another

  16. Quantification of Uncertainty in the Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  17. Identifying Consumer’s Needs of Health Information Technology through an Innovative Participatory Design Approach among English- and Spanish-speaking Urban Older Adults

    PubMed Central

    Sheehan, B.; Yen, P.; Velez, O.; Nobile-Hernandez, D.; Tiase, V.

    2014-01-01

    Summary Objectives We describe an innovative community-centered participatory design approach, Consumer-centered Participatory Design (C2PD), and the results of applying C2PD to design and develop a web-based fall prevention system. Methods We conducted focus groups and design sessions with English- and Spanish-speaking community-dwelling older adults. Focus group data were summarized and used to inform the context of the design sessions. Descriptive content analysis methods were used to develop categorical descriptions of design session informant’s needs related to information technology. Results The C2PD approach enabled the assessment and identification of informant’s needs of health information technology (HIT) that informed the development of a falls prevention system. We learned that our informants needed a system that provides variation in functions/content; differentiates between actionable/non-actionable information/structures; and contains sensory cues that support wide-ranging and complex tasks in a varied, simple, and clear interface to facilitate self-management. Conclusions The C2PD approach provides community-based organizations, academic researchers, and commercial entities with a systematic theoretically informed approach to develop HIT innovations. Our community-centered participatory design approach focuses on consumer’s technology needs while taking into account core public health functions. PMID:25589909

  18. Definition of perspective scheme of organization of traffic using methods of forecasting and modeling

    NASA Astrophysics Data System (ADS)

    Vlasov, V. M.; Novikov, A. N.; Novikov, I. A.; Shevtsova, A. G.

    2018-03-01

    In the environment of highly developed urban agglomerations, one of the main problems arises - inability of the road network to reach a high level of motorization. The introduction of intelligent transport systems allows solving this problem, but the main issue in their implementation remains open: to what extent this or that method of improving the transport network will be effective and whether it is able to solve the problem of vehicle growth especially for the long-term period. The main goal of this work was the development of an approach to forecasting the increase in the intensity of traffic flow for a long-term period using the population and the level of motorization. The developed approach made it possible to determine the projected population and, taking into account the level of motorization, to determine the growth factor of the traffic flow intensity, which allows calculating the intensity value for a long-term period with high accuracy. The analysis of the main methods for predicting the characteristics of the transport stream is performed. The basic values and parameters necessary for their use are established. The analysis of the urban settlement is carried out and the level of motorization characteristic for the given locality is determined. A new approach to predicting the intensity of the traffic flow has been developed, which makes it possible to predict the change in the transport situation in the long term in high accuracy. Calculations of the magnitude of the intensity increase on the basis of the developed forecasting method are made and the errors in the data obtained are determined. The main recommendations on the use of the developed forecasting approach for the long-term functioning of the road network are formulated.

  19. Dynamic adaptive learning for decision-making supporting systems

    NASA Astrophysics Data System (ADS)

    He, Haibo; Cao, Yuan; Chen, Sheng; Desai, Sachi; Hohil, Myron E.

    2008-03-01

    This paper proposes a novel adaptive learning method for data mining in support of decision-making systems. Due to the inherent characteristics of information ambiguity/uncertainty, high dimensionality and noisy in many homeland security and defense applications, such as surveillances, monitoring, net-centric battlefield, and others, it is critical to develop autonomous learning methods to efficiently learn useful information from raw data to help the decision making process. The proposed method is based on a dynamic learning principle in the feature spaces. Generally speaking, conventional approaches of learning from high dimensional data sets include various feature extraction (principal component analysis, wavelet transform, and others) and feature selection (embedded approach, wrapper approach, filter approach, and others) methods. However, very limited understandings of adaptive learning from different feature spaces have been achieved. We propose an integrative approach that takes advantages of feature selection and hypothesis ensemble techniques to achieve our goal. Based on the training data distributions, a feature score function is used to provide a measurement of the importance of different features for learning purpose. Then multiple hypotheses are iteratively developed in different feature spaces according to their learning capabilities. Unlike the pre-set iteration steps in many of the existing ensemble learning approaches, such as adaptive boosting (AdaBoost) method, the iterative learning process will automatically stop when the intelligent system can not provide a better understanding than a random guess in that particular subset of feature spaces. Finally, a voting algorithm is used to combine all the decisions from different hypotheses to provide the final prediction results. Simulation analyses of the proposed method on classification of different US military aircraft databases show the effectiveness of this method.

  20. Development of a Web-Based Health Care Intervention for Patients With Heart Disease: Lessons Learned From a Participatory Design Study

    PubMed Central

    2017-01-01

    Background The use of telemedicine technologies in health care has increased substantially, together with a growing interest in participatory design methods when developing telemedicine approaches. Objective We present lessons learned from a case study involving patients with heart disease and health care professionals in the development of a personalized Web-based health care intervention. Methods We used a participatory design approach inspired by the method for feasibility studies in software development. We collected qualitative data using multiple methods in 3 workshops and analyzed the data using thematic analysis. Participants were 7 patients with diagnosis of heart disease, 2 nurses, 1 physician, 2 systems architects, 3 moderators, and 3 observers. Results We present findings in 2 parts. (1) Outcomes of the participatory design process: users gave valuable feedback on ease of use of the platforms’ tracking tools, platform design, terminology, and insights into patients’ monitoring needs, information and communication technologies skills, and preferences for self-management tools. (2) Experiences from the participatory design process: patients and health care professionals contributed different perspectives, with the patients using an experience-based approach and the health care professionals using a more attitude-based approach. Conclusions The essential lessons learned concern planning and organization of workshops, including the finding that patients engaged actively and willingly in a participatory design process, whereas it was more challenging to include and engage health care professionals. PMID:28526674

  1. Thermal Conductivity of Metallic Uranium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hin, Celine

    This project has developed a modeling and simulation approaches to predict the thermal conductivity of metallic fuels and their alloys. We focus on two methods. The first method has been developed by the team at the University of Wisconsin Madison. They developed a practical and general modeling approach for thermal conductivity of metals and metal alloys that integrates ab-initio and semi-empirical physics-based models to maximize the strengths of both techniques. The second method has been developed by the team at Virginia Tech. This approach consists of a determining the thermal conductivity using only ab-initio methods without any fitting parameters. Bothmore » methods were complementary. The models incorporated both phonon and electron contributions. Good agreement with experimental data over a wide temperature range were found. The models also provided insight into the different physical factors that govern the thermal conductivity under different temperatures. The models were general enough to incorporate more complex effects like additional alloying species, defects, transmutation products and noble gas bubbles to predict the behavior of complex metallic alloys like U-alloy fuel systems under burnup. 3 Introduction Thermal conductivity is an important thermal physical property affecting the performance and efficiency of metallic fuels [1]. Some experimental measurement of thermal conductivity and its correlation with composition and temperature from empirical fitting are available for U, Zr and their alloys with Pu and other minor actinides. However, as reviewed in by Kim, Cho and Sohn [2], due to the difficulty in doing experiments on actinide materials, thermal conductivities of metallic fuels have only been measured at limited alloy compositions and temperatures, some of them even being negative and unphysical. Furthermore, the correlations developed so far are empirical in nature and may not be accurate when used for prediction at conditions far from those used in the original fitting. Moreover, as fuels burn up in the reactor and fission products are built up, thermal conductivity is also significantly changed [3]. Unfortunately, fundamental understanding of the effect of fission products is also currently lacking. In this project, we probe thermal conductivity of metallic fuels with ab initio calculations, a theoretical tool with the potential to yield better accuracy and predictive power than empirical fitting. This work will both complement experimental data by determining thermal conductivity in wider composition and temperature ranges than is available experimentally, and also develop mechanistic understanding to guide better design of metallic fuels in the future. So far, we focused on α-U perfect crystal, the ground-state phase of U metal. We focus on two methods. The first method has been developed by the team at the University of Wisconsin Madison. They developed a practical and general modeling approach for thermal conductivity of metals and metal alloys that integrates ab-initio and semi-empirical physics-based models to maximize the strengths of both techniques. The second method has been developed by the team at Virginia Tech. This approach consists of a determining the thermal conductivity using only ab-initio methods without any fitting parameters. Both methods were complementary and very helpful to understand the physics behind the thermal conductivity in metallic uranium and other materials with similar characteristics. In Section I, the combined model developed at UWM is explained. In Section II, the ab-initio method developed at VT is described along with the uranium pseudo-potential and its validation. Section III is devoted to the work done by Jianguo Yu at INL. Finally, we will present the performance of the project in terms of milestones, publications, and presentations.« less

  2. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  3. Automated statistical experimental design approach for rapid separation of coenzyme Q10 and identification of its biotechnological process related impurities using UHPLC and UHPLC-APCI-MS.

    PubMed

    Talluri, Murali V N Kumar; Kalariya, Pradipbhai D; Dharavath, Shireesha; Shaikh, Naeem; Garg, Prabha; Ramisetti, Nageswara Rao; Ragampeta, Srinivas

    2016-09-01

    A novel ultra high performance liquid chromatography method development strategy was ameliorated by applying quality by design approach. The developed systematic approach was divided into five steps (i) Analytical Target Profile, (ii) Critical Quality Attributes, (iii) Risk Assessments of Critical parameters using design of experiments (screening and optimization phases), (iv) Generation of design space, and (v) Process Capability Analysis (Cp) for robustness study using Monte Carlo simulation. The complete quality-by-design-based method development was made automated and expedited by employing sub-2 μm particles column with an ultra high performance liquid chromatography system. Successful chromatographic separation of the Coenzyme Q10 from its biotechnological process related impurities was achieved on a Waters Acquity phenyl hexyl (100 mm × 2.1 mm, 1.7 μm) column with gradient elution of 10 mM ammonium acetate buffer (pH 4.0) and a mixture of acetonitrile/2-propanol (1:1) as the mobile phase. Through this study, fast and organized method development workflow was developed and robustness of the method was also demonstrated. The method was validated for specificity, linearity, accuracy, precision, and robustness in compliance to the International Conference on Harmonization, Q2 (R1) guidelines. The impurities were identified by atmospheric pressure chemical ionization-mass spectrometry technique. Further, the in silico toxicity of impurities was analyzed using TOPKAT and DEREK software. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Improving Upon String Methods for Transition State Discovery.

    PubMed

    Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker

    2012-02-14

    Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.

  5. An Ensemble Approach for Drug Side Effect Prediction

    PubMed Central

    Jahid, Md Jamiul; Ruan, Jianhua

    2014-01-01

    In silico prediction of drug side-effects in early stage of drug development is becoming more popular now days, which not only reduces the time for drug design but also reduces the drug development costs. In this article we propose an ensemble approach to predict drug side-effects of drug molecules based on their chemical structure. Our idea originates from the observation that similar drugs have similar side-effects. Based on this observation we design an ensemble approach that combine the results from different classification models where each model is generated by a different set of similar drugs. We applied our approach to 1385 side-effects in the SIDER database for 888 drugs. Results show that our approach outperformed previously published approaches and standard classifiers. Furthermore, we applied our method to a number of uncharacterized drug molecules in DrugBank database and predict their side-effect profiles for future usage. Results from various sources confirm that our method is able to predict the side-effects for uncharacterized drugs and more importantly able to predict rare side-effects which are often ignored by other approaches. The method described in this article can be useful to predict side-effects in drug design in an early stage to reduce experimental cost and time. PMID:25327524

  6. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  7. Integrated Approaches to Testing and Assessment: OECD Activities on the Development and Use of Adverse Outcome Pathways and Case Studies.

    PubMed

    Sakuratani, Yuki; Horie, Masashi; Leinala, Eeva

    2018-01-09

    The Organisation for Economic Co-operation and Development (OECD) works with member countries and other stakeholders to improve and harmonize chemical assessment methods. In 2012, the OECD Adverse Outcome Pathways (AOPs) Development Programme started. The Programme has published six AOPs thus far and more than 60 AOPs are under various stages of development under the Programme. This article reviews recent OECD activities on the use of AOPs in developing Integrated Approaches to Testing and Assessments (IATAs). The guidance document for the use of AOPs in developing IATA, published in 2016, provides a framework for developing and using IATA and describes how IATA can be based on an AOP. The guidance document on the reporting of defined approaches to be used within IATA, also published in 2016, provides a set of principles for reporting defined approaches to testing and assessment to facilitate their evaluation. In the guidance documents, the AOP concept plays an important role for building IATA approaches in a science-based and transparent way. In 2015, the IATA Case Studies Project was launched to increase experience with the use of IATA and novel hazard methodologies by developing case studies, which constitute examples of predictions that are fit-for-regulatory use. This activity highlights the importance of international collaboration for harmonizing and improving chemical safety assessment methods. © 2018 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  8. A Generalized Approach for Measuring Relationships Among Genes.

    PubMed

    Wang, Lijun; Ahsan, Md Asif; Chen, Ming

    2017-07-21

    Several methods for identifying relationships among pairs of genes have been developed. In this article, we present a generalized approach for measuring relationships between any pairs of genes, which is based on statistical prediction. We derive two particular versions of the generalized approach, least squares estimation (LSE) and nearest neighbors prediction (NNP). According to mathematical proof, LSE is equivalent to the methods based on correlation; and NNP is approximate to one popular method called the maximal information coefficient (MIC) according to the performances in simulations and real dataset. Moreover, the approach based on statistical prediction can be extended from two-genes relationships to multi-genes relationships. This application would help to identify relationships among multi-genes.

  9. Paving the COWpath: data-driven design of pediatric order sets

    PubMed Central

    Zhang, Yiye; Padman, Rema; Levin, James E

    2014-01-01

    Objective Evidence indicates that users incur significant physical and cognitive costs in the use of order sets, a core feature of computerized provider order entry systems. This paper develops data-driven approaches for automating the construction of order sets that match closely with user preferences and workflow while minimizing physical and cognitive workload. Materials and methods We developed and tested optimization-based models embedded with clustering techniques using physical and cognitive click cost criteria. By judiciously learning from users’ actual actions, our methods identify items for constituting order sets that are relevant according to historical ordering data and grouped on the basis of order similarity and ordering time. We evaluated performance of the methods using 47 099 orders from the year 2011 for asthma, appendectomy and pneumonia management in a pediatric inpatient setting. Results In comparison with existing order sets, those developed using the new approach significantly reduce the physical and cognitive workload associated with usage by 14–52%. This approach is also capable of accommodating variations in clinical conditions that affect order set usage and development. Discussion There is a critical need to investigate the cognitive complexity imposed on users by complex clinical information systems, and to design their features according to ‘human factors’ best practices. Optimizing order set generation using cognitive cost criteria introduces a new approach that can potentially improve ordering efficiency, reduce unintended variations in order placement, and enhance patient safety. Conclusions We demonstrate that data-driven methods offer a promising approach for designing order sets that are generalizable, data-driven, condition-based, and up to date with current best practices. PMID:24674844

  10. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  11. Methods of the Development Strategy of Service Companies: Logistical Approach

    ERIC Educational Resources Information Center

    Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.

    2016-01-01

    The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…

  12. A simple finite element method for non-divergence form elliptic equation

    DOE PAGES

    Mu, Lin; Ye, Xiu

    2017-03-01

    Here, we develop a simple finite element method for solving second order elliptic equations in non-divergence form by combining least squares concept with discontinuous approximations. This simple method has a symmetric and positive definite system and can be easily analyzed and implemented. We could have also used general meshes with polytopal element and hanging node in the method. We prove that our finite element solution approaches to the true solution when the mesh size approaches to zero. Numerical examples are tested that demonstrate the robustness and flexibility of the method.

  13. A simple finite element method for non-divergence form elliptic equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mu, Lin; Ye, Xiu

    Here, we develop a simple finite element method for solving second order elliptic equations in non-divergence form by combining least squares concept with discontinuous approximations. This simple method has a symmetric and positive definite system and can be easily analyzed and implemented. We could have also used general meshes with polytopal element and hanging node in the method. We prove that our finite element solution approaches to the true solution when the mesh size approaches to zero. Numerical examples are tested that demonstrate the robustness and flexibility of the method.

  14. Demonstrating the Effectiveness of an Integrated and Intensive Research Methods and Statistics Course Sequence

    ERIC Educational Resources Information Center

    Pliske, Rebecca M.; Caldwell, Tracy L.; Calin-Jageman, Robert J.; Taylor-Ritzler, Tina

    2015-01-01

    We developed a two-semester series of intensive (six-contact hours per week) behavioral research methods courses with an integrated statistics curriculum. Our approach includes the use of team-based learning, authentic projects, and Excel and SPSS. We assessed the effectiveness of our approach by examining our students' content area scores on the…

  15. Translating Basic Behavioral and Social Science Research to Clinical Application: The EVOLVE Mixed Methods Approach

    ERIC Educational Resources Information Center

    Peterson, Janey C.; Czajkowski, Susan; Charlson, Mary E.; Link, Alissa R.; Wells, Martin T.; Isen, Alice M.; Mancuso, Carol A.; Allegrante, John P.; Boutin-Foster, Carla; Ogedegbe, Gbenga; Jobe, Jared B.

    2013-01-01

    Objective: To describe a mixed-methods approach to develop and test a basic behavioral science-informed intervention to motivate behavior change in 3 high-risk clinical populations. Our theoretically derived intervention comprised a combination of positive affect and self-affirmation (PA/SA), which we applied to 3 clinical chronic disease…

  16. The Effects of Jigsaw Technique Based on Cooperative Learning on Prospective Science Teachers' Science Process Skill

    ERIC Educational Resources Information Center

    Karacop, Ataman; Diken, Emine Hatun

    2017-01-01

    The purpose of this study is to investigate the effects of laboratory approach based on jigsaw method with cooperative learning and confirmatory laboratory approach on university students' cognitive process development in Science teaching laboratory applications, and to determine the opinions of the students on applied laboratory methods. The…

  17. The Evaluation of HRD: A Critical Study with Applications

    ERIC Educational Resources Information Center

    Tome, Eduardo

    2009-01-01

    Purpose: The purpose of this paper is to analyze critically the most important methods that are used in the evaluation of human resource development (HRD). Design/methodology/approach: The approach is to ask two questions: What are the methods available to define the impact of HRD in the economy? How can we evaluate the evaluations that have been…

  18. Probe molecules (PrM) approach in adverse outcome pathway (AOP) based high throughput screening (HTS): in vivo discovery for developing in vitro target methods

    EPA Science Inventory

    Efficient and accurate adverse outcome pathway (AOP) based high-throughput screening (HTS) methods use a systems biology based approach to computationally model in vitro cellular and molecular data for rapid chemical prioritization; however, not all HTS assays are grounded by rel...

  19. Application of Grey Relational Analysis to Decision-Making during Product Development

    ERIC Educational Resources Information Center

    Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan

    2017-01-01

    A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…

  20. Empirical and Clinical Methods in the Assessment of Personality and Psychopathology: An Integrative Approach for Training

    ERIC Educational Resources Information Center

    Flanagan, Rosemary; Esquivel, Giselle B.

    2006-01-01

    School psychologists have a critical role in identifying social-emotional problems and psychopathology in youth based on a set of personality-assessment competencies. The development of competencies in assessing personality and psychopathology is complex, requiring a variety of integrated methods and approaches. Given the limited extent and scope…

  1. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  2. Identifying Natural Alignments Between Ambulatory Surgery Centers and Local Health Systems: Building Broader Communities of Surgical Care.

    PubMed

    Funk, Russell J; Owen-Smith, Jason; Landon, Bruce E; Birkmeyer, John D; Hollingsworth, John M

    2017-02-01

    To develop and compare methods for identifying natural alignments between ambulatory surgery centers (ASCs) and hospitals that anchor local health systems. Using all-payer data from Florida's State Ambulatory Surgery and Inpatient Databases (2005-2009), we developed 3 methods for identifying alignments between ASCS and hospitals. The first, a geographic proximity approach, used spatial data to assign an ASC to its nearest hospital neighbor. The second, a predominant affiliation approach, assigned an ASC to the hospital with which it shared a plurality of surgeons. The third, a network community approach, linked an ASC with a larger group of hospitals held together by naturally occurring physician networks. We compared each method in terms of its ability to capture meaningful and stable affiliations and its administrative simplicity. Although the proximity approach was simplest to implement and produced the most durable alignments, ASC surgeon's loyalty to the assigned hospital was low with this method. The predominant affiliation and network community approaches performed better and nearly equivalently on these metrics, capturing more meaningful affiliations between ASCs and hospitals. However, the latter's alignments were least durable, and it was complex to administer. We describe 3 methods for identifying natural alignments between ASCs and hospitals, each with strengths and weaknesses. These methods will help health system managers identify ASCs with which to partner. Moreover, health services researchers and policy analysts can use them to study broader communities of surgical care.

  3. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    USGS Publications Warehouse

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  4. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    PubMed

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Preparative purification of polyethylene glycol derivatives with polystyrene-divinylbenzene beads as chromatographic packing.

    PubMed

    Yu, Pengzhan; Li, Xingqi; Li, Xiunan; Lu, Xiuling; Ma, Guanghui; Su, Zhiguo

    2007-10-15

    A clear and powerful chromatographic approach to purify polyethylene glycol derivatives at a preparative scale was reported, which was based on the polystyrene-divinylbenzene beads with ethanol/water as eluants. The validity of this method was verified with the reaction mixture of mPEG-Glu and mPEG propionaldehyde diethylacetal (ALD-PEG) as the model. The target products were one-step achieved with the purity of >99% on the polymer resins column at gram scale. The method developed was free from such disadvantages as utility of toxic solvent and narrow application scope, which was combined with conventional approaches. The method developed provided an appealing and attractive alternative methods for purification of PEG derivatives at a preparative scale.

  6. An expert systems approach to automated fault management in a regenerative life support subsystem

    NASA Technical Reports Server (NTRS)

    Malin, J. T.; Lance, N., Jr.

    1986-01-01

    This paper describes FIXER, a prototype expert system for automated fault management in a regenerative life support subsystem typical of Space Station applications. The development project provided an evaluation of the use of expert systems technology to enhance controller functions in space subsystems. The software development approach permitted evaluation of the effectiveness of direct involvement of the expert in design and development. The approach also permitted intensive observation of the knowledge and methods of the expert. This paper describes the development of the prototype expert system and presents results of the evaluation.

  7. The strategic approach to contraceptive introduction.

    PubMed

    Simmons, R; Hall, P; Díaz, J; Díaz, M; Fajans, P; Satia, J

    1997-06-01

    The introduction of new contraceptive technologies has great potential for expanding contraceptive choice, but in practice, benefits have not always materialized as new methods have been added to public-sector programs. In response to lessons from the past, the UNDP/UNFPA/WHO/World Bank Special Programme of Research, Development, and Research Training in Human Reproduction (HRP) has taken major steps to develop a new approach and to support governments interested in its implementation. After reviewing previous experience with contraceptive introduction, the article outlines the strategic approach and discusses lessons from eight countries. This new approach shifts attention from promotion of a particular technology to an emphasis on the method mix, the capacity to provide services with quality of care, reproductive choice, and users' perspectives and needs. It also suggests that technology choice should be undertaken through a participatory process that begins with an assessment of the need for contraceptive introduction and is followed by research and policy and program development. Initial results from Bolivia, Brazil, Burkina Faso, Chile, Myanmar, South Africa, Vietnam, and Zambia confirm the value of the new approach.

  8. Model-based elastography: a survey of approaches to the inverse elasticity problem

    PubMed Central

    Doyley, M M

    2012-01-01

    Elastography is emerging as an imaging modality that can distinguish normal versus diseased tissues via their biomechanical properties. This article reviews current approaches to elastography in three areas — quasi-static, harmonic, and transient — and describes inversion schemes for each elastographic imaging approach. Approaches include: first-order approximation methods; direct and iterative inversion schemes for linear elastic; isotropic materials; and advanced reconstruction methods for recovering parameters that characterize complex mechanical behavior. The paper’s objective is to document efforts to develop elastography within the framework of solving an inverse problem, so that elastography may provide reliable estimates of shear modulus and other mechanical parameters. We discuss issues that must be addressed if model-based elastography is to become the prevailing approach to quasi-static, harmonic, and transient elastography: (1) developing practical techniques to transform the ill-posed problem with a well-posed one; (2) devising better forward models to capture the transient behavior of soft tissue; and (3) developing better test procedures to evaluate the performance of modulus elastograms. PMID:22222839

  9. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  10. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  11. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  12. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Chung, E.-S.

    2015-04-01

    This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  13. Entrepreneurship and Socioeconomic Development in Africa: A Reality or Myth?

    ERIC Educational Resources Information Center

    Nafukho, Fredrick M.; Muyia, Machuma A. Helen

    2010-01-01

    Purpose: The purpose of this paper is to examine the development of entrepreneurship education and training in Kenya as a strategic approach to addressing the unemployment problem among the school and university graduates in Kenya and Africa in general. Design/methodology/approach: The study adopted a critical review of the literature method to…

  14. Promoting Conceptual Development in Physics Teacher Education: Cognitive-Historical Reconstruction of Electromagnetic Induction Law

    ERIC Educational Resources Information Center

    Mantyla, Terhi

    2013-01-01

    In teaching physics, the history of physics offers fruitful starting points for designing instruction. I introduce here an approach that uses historical cognitive processes to enhance the conceptual development of pre-service physics teachers' knowledge. It applies a method called cognitive-historical approach, introduced to the cognitive sciences…

  15. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  16. A systematic approach to engineering ethics education.

    PubMed

    Li, Jessica; Fu, Shengli

    2012-06-01

    Engineering ethics education is a complex field characterized by dynamic topics and diverse students, which results in significant challenges for engineering ethics educators. The purpose of this paper is to introduce a systematic approach to determine what to teach and how to teach in an ethics curriculum. This is a topic that has not been adequately addressed in the engineering ethics literature. This systematic approach provides a method to: (1) develop a context-specific engineering ethics curriculum using the Delphi technique, a process-driven research method; and (2) identify appropriate delivery strategies and instructional strategies using an instructional design model. This approach considers the context-specific needs of different engineering disciplines in ethics education and leverages the collaboration of engineering professors, practicing engineers, engineering graduate students, ethics scholars, and instructional design experts. The proposed approach is most suitable for a department, a discipline/field or a professional society. The approach helps to enhance learning outcomes and to facilitate ethics education curriculum development as part of the regular engineering curriculum.

  17. INTEGRATION OF SPATIAL DATA: METHODS EVALUATION WITH REGARD TO DATA ISSUES AND ASSESSMENT QUESTIONS

    EPA Science Inventory

    EPA's Regional Vulnerability Assessment (REVA) Program is developing and demonstrating approaches to assess current and future environmental vulnerabilities at a regional scale. An initial effort within this research program has been to develop and evaluate methods to synthesize ...

  18. Development of a Novel, Bicombinatorial Approach to Alloy Development, and Application to Rapid Screening of Creep Resistant Titanium Alloys

    NASA Astrophysics Data System (ADS)

    Martin, Brian

    Combinatorial approaches have proven useful for rapid alloy fabrication and optimization. A new method of producing controlled isothermal gradients using the Gleeble Thermomechanical simulator has been developed, and demonstrated on the metastable beta-Ti alloy beta-21S, achieving a thermal gradient of 525-700 °C. This thermal gradient method has subsequently been coupled with existing combinatorial methods of producing composition gradients using the LENS(TM) additive manufacturing system, through the use of elemental blended powders. This has been demonstrated with a binary Ti-(0-15) wt% Cr build, which has subsequently been characterized with optical and electron microscopy, with special attention to the precipitate of TiCr2 Laves phases. The TiCr2 phase has been explored for its high temperature mechanical properties in a new oxidation resistant beta-Ti alloy, which serves as a demonstration of the new bicombinatorial methods developed as applied to a multicomponent alloy system.

  19. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.

    PubMed

    Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier

    2010-01-28

    Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.

  20. A computational approach to estimate postmortem interval using opacity development of eye for human subjects.

    PubMed

    Cantürk, İsmail; Özyılmaz, Lale

    2018-07-01

    This paper presents an approach to postmortem interval (PMI) estimation, which is a very debated and complicated area of forensic science. Most of the reported methods to determine PMI in the literature are not practical because of the need for skilled persons and significant amounts of time, and give unsatisfactory results. Additionally, the error margin of PMI estimation increases proportionally with elapsed time after death. It is crucial to develop practical PMI estimation methods for forensic science. In this study, a computational system is developed to determine the PMI of human subjects by investigating postmortem opacity development of the eye. Relevant features from the eye images were extracted using image processing techniques to reflect gradual opacity development. The features were then investigated to predict the time after death using machine learning methods. The experimental results prove that the development of opacity can be utilized as a practical computational tool to determine PMI for human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Recommended approach to software development, revision 3

    NASA Technical Reports Server (NTRS)

    Landis, Linda; Waligora, Sharon; Mcgarry, Frank; Pajerski, Rose; Stark, Mike; Johnson, Kevin Orlin; Cover, Donna

    1992-01-01

    Guidelines for an organized, disciplined approach to software development that is based on studies conducted by the Software Engineering Laboratory (SEL) since 1976 are presented. It describes methods and practices for each phase of a software development life cycle that starts with requirements definition and ends with acceptance testing. For each defined life cycle phase, guidelines for the development process and its management, and for the products produced and their reviews are presented.

  2. Solving the problem of negative populations in approximate accelerated stochastic simulations using the representative reaction approach.

    PubMed

    Kadam, Shantanu; Vanka, Kumar

    2013-02-15

    Methods based on the stochastic formulation of chemical kinetics have the potential to accurately reproduce the dynamical behavior of various biochemical systems of interest. However, the computational expense makes them impractical for the study of real systems. Attempts to render these methods practical have led to the development of accelerated methods, where the reaction numbers are modeled by Poisson random numbers. However, for certain systems, such methods give rise to physically unrealistic negative numbers for species populations. The methods which make use of binomial variables, in place of Poisson random numbers, have since become popular, and have been partially successful in addressing this problem. In this manuscript, the development of two new computational methods, based on the representative reaction approach (RRA), has been discussed. The new methods endeavor to solve the problem of negative numbers, by making use of tools like the stochastic simulation algorithm and the binomial method, in conjunction with the RRA. It is found that these newly developed methods perform better than other binomial methods used for stochastic simulations, in resolving the problem of negative populations. Copyright © 2012 Wiley Periodicals, Inc.

  3. Developmental psycholinguistics teaches us that we need multi-method, not single-method, approaches to the study of linguistic representation.

    PubMed

    Rowland, Caroline F; Monaghan, Padraic

    2017-01-01

    In developmental psycholinguistics, we have, for many years, been generating and testing theories that propose both descriptions of adult representations and explanations of how those representations develop. We have learnt that restricting ourselves to any one methodology yields only incomplete data about the nature of linguistic representations. We argue that we need a multi-method approach to the study of representation.

  4. An automation-assisted generic approach for biological sample preparation and LC-MS/MS method validation.

    PubMed

    Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S

    2011-09-01

    Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.

  5. Numerical modeling of spray combustion with an advanced VOF method

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Shih, Ming-Hsin; Liaw, Paul

    1995-01-01

    This paper summarizes the technical development and validation of a multiphase computational fluid dynamics (CFD) numerical method using the volume-of-fluid (VOF) model and a Lagrangian tracking model which can be employed to analyze general multiphase flow problems with free surface mechanism. The gas-liquid interface mass, momentum and energy conservation relationships are modeled by continuum surface mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed flow regimes. The objectives of the present study are to develop and verify the fractional volume-of-fluid cell partitioning approach into a predictor-corrector algorithm and to demonstrate the effectiveness of the present approach by simulating benchmark problems including laminar impinging jets, shear coaxial jet atomization and shear coaxial spray combustion flows.

  6. Development and validation of a FISH-based method for the detection and quantification of E. coli and coliform bacteria in water samples.

    PubMed

    Hügler, Michael; Böckle, Karin; Eberhagen, Ingrid; Thelen, Karin; Beimfohr, Claudia; Hambsch, Beate

    2011-01-01

    Monitoring of microbiological contaminants in water supplies requires fast and sensitive methods for the specific detection of indicator organisms or pathogens. We developed a protocol for the simultaneous detection of E. coli and coliform bacteria based on the Fluorescence in situ Hybridization (FISH) technology. This protocol consists of two approaches. The first allows the direct detection of single E. coli and coliform bacterial cells on the filter membranes. The second approach includes incubation of the filter membranes on a nutrient agar plate and subsequent detection of the grown micro-colonies. Both approaches were validated using drinking water samples spiked with pure cultures and naturally contaminated water samples. The effects of heat, chlorine and UV disinfection were also investigated. The micro-colony approach yielded very good results for all samples and conditions tested, and thus can be thoroughly recommended for usage as an alternative method to detect E. coli and coliform bacteria in water samples. However, during this study, some limitations became visible for the single cell approach. The method cannot be applied for water samples which have been disinfected by UV irradiation. In addition, our results indicated that green fluorescent dyes are not suitable to be used with chlorine disinfected samples.

  7. A Video Ethnography Approach for Linking Naturalistic Behaviors to Research Constructs of Neurocognition in Schizophrenia

    PubMed Central

    Bromley, Elizabeth; Adams, Gail Fox; Brekke, John S.

    2015-01-01

    Few methods are available to explore the impact of neurocognition in schizophrenia on behaviors performed in usual contexts. The authors developed a video ethnography approach to examine the relationship between naturalistic behaviors and research constructs of neurocognition. Video ethnographers accompanied subjects through usual routines gathering continuous video data. Researchers developed codes to measure four behavioral domains observed on video. This paper describes the psychometric characteristics to be considered in the development of observational approaches. It also highlights differences between behaviors performed in usual environments and neuropsychological constructs. The authors demonstrate that everyday behaviors that have been shown to correspond to neurocognitive skills in a pilot feasibility study1 can be identified and rated. They further suggest that observational methods could provide novel strategies for linking research findings and clinical concerns. PMID:22772661

  8. Deciphering the Epigenetic Code: An Overview of DNA Methylation Analysis Methods

    PubMed Central

    Umer, Muhammad

    2013-01-01

    Abstract Significance: Methylation of cytosine in DNA is linked with gene regulation, and this has profound implications in development, normal biology, and disease conditions in many eukaryotic organisms. A wide range of methods and approaches exist for its identification, quantification, and mapping within the genome. While the earliest approaches were nonspecific and were at best useful for quantification of total methylated cytosines in the chunk of DNA, this field has seen considerable progress and development over the past decades. Recent Advances: Methods for DNA methylation analysis differ in their coverage and sensitivity, and the method of choice depends on the intended application and desired level of information. Potential results include global methyl cytosine content, degree of methylation at specific loci, or genome-wide methylation maps. Introduction of more advanced approaches to DNA methylation analysis, such as microarray platforms and massively parallel sequencing, has brought us closer to unveiling the whole methylome. Critical Issues: Sensitive quantification of DNA methylation from degraded and minute quantities of DNA and high-throughput DNA methylation mapping of single cells still remain a challenge. Future Directions: Developments in DNA sequencing technologies as well as the methods for identification and mapping of 5-hydroxymethylcytosine are expected to augment our current understanding of epigenomics. Here we present an overview of methodologies available for DNA methylation analysis with special focus on recent developments in genome-wide and high-throughput methods. While the application focus relates to cancer research, the methods are equally relevant to broader issues of epigenetics and redox science in this special forum. Antioxid. Redox Signal. 18, 1972–1986. PMID:23121567

  9. A New Method for a Virtue-Based Responsible Conduct of Research Curriculum: Pilot Test Results.

    PubMed

    Berling, Eric; McLeskey, Chet; O'Rourke, Michael; Pennock, Robert T

    2018-02-03

    Drawing on Pennock's theory of scientific virtues, we are developing an alternative curriculum for training scientists in the responsible conduct of research (RCR) that emphasizes internal values rather than externally imposed rules. This approach focuses on the virtuous characteristics of scientists that lead to responsible and exemplary behavior. We have been pilot-testing one element of such a virtue-based approach to RCR training by conducting dialogue sessions, modeled upon the approach developed by Toolbox Dialogue Initiative, that focus on a specific virtue, e.g., curiosity and objectivity. During these structured discussions, small groups of scientists explore the roles they think the focus virtue plays and should play in the practice of science. Preliminary results have shown that participants strongly prefer this virtue-based model over traditional methods of RCR training. While we cannot yet definitively say that participation in these RCR sessions contributes to responsible conduct, these pilot results are encouraging and warrant continued development of this virtue-based approach to RCR training.

  10. Hubble Space Telescope Angular Velocity Estimation During the Robotic Servicing Mission

    NASA Technical Reports Server (NTRS)

    Thienel, Julie K.; Queen, Steven Z.; VanEepoel, John M.; Sanner, Robert M.

    2005-01-01

    During the Hubble Robotic Servicing Mission, the Hubble Space Telescope (HST) attitude and rates are necessary to achieve the capture of HST by the Hubble Robotic Vehicle (HRV). The attitude and rates must be determined without the HST gyros or HST attitude estimates. The HRV will be equipped with vision-based sensors, capable of estimating the relative attitude between HST and HRV. The HST attitude is derived from the measured relative attitude and the HRV computed inertial attitude. However, the relative rate between HST and HRV cannot be measured directly. Therefore, the HST rate with respect to inertial space is not known. Two approaches are developed to estimate the HST rates. Both methods utilize the measured relative attitude and the HRV inertial attitude and rates. First, a nonlinear estimator is developed. The nonlinear approach estimates the HST rate through an estimation of the inertial angular momentum. Second, a linearized approach is developed. The linearized approach is based on more traditional Extended Kalman filter techniques. Simulation test results for both methods are given.

  11. LC-MS/MS-based approach for obtaining exposure estimates of metabolites in early clinical trials using radioactive metabolites as reference standards.

    PubMed

    Zhang, Donglu; Raghavan, Nirmala; Chando, Theodore; Gambardella, Janice; Fu, Yunlin; Zhang, Duxi; Unger, Steve E; Humphreys, W Griffith

    2007-12-01

    An LC-MS/MS-based approach that employs authentic radioactive metabolites as reference standards was developed to estimate metabolite exposures in early drug development studies. This method is useful to estimate metabolite levels in studies done with non-radiolabeled compounds where metabolite standards are not available to allow standard LC-MS/MS assay development. A metabolite mixture obtained from an in vivo source treated with a radiolabeled compound was partially purified, quantified, and spiked into human plasma to provide metabolite standard curves. Metabolites were analyzed by LC-MS/MS using the specific mass transitions and an internal standard. The metabolite concentrations determined by this approach were found to be comparable to those determined by valid LC-MS/MS assays. This approach does not requires synthesis of authentic metabolites or the knowledge of exact structures of metabolites, and therefore should provide a useful method to obtain early estimates of circulating metabolites in early clinical or toxicological studies.

  12. Data-driven modeling and predictive control for boiler-turbine unit using fuzzy clustering and subspace methods.

    PubMed

    Wu, Xiao; Shen, Jiong; Li, Yiguo; Lee, Kwang Y

    2014-05-01

    This paper develops a novel data-driven fuzzy modeling strategy and predictive controller for boiler-turbine unit using fuzzy clustering and subspace identification (SID) methods. To deal with the nonlinear behavior of boiler-turbine unit, fuzzy clustering is used to provide an appropriate division of the operation region and develop the structure of the fuzzy model. Then by combining the input data with the corresponding fuzzy membership functions, the SID method is extended to extract the local state-space model parameters. Owing to the advantages of the both methods, the resulting fuzzy model can represent the boiler-turbine unit very closely, and a fuzzy model predictive controller is designed based on this model. As an alternative approach, a direct data-driven fuzzy predictive control is also developed following the same clustering and subspace methods, where intermediate subspace matrices developed during the identification procedure are utilized directly as the predictor. Simulation results show the advantages and effectiveness of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Improvement of a stability-indicating method by Quality-by-Design versus Quality-by-Testing: a case of a learning process.

    PubMed

    Hubert, C; Lebrun, P; Houari, S; Ziemons, E; Rozet, E; Hubert, Ph

    2014-01-01

    The understanding of the method is a major concern when developing a stability-indicating method and even more so when dealing with impurity assays from complex matrices. In the presented case study, a Quality-by-Design approach was applied in order to optimize a routinely used method. An analytical issue occurring at the last stage of a long-term stability study involving unexpected impurities perturbing the monitoring of characterized impurities needed to be resolved. A compliant Quality-by-Design (QbD) methodology based on a Design of Experiments (DoE) approach was evaluated within the framework of a Liquid Chromatography (LC) method. This approach allows the investigation of Critical Process Parameters (CPPs), which have an impact on Critical Quality Attributes (CQAs) and, consequently, on LC selectivity. Using polynomial regression response modeling as well as Monte Carlo simulations for error propagation, Design Space (DS) was computed in order to determine robust working conditions for the developed stability-indicating method. This QbD compliant development was conducted in two phases allowing the use of the Design Space knowledge acquired during the first phase to define the experimental domain of the second phase, which constitutes a learning process. The selected working condition was then fully validated using accuracy profiles based on statistical tolerance intervals in order to evaluate the reliability of the results generated by this LC/ESI-MS stability-indicating method. A comparison was made between the traditional Quality-by-Testing (QbT) approach and the QbD strategy, highlighting the benefit of this QbD strategy in the case of an unexpected impurities issue. On this basis, the advantages of a systematic use of the QbD methodology were discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. LDRD Report: Topological Design Optimization of Convolutes in Next Generation Pulsed Power Devices.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cyr, Eric C.; von Winckel, Gregory John; Kouri, Drew Philip

    This LDRD project was developed around the ambitious goal of applying PDE-constrained opti- mization approaches to design Z-machine components whose performance is governed by elec- tromagnetic and plasma models. This report documents the results of this LDRD project. Our differentiating approach was to use topology optimization methods developed for structural design and extend them for application to electromagnetic systems pertinent to the Z-machine. To achieve this objective a suite of optimization algorithms were implemented in the ROL library part of the Trilinos framework. These methods were applied to standalone demonstration problems and the Drekar multi-physics research application. Out of thismore » exploration a new augmented Lagrangian approach to structural design problems was developed. We demonstrate that this approach has favorable mesh-independent performance. Both the final design and the algorithmic performance were independent of the size of the mesh. In addition, topology optimization formulations for the design of conducting networks were developed and demonstrated. Of note, this formulation was used to develop a design for the inner magnetically insulated transmission line on the Z-machine. The resulting electromagnetic device is compared with theoretically postulated designs.« less

  15. Edutourism Taka Bonerate National Park through Scientific Approach to Improve Student Learning Outcomes

    NASA Astrophysics Data System (ADS)

    Hayati, R. S.

    2017-02-01

    This research aim is develop the potential of Taka Bonerate National Park as learning resources through edutourism with scientific approach to improve student learning outcomes. Focus of student learning outcomes are students psychomotor abilities and comprehension on Biodiversity of Marine Biota, Corals Ecosystem, and Conservation topics. The edutourism development products are teacher manual, edutourism worksheet, material booklet, guide’s manual, and Taka Bonerate National Park governor manual. The method to develop edutourism products is ADDIE research and development model that consist of analysis, design, development and production, implementation, and evaluation step. The subjects in the implementation step were given a pretest and posttest and observation sheet to see the effect of edutourism Taka Bonerate National Park through scientific approach to student learning outcomes on Biodiversity of Marine Biota, Corals Ecosystem, and Conservation topics. The data were analyzed qualitative descriptively. The research result is edutourism Taka Bonerate National Park through scientific approach can improve students learning outcomes on Biodiversity of Marine Biota, Corals Ecosystem, and Conservation topics. Edutourism Taka Bonerate National Park can be an alternative of learning method on Biodiversity of Marine Biota, Corals Ecosystem, and Conservation topics.

  16. [Psychiatric Rehabilitation - From the Linear Continuum Approach Towards Supported Inclusion].

    PubMed

    Richter, Dirk; Hertig, Res; Hoffmann, Holger

    2016-11-01

    Background: For many decades, psychiatric rehabilitation in the German-speaking countries is following a conventional linear continuum approach. Methods: Recent developments in important fields related to psychiatric rehabilitation (UN Convention on the Rights of People with Disabilities, theory of rehabilitation, empirical research) are reviewed. Results: Common to all developments in the reviewed fields are the principles of choice, autonomy and social inclusion. These principles contradict the conventional linear continuum approach. Conclusions: The linear continuum approach of psychiatric rehabilitation should be replaced by the "supported inclusion"-approach. © Georg Thieme Verlag KG Stuttgart · New York.

  17. A study of alternative methods for reclaiming oxygen from carbon dioxide and water by a solid-electrolyte process for spacecraft applications

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Two alternative technical approaches were studied for application of an electrochemical process using a solid oxide electrolyte (zirconia stabilized by yttria or scandia) to oxygen reclamation from carbon dioxide and water, for spacecraft life support systems. Among the topics considered are the advisability of proceeding to engineering prototype development and fabrication of a full scale model for the system concept, the optimum choice of method or approach to be carried into prototype development, and the technical problem areas which exist.

  18. Manufacture and evaluation of Nb/sub 3/Sn conductors fabricated by the MJR method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, W.K.; Curtis, C.W.; Scanlan, R.M.

    1982-11-23

    The bronze matrix/niobium filament process has become established as a commercially viable method for producing multifilamentary Nb/sub 3/Sn superconductors. This paper describes a new method, the Modified Jelly-Roll (MJR) approach, which can produce a structure similar to that in a conventionally fabricated multifilamentary Nb/sub 3/Sn conductor. This approach utilizes alternate sheets of niobium expanded metal and bronze, which are rolled into a jelly-roll configuration and then extruded. During extrusion and subsequent drawing, the junctures in the niobium are elongated and the material develops a filamentary structure. This method may offer significant advantages in terms of reduced fabrication time and costmore » over the conventional approach. Results of a manufacturing development program will be presented in which two lengths of conductor were made to High-Field Test Facility conductor specifications. In addition, critical current and transition temperature measurements of the sub-elements used to construct the HFTF-type lengths will be reported.« less

  19. Final Technical Report: Distributed Controls for High Penetrations of Renewables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byrne, Raymond H.; Neely, Jason C.; Rashkin, Lee J.

    2015-12-01

    The goal of this effort was to apply four potential control analysis/design approaches to the design of distributed grid control systems to address the impact of latency and communications uncertainty with high penetrations of photovoltaic (PV) generation. The four techniques considered were: optimal fixed structure control; Nyquist stability criterion; vector Lyapunov analysis; and Hamiltonian design methods. A reduced order model of the Western Electricity Coordinating Council (WECC) developed for the Matlab Power Systems Toolbox (PST) was employed for the study, as well as representative smaller systems (e.g., a two-area, three-area, and four-area power system). Excellent results were obtained with themore » optimal fixed structure approach, and the methodology we developed was published in a journal article. This approach is promising because it offers a method for designing optimal control systems with the feedback signals available from Phasor Measurement Unit (PMU) data as opposed to full state feedback or the design of an observer. The Nyquist approach inherently handles time delay and incorporates performance guarantees (e.g., gain and phase margin). We developed a technique that works for moderate sized systems, but the approach does not scale well to extremely large system because of computational complexity. The vector Lyapunov approach was applied to a two area model to demonstrate the utility for modeling communications uncertainty. Application to large power systems requires a method to automatically expand/contract the state space and partition the system so that communications uncertainty can be considered. The Hamiltonian Surface Shaping and Power Flow Control (HSSPFC) design methodology was selected to investigate grid systems for energy storage requirements to support high penetration of variable or stochastic generation (such as wind and PV) and loads. This method was applied to several small system models.« less

  20. A Simplified Micromechanical Modeling Approach to Predict the Tensile Flow Curve Behavior of Dual-Phase Steels

    NASA Astrophysics Data System (ADS)

    Nanda, Tarun; Kumar, B. Ravi; Singh, Vishal

    2017-11-01

    Micromechanical modeling is used to predict material's tensile flow curve behavior based on microstructural characteristics. This research develops a simplified micromechanical modeling approach for predicting flow curve behavior of dual-phase steels. The existing literature reports on two broad approaches for determining tensile flow curve of these steels. The modeling approach developed in this work attempts to overcome specific limitations of the existing two approaches. This approach combines dislocation-based strain-hardening method with rule of mixtures. In the first step of modeling, `dislocation-based strain-hardening method' was employed to predict tensile behavior of individual phases of ferrite and martensite. In the second step, the individual flow curves were combined using `rule of mixtures,' to obtain the composite dual-phase flow behavior. To check accuracy of proposed model, four distinct dual-phase microstructures comprising of different ferrite grain size, martensite fraction, and carbon content in martensite were processed by annealing experiments. The true stress-strain curves for various microstructures were predicted with the newly developed micromechanical model. The results of micromechanical model matched closely with those of actual tensile tests. Thus, this micromechanical modeling approach can be used to predict and optimize the tensile flow behavior of dual-phase steels.

  1. Innovative spectrophotometric methods for simultaneous estimation of the novel two-drug combination: Sacubitril/Valsartan through two manipulation approaches and a comparative statistical study

    NASA Astrophysics Data System (ADS)

    Eissa, Maya S.; Abou Al Alamein, Amal M.

    2018-03-01

    Different innovative spectrophotometric methods were introduced for the first time for simultaneous quantification of sacubitril/valsartan in their binary mixture and in their combined dosage form without prior separation through two manipulation approaches. These approaches were developed and based either on two wavelength selection in zero-order absorption spectra namely; dual wavelength method (DWL) at 226 nm and 275 nm for valsartan, induced dual wavelength method (IDW) at 226 nm and 254 nm for sacubitril and advanced absorbance subtraction (AAS) based on their iso-absorptive point at 246 nm (λiso) and 261 nm (sacubitril shows equal absorbance values at the two selected wavelengths) or on ratio spectra using their normalized spectra namely; ratio difference spectrophotometric method (RD) at 225 nm and 264 nm for both of them in their ratio spectra, first derivative of ratio spectra (DR1) at 232 nm for valsartan and 239 nm for sacubitril and mean centering of ratio spectra (MCR) at 260 nm for both of them. Both sacubitril and valsartan showed linearity upon application of these methods in the range of 2.5-25.0 μg/mL. The developed spectrophotmetric methods were successfully applied to the analysis of their combined tablet dosage form ENTRESTO™. The adopted spectrophotometric methods were also validated according to ICH guidelines. The results obtained from the proposed methods were statistically compared to a reported HPLC method using Student t-test, F-test and a comparative study was also developed with one-way ANOVA, showing no statistical difference in accordance to precision and accuracy.

  2. Theoretical development and first-principles analysis of strongly correlated systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chen

    A variety of quantum many-body methods have been developed for studying the strongly correlated electron systems. We have also proposed a computationally efficient and accurate approach, named the correlation matrix renormalization (CMR) method, to address the challenges. The initial implementation of the CMR method is designed for molecules which have theoretical advantages, including small size of system, manifest mechanism and strongly correlation effect such as bond breaking process. The theoretic development and benchmark tests of the CMR method are included in this thesis. Meanwhile, ground state total energy is the most important property of electronic calculations. We also investigated anmore » alternative approach to calculate the total energy, and extended this method for magnetic anisotropy energy (MAE) of ferromagnetic materials. In addition, another theoretical tool, dynamical mean- field theory (DMFT) on top of the DFT , has also been used in electronic structure calculations for an Iridium oxide to study the phase transition, which results from an interplay of the d electrons' internal degrees of freedom.« less

  3. A Gentle Approach for Young Infants.

    ERIC Educational Resources Information Center

    Suskind, Diana; Kozma, Marta

    The Gentle Approach is a method for lifting infants younger than 6 months that promotes security and reassurance during adult-imposed changes in position. Developed at the Emmi Pilker National Methodological Institute for Residential Nurseries in Budapest, Hungary, the approach provides continual support and less opportunity for unprotected…

  4. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  5. A review of consensus test methods for established medical imaging modalities and their implications for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Pfefer, Joshua; Agrawal, Anant

    2012-03-01

    In recent years there has been increasing interest in development of consensus, tissue-phantom-based approaches for assessment of biophotonic imaging systems, with the primary goal of facilitating clinical translation of novel optical technologies. Well-characterized test methods based on tissue phantoms can provide useful tools for performance assessment, thus enabling standardization and device inter-comparison during preclinical development as well as quality assurance and re-calibration in the clinical setting. In this review, we study the role of phantom-based test methods as described in consensus documents such as international standards for established imaging modalities including X-ray CT, MRI and ultrasound. Specifically, we focus on three image quality characteristics - spatial resolution, spatial measurement accuracy and image uniformity - and summarize the terminology, metrics, phantom design/construction approaches and measurement/analysis procedures used to assess these characteristics. Phantom approaches described are those in routine clinical use and tend to have simplified morphology and biologically-relevant physical parameters. Finally, we discuss the potential for applying knowledge gained from existing consensus documents in the development of standardized, phantom-based test methods for optical coherence tomography.

  6. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Flexible Method for Developing Tactics, Techniques, and Procedures for Future Capabilities

    DTIC Science & Technology

    2009-02-01

    levels of ability, military experience, and motivation, (b) number and type of significant events, and (c) other sources of natural variability...research has developed a number of specific instruments designed to aid in this process. Second, the iterative, feed-forward nature of the method allows...FLEX method), but still lack the structured KE approach and iterative, feed-forward nature of the FLEX method. To facilitate decision making

  8. Ground State and Finite Temperature Lanczos Methods

    NASA Astrophysics Data System (ADS)

    Prelovšek, P.; Bonča, J.

    The present review will focus on recent development of exact- diagonalization (ED) methods that use Lanczos algorithm to transform large sparse matrices onto the tridiagonal form. We begin with a review of basic principles of the Lanczos method for computing ground-state static as well as dynamical properties. Next, generalization to finite-temperatures in the form of well established finite-temperature Lanczos method is described. The latter allows for the evaluation of temperatures T>0 static and dynamic quantities within various correlated models. Several extensions and modification of the latter method introduced more recently are analysed. In particular, the low-temperature Lanczos method and the microcanonical Lanczos method, especially applicable within the high-T regime. In order to overcome the problems of exponentially growing Hilbert spaces that prevent ED calculations on larger lattices, different approaches based on Lanczos diagonalization within the reduced basis have been developed. In this context, recently developed method based on ED within a limited functional space is reviewed. Finally, we briefly discuss the real-time evolution of correlated systems far from equilibrium, which can be simulated using the ED and Lanczos-based methods, as well as approaches based on the diagonalization in a reduced basis.

  9. IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.

    PubMed

    Bayard, David S; Schumitzky, Alan

    2010-03-01

    This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.

  10. Investigating the Importance of the Pocket-estimation Method in Pocket-based Approaches: An Illustration Using Pocket-ligand Classification.

    PubMed

    Caumes, Géraldine; Borrel, Alexandre; Abi Hussein, Hiba; Camproux, Anne-Claude; Regad, Leslie

    2017-09-01

    Small molecules interact with their protein target on surface cavities known as binding pockets. Pocket-based approaches are very useful in all of the phases of drug design. Their first step is estimating the binding pocket based on protein structure. The available pocket-estimation methods produce different pockets for the same target. The aim of this work is to investigate the effects of different pocket-estimation methods on the results of pocket-based approaches. We focused on the effect of three pocket-estimation methods on a pocket-ligand (PL) classification. This pocket-based approach is useful for understanding the correspondence between the pocket and ligand spaces and to develop pharmacological profiling models. We found pocket-estimation methods yield different binding pockets in terms of boundaries and properties. These differences are responsible for the variation in the PL classification results that can have an impact on the detected correspondence between pocket and ligand profiles. Thus, we highlighted the importance of the pocket-estimation method choice in pocket-based approaches. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Automated solid-phase extraction workstations combined with quantitative bioanalytical LC/MS.

    PubMed

    Huang, N H; Kagel, J R; Rossi, D T

    1999-03-01

    An automated solid-phase extraction workstation was used to develop, characterize and validate an LC/MS/MS method for quantifying a novel lipid-regulating drug in dog plasma. Method development was facilitated by workstation functions that allowed wash solvents of varying organic composition to be mixed and tested automatically. Precision estimates for this approach were within 9.8% relative standard deviation (RSD) across the calibration range. Accuracy for replicate determinations of quality controls was between -7.2 and +6.2% relative error (RE) over 5-1,000 ng/ml(-1). Recoveries were evaluated for a wide variety of wash solvents, elution solvents and sorbents. Optimized recoveries were generally > 95%. A sample throughput benchmark for the method was approximately equal 8 min per sample. Because of parallel sample processing, 100 samples were extracted in less than 120 min. The approach has proven useful for use with LC/MS/MS, using a multiple reaction monitoring (MRM) approach.

  12. Structural Health Monitoring of Large Structures

    NASA Technical Reports Server (NTRS)

    Kim, Hyoung M.; Bartkowicz, Theodore J.; Smith, Suzanne Weaver; Zimmerman, David C.

    1994-01-01

    This paper describes a damage detection and health monitoring method that was developed for large space structures using on-orbit modal identification. After evaluating several existing model refinement and model reduction/expansion techniques, a new approach was developed to identify the location and extent of structural damage with a limited number of measurements. A general area of structural damage is first identified and, subsequently, a specific damaged structural component is located. This approach takes advantage of two different model refinement methods (optimal-update and design sensitivity) and two different model size matching methods (model reduction and eigenvector expansion). Performance of the proposed damage detection approach was demonstrated with test data from two different laboratory truss structures. This space technology can also be applied to structural inspection of aircraft, offshore platforms, oil tankers, ridges, and buildings. In addition, its applications to model refinement will improve the design of structural systems such as automobiles and electronic packaging.

  13. Recent developments in broadly applicable structure-biodegradability relationships.

    PubMed

    Jaworska, Joanna S; Boethling, Robert S; Howard, Philip H

    2003-08-01

    Biodegradation is one of the most important processes influencing concentration of a chemical substance after its release to the environment. It is the main process for removal of many chemicals from the environment and therefore is an important factor in risk assessments. This article reviews available methods and models for predicting biodegradability of organic chemicals from structure. The first section of the article briefly discusses current needs for biodegradability estimation methods related to new and existing chemicals and in the context of multimedia exposure models. Following sections include biodegradation test methods and endpoints used in modeling, with special attention given to the Japanese Ministry of International Trade and Industry test; a primer on modeling, describing the various approaches that have been used in the structure/biodegradability relationship work, and contrasting statistical and mechanistic approaches; and recent developments in structure/biodegradability relationships, divided into group contribution, chemometric, and artificial intelligence approaches.

  14. Global quasi-linearization (GQL) versus QSSA for a hydrogen-air auto-ignition problem.

    PubMed

    Yu, Chunkan; Bykov, Viatcheslav; Maas, Ulrich

    2018-04-25

    A recently developed automatic reduction method for systems of chemical kinetics, the so-called Global Quasi-Linearization (GQL) method, has been implemented to study and reduce the dimensions of a homogeneous combustion system. The results of application of the GQL and the Quasi-Steady State Assumption (QSSA) are compared. A number of drawbacks of the QSSA are discussed, i.e. the selection criteria of QSS-species and its sensitivity to system parameters, initial conditions, etc. To overcome these drawbacks, the GQL approach has been developed as a robust, automatic and scaling invariant method for a global analysis of the system timescale hierarchy and subsequent model reduction. In this work the auto-ignition problem of the hydrogen-air system is considered in a wide range of system parameters and initial conditions. The potential of the suggested approach to overcome most of the drawbacks of the standard approaches is illustrated.

  15. A New Conflict Resolution Method for Multiple Mobile Robots in Cluttered Environments With Motion-Liveness.

    PubMed

    Shahriari, Mohammadali; Biglarbegian, Mohammad

    2018-01-01

    This paper presents a new conflict resolution methodology for multiple mobile robots while ensuring their motion-liveness, especially for cluttered and dynamic environments. Our method constructs a mathematical formulation in a form of an optimization problem by minimizing the overall travel times of the robots subject to resolving all the conflicts in their motion. This optimization problem can be easily solved through coordinating only the robots' speeds. To overcome the computational cost in executing the algorithm for very cluttered environments, we develop an innovative method through clustering the environment into independent subproblems that can be solved using parallel programming techniques. We demonstrate the scalability of our approach through performing extensive simulations. Simulation results showed that our proposed method is capable of resolving the conflicts of 100 robots in less than 1.23 s in a cluttered environment that has 4357 intersections in the paths of the robots. We also developed an experimental testbed and demonstrated that our approach can be implemented in real time. We finally compared our approach with other existing methods in the literature both quantitatively and qualitatively. This comparison shows while our approach is mathematically sound, it is more computationally efficient, scalable for very large number of robots, and guarantees the live and smooth motion of robots.

  16. A Mobile Outdoor Augmented Reality Method Combining Deep Learning Object Detection and Spatial Relationships for Geovisualization

    PubMed Central

    Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun

    2017-01-01

    The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device’s built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction. PMID:28837096

  17. A Mobile Outdoor Augmented Reality Method Combining Deep Learning Object Detection and Spatial Relationships for Geovisualization.

    PubMed

    Rao, Jinmeng; Qiao, Yanjun; Ren, Fu; Wang, Junxing; Du, Qingyun

    2017-08-24

    The purpose of this study was to develop a robust, fast and markerless mobile augmented reality method for registration, geovisualization and interaction in uncontrolled outdoor environments. We propose a lightweight deep-learning-based object detection approach for mobile or embedded devices; the vision-based detection results of this approach are combined with spatial relationships by means of the host device's built-in Global Positioning System receiver, Inertial Measurement Unit and magnetometer. Virtual objects generated based on geospatial information are precisely registered in the real world, and an interaction method based on touch gestures is implemented. The entire method is independent of the network to ensure robustness to poor signal conditions. A prototype system was developed and tested on the Wuhan University campus to evaluate the method and validate its results. The findings demonstrate that our method achieves a high detection accuracy, stable geovisualization results and interaction.

  18. Validation of Western North America Models based on finite-frequency and ray theory imaging methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larmat, Carene; Maceira, Monica; Porritt, Robert W.

    2015-02-02

    We validate seismic models developed for western North America with a focus on effect of imaging methods on data fit. We use the DNA09 models for which our collaborators provide models built with both the body-­wave FF approach and the RT approach, when the data selection, processing and reference models are the same.

  19. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  20. Situational Approaches to Direct Practice: Origin, Decline, and Re-Emergence

    ERIC Educational Resources Information Center

    Murdach, Allison D.

    2007-01-01

    During the 1890s and the first three decades of the 20th century, social work in the United States developed a community-based direct practice approach to family assistance and social reform. The basis for this method was a situational view of social life that emphasized the use of interpersonal and transactional methods to achieve social and…

  1. Alternative Test Methods for Developmental Neurotoxicity: A ...

    EPA Pesticide Factsheets

    Exposure to environmental contaminants is well documented to adversely impact the development of the nervous system. However, the time, animal and resource intensive EPA and OECD testing guideline methods for developmental neurotoxicity (DNT) are not a viable solution to characterizing potential chemical hazards for the thousands of untested chemicals currently in commerce. Thus, research efforts over the past decade have endeavored to develop cost-effective alternative DNT testing methods. These efforts have begun to generate data that can inform regulatory decisions. Yet there are major challenges to both the acceptance and use of this data. Major scientific challenges for DNT include development of new methods and models that are “fit for purpose”, development of a decision-use framework, and regulatory acceptance of the methods. It is critical to understand that use of data from these methods will be driven mainly by the regulatory problems being addressed. Some problems may be addressed with limited datasets, while others may require data for large numbers of chemicals, or require the development and use of new biological and computational models. For example mechanistic information derived from in vitro DNT assays can be used to inform weight of evidence (WoE) or integrated approaches to testing and assessment (IATA) approaches for chemical-specific assessments. Alternatively, in vitro data can be used to prioritize (for further testing) the thousands

  2. A new approach for continuous estimation of baseflow using discrete water quality data: Method description and comparison with baseflow estimates from two existing approaches

    USGS Publications Warehouse

    Miller, Matthew P.; Johnson, Henry M.; Susong, David D.; Wolock, David M.

    2015-01-01

    Understanding how watershed characteristics and climate influence the baseflow component of stream discharge is a topic of interest to both the scientific and water management communities. Therefore, the development of baseflow estimation methods is a topic of active research. Previous studies have demonstrated that graphical hydrograph separation (GHS) and conductivity mass balance (CMB) methods can be applied to stream discharge data to estimate daily baseflow. While CMB is generally considered to be a more objective approach than GHS, its application across broad spatial scales is limited by a lack of high frequency specific conductance (SC) data. We propose a new method that uses discrete SC data, which are widely available, to estimate baseflow at a daily time step using the CMB method. The proposed approach involves the development of regression models that relate discrete SC concentrations to stream discharge and time. Regression-derived CMB baseflow estimates were more similar to baseflow estimates obtained using a CMB approach with measured high frequency SC data than were the GHS baseflow estimates at twelve snowmelt dominated streams and rivers. There was a near perfect fit between the regression-derived and measured CMB baseflow estimates at sites where the regression models were able to accurately predict daily SC concentrations. We propose that the regression-derived approach could be applied to estimate baseflow at large numbers of sites, thereby enabling future investigations of watershed and climatic characteristics that influence the baseflow component of stream discharge across large spatial scales.

  3. Characterization of Meta-Materials Using Computational Electromagnetic Methods

    NASA Technical Reports Server (NTRS)

    Deshpande, Manohar; Shin, Joon

    2005-01-01

    An efficient and powerful computational method is presented to synthesize a meta-material to specified electromagnetic properties. Using the periodicity of meta-materials, the Finite Element Methodology (FEM) is developed to estimate the reflection and transmission through the meta-material structure for a normal plane wave incidence. For efficient computations of the reflection and transmission over a wide band frequency range through a meta-material a Finite Difference Time Domain (FDTD) approach is also developed. Using the Nicholson-Ross method and the Genetic Algorithms, a robust procedure to extract electromagnetic properties of meta-material from the knowledge of its reflection and transmission coefficients is described. Few numerical examples are also presented to validate the present approach.

  4. TLNS3D/CDISC Multipoint Design of the TCA Concept

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Mann, Michael J.

    1999-01-01

    This paper presents the work done to date by the authors on developing an efficient approach to multipoint design and applying it to the design of the HSR TCA (High Speed Research Technology Concept Aircraft) configuration. While the title indicates that this exploratory study has been performed using the TLNS3DMB flow solver and the CDISC (Constrained Direct Iterative Surface Curvature) design method, the CDISC method could have been used with any flow solver, and the multipoint design approach does not require the use of CDISC. The goal of the study was to develop a multipoint design method that could achieve a design in about the same time as 10 analysis runs.

  5. Analytical and quasi-Bayesian methods as development of the iterative approach for mixed radiation biodosimetry.

    PubMed

    Słonecka, Iwona; Łukasik, Krzysztof; Fornalski, Krzysztof W

    2018-06-04

    The present paper proposes two methods of calculating components of the dose absorbed by the human body after exposure to a mixed neutron and gamma radiation field. The article presents a novel approach to replace the common iterative method in its analytical form, thus reducing the calculation time. It also shows a possibility of estimating the neutron and gamma doses when their ratio in a mixed beam is not precisely known.

  6. A novel approach to identifying regulatory motifs in distantly related genomes

    PubMed Central

    Van Hellemont, Ruth; Monsieurs, Pieter; Thijs, Gert; De Moor, Bart; Van de Peer, Yves; Marchal, Kathleen

    2005-01-01

    Although proven successful in the identification of regulatory motifs, phylogenetic footprinting methods still show some shortcomings. To assess these difficulties, most apparent when applying phylogenetic footprinting to distantly related organisms, we developed a two-step procedure that combines the advantages of sequence alignment and motif detection approaches. The results on well-studied benchmark datasets indicate that the presented method outperforms other methods when the sequences become either too long or too heterogeneous in size. PMID:16420672

  7. A computational approach to candidate gene prioritization for X-linked mental retardation using annotation-based binary filtering and motif-based linear discriminatory analysis

    PubMed Central

    2011-01-01

    Background Several computational candidate gene selection and prioritization methods have recently been developed. These in silico selection and prioritization techniques are usually based on two central approaches - the examination of similarities to known disease genes and/or the evaluation of functional annotation of genes. Each of these approaches has its own caveats. Here we employ a previously described method of candidate gene prioritization based mainly on gene annotation, in accompaniment with a technique based on the evaluation of pertinent sequence motifs or signatures, in an attempt to refine the gene prioritization approach. We apply this approach to X-linked mental retardation (XLMR), a group of heterogeneous disorders for which some of the underlying genetics is known. Results The gene annotation-based binary filtering method yielded a ranked list of putative XLMR candidate genes with good plausibility of being associated with the development of mental retardation. In parallel, a motif finding approach based on linear discriminatory analysis (LDA) was employed to identify short sequence patterns that may discriminate XLMR from non-XLMR genes. High rates (>80%) of correct classification was achieved, suggesting that the identification of these motifs effectively captures genomic signals associated with XLMR vs. non-XLMR genes. The computational tools developed for the motif-based LDA is integrated into the freely available genomic analysis portal Galaxy (http://main.g2.bx.psu.edu/). Nine genes (APLN, ZC4H2, MAGED4, MAGED4B, RAP2C, FAM156A, FAM156B, TBL1X, and UXT) were highlighted as highly-ranked XLMR methods. Conclusions The combination of gene annotation information and sequence motif-orientated computational candidate gene prediction methods highlight an added benefit in generating a list of plausible candidate genes, as has been demonstrated for XLMR. Reviewers: This article was reviewed by Dr Barbara Bardoni (nominated by Prof Juergen Brosius); Prof Neil Smalheiser and Dr Dustin Holloway (nominated by Prof Charles DeLisi). PMID:21668950

  8. Developing Learners' Second Language Communicative Competence through Active Learning: Clickers or Communicative Approach?

    ERIC Educational Resources Information Center

    Agbatogun, Alaba Olaoluwakotansibe

    2014-01-01

    The purpose of this study was to compare the impact of clickers, the communicative approach and the lecture method on the communicative competence development of learners who were taught English a second language (ESL). Ninety nine pupils from three primary schools participated in the study. Quasi-experimental non-randomised pre-test posttest…

  9. Student-Led Development of an Interactive and Free Biochemical Methods eBook

    ERIC Educational Resources Information Center

    Hill, Alyssa C.; Nickels, Logan M.; Sims, Paul A.

    2016-01-01

    An approach to create an interactive and inexpensive electronic book (eBook) for an undergraduate biochemistry laboratory course is presented. This approach featured the involvement of an undergraduate student in the lead role of designing and developing the eBook using Apple's iBooks Author application. The eBook, entitled "Introduction to…

  10. Implementation of Active Teaching Methods and Emerging Topics in Photogrammetry and Remote Sensing Subjects

    NASA Astrophysics Data System (ADS)

    Kosmatin Fras, M.; Grigillo, D.

    2016-06-01

    Fast technological developments in photogrammetry and remote sensing areas demand quick and steady changes in the education programme and its realization. The university teachers and assistants are faced with ensuring the learning materials, data and software for practical lessons, as well as project proposals for student's team work and bachelor or master thesis. In this paper the emerging topics that already have a considerable impact in the practice are treated mostly from the educational aspect. These relatively new topics that are considered in this paper are unmanned aerial systems for spatial data collection, terrestrial and aerial laser scanning, mobile mapping systems, and novelties in satellite remote sensing. The focus is given to practical implementation of these topics into the teaching and learning programme of Geodesy and Geoinformation at the University of Ljubljana, Faculty of Civil and Geodetic Engineering, and experiences gained by the authors so far. Together with the technological advances, the teaching approaches must be modernized as well. Classical approaches of teaching, where a lecturer gives lecture ex cathedra and students are only listeners, are not effective enough. The didactics science of teaching has developed and proved in the practice many useful approaches that can better motivate students for more active learning. We can use different methods of team work like pro et contra debate, buzzing groups, press conference, moderated discussion etc. An experimental study on active teaching methods in the class of students of the Master programme of Geodesy and Geoinformation has been made and the results are presented. After using some new teaching methods in the class, the students were asked to answer two types of a questionnaire. First questionnaire was the standard form developed by Noel Entwistle, an educational psychologist who developed the Approaches to Studying Inventory (ASI) for identifying deep and surface approaches to learning. The second questionnaire was developed for our purpose to get the feedback from students on active teaching and learning methods. Although this investigation has been done only for one class of master programme students, the results are encouraging and we could extract some recommendations for the future.

  11. A review of lipidomic technologies applicable to sphingolipidomics and their relevant applications

    PubMed Central

    Han, Xianlin; Jiang, Xuntian

    2009-01-01

    Sphingolipidomics, a branch of lipidomics, focuses on the large-scale study of the cellular sphingolipidomes. In the current review, two main approaches for the analysis of cellular sphingolipidomes (i.e. LC-MS- or LC-MS/MS-based approach and shotgun lipidomics-based approach) are briefly discussed. Their advantages, some considerations of these methods, and recent applications of these approaches are summarized. It is the authors’ sincere hope that this review article will add to the readers understanding of the advantages and limitations of each developed method for the analysis of a cellular sphingolipidome. PMID:19690629

  12. Novel strategy of endoscopic submucosal dissection using an insulation-tipped knife for early gastric cancer: near-side approach method

    PubMed Central

    Mori, Genki; Nonaka, Satoru; Oda, Ichiro; Abe, Seiichiro; Suzuki, Haruhisa; Yoshinaga, Shigetaka; Nakajima, Takeshi; Saito, Yutaka

    2015-01-01

    Background and study aims: Endoscopic submucosal dissection (ESD) using insulation-tipped knives (IT knives) to treat gastric lesions located on the greater curvature of the gastric body remains technically challenging because of the associated bleeding, control of which can be difficult and time consuming. To eliminate these difficulties, we developed a novel strategy which we have called the “near-side approach method” and assessed its utility. Patients and methods: We reviewed patients who underwent ESD for solitary early gastric cancer located on the greater curvature of the gastric body from January 2003 to September 2014. The technical results of ESD were compared between the group treated with the novel near-side approach method and the group treated with the conventional method. Results: This study included 238 patients with 238 lesions, 118 of which were removed using the near-side approach method and 120 of which were removed using the conventional method. The median procedure time was 92 minutes for the near-side approach method and 120 minutes for the conventional method. The procedure time was significantly shorter in the near-side approach method arm. Although, the procedure time required by an experienced endoscopist was not significantly different between the two groups (100 vs. 110 minutes), the near-side approach group showed significantly shorter procedure time for a less-experienced endoscopist (90 vs. 120 minutes). Conclusions: The near-side approach method appears to require less time to complete gastric ESD than the conventional method using IT knives for technically challenging lesions located on the greater curvature of the gastric body, especially if the procedure is performed by less-experienced endoscopists. PMID:26528496

  13. Discovering Entrepreneurship: An Exploration of a Tripartite Approach to Developing Entrepreneurial Capacities

    ERIC Educational Resources Information Center

    Collins, Lorna A.; Smith, Alison J.; Hannon, Paul D.

    2006-01-01

    Purpose: To describe an exploration in the use of synergistic learning methods in the delivery of an innovative pilot programme designed to teach entrepreneurship capacities. The programme took a tripartite approach involving nascent entrepreneurs, existing entrepreneurs and facilitators using an action research and action learning approach.…

  14. Developing a Mind-Body Exercise Programme for Stressed Children

    ERIC Educational Resources Information Center

    Wang, Claudia; Seo, Dong-Chul; Geib, Roy W

    2017-01-01

    Objective: To describe the process of developing a Health Qigong programme for stressed children using a formative evaluation approach. Methods: A multi-step formative evaluation method was utilised. These steps included (1) identifying programme content and drafting the curriculum, (2) synthesising effective and age-appropriate pedagogies, (3)…

  15. ASSESSMENT OF PC12 CELL DIFFERENTIATION AND NEURITE GROWTH: A COMPARISON OF MORPHOLOGICAL AND NEUROCHEMICAL MEASURES.

    EPA Science Inventory

    In order to screen large numbers of chemicals for their potential to produce developmental neurotoxicity new, in vitro methods are needed. One approach is to develop methods based on the biologic processes which underlie brain development including the growth and maturation of ce...

  16. Interdisciplinary Methods in Water Resources

    ERIC Educational Resources Information Center

    Cosens, Barbara; Fiedler, Fritz; Boll, Jan; Higgins, Lorie; Johnson, Gary; Kennedy, Brian; Strand, Eva; Wilson, Patrick; Laflin, Maureen

    2011-01-01

    In the face of a myriad of complex water resource issues, traditional disciplinary separation is ineffective in developing approaches to promote a sustainable water future. As part of a new graduate program in water resources, faculty at the University of Idaho have developed a course on interdisciplinary methods designed to prepare students for…

  17. Developing analytical approaches to explore the connectionbetween endocrine-active pharmaceuticals in waterto effects in fish

    EPA Science Inventory

    The emphasis of this research project was to develop, and optimize, a solid-phase extraction (SPE) method and high performance liquid chromatography-electrospray ionization- mass spectrometry (LC-MS/MS) method, such that a linkage between the detection of endocrine active pharma...

  18. Direct torque control method applied to the WECS based on the PMSG and controlled with backstepping approach

    NASA Astrophysics Data System (ADS)

    Errami, Youssef; Obbadi, Abdellatif; Sahnoun, Smail; Ouassaid, Mohammed; Maaroufi, Mohamed

    2018-05-01

    This paper proposes a Direct Torque Control (DTC) method for Wind Power System (WPS) based Permanent Magnet Synchronous Generator (PMSG) and Backstepping approach. In this work, generator side and grid-side converter with filter are used as the interface between the wind turbine and grid. Backstepping approach demonstrates great performance in complicated nonlinear systems control such as WPS. So, the control method combines the DTC to achieve Maximum Power Point Tracking (MPPT) and Backstepping approach to sustain the DC-bus voltage and to regulate the grid-side power factor. In addition, control strategy is developed in the sense of Lyapunov stability theorem for the WPS. Simulation results using MATLAB/Simulink validate the effectiveness of the proposed controllers.

  19. The rideability of a deflected bridge approach slab : technical summary report 457.

    DOT National Transportation Integrated Search

    2009-11-01

    The Louisiana Department of Transportation and Development (LADOTD) initiated the Louisiana Quality : Initiative (LQI) entitled Preservation of Bridge Approach Rideability to explore different potential : methods of solving what has been observ...

  20. Development of an integrated configuration management/flight director system for piloted STOL approaches

    NASA Technical Reports Server (NTRS)

    Hoh, R. H.; Klein, R. H.; Johnson, W. A.

    1977-01-01

    A system analysis method for the development of an integrated configuration management/flight director system for IFR STOL approaches is presented. Curved descending decelerating approach trajectories are considered. Considerable emphasis is placed on satisfying the pilot centered requirements (acceptable workload) as well as the usual guidance and control requirements (acceptable performance). The Augmentor Wing Jet STOL Research Aircraft was utilized to allow illustration by example, and to validate the analysis procedure via manned simulation.

  1. A bootstrapping method for development of Treebank

    NASA Astrophysics Data System (ADS)

    Zarei, F.; Basirat, A.; Faili, H.; Mirain, M.

    2017-01-01

    Using statistical approaches beside the traditional methods of natural language processing could significantly improve both the quality and performance of several natural language processing (NLP) tasks. The effective usage of these approaches is subject to the availability of the informative, accurate and detailed corpora on which the learners are trained. This article introduces a bootstrapping method for developing annotated corpora based on a complex and rich linguistically motivated elementary structure called supertag. To this end, a hybrid method for supertagging is proposed that combines both of the generative and discriminative methods of supertagging. The method was applied on a subset of Wall Street Journal (WSJ) in order to annotate its sentences with a set of linguistically motivated elementary structures of the English XTAG grammar that is using a lexicalised tree-adjoining grammar formalism. The empirical results confirm that the bootstrapping method provides a satisfactory way for annotating the English sentences with the mentioned structures. The experiments show that the method could automatically annotate about 20% of WSJ with the accuracy of F-measure about 80% of which is particularly 12% higher than the F-measure of the XTAG Treebank automatically generated from the approach proposed by Basirat and Faili [(2013). Bridge the gap between statistical and hand-crafted grammars. Computer Speech and Language, 27, 1085-1104].

  2. Development of hybrid genetic-algorithm-based neural networks using regression trees for modeling air quality inside a public transportation bus.

    PubMed

    Kadiyala, Akhil; Kaur, Devinder; Kumar, Ashok

    2013-02-01

    The present study developed a novel approach to modeling indoor air quality (IAQ) of a public transportation bus by the development of hybrid genetic-algorithm-based neural networks (also known as evolutionary neural networks) with input variables optimized from using the regression trees, referred as the GART approach. This study validated the applicability of the GART modeling approach in solving complex nonlinear systems by accurately predicting the monitored contaminants of carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), sulfur dioxide (SO2), 0.3-0.4 microm sized particle numbers, 0.4-0.5 microm sized particle numbers, particulate matter (PM) concentrations less than 1.0 microm (PM10), and PM concentrations less than 2.5 microm (PM2.5) inside a public transportation bus operating on 20% grade biodiesel in Toledo, OH. First, the important variables affecting each monitored in-bus contaminant were determined using regression trees. Second, the analysis of variance was used as a complimentary sensitivity analysis to the regression tree results to determine a subset of statistically significant variables affecting each monitored in-bus contaminant. Finally, the identified subsets of statistically significant variables were used as inputs to develop three artificial neural network (ANN) models. The models developed were regression tree-based back-propagation network (BPN-RT), regression tree-based radial basis function network (RBFN-RT), and GART models. Performance measures were used to validate the predictive capacity of the developed IAQ models. The results from this approach were compared with the results obtained from using a theoretical approach and a generalized practicable approach to modeling IAQ that included the consideration of additional independent variables when developing the aforementioned ANN models. The hybrid GART models were able to capture majority of the variance in the monitored in-bus contaminants. The genetic-algorithm-based neural network IAQ models outperformed the traditional ANN methods of the back-propagation and the radial basis function networks. The novelty of this research is the development of a novel approach to modeling vehicular indoor air quality by integration of the advanced methods of genetic algorithms, regression trees, and the analysis of variance for the monitored in-vehicle gaseous and particulate matter contaminants, and comparing the results obtained from using the developed approach with conventional artificial intelligence techniques of back propagation networks and radial basis function networks. This study validated the newly developed approach using holdout and threefold cross-validation methods. These results are of great interest to scientists, researchers, and the public in understanding the various aspects of modeling an indoor microenvironment. This methodology can easily be extended to other fields of study also.

  3. Automatically finding relevant citations for clinical guideline development.

    PubMed

    Bui, Duy Duc An; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2015-10-01

    Literature database search is a crucial step in the development of clinical practice guidelines and systematic reviews. In the age of information technology, the process of literature search is still conducted manually, therefore it is costly, slow and subject to human errors. In this research, we sought to improve the traditional search approach using innovative query expansion and citation ranking approaches. We developed a citation retrieval system composed of query expansion and citation ranking methods. The methods are unsupervised and easily integrated over the PubMed search engine. To validate the system, we developed a gold standard consisting of citations that were systematically searched and screened to support the development of cardiovascular clinical practice guidelines. The expansion and ranking methods were evaluated separately and compared with baseline approaches. Compared with the baseline PubMed expansion, the query expansion algorithm improved recall (80.2% vs. 51.5%) with small loss on precision (0.4% vs. 0.6%). The algorithm could find all citations used to support a larger number of guideline recommendations than the baseline approach (64.5% vs. 37.2%, p<0.001). In addition, the citation ranking approach performed better than PubMed's "most recent" ranking (average precision +6.5%, recall@k +21.1%, p<0.001), PubMed's rank by "relevance" (average precision +6.1%, recall@k +14.8%, p<0.001), and the machine learning classifier that identifies scientifically sound studies from MEDLINE citations (average precision +4.9%, recall@k +4.2%, p<0.001). Our unsupervised query expansion and ranking techniques are more flexible and effective than PubMed's default search engine behavior and the machine learning classifier. Automated citation finding is promising to augment the traditional literature search. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. W. Parry; J.A Forester; V.N. Dang

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less

  5. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  6. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  7. The Island Approach.

    ERIC Educational Resources Information Center

    Schroder, Peter C.

    1994-01-01

    Proposes the study of islands to develop a method of integrating sustainable development with sound resource management that can be extrapolated to more complex, highly populated continental coastal areas. (MDH)

  8. A Graph-Embedding Approach to Hierarchical Visual Word Mergence.

    PubMed

    Wang, Lei; Liu, Lingqiao; Zhou, Luping

    2017-02-01

    Appropriately merging visual words are an effective dimension reduction method for the bag-of-visual-words model in image classification. The approach of hierarchically merging visual words has been extensively employed, because it gives a fully determined merging hierarchy. Existing supervised hierarchical merging methods take different approaches and realize the merging process with various formulations. In this paper, we propose a unified hierarchical merging approach built upon the graph-embedding framework. Our approach is able to merge visual words for any scenario, where a preferred structure and an undesired structure are defined, and, therefore, can effectively attend to all kinds of requirements for the word-merging process. In terms of computational efficiency, we show that our algorithm can seamlessly integrate a fast search strategy developed in our previous work and, thus, well maintain the state-of-the-art merging speed. To the best of our survey, the proposed approach is the first one that addresses the hierarchical visual word mergence in such a flexible and unified manner. As demonstrated, it can maintain excellent image classification performance even after a significant dimension reduction, and outperform all the existing comparable visual word-merging methods. In a broad sense, our work provides an open platform for applying, evaluating, and developing new criteria for hierarchical word-merging tasks.

  9. Robust climate policies under uncertainty: a comparison of robust decision making and info-gap methods.

    PubMed

    Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J

    2012-10-01

    This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.

  10. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    NASA Astrophysics Data System (ADS)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  11. Improving the dictionary lookup approach for disease normalization using enhanced dictionary and query expansion.

    PubMed

    Jonnagaddala, Jitendra; Jue, Toni Rose; Chang, Nai-Wen; Dai, Hong-Jie

    2016-01-01

    The rapidly increasing biomedical literature calls for the need of an automatic approach in the recognition and normalization of disease mentions in order to increase the precision and effectivity of disease based information retrieval. A variety of methods have been proposed to deal with the problem of disease named entity recognition and normalization. Among all the proposed methods, conditional random fields (CRFs) and dictionary lookup method are widely used for named entity recognition and normalization respectively. We herein developed a CRF-based model to allow automated recognition of disease mentions, and studied the effect of various techniques in improving the normalization results based on the dictionary lookup approach. The dataset from the BioCreative V CDR track was used to report the performance of the developed normalization methods and compare with other existing dictionary lookup based normalization methods. The best configuration achieved an F-measure of 0.77 for the disease normalization, which outperformed the best dictionary lookup based baseline method studied in this work by an F-measure of 0.13.Database URL: https://github.com/TCRNBioinformatics/DiseaseExtract. © The Author(s) 2016. Published by Oxford University Press.

  12. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1994-01-01

    The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.

  13. Towards a Highly Efficient Meshfree Simulation of Non-Newtonian Free Surface Ice Flow: Application to the Haut Glacier d'Arolla

    NASA Astrophysics Data System (ADS)

    Shcherbakov, V.; Ahlkrona, J.

    2016-12-01

    In this work we develop a highly efficient meshfree approach to ice sheet modeling. Traditionally mesh based methods such as finite element methods are employed to simulate glacier and ice sheet dynamics. These methods are mature and well developed. However, despite of numerous advantages these methods suffer from some drawbacks such as necessity to remesh the computational domain every time it changes its shape, which significantly complicates the implementation on moving domains, or a costly assembly procedure for nonlinear problems. We introduce a novel meshfree approach that frees us from all these issues. The approach is built upon a radial basis function (RBF) method that, thanks to its meshfree nature, allows for an efficient handling of moving margins and free ice surface. RBF methods are also accurate and easy to implement. Since the formulation is stated in strong form it allows for a substantial reduction of the computational cost associated with the linear system assembly inside the nonlinear solver. We implement a global RBF method that defines an approximation on the entire computational domain. This method exhibits high accuracy properties. However, it suffers from a disadvantage that the coefficient matrix is dense, and therefore the computational efficiency decreases. In order to overcome this issue we also implement a localized RBF method that rests upon a partition of unity approach to subdivide the domain into several smaller subdomains. The radial basis function partition of unity method (RBF-PUM) inherits high approximation characteristics form the global RBF method while resulting in a sparse system of equations, which essentially increases the computational efficiency. To demonstrate the usefulness of the RBF methods we model the velocity field of ice flow in the Haut Glacier d'Arolla. We assume that the flow is governed by the nonlinear Blatter-Pattyn equations. We test the methods for different basal conditions and for a free moving surface. Both RBF methods are compared with a classical finite element method in terms of accuracy and efficiency. We find that the RBF methods are more efficient than the finite element method and well suited for ice dynamics modeling, especially the partition of unity approach.

  14. A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation

    PubMed Central

    Ali Khan, Wajahat; Hur, Taeho; Muhammad Bilal, Hafiz Syed; Ul Hassan, Anees; Lee, Sungyoung

    2018-01-01

    The user experience (UX) is an emerging field in user research and design, and the development of UX evaluation methods presents a challenge for both researchers and practitioners. Different UX evaluation methods have been developed to extract accurate UX data. Among UX evaluation methods, the mixed-method approach of triangulation has gained importance. It provides more accurate and precise information about the user while interacting with the product. However, this approach requires skilled UX researchers and developers to integrate multiple devices, synchronize them, analyze the data, and ultimately produce an informed decision. In this paper, a method and system for measuring the overall UX over time using a triangulation method are proposed. The proposed platform incorporates observational and physiological measurements in addition to traditional ones. The platform reduces the subjective bias and validates the user’s perceptions, which are measured by different sensors through objectification of the subjective nature of the user in the UX assessment. The platform additionally offers plug-and-play support for different devices and powerful analytics for obtaining insight on the UX in terms of multiple participants. PMID:29783712

  15. A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation.

    PubMed

    Hussain, Jamil; Khan, Wajahat Ali; Hur, Taeho; Bilal, Hafiz Syed Muhammad; Bang, Jaehun; Hassan, Anees Ul; Afzal, Muhammad; Lee, Sungyoung

    2018-05-18

    The user experience (UX) is an emerging field in user research and design, and the development of UX evaluation methods presents a challenge for both researchers and practitioners. Different UX evaluation methods have been developed to extract accurate UX data. Among UX evaluation methods, the mixed-method approach of triangulation has gained importance. It provides more accurate and precise information about the user while interacting with the product. However, this approach requires skilled UX researchers and developers to integrate multiple devices, synchronize them, analyze the data, and ultimately produce an informed decision. In this paper, a method and system for measuring the overall UX over time using a triangulation method are proposed. The proposed platform incorporates observational and physiological measurements in addition to traditional ones. The platform reduces the subjective bias and validates the user's perceptions, which are measured by different sensors through objectification of the subjective nature of the user in the UX assessment. The platform additionally offers plug-and-play support for different devices and powerful analytics for obtaining insight on the UX in terms of multiple participants.

  16. Gradient optimization and nonlinear control

    NASA Technical Reports Server (NTRS)

    Hasdorff, L.

    1976-01-01

    The book represents an introduction to computation in control by an iterative, gradient, numerical method, where linearity is not assumed. The general language and approach used are those of elementary functional analysis. The particular gradient method that is emphasized and used is conjugate gradient descent, a well known method exhibiting quadratic convergence while requiring very little more computation than simple steepest descent. Constraints are not dealt with directly, but rather the approach is to introduce them as penalty terms in the criterion. General conjugate gradient descent methods are developed and applied to problems in control.

  17. Employing Design and Development Research (DDR): Approaches in the Design and Development of Online Arabic Vocabulary Learning Games Prototype

    ERIC Educational Resources Information Center

    Sahrir, Muhammad Sabri; Alias, Nor Aziah; Ismail, Zawawi; Osman, Nurulhuda

    2012-01-01

    The design and development research, first proposed by Brown and Collins in the 1990s, is currently among the well-known methods in educational research to test theory and validate its practicality. The method is also known as developmental research, design research, design-based research, formative research and design-cased and possesses…

  18. Uncovering a Hidden Professional Agenda for Teacher Educators: A Mixed Method Study on Flemish Teacher Educators and Their Professional Development

    ERIC Educational Resources Information Center

    Tack, Hanne; Valcke, Martin; Rots, Isabel; Struyven, Katrien; Vanderlinde, Ruben

    2018-01-01

    Taking into account the pressing need to understand more about what teacher educators' professional development characterises, this article adopts a mixed method approach to explore Flemish (Dutch-speaking part of Belgium) teacher educators' professional development needs and opportunities. Analysis results of a large-scale survey study with 611…

  19. Human Fecal Source Identification: Real-Time Quantitative PCR Method Standardization

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  20. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    PubMed

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  1. District nursing workforce planning: a review of the methods.

    PubMed

    Reid, Bernie; Kane, Kay; Curran, Carol

    2008-11-01

    District nursing services in Northern Ireland face increasing demands and challenges which may be responded to by effective and efficient workforce planning and development. The aim of this paper is to critically analyse district nursing workforce planning and development methods, in an attempt to find a suitable method for Northern Ireland. A systematic analysis of the literature reveals four methods: professional judgement; population-based health needs; caseload analysis and dependency-acuity. Each method has strengths and weaknesses. Professional judgement offers a 'belt and braces' approach but lacks sensitivity to fluctuating patient numbers. Population-based health needs methods develop staffing algorithms that reflect deprivation and geographical spread, but are poorly understood by district nurses. Caseload analysis promotes equitable workloads but poorly performing district nursing localities may continue if benchmarking processes only consider local data. Dependency-acuity methods provide a means of equalizing and prioritizing workload but are prone to district nurses overstating factors in patient dependency or understating carers' capability. In summary a mixed method approach is advocated to evaluate and adjust the size and mix of district nursing teams using empirically determined patient dependency and activity-based variables based on the population's health needs.

  2. Analysis of enamel development using murine model systems: approaches and limitations

    PubMed Central

    Pugach, Megan K.; Gibson, Carolyn W.

    2014-01-01

    A primary goal of enamel research is to understand and potentially treat or prevent enamel defects related to amelogenesis imperfecta (AI). Rodents are ideal models to assist our understanding of how enamel is formed because they are easily genetically modified, and their continuously erupting incisors display all stages of enamel development and mineralization. While numerous methods have been developed to generate and analyze genetically modified rodent enamel, it is crucial to understand the limitations and challenges associated with these methods in order to draw appropriate conclusions that can be applied translationally, to AI patient care. We have highlighted methods involved in generating and analyzing rodent enamel and potential approaches to overcoming limitations of these methods: (1) generating transgenic, knockout, and knockin mouse models, and (2) analyzing rodent enamel mineral density and functional properties (structure and mechanics) of mature enamel. There is a need for a standardized workflow to analyze enamel phenotypes in rodent models so that investigators can compare data from different studies. These methods include analyses of gene and protein expression, developing enamel histology, enamel pigment, degree of mineralization, enamel structure, and mechanical properties. Standardization of these methods with regard to stage of enamel development and sample preparation is crucial, and ideally investigators can use correlative and complementary techniques with the understanding that developing mouse enamel is dynamic and complex. PMID:25278900

  3. Bayesian-based estimation of acoustic surface impedance: Finite difference frequency domain approach.

    PubMed

    Bockman, Alexander; Fackler, Cameron; Xiang, Ning

    2015-04-01

    Acoustic performance for an interior requires an accurate description of the boundary materials' surface acoustic impedance. Analytical methods may be applied to a small class of test geometries, but inverse numerical methods provide greater flexibility. The parameter estimation problem requires minimizing prediction vice observed acoustic field pressure. The Bayesian-network sampling approach presented here mitigates other methods' susceptibility to noise inherent to the experiment, model, and numerics. A geometry agnostic method is developed here and its parameter estimation performance is demonstrated for an air-backed micro-perforated panel in an impedance tube. Good agreement is found with predictions from the ISO standard two-microphone, impedance-tube method, and a theoretical model for the material. Data by-products exclusive to a Bayesian approach are analyzed to assess sensitivity of the method to nuisance parameters.

  4. A Stable Whole Building Performance Method for Standard 90.1-Part II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Eley, Charles

    2016-06-01

    In May of 2013 we introduced a new approach for compliance with Standard 90.1 that was under development based on the Performance Rating Method of Appendix G to Standard 90.11. Since then, the approach has been finalized through Addendum BM to Standard 90.1-2013 and will be published in the 2016 edition of the Standard. In the meantime, ASHRAE has published an advanced copy of Appendix G including Addendum BM and several other addenda so that software developers and energy program administrators can get a preview of what is coming in the 2016 edition of the Standard2. This article is anmore » update on Addendum BM, summarizes changes made to the original concept as introduced in May of 2013, and provides an approach for developing performance targets for code compliance and beyond code programs.« less

  5. The Balanced Scorecard of acute settings: development process, definition of 20 strategic objectives and implementation.

    PubMed

    Groene, Oliver; Brandt, Elimer; Schmidt, Werner; Moeller, Johannes

    2009-08-01

    Strategy development and implementation in acute care settings is often restricted by competing challenges, the pace of policy reform and the existence of parallel hierarchies. To describe a generic approach to strategy development, illustrate the use of the Balanced Scorecard as a tool to facilitate strategy implementation and demonstrate how to break down strategic goals into measurable elements. Multi-method approach using three different conceptual models: Health Promoting Hospitals Standards and Strategies, the European Foundation for Quality Management (EFQM) Model and the Balanced Scorecard. A bundle of qualitative and quantitative methods were used including in-depth interviews, standardized organization-wide surveys on organizational values, staff satisfaction and patient experience. Three acute care hospitals in four different locations belonging to a German holding group. Chief executive officer, senior medical officers, working group leaders and hospital staff. Development and implementation of the Balanced Scorecard. Twenty strategic objectives with corresponding Balanced Scorecard measures. A stepped approach from strategy development to implementation is presented to identify key themes for strategy development, drafting a strategy map and developing strategic objectives and measures. The Balanced Scorecard, in combination with the EFQM model, is a useful tool to guide strategy development and implementation in health care organizations. As for other quality improvement and management tools not specifically developed for health care organizations, some adaptations are required to improve acceptability among professionals. The step-wise approach of strategy development and implementation presented here may support similar processes in comparable organizations.

  6. Application of high level wavefunction methods in quantum mechanics/molecular mechanics hybrid schemes.

    PubMed

    Mata, Ricardo A

    2010-05-21

    In this Perspective, several developments in the field of quantum mechanics/molecular mechanics (QM/MM) approaches are reviewed. Emphasis is placed on the use of correlated wavefunction theory and new state of the art methods for the treatment of large quantum systems. Until recently, computational chemistry approaches to large/complex chemical problems have seldom been considered as tools for quantitative predictions. However, due to the tremendous development of computational resources and new quantum chemical methods, it is nowadays possible to describe the electronic structure of biomolecules at levels of theory which a decade ago were only possible for system sizes of up to 20 atoms. These advances are here outlined in the context of QM/MM. The article concludes with a short outlook on upcoming developments and possible bottlenecks for future applications.

  7. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  8. Storybridging: Four steps for constructing effective health narratives

    PubMed Central

    Boeijinga, Anniek; Hoeken, Hans; Sanders, José

    2017-01-01

    Objective: To develop a practical step-by-step approach to constructing narrative health interventions in response to the mixed results and wide diversity of narratives used in health-related narrative persuasion research. Method: Development work was guided by essential narrative characteristics as well as principles enshrined in the Health Action Process Approach. Results: The ‘storybridging’ method for constructing health narratives is described as consisting of four concrete steps: (a) identifying the stage of change, (b) identifying the key elements, (c) building the story, and (d) pre-testing the story. These steps are illustrated by means of a case study in which an effective narrative health intervention was developed for Dutch truck drivers: a high-risk, underprivileged occupational group. Conclusion: Although time and labour intensive, the Storybridging approach suggests integrating the target audience as an important stakeholder throughout the development process. Implications and recommendations are provided for health promotion targeting truck drivers specifically and for constructing narrative health interventions in general. PMID:29276232

  9. Computational Prediction of Metabolism: Sites, Products, SAR, P450 Enzyme Dynamics, and Mechanisms

    PubMed Central

    2012-01-01

    Metabolism of xenobiotics remains a central challenge for the discovery and development of drugs, cosmetics, nutritional supplements, and agrochemicals. Metabolic transformations are frequently related to the incidence of toxic effects that may result from the emergence of reactive species, the systemic accumulation of metabolites, or by induction of metabolic pathways. Experimental investigation of the metabolism of small organic molecules is particularly resource demanding; hence, computational methods are of considerable interest to complement experimental approaches. This review provides a broad overview of structure- and ligand-based computational methods for the prediction of xenobiotic metabolism. Current computational approaches to address xenobiotic metabolism are discussed from three major perspectives: (i) prediction of sites of metabolism (SOMs), (ii) elucidation of potential metabolites and their chemical structures, and (iii) prediction of direct and indirect effects of xenobiotics on metabolizing enzymes, where the focus is on the cytochrome P450 (CYP) superfamily of enzymes, the cardinal xenobiotics metabolizing enzymes. For each of these domains, a variety of approaches and their applications are systematically reviewed, including expert systems, data mining approaches, quantitative structure–activity relationships (QSARs), and machine learning-based methods, pharmacophore-based algorithms, shape-focused techniques, molecular interaction fields (MIFs), reactivity-focused techniques, protein–ligand docking, molecular dynamics (MD) simulations, and combinations of methods. Predictive metabolism is a developing area, and there is still enormous potential for improvement. However, it is clear that the combination of rapidly increasing amounts of available ligand- and structure-related experimental data (in particular, quantitative data) with novel and diverse simulation and modeling approaches is accelerating the development of effective tools for prediction of in vivo metabolism, which is reflected by the diverse and comprehensive data sources and methods for metabolism prediction reviewed here. This review attempts to survey the range and scope of computational methods applied to metabolism prediction and also to compare and contrast their applicability and performance. PMID:22339582

  10. An Efficient, Simple, and Noninvasive Procedure for Genotyping Aquatic and Nonaquatic Laboratory Animals.

    PubMed

    Okada, Morihiro; Miller, Thomas C; Roediger, Julia; Shi, Yun-Bo; Schech, Joseph Mat

    2017-09-01

    Various animal models are indispensible in biomedical research. Increasing awareness and regulations have prompted the adaptation of more humane approaches in the use of laboratory animals. With the development of easier and faster methodologies to generate genetically altered animals, convenient and humane methods to genotype these animals are important for research involving such animals. Here, we report skin swabbing as a simple and noninvasive method for extracting genomic DNA from mice and frogs for genotyping. We show that this method is highly reliable and suitable for both immature and adult animals. Our approach allows a simpler and more humane approach for genotyping vertebrate animals.

  11. Developing parenting programs to prevent child health risk behaviors: a practice model

    PubMed Central

    Jackson, Christine; Dickinson, Denise M.

    2009-01-01

    Research indicates that developing public health programs to modify parenting behaviors could lead to multiple beneficial health outcomes for children. Developing feasible effective parenting programs requires an approach that applies a theory-based model of parenting to a specific domain of child health and engages participant representatives in intervention development. This article describes this approach to intervention development in detail. Our presentation emphasizes three points that provide key insights into the goals and procedures of parenting program development. These are a generalized theoretical model of parenting derived from the child development literature, an established eight-step parenting intervention development process and an approach to integrating experiential learning methods into interventions for parents and children. By disseminating this framework for a systematic theory-based approach to developing parenting programs, we aim to support the program development efforts of public health researchers and practitioners who recognize the potential of parenting programs to achieve primary prevention of health risk behaviors in children. PMID:19661165

  12. Developing a competitive advantage in the market for radiology services.

    PubMed

    Kropf, R; Szafran, A J

    1988-01-01

    This article describes how managers of outpatient diagnostic radiology services can develop a competitive advantage by increasing the value of services to patients and referring physicians. A method is presented to identify changes to services that increase their value. The method requires the definition of the "value chains" of patients and referring physicians. Particular attention is paid to the use of information systems technology to suggest and implement service changes. A narrow range of health services was selected because the approach requires a detailed understanding of consumers and how they use services. The approach should, however, be examined carefully by managers seeking to develop a competitive advantage for a wide range of health services.

  13. The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods

    NASA Astrophysics Data System (ADS)

    Crootof, A.; Albrecht, T.; Scott, C. A.

    2017-12-01

    The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.

  14. Monitoring and evaluation of patient involvement in clinical practice guideline development: lessons from the Multidisciplinary Guideline for Employment and Severe Mental Illness, the Netherlands.

    PubMed

    van der Ham, Alida J; van Erp, Nicole; Broerse, Jacqueline E W

    2016-04-01

    The aim of this study was to gain better insight into the quality of patient participation in the development of clinical practice guidelines and to contribute to approaches for the monitoring and evaluation of such initiatives. In addition, we explore the potential of a dialogue-based approach for reconciliation of preferences of patients and professionals in the guideline development processes. The development of the Multidisciplinary Guideline for Employment and Severe Mental Illness in the Netherlands served as a case study. Methods for patient involvement in guideline development included the following: four patient representatives in the development group and advisory committee, two focus group discussions with patients, a dialogue session and eight case studies. To evaluate the quality of patient involvement, we developed a monitoring and evaluation framework including both process and outcome criteria. Data collection included observations, document analysis and semi-structured interviews (n = 26). The quality of patient involvement was enhanced using different methods, reflection of patient input in the guideline text, a supportive attitude among professionals and attention to patient involvement throughout the process. The quality was lower with respect to representing the diversity of the target group, articulation of the patient perspective in the GDG, and clarity and transparency concerning methods of involvement. The monitoring and evaluation framework was useful in providing detailed insights into patient involvement in guideline development. Patient involvement was evaluated as being of good quality. The dialogue-based approach appears to be a promising method for obtaining integrated stakeholder input in a multidisciplinary setting. © 2015 John Wiley & Sons Ltd.

  15. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  16. Assessing sufficient capability: A new approach to economic evaluation.

    PubMed

    Mitchell, Paul Mark; Roberts, Tracy E; Barton, Pelham M; Coast, Joanna

    2015-08-01

    Amartya Sen's capability approach has been discussed widely in the health economics discipline. Although measures have been developed to assess capability in economic evaluation, there has been much less attention paid to the decision rules that might be applied alongside. Here, new methods, drawing on the multidimensional poverty and health economics literature, are developed for conducting economic evaluation within the capability approach and focusing on an objective of achieving "sufficient capability". This objective more closely reflects the concern with equity that pervades the capability approach and the method has the advantage of retaining the longitudinal aspect of estimating outcome that is associated with quality-adjusted life years (QALYs), whilst also drawing on notions of shortfall associated with assessments of poverty. Economic evaluation from this perspective is illustrated in an osteoarthritis patient group undergoing joint replacement, with capability wellbeing assessed using ICECAP-O. Recommendations for taking the sufficient capability approach forward are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. The use of CORE model by metacognitive skill approach in developing characters junior high school students

    NASA Astrophysics Data System (ADS)

    Fisher, Dahlia; Yaniawati, Poppy; Kusumah, Yaya Sukjaya

    2017-08-01

    This study aims to analyze the character of students who obtain CORE learning model using metacognitive approach. The method in this study is qualitative research and quantitative research design (Mixed Method Design) with concurrent embedded strategy. The research was conducted on two groups: an experimental group and the control group. An experimental group consists of students who had CORE model learning using metacognitive approach while the control group consists of students taught by conventional learning. The study was conducted the object this research is the seventh grader students in one the public junior high schools in Bandung. Based on this research, it is known that the characters of the students in the CORE model learning through metacognitive approach is: honest, hard work, curious, conscientious, creative and communicative. Overall it can be concluded that CORE model learning is good for developing characters of a junior high school student.

  18. Use of statistical and neural net approaches in predicting toxicity of chemicals.

    PubMed

    Basak, S C; Grunwald, G D; Gute, B D; Balasubramanian, K; Opitz, D

    2000-01-01

    Hierarchical quantitative structure-activity relationships (H-QSAR) have been developed as a new approach in constructing models for estimating physicochemical, biomedicinal, and toxicological properties of interest. This approach uses increasingly more complex molecular descriptors in a graduated approach to model building. In this study, statistical and neural network methods have been applied to the development of H-QSAR models for estimating the acute aquatic toxicity (LC50) of 69 benzene derivatives to Pimephales promelas (fathead minnow). Topostructural, topochemical, geometrical, and quantum chemical indices were used as the four levels of the hierarchical method. It is clear from both the statistical and neural network models that topostructural indices alone cannot adequately model this set of congeneric chemicals. Not surprisingly, topochemical indices greatly increase the predictive power of both statistical and neural network models. Quantum chemical indices also add significantly to the modeling of this set of acute aquatic toxicity data.

  19. An approach to constrained aerodynamic design with application to airfoils

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.

    1992-01-01

    An approach was developed for incorporating flow and geometric constraints into the Direct Iterative Surface Curvature (DISC) design method. In this approach, an initial target pressure distribution is developed using a set of control points. The chordwise locations and pressure levels of these points are initially estimated either from empirical relationships and observed characteristics of pressure distributions for a given class of airfoils or by fitting the points to an existing pressure distribution. These values are then automatically adjusted during the design process to satisfy the flow and geometric constraints. The flow constraints currently available are lift, wave drag, pitching moment, pressure gradient, and local pressure levels. The geometric constraint options include maximum thickness, local thickness, leading-edge radius, and a 'glove' constraint involving inner and outer bounding surfaces. This design method was also extended to include the successive constraint release (SCR) approach to constrained minimization.

  20. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  1. Developing a Research Method for Testing New Curriculum Ideas: Report No. 1 on Using Marketing Research in a Lifelong Learning Organization.

    ERIC Educational Resources Information Center

    Haskins, Jack B.

    A survey method was developed and used to determine interest in new course topics at the Learning Institute for Elders (LIFE) at the University of Central Florida. In the absence of a known validated method for course concept testing, this approach was modeled after the "message pretesting," or formative, research used and validated in…

  2. Color preservation for tone reproduction and image enhancement

    NASA Astrophysics Data System (ADS)

    Hsin, Chengho; Lee, Zong Wei; Lee, Zheng Zhan; Shin, Shaw-Jyh

    2014-01-01

    Applications based on luminance processing often face the problem of recovering the original chrominance in the output color image. A common approach to reconstruct a color image from the luminance output is by preserving the original hue and saturation. However, this approach often produces a highly colorful image which is undesirable. We develop a color preservation method that not only retains the ratios of the input tri-chromatic values but also adjusts the output chroma in an appropriate way. Linearizing the output luminance is the key idea to realize this method. In addition, a lightness difference metric together with a colorfulness difference metric are proposed to evaluate the performance of the color preservation methods. It shows that the proposed method performs consistently better than the existing approaches.

  3. Supercoherent states and physical systems

    NASA Technical Reports Server (NTRS)

    Fatyga, B. W.; Kostelecky, V. Alan; Nieto, Michael Martin; Truax, D. Rodney

    1992-01-01

    A method is developed for obtaining coherent states of a system admitting a supersymmetry. These states are called supercoherent states. The presented approach is based on an extension to supergroups of the usual group-theoretic approach. The example of the supersymmetric harmonic oscillator is discussed, thereby illustrating some of the attractive features of the method. Supercoherent states of an electron moving in a constant magnetic field are also described.

  4. Developing Distinct Mathematical and Scientific Pedagogical Content Knowledge in an Early Childhood Dual-Content Methods Course: An Alternative to Integration

    ERIC Educational Resources Information Center

    Kalchman, Mindy; Kozoll, Richard H.

    2017-01-01

    Methods for teaching early childhood mathematics and science are often addressed in a single, dual-content course. Approaches to teaching this type of course include integrating the content and the pedagogy of both subjects, or keeping the subject areas distinct. In this article, the authors discuss and illustrate their approach to such a combined…

  5. Foreground Mitigation in the Epoch of Reionization

    NASA Astrophysics Data System (ADS)

    Chapman, Emma

    2018-05-01

    The EoR foregrounds can be up to three magnitudes greater than the cosmological signal we wish to detect. Multiple methods have been developed in order to extract the cosmological signal, falling roughly into three categories: foreground removal, foreground suppression and foreground avoidance. These main approaches are briefly discussed in this review and consideration taken to the future application of these methods as a multi-layered approach.

  6. The Use of the "Indoor-Outdoor-Indoor" Approach to Teaching Science Conservation with Concentration on Methods of Inquiry and Emphasis on Processes of Science, Grades K-3.

    ERIC Educational Resources Information Center

    Busch, Phyllis S.

    Contained are instructional materials developed by the Science Project Related to Upgrading Conservation Education. The lesson plans given are intended to demonstrate the "indoor-outdoor-indoor" approach to teaching science conservation, with concentration on methods of inquiry and emphasis on processes of science. Four subject areas are…

  7. Measuring the Return on Information Technology: A Knowledge-Based Approach for Revenue Allocation at the Process and Firm Level

    DTIC Science & Technology

    2005-07-01

    approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of

  8. Risk prediction model: Statistical and artificial neural network approach

    NASA Astrophysics Data System (ADS)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  9. Land Research

    EPA Pesticide Factsheets

    EPA is working to develop methods and guidance to manage and clean up contaminated land, groundwater and nutrient pollution as well as develop innovative approaches to managing materials and waste including energy recovery.

  10. HUMAN FECAL SOURCE IDENTIFICATION: REAL-TIME QUANTITATIVE PCR METHOD STANDARDIZATION - abstract

    EPA Science Inventory

    Method standardization or the formal development of a protocol that establishes uniform performance benchmarks and practices is necessary for widespread adoption of a fecal source identification approach. Standardization of a human-associated fecal identification method has been...

  11. Developing and Validating Personas in e-Commerce: A Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Thoma, Volker; Williams, Bryn

    A multi-method persona development process in a large e-commerce business is described. Personas are fictional representations of customers that describe typical user attributes to facilitate a user-centered approach in interaction design. In the current project persona attributes were derived from various data sources, such as stakeholder interviews, user tests and interviews, data mining, customer surveys, and ethnographic (direct observation, diary studies) research. The heuristic approach of using these data sources conjointly allowed for an early validation of relevant persona dimensions.

  12. [A new methodological approach for leptospira persistence studies in case of mixed leptospirosis].

    PubMed

    Samsonova, A P; Petrov, E M; Vyshivkina, N V; Anan'ina, Iu V

    2003-01-01

    A new methodical approach for Leptospira persistence studies in case of mixed leptospirosis, based on the use of PCR test systems with different taxonomic specificity for the indication and identification of leptospires, was developed. Two PCR test systems (G and B) were used in experiments on BALB/c white mice to study patterns of the development of mixed infection caused by leptospires of serovar poi (genomospecies L. borgpeterseni) and grippotyphosa (genomospecies L. kirschneri). The conclusion was made of good prospects of this method application in studies on symbiotic relationships of leptospires both in vivo and in vitro.

  13. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods.

    PubMed

    Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A

    2017-08-01

    For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.

  14. Innovative spectrophotometric methods for simultaneous estimation of the novel two-drug combination: Sacubitril/Valsartan through two manipulation approaches and a comparative statistical study.

    PubMed

    Eissa, Maya S; Abou Al Alamein, Amal M

    2018-03-15

    Different innovative spectrophotometric methods were introduced for the first time for simultaneous quantification of sacubitril/valsartan in their binary mixture and in their combined dosage form without prior separation through two manipulation approaches. These approaches were developed and based either on two wavelength selection in zero-order absorption spectra namely; dual wavelength method (DWL) at 226nm and 275nm for valsartan, induced dual wavelength method (IDW) at 226nm and 254nm for sacubitril and advanced absorbance subtraction (AAS) based on their iso-absorptive point at 246nm (λ iso ) and 261nm (sacubitril shows equal absorbance values at the two selected wavelengths) or on ratio spectra using their normalized spectra namely; ratio difference spectrophotometric method (RD) at 225nm and 264nm for both of them in their ratio spectra, first derivative of ratio spectra (DR 1 ) at 232nm for valsartan and 239nm for sacubitril and mean centering of ratio spectra (MCR) at 260nm for both of them. Both sacubitril and valsartan showed linearity upon application of these methods in the range of 2.5-25.0μg/mL. The developed spectrophotmetric methods were successfully applied to the analysis of their combined tablet dosage form ENTRESTO™. The adopted spectrophotometric methods were also validated according to ICH guidelines. The results obtained from the proposed methods were statistically compared to a reported HPLC method using Student t-test, F-test and a comparative study was also developed with one-way ANOVA, showing no statistical difference in accordance to precision and accuracy. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Student Oriented Approaches in the Teaching of Thermodynamics at Universities--Developing an Effective Course Structure

    ERIC Educational Resources Information Center

    Partanen, Lauri

    2016-01-01

    The aim of this study was to apply current pedagogical research in order to develop an effective course and exercise structure for a physical chemistry thermodynamics course intended for second or third year university students of chemistry. A mixed-method approach was used to measure the impact the changes had on student learning. In its final…

  16. Thinking Outside the Box While Playing the Game: A Creative School-Based Approach to Working with Children and Adolescents

    ERIC Educational Resources Information Center

    Martinez, Angel; Lasser, Jon

    2013-01-01

    The process of creating child-developed board games in a counseling setting may promote social, emotional, and behavioral development in children. Using this creative approach, counselors can actively work with children to address referred concerns and build skills that may generalize outside of counseling sessions. A description of the method is…

  17. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the

  18. A system approach for reducing the environmental impact of manufacturing and sustainability improvement of nano-scale manufacturing

    NASA Astrophysics Data System (ADS)

    Yuan, Yingchun

    This dissertation develops an effective and economical system approach to reduce the environmental impact of manufacturing. The system approach is developed by using a process-based holistic method for upstream analysis and source reduction of the environmental impact of manufacturing. The system approach developed consists of three components of a manufacturing system: technology, energy and material, and is useful for sustainable manufacturing as it establishes a clear link between manufacturing system components and its overall sustainability performance, and provides a framework for environmental impact reductions. In this dissertation, the system approach developed is applied for environmental impact reduction of a semiconductor nano-scale manufacturing system, with three case scenarios analyzed in depth on manufacturing process improvement, clean energy supply, and toxic chemical material selection. The analysis on manufacturing process improvement is conducted on Atomic Layer Deposition of Al2O3 dielectric gate on semiconductor microelectronics devices. Sustainability performance and scale-up impact of the ALD technology in terms of environmental emissions, energy consumption, nano-waste generation and manufacturing productivity are systematically investigated and the ways to improve the sustainability of the ALD technology are successfully developed. The clean energy supply is studied using solar photovoltaic, wind, and fuel cells systems for electricity generation. Environmental savings from each clean energy supply over grid power are quantitatively analyzed, and costs for greenhouse gas reductions on each clean energy supply are comparatively studied. For toxic chemical material selection, an innovative schematic method is developed as a visual decision tool for characterizing and benchmarking the human health impact of toxic chemicals, with a case study conducted on six chemicals commonly used as solvents in semiconductor manufacturing. Reliability of the schematic method is validated by comparing its benchmark results on 104 chemicals with that from the conventional Human Toxicity Potential (HTP) method. This dissertation concludes with discussions on environmental impact assessment of nanotechnologies and sustainability management of nano-particles. As nano-manufacturing is emerging for wide industrial applications, improvement and expansion of the system approach would be valuable for use in the environmental management of nano-manufacturing and in the risk control of nano-particles in the interests of public health and the environment.

  19. Housing decision making methods for initiation development phase process

    NASA Astrophysics Data System (ADS)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina

    2017-10-01

    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  20. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  1. Mobile Applications for Patient-centered Care Coordination: A Review of Human Factors Methods Applied to their Design, Development, and Evaluation

    PubMed Central

    Westbrook, J. I.

    2015-01-01

    Summary Objectives To examine if human factors methods were applied in the design, development, and evaluation of mobile applications developed to facilitate aspects of patient-centered care coordination. Methods We searched MEDLINE and EMBASE (2013-2014) for studies describing the design or the evaluation of a mobile health application that aimed to support patients’ active involvement in the coordination of their care. Results 34 papers met the inclusion criteria. Applications ranged from tools that supported self-management of specific conditions (e.g. asthma) to tools that provided coaching or education. Twelve of the 15 papers describing the design or development of an app reported the use of a human factors approach. The most frequently used methods were interviews and surveys, which often included an exploration of participants’ current use of information technology. Sixteen papers described the evaluation of a patient application in practice. All of them adopted a human factors approach, typically an examination of the use of app features and/or surveys or interviews which enquired about patients’ views of the effects of using the app on their behaviors (e.g. medication adherence), knowledge, and relationships with healthcare providers. No study in our review assessed the impact of mobile applications on health outcomes. Conclusion The potential of mobile health applications to assist patients to more actively engage in the management of their care has resulted in a large number of applications being developed. Our review showed that human factors approaches are nearly always adopted to some extent in the design, development, and evaluation of mobile applications. PMID:26293851

  2. Implementing Project Approach in Hong Kong. Preschool.

    ERIC Educational Resources Information Center

    Ho, Rose

    The primary objective of this action research was to shift the teaching method used by preschool teachers in Hong Kong from a teacher-directed mode by training them to use the project approach. The secondary objective was to measure children's achievement while using the project approach, focusing on their language ability, social development, and…

  3. Effectiveness of Social Media for Communicating Health Messages in Ghana

    ERIC Educational Resources Information Center

    Bannor, Richard; Asare, Anthony Kwame; Bawole, Justice Nyigmah

    2017-01-01

    Purpose: The purpose of this paper is to develop an in-depth understanding of the effectiveness, evolution and dynamism of the current health communication media used in Ghana. Design/methodology/approach: This paper uses a multi-method approach which utilizes a combination of qualitative and quantitative approaches. In-depth interviews are…

  4. Teaching Mathematical Induction: An Alternative Approach.

    ERIC Educational Resources Information Center

    Allen, Lucas G.

    2001-01-01

    Describes experience using a new approach to teaching induction that was developed by the Mathematical Methods in High School Project. The basic idea behind the new approach is to use induction to prove that two formulas, one in recursive form and the other in a closed or explicit form, will always agree for whole numbers. (KHR)

  5. An Effective Palmprint Recognition Approach for Visible and Multispectral Sensor Images.

    PubMed

    Gumaei, Abdu; Sammouda, Rachid; Al-Salman, Abdul Malik; Alsanad, Ahmed

    2018-05-15

    Among several palmprint feature extraction methods the HOG-based method is attractive and performs well against changes in illumination and shadowing of palmprint images. However, it still lacks the robustness to extract the palmprint features at different rotation angles. To solve this problem, this paper presents a hybrid feature extraction method, named HOG-SGF that combines the histogram of oriented gradients (HOG) with a steerable Gaussian filter (SGF) to develop an effective palmprint recognition approach. The approach starts by processing all palmprint images by David Zhang's method to segment only the region of interests. Next, we extracted palmprint features based on the hybrid HOG-SGF feature extraction method. Then, an optimized auto-encoder (AE) was utilized to reduce the dimensionality of the extracted features. Finally, a fast and robust regularized extreme learning machine (RELM) was applied for the classification task. In the evaluation phase of the proposed approach, a number of experiments were conducted on three publicly available palmprint databases, namely MS-PolyU of multispectral palmprint images and CASIA and Tongji of contactless palmprint images. Experimentally, the results reveal that the proposed approach outperforms the existing state-of-the-art approaches even when a small number of training samples are used.

  6. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach.

    PubMed

    Park, Hyunseok; Magee, Christopher L

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents.

  7. Tracing Technological Development Trajectories: A Genetic Knowledge Persistence-Based Main Path Approach

    PubMed Central

    2017-01-01

    The aim of this paper is to propose a new method to identify main paths in a technological domain using patent citations. Previous approaches for using main path analysis have greatly improved our understanding of actual technological trajectories but nonetheless have some limitations. They have high potential to miss some dominant patents from the identified main paths; nonetheless, the high network complexity of their main paths makes qualitative tracing of trajectories problematic. The proposed method searches backward and forward paths from the high-persistence patents which are identified based on a standard genetic knowledge persistence algorithm. We tested the new method by applying it to the desalination and the solar photovoltaic domains and compared the results to output from the same domains using a prior method. The empirical results show that the proposed method can dramatically reduce network complexity without missing any dominantly important patents. The main paths identified by our approach for two test cases are almost 10x less complex than the main paths identified by the existing approach. The proposed approach identifies all dominantly important patents on the main paths, but the main paths identified by the existing approach miss about 20% of dominantly important patents. PMID:28135304

  8. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  9. Hybrid method for moving interface problems with application to the Hele-Shaw flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, T.Y.; Li, Zhilin; Osher, S.

    In this paper, a hybrid approach which combines the immersed interface method with the level set approach is presented. The fast version of the immersed interface method is used to solve the differential equations whose solutions and their derivatives may be discontinuous across the interfaces due to the discontinuity of the coefficients or/and singular sources along the interfaces. The moving interfaces then are updated using the newly developed fast level set formulation which involves computation only inside some small tubes containing the interfaces. This method combines the advantage of the two approaches and gives a second-order Eulerian discretization for interfacemore » problems. Several key steps in the implementation are addressed in detail. This new approach is then applied to Hele-Shaw flow, an unstable flow involving two fluids with very different viscosity. 40 refs., 10 figs., 3 tabs.« less

  10. From Ethnography to Items: A Mixed Methods Approach to Developing a Survey to Examine Graduate Engineering Student Retention

    ERIC Educational Resources Information Center

    Crede, Erin; Borrego, Maura

    2013-01-01

    As part of a sequential exploratory mixed methods study, 9 months of ethnographically guided observations and interviews were used to develop a survey examining graduate engineering student retention. Findings from the ethnographic fieldwork yielded several themes, including international diversity, research group organization and climate,…

  11. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  12. Dynamic Assessment: One Approach and Some Initial Data. Technical Report No. 361.

    ERIC Educational Resources Information Center

    Campione, Joseph C.; Brown, Ann L.

    In an effort to validate dynamic assessment methods influenced by Vygotsky's (1978) definition of zones of proximal development (an indicator of readiness), three sets of experiments addressed two goals: the development of diagnostic assessment methods and the use of diagnostic results to guide the design of instructional programs. The first two…

  13. A Four-Stage Method for Developing Early Interventions for Alcohol among Aboriginal Adolescents

    ERIC Educational Resources Information Center

    Mushquash, Christopher J.; Comeau, M. Nancy; McLeod, Brian D.; Stewart, Sherry H.

    2010-01-01

    This paper details a four-stage methodology for developing early alcohol interventions for at-risk Aboriginal youth. Stage 1 was an integrative approach to Aboriginal education that upholds Aboriginal traditional wisdom supporting respectful relationships to the Creator, to the land and to each other. Stage 2 used quantitative methods to…

  14. [Theatre systems as a basis for developing medical and rehabilitation methods in psychiatry].

    PubMed

    Stroganov, A E

    2004-01-01

    On the basis of existing and widely used in theatrical practice systems, a new direction in medical and rehabilitation psychiatry, namely transdramatherapy, was developed. The approach is illustrated by the original psychotherapeutic method of epos therapy directed to treatment of neurotic disorders, which has been already created and approbated.

  15. Aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Murman, E. M.; Chapman, G. T.

    1983-01-01

    The procedure of using numerical optimization methods coupled with computational fluid dynamic (CFD) codes for the development of an aerodynamic design is examined. Several approaches that replace wind tunnel tests, develop pressure distributions and derive designs, or fulfill preset design criteria are presented. The method of Aerodynamic Design by Numerical Optimization (ADNO) is described and illustrated with examples.

  16. Investigating habitat value to inform contaminant remediation options: approach

    Treesearch

    Rebecca A. Efroymson; Mark J. Peterson; Christopher J. Welsh; Daniel L. Druckenbrod; Michael G. Ryon; John G. Smith; William W. Hargrove; Neil R. Giffen; W. Kelly Roy; Harry D. Quarles

    2008-01-01

    Habitat valuation methods are most often developed and used to prioritize candidate lands for conservation. In this study the intent of habitat valuation was to inform the decision-making process for remediation of chemical contaminants on specific lands or surface water bodies. Methods were developed to summarize dimensions of habitat value for six representative...

  17. An Integrated Approach Linking Process to Structural Modeling With Microstructural Characterization for Injections-Molded Long-Fiber Thermoplastics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.

    2008-09-01

    The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less

  18. Design studies of Laminar Flow Control (LFC) wing concepts using superplastics forming and diffusion bonding (SPF/DB)

    NASA Technical Reports Server (NTRS)

    Wilson, V. E.

    1980-01-01

    Alternate concepts and design approaches were developed for suction panels and techniques were defined for integrating these panel designs into a complete LFC 200R wing. The design concepts and approaches were analyzed to assure that they would meet the strength, stability, and internal volume requirements. Cost and weight comparisions of the concepts were also made. Problems of integrating the concepts into a complete aircraft system were addressed. Methods for making splices both chordwise and spanwise, fuel light joints, and internal duct installations were developed. Manufacturing problems such as slot aligment, tapered slot spacing, production methods, and repair techniques were addressed. An assessment of the program was used to developed recommendations for additional research in the development of SPF/DB for LFC structure.

  19. Brief summary of the evolution of high-temperature creep-fatigue life prediction models for crack initiation

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.

    1993-01-01

    The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design. Recently, two of the methods were transcribed into computer software for use with personal computers.

  20. Brief summary of the evolution of high-temperature creep-fatigue life prediction models for crack initiation

    NASA Astrophysics Data System (ADS)

    Halford, Gary R.

    1993-10-01

    The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design.

  1. Contemporary screening approaches to reaction discovery and development.

    PubMed

    Collins, Karl D; Gensch, Tobias; Glorius, Frank

    2014-10-01

    New organic reactivity has often been discovered by happenstance. Several recent research efforts have attempted to leverage this to discover new reactions. In this Review, we attempt to unify reported approaches to reaction discovery on the basis of the practical and strategic principles applied. We concentrate on approaches to reaction discovery as opposed to reaction development, though conceptually groundbreaking approaches to identifying efficient catalyst systems are also considered. Finally, we provide a critical overview of the utility and application of the reported methods from the perspective of a synthetic chemist, and consider the future of high-throughput screening in reaction discovery.

  2. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    DTIC Science & Technology

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  3. A probability-based approach for assessment of roadway safety hardware.

    DOT National Transportation Integrated Search

    2017-03-14

    This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...

  4. A Person-Oriented Approach: Methods for Today and Methods for Tomorrow

    ERIC Educational Resources Information Center

    Bergman, Lars R.; El-Khouri, Bassam M.

    2003-01-01

    Methodological implications of a person-oriented, holistic-interactionistic perspective in research on individual development are outlined, desirable properties of a mathematical model of a phenomenon are discussed, and selected methods for carrying out person-oriented research are briefly overviewed. These methods are: (1) the classificatory…

  5. GSimp: A Gibbs sampler based left-censored missing value imputation approach for metabolomics studies

    PubMed Central

    Jia, Erik; Chen, Tianlu

    2018-01-01

    Left-censored missing values commonly exist in targeted metabolomics datasets and can be considered as missing not at random (MNAR). Improper data processing procedures for missing values will cause adverse impacts on subsequent statistical analyses. However, few imputation methods have been developed and applied to the situation of MNAR in the field of metabolomics. Thus, a practical left-censored missing value imputation method is urgently needed. We developed an iterative Gibbs sampler based left-censored missing value imputation approach (GSimp). We compared GSimp with other three imputation methods on two real-world targeted metabolomics datasets and one simulation dataset using our imputation evaluation pipeline. The results show that GSimp outperforms other imputation methods in terms of imputation accuracy, observation distribution, univariate and multivariate analyses, and statistical sensitivity. Additionally, a parallel version of GSimp was developed for dealing with large scale metabolomics datasets. The R code for GSimp, evaluation pipeline, tutorial, real-world and simulated targeted metabolomics datasets are available at: https://github.com/WandeRum/GSimp. PMID:29385130

  6. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  7. Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli

    PubMed Central

    van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.

    2016-01-01

    Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438

  8. Viral Infection at High Magnification: 3D Electron Microscopy Methods to Analyze the Architecture of Infected Cells

    PubMed Central

    Romero-Brey, Inés; Bartenschlager, Ralf

    2015-01-01

    As obligate intracellular parasites, viruses need to hijack their cellular hosts and reprogram their machineries in order to replicate their genomes and produce new virions. For the direct visualization of the different steps of a viral life cycle (attachment, entry, replication, assembly and egress) electron microscopy (EM) methods are extremely helpful. While conventional EM has given important information about virus-host cell interactions, the development of three-dimensional EM (3D-EM) approaches provides unprecedented insights into how viruses remodel the intracellular architecture of the host cell. During the last years several 3D-EM methods have been developed. Here we will provide a description of the main approaches and examples of innovative applications. PMID:26633469

  9. Viral Infection at High Magnification: 3D Electron Microscopy Methods to Analyze the Architecture of Infected Cells.

    PubMed

    Romero-Brey, Inés; Bartenschlager, Ralf

    2015-12-03

    As obligate intracellular parasites, viruses need to hijack their cellular hosts and reprogram their machineries in order to replicate their genomes and produce new virions. For the direct visualization of the different steps of a viral life cycle (attachment, entry, replication, assembly and egress) electron microscopy (EM) methods are extremely helpful. While conventional EM has given important information about virus-host cell interactions, the development of three-dimensional EM (3D-EM) approaches provides unprecedented insights into how viruses remodel the intracellular architecture of the host cell. During the last years several 3D-EM methods have been developed. Here we will provide a description of the main approaches and examples of innovative applications.

  10. An Old Story in the Parallel Synthesis World: An Approach to Hydantoin Libraries.

    PubMed

    Bogolubsky, Andrey V; Moroz, Yurii S; Savych, Olena; Pipko, Sergey; Konovets, Angelika; Platonov, Maxim O; Vasylchenko, Oleksandr V; Hurmach, Vasyl V; Grygorenko, Oleksandr O

    2018-01-08

    An approach to the parallel synthesis of hydantoin libraries by reaction of in situ generated 2,2,2-trifluoroethylcarbamates and α-amino esters was developed. To demonstrate utility of the method, a library of 1158 hydantoins designed according to the lead-likeness criteria (MW 200-350, cLogP 1-3) was prepared. The success rate of the method was analyzed as a function of physicochemical parameters of the products, and it was found that the method can be considered as a tool for lead-oriented synthesis. A hydantoin-bearing submicromolar primary hit acting as an Aurora kinase A inhibitor was discovered with a combination of rational design, parallel synthesis using the procedures developed, in silico and in vitro screenings.

  11. Quantitative PET Imaging in Drug Development: Estimation of Target Occupancy.

    PubMed

    Naganawa, Mika; Gallezot, Jean-Dominique; Rossano, Samantha; Carson, Richard E

    2017-12-11

    Positron emission tomography, an imaging tool using radiolabeled tracers in humans and preclinical species, has been widely used in recent years in drug development, particularly in the central nervous system. One important goal of PET in drug development is assessing the occupancy of various molecular targets (e.g., receptors, transporters, enzymes) by exogenous drugs. The current linear mathematical approaches used to determine occupancy using PET imaging experiments are presented. These algorithms use results from multiple regions with different target content in two scans, a baseline (pre-drug) scan and a post-drug scan. New mathematical estimation approaches to determine target occupancy, using maximum likelihood, are presented. A major challenge in these methods is the proper definition of the covariance matrix of the regional binding measures, accounting for different variance of the individual regional measures and their nonzero covariance, factors that have been ignored by conventional methods. The novel methods are compared to standard methods using simulation and real human occupancy data. The simulation data showed the expected reduction in variance and bias using the proper maximum likelihood methods, when the assumptions of the estimation method matched those in simulation. Between-method differences for data from human occupancy studies were less obvious, in part due to small dataset sizes. These maximum likelihood methods form the basis for development of improved PET covariance models, in order to minimize bias and variance in PET occupancy studies.

  12. Climate Change Impacts and Vulnerability Assessment in Industrial Complexes

    NASA Astrophysics Data System (ADS)

    Lee, H. J.; Lee, D. K.

    2016-12-01

    Climate change has recently caused frequent natural disasters, such as floods, droughts, and heat waves. Such disasters have also increased industrial damages. We must establish climate change adaptation policies to reduce the industrial damages. It is important to make accurate vulnerability assessment to establish climate change adaptation policies. Thus, this study aims at establishing a new index to assess vulnerability level in industrial complexes. Most vulnerability indices have been developed with subjective approaches, such as the Delphi survey and the Analytic Hierarchy Process(AHP). The subjective approaches rely on the knowledge of a few experts, which provokes the lack of the reliability of the indices. To alleviate the problem, we have designed a vulnerability index incorporating objective approaches. We have investigated 42 industrial complex sites in Republic of Korea (ROK). To calculate weights of variables, we used entropy method as an objective method integrating the Delphi survey as a subjective method. Finally, we found our method integrating both subjective method and objective method could generate result. The integration of the entropy method enables us to assess the vulnerability objectively. Our method will be useful to establish climate change adaptation policies by reducing the uncertainties of the methods based on the subjective approaches.

  13. Designing eHealth that Matters via a Multidisciplinary Requirements Development Approach.

    PubMed

    Van Velsen, Lex; Wentzel, Jobke; Van Gemert-Pijnen, Julia Ewc

    2013-06-24

    Requirements development is a crucial part of eHealth design. It entails all the activities devoted to requirements identification, the communication of requirements to other developers, and their evaluation. Currently, a requirements development approach geared towards the specifics of the eHealth domain is lacking. This is likely to result in a mismatch between the developed technology and end user characteristics, physical surroundings, and the organizational context of use. It also makes it hard to judge the quality of eHealth design, since it makes it difficult to gear evaluations of eHealth to the main goals it is supposed to serve. In order to facilitate the creation of eHealth that matters, we present a practical, multidisciplinary requirements development approach which is embedded in a holistic design approach for eHealth (the Center for eHealth Research roadmap) that incorporates both human-centered design and business modeling. Our requirements development approach consists of five phases. In the first, preparatory, phase the project team is composed and the overall goal(s) of the eHealth intervention are decided upon. Second, primary end users and other stakeholders are identified by means of audience segmentation techniques and our stakeholder identification method. Third, the designated context of use is mapped and end users are profiled by means of requirements elicitation methods (eg, interviews, focus groups, or observations). Fourth, stakeholder values and eHealth intervention requirements are distilled from data transcripts, which leads to phase five, in which requirements are communicated to other developers using a requirements notation template we developed specifically for the context of eHealth technologies. The end result of our requirements development approach for eHealth interventions is a design document which includes functional and non-functional requirements, a list of stakeholder values, and end user profiles in the form of personas (fictitious end users, representative of a primary end user group). The requirements development approach presented in this article enables eHealth developers to apply a systematic and multi-disciplinary approach towards the creation of requirements. The cooperation between health, engineering, and social sciences creates a situation in which a mismatch between design, end users, and the organizational context can be avoided. Furthermore, we suggest to evaluate eHealth on a feature-specific level in order to learn exactly why such a technology does or does not live up to its expectations.

  14. Ab initio approaches for the determination of heavy element energetics: Ionization energies of trivalent lanthanides (Ln = La-Eu)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Charles; Penchoff, Deborah A.; Wilson, Angela K., E-mail: wilson@chemistry.msu.edu

    2015-11-21

    An effective approach for the determination of lanthanide energetics, as demonstrated by application to the third ionization energy (in the gas phase) for the first half of the lanthanide series, has been developed. This approach uses a combination of highly correlated and fully relativistic ab initio methods to accurately describe the electronic structure of heavy elements. Both scalar and fully relativistic methods are used to achieve an approach that is both computationally feasible and accurate. The impact of basis set choice and the number of electrons included in the correlation space has also been examined.

  15. Engaging a Community in Developing an Entertainment–Education Spanish-Language Radio Novella Aimed at Reducing Chronic Disease Risk Factors, Alabama, 2010–2011

    PubMed Central

    Massingale, Shermetria; Bowen, Michelle; Kohler, Connie

    2012-01-01

    Background US Hispanics have disproportionate rates of diabetes and other chronic diseases. We used the entertainment–education approach to develop a Spanish-language radio novella aimed at reducing risk factors for diabetes, obesity, and tobacco use. The approach is based on social cognitive theory and proposes modeling as a source of vicarious learning of outcome and efficacy expectations. Community Context The Hispanic population in Alabama increased by 145% between 2000 and 2010. Nearly one-quarter of Hispanics aged 18 to 64 live below the federal poverty level, and 49% are uninsured. Several lifestyle factors lead to poor health behaviors in this community. Radio is a popular medium among Hispanic immigrants. The single local Spanish-language radio station reaches a large proportion of the local community and several communities beyond. Methods Through various methods, including workshops, review sessions, and other feedback mechanisms, we engaged stakeholders and community members in developing and evaluating a 48-episode radio novella to be broadcast as part of a variety show. We tracked participation of community members in all phases. Outcome Community members participated significantly in developing, broadcasting, and evaluating the intervention. The desired outcome — development of a culturally relevant storyline that addresses salient health issues and resonates with the community — was realized. Interpretation Our approach to community engagement can serve as a model for other organizations wishing to use community-based participatory methods in addressing Hispanic health issues. The radio novella was a unique approach for addressing health disparities among our community’s Hispanic population. PMID:22863307

  16. The Whole School, Whole Community, Whole Child Model: A New Approach for Improving Educational Attainment and Healthy Development for Students

    ERIC Educational Resources Information Center

    Lewallen, Theresa C.; Hunt, Holly; Potts-Datema, William; Zaza, Stephanie; Giles, Wayne

    2015-01-01

    Background: The Whole Child approach and the coordinated school health (CSH) approach both address the physical and emotional needs of students. However, a unified approach acceptable to both the health and education communities is needed to assure that students are healthy and ready to learn. Methods: During spring 2013, the ASCD (formerly known…

  17. Personalized Modeling for Prediction with Decision-Path Models

    PubMed Central

    Visweswaran, Shyam; Ferreira, Antonio; Ribeiro, Guilherme A.; Oliveira, Alexandre C.; Cooper, Gregory F.

    2015-01-01

    Deriving predictive models in medicine typically relies on a population approach where a single model is developed from a dataset of individuals. In this paper we describe and evaluate a personalized approach in which we construct a new type of decision tree model called decision-path model that takes advantage of the particular features of a given person of interest. We introduce three personalized methods that derive personalized decision-path models. We compared the performance of these methods to that of Classification And Regression Tree (CART) that is a population decision tree to predict seven different outcomes in five medical datasets. Two of the three personalized methods performed statistically significantly better on area under the ROC curve (AUC) and Brier skill score compared to CART. The personalized approach of learning decision path models is a new approach for predictive modeling that can perform better than a population approach. PMID:26098570

  18. Qualitative research methods in renal medicine: an introduction.

    PubMed

    Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M

    2015-09-01

    Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  19. Participatory Design in Gerontechnology: A Systematic Literature Review.

    PubMed

    Merkel, Sebastian; Kucharski, Alexander

    2018-05-19

    Participatory design (PD) is widely used within gerontechnology but there is no common understanding about which methods are used for what purposes. This review aims to examine what different forms of PD exist in the field of gerontechnology and how these can be categorized. We conducted a systematic literature review covering several databases. The search strategy was based on 3 elements: (1) participatory methods and approaches with (2) older persons aiming at developing (3) technology for older people. Our final review included 26 studies representing a variety of technologies designed/developed and methods/instruments applied. According to the technologies, the publications reviewed can be categorized in 3 groups: Studies that (1) use already existing technology with the aim to find new ways of use; (2) aim at creating new devices; (3) test and/or modify prototypes. The implementation of PD depends on the questions: Why a participatory approach is applied, who is involved as future user(s), when those future users are involved, and how they are incorporated into the innovation process. There are multiple ways, methods, and instruments to integrate users into the innovation process. Which methods should be applied, depends on the context. However, most studies do not evaluate if participatory approaches will lead to a better acceptance and/or use of the co-developed products. Therefore, participatory design should follow a comprehensive strategy, starting with the users' needs and ending with an evaluation if the applied methods have led to better results.

  20. A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies

    PubMed Central

    Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.

    2008-01-01

    Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969

  1. Localized-overlap approach to calculations of intermolecular interactions

    NASA Astrophysics Data System (ADS)

    Rob, Fazle

    Symmetry-adapted perturbation theory (SAPT) based on the density functional theory (DFT) description of the monomers [SAPT(DFT)] is one of the most robust tools for computing intermolecular interaction energies. Currently, one can use the SAPT(DFT) method to calculate interaction energies of dimers consisting of about a hundred atoms. To remove the methodological and technical limits and extend the size of the systems that can be calculated with the method, a novel approach has been proposed that redefines the electron densities and polarizabilities in a localized way. In the new method, accurate but computationally expensive quantum-chemical calculations are only applied for the regions where it is necessary and for other regions, where overlap effects of the wave functions are negligible, inexpensive asymptotic techniques are used. Unlike other hybrid methods, this new approach is mathematically rigorous. The main benefit of this method is that with the increasing size of the system the calculation scales linearly and, therefore, this approach will be denoted as local-overlap SAPT(DFT) or LSAPT(DFT). As a byproduct of developing LSAPT(DFT), some important problems concerning distributed molecular response, in particular, the unphysical charge-flow terms were eliminated. Additionally, to illustrate the capabilities of SAPT(DFT), a potential energy function has been developed for an energetic molecular crystal of 1,1-diamino-2,2-dinitroethylene (FOX-7), where an excellent agreement with the experimental data has been found.

  2. How to approach the ENS: various ways to analyse motility disorders in situ and in vitro.

    PubMed

    Schäfer, K-H; Hagl, C I; Wink, E; Holland-Cunz, S; Klotz, M; Rauch, U; Waag, K-L

    2003-06-01

    Motility disorders of the human intestine are so variable that they cannot be diagnosed by just one technique. Their aetiology is obviously so varied that they have to be approached with a broad range of technical methods. These reach from the simple haematoxylin-stained section to the isolation of stem or precursor cells. In this study, various methods to investigate the enteric nervous system and its surrounding tissue are demonstrated. While sections from paraffin-embedded material or cryostat sections provide only a two-dimensional perspective of the ENS, the whole-mount method yields three-dimensional perspectives of large areas of the gut wall. The three-dimensional impression can even be enhanced by electron microscopy of the isolated ENS. Dynamical aspects of ENS development can be tackled by in vitro studies. The myenteric plexus can be isolated and cultivated under the influence of the microenvironment (protein extracts). Although the postnatal myenteric plexus is not fully developed, the choice of embryological neuronal cells seems to be more effective for certain approaches. They can be isolated from the embryonic mouse gut and cultivated under the influence of various factors. This method seems to us a valuable tool for the investigation of the aetiology of motility disorders, although only a "complete" approach which considers all available methods will yield at the end a clear understanding which might lead to new therapeutical concepts.

  3. Evaluating the Effects of Chemicals on Nervous System Development

    EPA Pesticide Factsheets

    There are thousands of chemicals that lack data on their potential effects on neurodevelopment. EPA is developing New Approach Methods to evaluate these chemicals for developmental neurotoxicity hazard.

  4. A new approach to applying feedforward neural networks to the prediction of musculoskeletal disorder risk.

    PubMed

    Chen, C L; Kaber, D B; Dempsey, P G

    2000-06-01

    A new and improved method to feedforward neural network (FNN) development for application to data classification problems, such as the prediction of levels of low-back disorder (LBD) risk associated with industrial jobs, is presented. Background on FNN development for data classification is provided along with discussions of previous research and neighborhood (local) solution search methods for hard combinatorial problems. An analytical study is presented which compared prediction accuracy of a FNN based on an error-back propagation (EBP) algorithm with the accuracy of a FNN developed by considering results of local solution search (simulated annealing) for classifying industrial jobs as posing low or high risk for LBDs. The comparison demonstrated superior performance of the FNN generated using the new method. The architecture of this FNN included fewer input (predictor) variables and hidden neurons than the FNN developed based on the EBP algorithm. Independent variable selection methods and the phenomenon of 'overfitting' in FNN (and statistical model) generation for data classification are discussed. The results are supportive of the use of the new approach to FNN development for applications to musculoskeletal disorders and risk forecasting in other domains.

  5. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  6. Human ergology that promotes participatory approach to improving safety, health and working conditions at grassroots workplaces: achievements and actions.

    PubMed

    Kawakami, Tsuyoshi

    2011-12-01

    Participatory approaches are increasingly applied to improve safety, health and working conditions of grassroots workplaces in Asia. The core concepts and methods in human ergology research such as promoting real work life studies, relying on positive efforts of local people (daily life-technology), promoting active participation of local people to identify practical solutions, and learning from local human networks to reach grassroots workplaces, have provided useful viewpoints to devise such participatory training programmes. This study was aimed to study and analyze how human ergology approaches were applied in the actual development and application of three typical participatory training programmes: WISH (Work Improvement for Safe Home) with home workers in Cambodia, WISCON (Work Improvement in Small Construction Sites) with construction workers in Thailand, and WARM (Work Adjustment for Recycling and Managing Waste) with waste collectors in Fiji. The results revealed that all the three programmes, in the course of their developments, commonly applied direct observation methods of the work of target workers before devising the training programmes, learned from existing local good examples and efforts, and emphasized local human networks for cooperation. These methods and approaches were repeatedly applied in grassroots workplaces by taking advantage of their the sustainability and impacts. It was concluded that human ergology approaches largely contributed to the developments and expansion of participatory training programmes and could continue to support the self-help initiatives of local people for promoting human-centred work.

  7. Intercomparison of 3D pore-scale flow and solute transport simulation methods

    DOE PAGES

    Mehmani, Yashar; Schoenherr, Martin; Pasquali, Andrea; ...

    2015-09-28

    Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This paper provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less

  8. Kinetics and mechanism of solid decompositions — From basic discoveries by atomic absorption spectrometry and quadrupole mass spectroscopy to thorough thermogravimetric analysis

    NASA Astrophysics Data System (ADS)

    L'vov, Boris V.

    2008-02-01

    This paper sums up the evolution of thermochemical approach to the interpretation of solid decompositions for the past 25 years. This period includes two stages related to decomposition studies by different techniques: by ET AAS and QMS in 1981-2001 and by TG in 2002-2007. As a result of ET AAS and QMS investigations, the method for determination of absolute rates of solid decompositions was developed and the mechanism of decompositions through the congruent dissociative vaporization was discovered. On this basis, in the period from 1997 to 2001, the decomposition mechanisms of several classes of reactants were interpreted and some unusual effects observed in TA were explained. However, the thermochemical approach has not received any support by other TA researchers. One of the potential reasons of this distrust was the unreliability of the E values measured by the traditional Arrhenius plot method. The theoretical analysis and comparison of metrological features of different methods used in the determinations of thermochemical quantities permitted to conclude that in comparison with the Arrhenius plot and second-law methods, the third-law method is to be very much preferred. However, this method cannot be used in the kinetic studies by the Arrhenius approach because its use suggests the measuring of the equilibrium pressures of decomposition products. On the contrary, the method of absolute rates is ideally suitable for this purpose. As a result of much higher precision of the third-law method, some quantitative conclusions that follow from the theory were confirmed, and several new effects, which were invisible in the framework of the Arrhenius approach, have been revealed. In spite of great progress reached in the development of reliable methodology, based on the third-law method, the thermochemical approach remains unclaimed as before.

  9. Intercomparison of 3D pore-scale flow and solute transport simulation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.

    2016-09-01

    Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less

  10. A dynamical-systems approach for computing ice-affected streamflow

    USGS Publications Warehouse

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  11. Working towards the SDGs: measuring resilience from a practitioner's perspective

    NASA Astrophysics Data System (ADS)

    van Manen, S. M.; Both, M.

    2015-12-01

    The broad universal nature of the SDGs requires integrated approaches across development sectors and action at a variety of scales: from global to local. In humanitarian and development contexts, particularly at the local level, working towards these goals is increasingly approached through the concept of resilience. Resilience is broadly defined as the ability to minimise the impact of, cope with and recover from the consequences of shocks and stresses, both natural and manmade, without compromising long-term prospects. Key in this are the physical resources required and the ability to organise these prior to and during a crisis. However, despite the active debate on the theoretical foundations of resilience there is a comparative lack in the development of measurement approaches. The conceptual diversity of the few existing approaches further illustrates the complexity of operationalising the concept. Here we present a practical method to measure community resilience using a questionnaire composed of a generic set of household-level indicators. Rooted in the sustainable livelihoods approach it considers 6 domains: human, social, natural, economic, physical and political, and evaluates both resources and socio-cognitive factors. It is intended to be combined with more specific intervention-based questionnaires to systematically assess, monitor and evaluate the resilience of a community and the contribution of specific activities to resilience. Its use will be illustrated using a Haiti-based case study. The method presented supports knowledge-based decision making and impact monitoring. Furthermore, the evidence-based way of working contributes to accountability to a range of stakeholders and can be used for resource mobilisation. However, it should be noted that due to its inherent complexity and comprehensive nature there is no method or combination of methods and data types that can fully capture resilience in and across all of its facets, scales and domains.

  12. Adaptive Prior Variance Calibration in the Bayesian Continual Reassessment Method

    PubMed Central

    Zhang, Jin; Braun, Thomas M.; Taylor, Jeremy M.G.

    2012-01-01

    Use of the Continual Reassessment Method (CRM) and other model-based approaches to design in Phase I clinical trials has increased due to the ability of the CRM to identify the maximum tolerated dose (MTD) better than the 3+3 method. However, the CRM can be sensitive to the variance selected for the prior distribution of the model parameter, especially when a small number of patients are enrolled. While methods have emerged to adaptively select skeletons and to calibrate the prior variance only at the beginning of a trial, there has not been any approach developed to adaptively calibrate the prior variance throughout a trial. We propose three systematic approaches to adaptively calibrate the prior variance during a trial and compare them via simulation to methods proposed to calibrate the variance at the beginning of a trial. PMID:22987660

  13. An RBF-based compression method for image-based relighting.

    PubMed

    Leung, Chi-Sing; Wong, Tien-Tsin; Lam, Ping-Man; Choy, Kwok-Hung

    2006-04-01

    In image-based relighting, a pixel is associated with a number of sampled radiance values. This paper presents a two-level compression method. In the first level, the plenoptic property of a pixel is approximated by a spherical radial basis function (SRBF) network. That means that the spherical plenoptic function of each pixel is represented by a number of SRBF weights. In the second level, we apply a wavelet-based method to compress these SRBF weights. To reduce the visual artifact due to quantization noise, we develop a constrained method for estimating the SRBF weights. Our proposed approach is superior to JPEG, JPEG2000, and MPEG. Compared with the spherical harmonics approach, our approach has a lower complexity, while the visual quality is comparable. The real-time rendering method for our SRBF representation is also discussed.

  14. Flight Test Results of a GPS-Based Pitot-Static Calibration Method Using Output-Error Optimization for a Light Twin-Engine Airplane

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Kiszely, Paul; Foster, John V.

    2011-01-01

    As part of the NASA Aviation Safety Program (AvSP), a novel pitot-static calibration method was developed to allow rapid in-flight calibration for subscale aircraft while flying within confined test areas. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. This method has been demonstrated in subscale flight tests and has shown small 2- error bounds with significant reduction in test time compared to other methods. The current research was motivated by the desire to further evaluate and develop this method for full-scale aircraft. A goal of this research was to develop an accurate calibration method that enables reductions in test equipment and flight time, thus reducing costs. The approach involved analysis of data acquisition requirements, development of efficient flight patterns, and analysis of pressure error models based on system identification methods. Flight tests were conducted at The University of Tennessee Space Institute (UTSI) utilizing an instrumented Piper Navajo research aircraft. In addition, the UTSI engineering flight simulator was used to investigate test maneuver requirements and handling qualities issues associated with this technique. This paper provides a summary of piloted simulation and flight test results that illustrates the performance and capabilities of the NASA calibration method. Discussion of maneuver requirements and data analysis methods is included as well as recommendations for piloting technique.

  15. Flexible functional regression methods for estimating individualized treatment regimes.

    PubMed

    Ciarleglio, Adam; Petkova, Eva; Tarpey, Thaddeus; Ogden, R Todd

    2016-01-01

    A major focus of personalized medicine is on the development of individualized treatment rules. Good decision rules have the potential to significantly advance patient care and reduce the burden of a host of diseases. Statistical methods for developing such rules are progressing rapidly, but few methods have considered the use of pre-treatment functional data to guide in decision-making. Furthermore, those methods that do allow for the incorporation of functional pre-treatment covariates typically make strong assumptions about the relationships between the functional covariates and the response of interest. We propose two approaches for using functional data to select an optimal treatment that address some of the shortcomings of previously developed methods. Specifically, we combine the flexibility of functional additive regression models with Q -learning or A -learning in order to obtain treatment decision rules. Properties of the corresponding estimators are discussed. Our approaches are evaluated in several realistic settings using synthetic data and are applied to real data arising from a clinical trial comparing two treatments for major depressive disorder in which baseline imaging data are available for subjects who are subsequently treated.

  16. An Assessment of Phylogenetic Tools for Analyzing the Interplay Between Interspecific Interactions and Phenotypic Evolution.

    PubMed

    Drury, J P; Grether, G F; Garland, T; Morlon, H

    2018-05-01

    Much ecological and evolutionary theory predicts that interspecific interactions often drive phenotypic diversification and that species phenotypes in turn influence species interactions. Several phylogenetic comparative methods have been developed to assess the importance of such processes in nature; however, the statistical properties of these methods have gone largely untested. Focusing mainly on scenarios of competition between closely-related species, we assess the performance of available comparative approaches for analyzing the interplay between interspecific interactions and species phenotypes. We find that many currently used statistical methods often fail to detect the impact of interspecific interactions on trait evolution, that sister-taxa analyses are particularly unreliable in general, and that recently developed process-based models have more satisfactory statistical properties. Methods for detecting predictors of species interactions are generally more reliable than methods for detecting character displacement. In weighing the strengths and weaknesses of different approaches, we hope to provide a clear guide for empiricists testing hypotheses about the reciprocal effect of interspecific interactions and species phenotypes and to inspire further development of process-based models.

  17. Utilizing the Foreign Body Response to Grow Tissue Engineered Blood Vessels in Vivo.

    PubMed

    Geelhoed, Wouter J; Moroni, Lorenzo; Rotmans, Joris I

    2017-04-01

    It is well known that the number of patients requiring a vascular grafts for use as vessel replacement in cardiovascular diseases, or as vascular access site for hemodialysis is ever increasing. The development of tissue engineered blood vessels (TEBV's) is a promising method to meet this increasing demand vascular grafts, without having to rely on poorly performing synthetic options such as polytetrafluoroethylene (PTFE) or Dacron. The generation of in vivo TEBV's involves utilizing the host reaction to an implanted biomaterial for the generation of completely autologous tissues. Essentially this approach to the development of TEBV's makes use of the foreign body response to biomaterials for the construction of the entire vascular replacement tissue within the patient's own body. In this review we will discuss the method of developing in vivo TEBV's, and debate the approaches of several research groups that have implemented this method.

  18. Automatic identification and normalization of dosage forms in drug monographs

    PubMed Central

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  19. CRISPR Approaches to Small Molecule Target Identification. | Office of Cancer Genomics

    Cancer.gov

    A long-standing challenge in drug development is the identification of the mechanisms of action of small molecules with therapeutic potential. A number of methods have been developed to address this challenge, each with inherent strengths and limitations. We here provide a brief review of these methods with a focus on chemical-genetic methods that are based on systematically profiling the effects of genetic perturbations on drug sensitivity.

  20. Putting social impact assessment to the test as a method for implementing responsible tourism practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCombes, Lucy, E-mail: l.mccombes@leedsbeckett.ac.uk; Vanclay, Frank, E-mail: frank.vanclay@rug.nl; Evers, Yvette, E-mail: y.evers@tft-earth.org

    The discourse on the social impacts of tourism needs to shift from the current descriptive critique of tourism to considering what can be done in actual practice to embed the management of tourism's social impacts into the existing planning, product development and operational processes of tourism businesses. A pragmatic approach for designing research methodologies, social management systems and initial actions, which is shaped by the real world operational constraints and existing systems used in the tourism industry, is needed. Our pilot study with a small Bulgarian travel company put social impact assessment (SIA) to the test to see if itmore » could provide this desired approach and assist in implementing responsible tourism development practice, especially in small tourism businesses. Our findings showed that our adapted SIA method has value as a practical method for embedding a responsible tourism approach. While there were some challenges, SIA proved to be effective in assisting the staff of our test case tourism business to better understand their social impacts on their local communities and to identify actions to take. - Highlights: • Pragmatic approach is needed for the responsible management of social impacts of tourism. • Our adapted Social impact Assessment (SIA) method has value as a practical method. • SIA can be embedded into tourism businesses existing ‘ways of doing things’. • We identified challenges and ways to improve our method to better suit small tourism business context.« less

  1. A Statistical Approach for the Concurrent Coupling of Molecular Dynamics and Finite Element Methods

    NASA Technical Reports Server (NTRS)

    Saether, E.; Yamakov, V.; Glaessgen, E.

    2007-01-01

    Molecular dynamics (MD) methods are opening new opportunities for simulating the fundamental processes of material behavior at the atomistic level. However, increasing the size of the MD domain quickly presents intractable computational demands. A robust approach to surmount this computational limitation has been to unite continuum modeling procedures such as the finite element method (FEM) with MD analyses thereby reducing the region of atomic scale refinement. The challenging problem is to seamlessly connect the two inherently different simulation techniques at their interface. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the typical boundary value problem used to define a coupled domain. The method uses statistical averaging of the atomistic MD domain to provide displacement interface boundary conditions to the surrounding continuum FEM region, which, in return, generates interface reaction forces applied as piecewise constant traction boundary conditions to the MD domain. The two systems are computationally disconnected and communicate only through a continuous update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM) as opposed to a direct coupling method where interface atoms and FEM nodes are individually related. The methodology is inherently applicable to three-dimensional domains, avoids discretization of the continuum model down to atomic scales, and permits arbitrary temperatures to be applied.

  2. Lattice Boltzmann simulations of heat transfer in fully developed periodic incompressible flows

    NASA Astrophysics Data System (ADS)

    Wang, Zimeng; Shang, Helen; Zhang, Junfeng

    2017-06-01

    Flow and heat transfer in periodic structures are of great interest for many applications. In this paper, we carefully examine the periodic features of fully developed periodic incompressible thermal flows, and incorporate them in the lattice Boltzmann method (LBM) for flow and heat transfer simulations. Two numerical approaches, the distribution modification (DM) approach and the source term (ST) approach, are proposed; and they can both be used for periodic thermal flows with constant wall temperature (CWT) and surface heat flux boundary conditions. However, the DM approach might be more efficient, especially for CWT systems since the ST approach requires calculations of the streamwise temperature gradient at all lattice nodes. Several example simulations are conducted, including flows through flat and wavy channels and flows through a square array with circular cylinders. Results are compared to analytical solutions, previous studies, and our own LBM calculations using different simulation techniques (i.e., the one-module simulation vs. the two-module simulation, and the DM approach vs. the ST approach) with good agreement. These simple, however, representative simulations demonstrate the accuracy and usefulness of our proposed LBM methods for future thermal periodic flow simulations.

  3. Sage Simulation Model for Technology Demonstration Convertor by a Step-by-Step Approach

    NASA Technical Reports Server (NTRS)

    Demko, Rikako; Penswick, L. Barry

    2006-01-01

    The development of a Stirling model using the 1-D Saga design code was completed using a step-by-step approach. This is a method of gradually increasing the complexity of the Saga model while observing the energy balance and energy losses at each step of the development. This step-by-step model development and energy-flow analysis can clarify where the losses occur, their impact, and suggest possible opportunities for design improvement.

  4. A novel method for expediting the development of patient-reported outcome measures and an evaluation across several populations

    PubMed Central

    Garrard, Lili; Price, Larry R.; Bott, Marjorie J.; Gajewski, Byron J.

    2016-01-01

    Item response theory (IRT) models provide an appropriate alternative to the classical ordinal confirmatory factor analysis (CFA) during the development of patient-reported outcome measures (PROMs). Current literature has identified the assessment of IRT model fit as both challenging and underdeveloped (Sinharay & Johnson, 2003; Sinharay, Johnson, & Stern, 2006). This study evaluates the performance of Ordinal Bayesian Instrument Development (OBID), a Bayesian IRT model with a probit link function approach, through applications in two breast cancer-related instrument development studies. The primary focus is to investigate an appropriate method for comparing Bayesian IRT models in PROMs development. An exact Bayesian leave-one-out cross-validation (LOO-CV) approach (Vehtari & Lampinen, 2002) is implemented to assess prior selection for the item discrimination parameter in the IRT model and subject content experts’ bias (in a statistical sense and not to be confused with psychometric bias as in differential item functioning) toward the estimation of item-to-domain correlations. Results support the utilization of content subject experts’ information in establishing evidence for construct validity when sample size is small. However, the incorporation of subject experts’ content information in the OBID approach can be sensitive to the level of expertise of the recruited experts. More stringent efforts need to be invested in the appropriate selection of subject experts to efficiently use the OBID approach and reduce potential bias during PROMs development. PMID:27667878

  5. A novel method for expediting the development of patient-reported outcome measures and an evaluation across several populations.

    PubMed

    Garrard, Lili; Price, Larry R; Bott, Marjorie J; Gajewski, Byron J

    2016-10-01

    Item response theory (IRT) models provide an appropriate alternative to the classical ordinal confirmatory factor analysis (CFA) during the development of patient-reported outcome measures (PROMs). Current literature has identified the assessment of IRT model fit as both challenging and underdeveloped (Sinharay & Johnson, 2003; Sinharay, Johnson, & Stern, 2006). This study evaluates the performance of Ordinal Bayesian Instrument Development (OBID), a Bayesian IRT model with a probit link function approach, through applications in two breast cancer-related instrument development studies. The primary focus is to investigate an appropriate method for comparing Bayesian IRT models in PROMs development. An exact Bayesian leave-one-out cross-validation (LOO-CV) approach (Vehtari & Lampinen, 2002) is implemented to assess prior selection for the item discrimination parameter in the IRT model and subject content experts' bias (in a statistical sense and not to be confused with psychometric bias as in differential item functioning) toward the estimation of item-to-domain correlations. Results support the utilization of content subject experts' information in establishing evidence for construct validity when sample size is small. However, the incorporation of subject experts' content information in the OBID approach can be sensitive to the level of expertise of the recruited experts. More stringent efforts need to be invested in the appropriate selection of subject experts to efficiently use the OBID approach and reduce potential bias during PROMs development.

  6. Combinatorial Strategies for the Development of Bulk Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Ding, Shiyan

    The systematic identification of multi-component alloys out of the vast composition space is still a daunting task, especially in the development of bulk metallic glasses that are typically based on three or more elements. In order to address this challenge, combinatorial approaches have been proposed. However, previous attempts have not successfully coupled the synthesis of combinatorial libraries with high-throughput characterization methods. The goal of my dissertation is to develop efficient high-throughput characterization methods, optimized to identify glass formers systematically. Here, two innovative approaches have been invented. One is to measure the nucleation temperature in parallel for up-to 800 compositions. The composition with the lowest nucleation temperature has a reasonable agreement with the best-known glass forming composition. In addition, the thermoplastic formability of a metallic glass forming system is determined through blow molding a compositional library. Our results reveal that the composition with the largest thermoplastic deformation correlates well with the best-known formability composition. I have demonstrated both methods as powerful tools to develop new bulk metallic glasses.

  7. Auditory-Verbal Music Play Therapy: An Integrated Approach (AVMPT)

    PubMed Central

    Mohammad Esmaeilzadeh, Sahar; Sharifi, Shahla; Tayarani Niknezhad, Hamid

    2013-01-01

    Introduction: Hearing loss occurs when there is a problem with one or more parts of the ear or ears and causes children to have a delay in the language-learning process. Hearing loss affects children's lives and their development. Several approaches have been developed over recent decades to help hearing-impaired children develop language skills. Auditory-verbal therapy (AVT) is one such approach. Recently, researchers have found that music and play have a considerable effect on the communication skills of children, leading to the development of music therapy (MT) and play therapy (PT). There have been several studies which focus on the impact of music on hearing-impaired children. The aim of this article is to review studies conducted in AVT, MT, and PT and their efficacy in hearing-impaired children. Furthermore, the authors aim to introduce an integrated approach of AVT, MT, and PT which facilitates language and communication skills in hearing-impaired children. Materials and Methods: In this article we review studies of AVT, MT, and PT and their impact on hearing-impaired children. To achieve this goal, we searched databases and journals including Elsevier, Chor Teach, and Military Psychology, for example. We also used reliable websites such as American Choral Directors Association and Joint Committee on Infant Hearing websites. The websites were reviewed and key words in this article used to find appropriate references. Those articles which are related to ours in content were selected. Conclusion: VT, MT, and PT enhance children’s communication and language skills from an early age. Each method has a meaningful impact on hearing loss, so by integrating them we have a comprehensive method in order to facilitate communication and language learning. To achieve this goal, the article offers methods and techniques to perform AVT and MT integrated with PT leading to an approach which offers all advantages of these three types of therapy. PMID:24303441

  8. Development and usability testing of a web-based cancer symptom and quality-of-life support intervention.

    PubMed

    Wolpin, S E; Halpenny, B; Whitman, G; McReynolds, J; Stewart, M; Lober, W B; Berry, D L

    2015-03-01

    The feasibility and acceptability of computerized screening and patient-reported outcome measures have been demonstrated in the literature. However, patient-centered management of health information entails two challenges: gathering and presenting data using "patient-tailored" methods and supporting "patient-control" of health information. The design and development of many symptom and quality-of-life information systems have not included opportunities for systematically collecting and analyzing user input. As part of a larger clinical trial, the Electronic Self-Report Assessment for Cancer-II project, participatory design approaches were used to build and test new features and interfaces for patient/caregiver users. The research questions centered on patient/caregiver preferences with regard to the following: (a) content, (b) user interface needs, (c) patient-oriented summary, and (d) patient-controlled sharing of information with family, caregivers, and clinicians. Mixed methods were used with an emphasis on qualitative approaches; focus groups and individual usability tests were the primary research methods. Focus group data were content analyzed, while individual usability sessions were assessed with both qualitative and quantitative methods. We identified 12 key patient/caregiver preferences through focus groups with 6 participants. We implemented seven of these preferences during the iterative design process. We deferred development for some of the preferences due to resource constraints. During individual usability testing (n = 8), we were able to identify 65 usability issues ranging from minor user confusion to critical errors that blocked task completion. The participatory development model that we used led to features and design revisions that were patient centered. We are currently evaluating new approaches for the application interface and for future research pathways. We encourage other researchers to adopt user-centered design approaches when building patient-centered technologies. © The Author(s) 2014.

  9. Epidemiologic methods in mastitis treatment and control.

    PubMed

    Thurmond, M C

    1993-11-01

    Methods and concepts of epidemiology offer means whereby udder health can be monitored and evaluated. Prerequisite to a sound epidemiologic approach is development of measures of mastitis that minimize biases and that account for sensitivity and specificity of diagnostic tests. Mastitis surveillance offers an ongoing and passive system for evaluation of udder health, whereas clinical and observational trials offer a more proactive and developmental approach to improving udder health.

  10. The Value of Developing a Mixed-Methods Program of Research.

    PubMed

    Simonovich, Shannon

    2017-07-01

    This article contributes to the discussion of the value of utilizing mixed methodological approaches to conduct nursing research. To this end, the author of this article proposes creating a mixed-methods program of research over time, where both quantitative and qualitative data are collected and analyzed simultaneously, rather than focusing efforts on designing singular mixed-methods studies. A mixed-methods program of research would allow for the best of both worlds: precision through focus on one method at a time, and the benefits of creating a robust understanding of a phenomenon over the trajectory of one's career through examination from various methodological approaches.

  11. Comparison and combination of "direct" and fragment based local correlation methods: Cluster in molecules and domain based local pair natural orbital perturbation and coupled cluster theories

    NASA Astrophysics Data System (ADS)

    Guo, Yang; Becker, Ute; Neese, Frank

    2018-03-01

    Local correlation theories have been developed in two main flavors: (1) "direct" local correlation methods apply local approximation to the canonical equations and (2) fragment based methods reconstruct the correlation energy from a series of smaller calculations on subsystems. The present work serves two purposes. First, we investigate the relative efficiencies of the two approaches using the domain-based local pair natural orbital (DLPNO) approach as the "direct" method and the cluster in molecule (CIM) approach as the fragment based approach. Both approaches are applied in conjunction with second-order many-body perturbation theory (MP2) as well as coupled-cluster theory with single-, double- and perturbative triple excitations [CCSD(T)]. Second, we have investigated the possible merits of combining the two approaches by performing CIM calculations with DLPNO methods serving as the method of choice for performing the subsystem calculations. Our cluster-in-molecule approach is closely related to but slightly deviates from approaches in the literature since we have avoided real space cutoffs. Moreover, the neglected distant pair correlations in the previous CIM approach are considered approximately. Six very large molecules (503-2380 atoms) were studied. At both MP2 and CCSD(T) levels of theory, the CIM and DLPNO methods show similar efficiency. However, DLPNO methods are more accurate for 3-dimensional systems. While we have found only little incentive for the combination of CIM with DLPNO-MP2, the situation is different for CIM-DLPNO-CCSD(T). This combination is attractive because (1) the better parallelization opportunities offered by CIM; (2) the methodology is less memory intensive than the genuine DLPNO-CCSD(T) method and, hence, allows for large calculations on more modest hardware; and (3) the methodology is applicable and efficient in the frequently met cases, where the largest subsystem calculation is too large for the canonical CCSD(T) method.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Jianfei; Wang, Shijun; Turkbey, Evrim B.

    Purpose: Renal calculi are common extracolonic incidental findings on computed tomographic colonography (CTC). This work aims to develop a fully automated computer-aided diagnosis system to accurately detect renal calculi on CTC images. Methods: The authors developed a total variation (TV) flow method to reduce image noise within the kidneys while maintaining the characteristic appearance of renal calculi. Maximally stable extremal region (MSER) features were then calculated to robustly identify calculi candidates. Finally, the authors computed texture and shape features that were imported to support vector machines for calculus classification. The method was validated on a dataset of 192 patients andmore » compared to a baseline approach that detects calculi by thresholding. The authors also compared their method with the detection approaches using anisotropic diffusion and nonsmoothing. Results: At a false positive rate of 8 per patient, the sensitivities of the new method and the baseline thresholding approach were 69% and 35% (p < 1e − 3) on all calculi from 1 to 433 mm{sup 3} in the testing dataset. The sensitivities of the detection methods using anisotropic diffusion and nonsmoothing were 36% and 0%, respectively. The sensitivity of the new method increased to 90% if only larger and more clinically relevant calculi were considered. Conclusions: Experimental results demonstrated that TV-flow and MSER features are efficient means to robustly and accurately detect renal calculi on low-dose, high noise CTC images. Thus, the proposed method can potentially improve diagnosis.« less

  13. Accounting for ecosystem services in life cycle assessment, Part I: a critical review.

    PubMed

    Zhang, Yi; Singh, Shweta; Bakshi, Bhavik R

    2010-04-01

    If life cycle oriented methods are to encourage sustainable development, they must account for the role of ecosystem goods and services, since these form the basis of planetary activities and human well-being. This article reviews methods that are relevant to accounting for the role of nature and that could be integrated into life cycle oriented approaches. These include methods developed by ecologists for quantifying ecosystem services, by ecological economists for monetary valuation, and life cycle methods such as conventional life cycle assessment, thermodynamic methods for resource accounting such as exergy and emergy analysis, variations of the ecological footprint approach, and human appropriation of net primary productivity. Each approach has its strengths: economic methods are able to quantify the value of cultural services; LCA considers emissions and assesses their impact; emergy accounts for supporting services in terms of cumulative exergy; and ecological footprint is intuitively appealing and considers biocapacity. However, no method is able to consider all the ecosystem services, often due to the desire to aggregate all resources in terms of a single unit. This review shows that comprehensive accounting for ecosystem services in LCA requires greater integration among existing methods, hierarchical schemes for interpreting results via multiple levels of aggregation, and greater understanding of the role of ecosystems in supporting human activities. These present many research opportunities that must be addressed to meet the challenges of sustainability.

  14. Issues to consider in the derivation of water quality benchmarks for the protection of aquatic life.

    PubMed

    Schneider, Uwe

    2014-01-01

    While water quality benchmarks for the protection of aquatic life have been in use in some jurisdictions for several decades (USA, Canada, several European countries), more and more countries are now setting up their own national water quality benchmark development programs. In doing so, they either adopt an existing method from another jurisdiction, update on an existing approach, or develop their own new derivation method. Each approach has its own advantages and disadvantages, and many issues have to be addressed when setting up a water quality benchmark development program or when deriving a water quality benchmark. Each of these tasks requires a special expertise. They may seem simple, but are complex in their details. The intention of this paper was to provide some guidance for this process of water quality benchmark development on the program level, for the derivation methodology development, and in the actual benchmark derivation step, as well as to point out some issues (notably the inclusion of adapted populations and cryptic species and points to consider in the use of the species sensitivity distribution approach) and future opportunities (an international data repository and international collaboration in water quality benchmark development).

  15. Beyond the Condom: Frontiers in Male Contraception

    PubMed Central

    Roth, Mara Y.; Amory, John K.

    2016-01-01

    Nearly half of all pregnancies worldwide are unplanned, despite numerous contraceptive options available. No new contraceptive method has been developed for men since the invention of condom. Nevertheless, more than 25% of contraception worldwide relies on male methods. Therefore, novel effective methods of male contraception are of interest. Herein we review the physiologic basis for both male hormonal and nonhormonal methods of contraception. We review the history of male hormonal contraception development, current hormonal agents in development, as well as the potential risks and benefits of male hormonal contraception options for men. Nonhormonal methods reviewed will include both pharmacological and mechanical approaches in development, with specific focus on methods which inhibit the testicular retinoic acid synthesis and action. Multiple hormonal and nonhormonal methods of male contraception are in the drug development pathway, with the hope that a reversible, reliable, safe method of male contraception will be available to couples in the not too distant future. PMID:26947703

  16. Wildland fire management. Volume 1: Prevention methods and analysis. [systems engineering approach to California fire problems

    NASA Technical Reports Server (NTRS)

    Weissenberger, S. (Editor)

    1973-01-01

    A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions.

  17. A Comprehensive Planning Model

    ERIC Educational Resources Information Center

    Temkin, Sanford

    1972-01-01

    Combines elements of the problem solving approach inherent in methods of applied economics and operations research and the structural-functional analysis common in social science modeling to develop an approach for economic planning and resource allocation for schools and other public sector organizations. (Author)

  18. Developing a Research Program Using Qualitative and Quantitative Approaches.

    ERIC Educational Resources Information Center

    Beck, Cheryl Tatano

    1997-01-01

    A research program on postpartum depression is used to illustrate the use of both qualitative and quantitative approaches. The direction of a research program is thus not limited by the type of methods in which a researcher has expertise. (SK)

  19. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment.

    NASA Astrophysics Data System (ADS)

    Dimou, Kaotar; Emond, Claude

    2017-06-01

    In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population.

  20. Improved productivity through interactive communication

    NASA Technical Reports Server (NTRS)

    Marino, P. P.

    1985-01-01

    New methods and approaches are being tried and evaluated with the goal of increasing productivity and quality. The underlying concept in all of these approaches, methods or processes is that people require interactive communication to maximize the organization's strengths and minimize impediments to productivity improvement. This paper examines Bendix Field Engineering Corporation's organizational structure and experiences with employee involvement programs. The paper focuses on methods Bendix developed and implemented to open lines of communication throughout the organization. The Bendix approach to productivity and quality enhancement shows that interactive communication is critical to the successful implementation of any productivity improvement program. The paper concludes with an examination of the Bendix methodologies which can be adopted by any corporation in any industry.

  1. On continuous and discontinuous approaches for modeling groundwater flow in heterogeneous media using the Numerical Manifold Method: Model development and comparison

    NASA Astrophysics Data System (ADS)

    Hu, Mengsu; Wang, Yuan; Rutqvist, Jonny

    2015-06-01

    One major challenge in modeling groundwater flow within heterogeneous geological media is that of modeling arbitrarily oriented or intersected boundaries and inner material interfaces. The Numerical Manifold Method (NMM) has recently emerged as a promising method for such modeling, in its ability to handle boundaries, its flexibility in constructing physical cover functions (continuous or with gradient jump), its meshing efficiency with a fixed mathematical mesh (covers), its convenience for enhancing approximation precision, and its integration precision, achieved by simplex integration. In this paper, we report on developing and comparing two new approaches for boundary constraints using the NMM, namely a continuous approach with jump functions and a discontinuous approach with Lagrange multipliers. In the discontinuous Lagrange multiplier method (LMM), the material interfaces are regarded as discontinuities which divide mathematical covers into different physical covers. We define and derive stringent forms of Lagrange multipliers to link the divided physical covers, thus satisfying the continuity requirement of the refraction law. In the continuous Jump Function Method (JFM), the material interfaces are regarded as inner interfaces contained within physical covers. We briefly define jump terms to represent the discontinuity of the head gradient across an interface to satisfy the refraction law. We then make a theoretical comparison between the two approaches in terms of global degrees of freedom, treatment of multiple material interfaces, treatment of small area, treatment of moving interfaces, the feasibility of coupling with mechanical analysis and applicability to other numerical methods. The newly derived boundary-constraint approaches are coded into a NMM model for groundwater flow analysis, and tested for precision and efficiency on different simulation examples. We first test the LMM for a Dirichlet boundary and then test both LMM and JFM for an idealized heterogeneous model, comparing the numerical results with analytical solutions. Then we test both approaches for a heterogeneous model and compare the results of hydraulic head and specific discharge. We show that both approaches are suitable for modeling material boundaries, considering high accuracy for the boundary constraints, the capability to deal with arbitrarily oriented or complexly intersected boundaries, and their efficiency using a fixed mathematical mesh.

  2. Application of Quality by Design Approach to Bioanalysis: Development of a Method for Elvitegravir Quantification in Human Plasma.

    PubMed

    Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo

    2017-10-01

    The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.

  3. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimbach, Patrick

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less

  4. Mixture experiment methods in the development and optimization of microemulsion formulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furlanetto, Sandra; Cirri, Marzia; Piepel, Gregory F.

    2011-06-25

    Microemulsion formulations represent an interesting delivery vehicle for lipophilic drugs, allowing for improving their solubility and dissolution properties. This work developed effective microemulsion formulations using glyburide (a very poorly-water-soluble hypoglycaemic agent) as a model drug. First, the area of stable microemulsion (ME) formations was identified using a new approach based on mixture experiment methods. A 13-run mixture design was carried out in an experimental region defined by constraints on three components: aqueous, oil, and surfactant/cosurfactant. The transmittance percentage (at 550 nm) of ME formulations (indicative of their transparency and thus of their stability) was chosen as the response variable. Themore » results obtained using the mixture experiment approach corresponded well with those obtained using the traditional approach based on pseudo-ternary phase diagrams. However, the mixture experiment approach required far less experimental effort than the traditional approach. A subsequent 13-run mixture experiment, in the region of stable MEs, was then performed to identify the optimal formulation (i.e., having the best glyburide dissolution properties). Percent drug dissolved and dissolution efficiency were selected as the responses to be maximized. The ME formulation optimized via the mixture experiment approach consisted of 78% surfactant/cosurfacant (a mixture of Tween 20 and Transcutol, 1:1 v/v), 5% oil (Labrafac Hydro) and 17% aqueous (water). The stable region of MEs was identified using mixture experiment methods for the first time.« less

  5. Developing Emotion-Based Case Formulations: A Research-Informed Method.

    PubMed

    Pascual-Leone, Antonio; Kramer, Ueli

    2017-01-01

    New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. An Embedded Statistical Method for Coupling Molecular Dynamics and Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Saether, E.; Glaessgen, E.H.; Yamakov, V.

    2008-01-01

    The coupling of molecular dynamics (MD) simulations with finite element methods (FEM) yields computationally efficient models that link fundamental material processes at the atomistic level with continuum field responses at higher length scales. The theoretical challenge involves developing a seamless connection along an interface between two inherently different simulation frameworks. Various specialized methods have been developed to solve particular classes of problems. Many of these methods link the kinematics of individual MD atoms with FEM nodes at their common interface, necessarily requiring that the finite element mesh be refined to atomic resolution. Some of these coupling approaches also require simulations to be carried out at 0 K and restrict modeling to two-dimensional material domains due to difficulties in simulating full three-dimensional material processes. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the standard boundary value problem used to define a coupled domain. The method replaces a direct linkage of individual MD atoms and finite element (FE) nodes with a statistical averaging of atomistic displacements in local atomic volumes associated with each FE node in an interface region. The FEM and MD computational systems are effectively independent and communicate only through an iterative update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM). ESCM provides an enhanced coupling methodology that is inherently applicable to three-dimensional domains, avoids discretization of the continuum model to atomic scale resolution, and permits finite temperature states to be applied.

  7. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    PubMed

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  8. A pilot study to explore the feasibility of using theClinical Care Classification System for developing a reliable costing method for nursing services.

    PubMed

    Dykes, Patricia C; Wantland, Dean; Whittenburg, Luann; Lipsitz, Stuart; Saba, Virginia K

    2013-01-01

    While nursing activities represent a significant proportion of inpatient care, there are no reliable methods for determining nursing costs based on the actual services provided by the nursing staff. Capture of data to support accurate measurement and reporting on the cost of nursing services is fundamental to effective resource utilization. Adopting standard terminologies that support tracking both the quality and the cost of care could reduce the data entry burden on direct care providers. This pilot study evaluated the feasibility of using a standardized nursing terminology, the Clinical Care Classification System (CCC), for developing a reliable costing method for nursing services. Two different approaches are explored; the Relative Value Unit RVU and the simple cost-to-time methods. We found that the simple cost-to-time method was more accurate and more transparent in its derivation than the RVU method and may support a more consistent and reliable approach for costing nursing services.

  9. Safety assessment in plant layout design using indexing approach: implementing inherent safety perspective. Part 1 - guideword applicability and method description.

    PubMed

    Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2008-12-15

    Layout planning plays a key role in the inherent safety performance of process plants since this design feature controls the possibility of accidental chain-events and the magnitude of possible consequences. A lack of suitable methods to promote the effective implementation of inherent safety in layout design calls for the development of new techniques and methods. In the present paper, a safety assessment approach suitable for layout design in the critical early phase is proposed. The concept of inherent safety is implemented within this safety assessment; the approach is based on an integrated assessment of inherent safety guideword applicability within the constraints typically present in layout design. Application of these guidewords is evaluated along with unit hazards and control devices to quantitatively map the safety performance of different layout options. Moreover, the economic aspects related to safety and inherent safety are evaluated by the method. Specific sub-indices are developed within the integrated safety assessment system to analyze and quantify the hazard related to domino effects. The proposed approach is quick in application, auditable and shares a common framework applicable in other phases of the design lifecycle (e.g. process design). The present work is divided in two parts: Part 1 (current paper) presents the application of inherent safety guidelines in layout design and the index method for safety assessment; Part 2 (accompanying paper) describes the domino hazard sub-index and demonstrates the proposed approach with a case study, thus evidencing the introduction of inherent safety features in layout design.

  10. eSIP: A Novel Solution-Based Sectioned Image Property Approach for Microscope Calibration

    PubMed Central

    Butzlaff, Malte; Weigel, Arwed; Ponimaskin, Evgeni; Zeug, Andre

    2015-01-01

    Fluorescence confocal microscopy represents one of the central tools in modern sciences. Correspondingly, a growing amount of research relies on the development of novel microscopic methods. During the last decade numerous microscopic approaches were developed for the investigation of various scientific questions. Thereby, the former qualitative imaging methods became replaced by advanced quantitative methods to gain more and more information from a given sample. However, modern microscope systems being as complex as they are, require very precise and appropriate calibration routines, in particular when quantitative measurements should be compared over longer time scales or between different setups. Multispectral beads with sub-resolution size are often used to describe the point spread function and thus the optical properties of the microscope. More recently, a fluorescent layer was utilized to describe the axial profile for each pixel, which allows a spatially resolved characterization. However, fabrication of a thin fluorescent layer with matching refractive index is technically not solved yet. Therefore, we propose a novel type of calibration concept for sectioned image property (SIP) measurements which is based on fluorescent solution and makes the calibration concept available for a broader number of users. Compared to the previous approach, additional information can be obtained by application of this extended SIP chart approach, including penetration depth, detected number of photons, and illumination profile shape. Furthermore, due to the fit of the complete profile, our method is less susceptible to noise. Generally, the extended SIP approach represents a simple and highly reproducible method, allowing setup independent calibration and alignment procedures, which is mandatory for advanced quantitative microscopy. PMID:26244982

  11. Direct reconstruction of cardiac PET kinetic parametric images using a preconditioned conjugate gradient approach

    PubMed Central

    Rakvongthai, Yothin; Ouyang, Jinsong; Guerin, Bastien; Li, Quanzheng; Alpert, Nathaniel M.; El Fakhri, Georges

    2013-01-01

    Purpose: Our research goal is to develop an algorithm to reconstruct cardiac positron emission tomography (PET) kinetic parametric images directly from sinograms and compare its performance with the conventional indirect approach. Methods: Time activity curves of a NCAT phantom were computed according to a one-tissue compartmental kinetic model with realistic kinetic parameters. The sinograms at each time frame were simulated using the activity distribution for the time frame. The authors reconstructed the parametric images directly from the sinograms by optimizing a cost function, which included the Poisson log-likelihood and a spatial regularization terms, using the preconditioned conjugate gradient (PCG) algorithm with the proposed preconditioner. The proposed preconditioner is a diagonal matrix whose diagonal entries are the ratio of the parameter and the sensitivity of the radioactivity associated with parameter. The authors compared the reconstructed parametric images using the direct approach with those reconstructed using the conventional indirect approach. Results: At the same bias, the direct approach yielded significant relative reduction in standard deviation by 12%–29% and 32%–70% for 50 × 106 and 10 × 106 detected coincidences counts, respectively. Also, the PCG method effectively reached a constant value after only 10 iterations (with numerical convergence achieved after 40–50 iterations), while more than 500 iterations were needed for CG. Conclusions: The authors have developed a novel approach based on the PCG algorithm to directly reconstruct cardiac PET parametric images from sinograms, and yield better estimation of kinetic parameters than the conventional indirect approach, i.e., curve fitting of reconstructed images. The PCG method increases the convergence rate of reconstruction significantly as compared to the conventional CG method. PMID:24089922

  12. Self-Monitoring Approach for the Modification of Smoking Behavior: A Case Study.

    ERIC Educational Resources Information Center

    Faherty, John K.

    This paper presents a review of relevant literature on treatment approaches for the modification of smoking behavior, followed by an outline of an approach developed by the author to decrease his own rate of cigarette smoking. Studies are reviewed which have used various treatment methods: use of electric shock, satiation and/or use of cigarette…

  13. Work-Based Learning: Effectiveness in Information Systems Training and Development

    ERIC Educational Resources Information Center

    Walters, David

    2006-01-01

    The ability to use methodologies is an essential ingredient in the teaching of Information System techniques and approaches. One method to achieve this is to use a practical approach where students undertake "live" projects with local client organisations. They can then reflect on the approach adopted with the aim of producing a "reflective"…

  14. An approach to large scale identification of non-obvious structural similarities between proteins

    PubMed Central

    Cherkasov, Artem; Jones, Steven JM

    2004-01-01

    Background A new sequence independent bioinformatics approach allowing genome-wide search for proteins with similar three dimensional structures has been developed. By utilizing the numerical output of the sequence threading it establishes putative non-obvious structural similarities between proteins. When applied to the testing set of proteins with known three dimensional structures the developed approach was able to recognize structurally similar proteins with high accuracy. Results The method has been developed to identify pathogenic proteins with low sequence identity and high structural similarity to host analogues. Such protein structure relationships would be hypothesized to arise through convergent evolution or through ancient horizontal gene transfer events, now undetectable using current sequence alignment techniques. The pathogen proteins, which could mimic or interfere with host activities, would represent candidate virulence factors. The developed approach utilizes the numerical outputs from the sequence-structure threading. It identifies the potential structural similarity between a pair of proteins by correlating the threading scores of the corresponding two primary sequences against the library of the standard folds. This approach allowed up to 64% sensitivity and 99.9% specificity in distinguishing protein pairs with high structural similarity. Conclusion Preliminary results obtained by comparison of the genomes of Homo sapiens and several strains of Chlamydia trachomatis have demonstrated the potential usefulness of the method in the identification of bacterial proteins with known or potential roles in virulence. PMID:15147578

  15. Planning in the Continuous Operations Environment of the International Space Station

    NASA Technical Reports Server (NTRS)

    Maxwell, Theresa; Hagopian, Jeff

    1996-01-01

    The continuous operation planning approach developed for the operations planning of the International Space Station (ISS) is reported on. The approach was designed to be a robust and cost-effective method. It separates ISS planning into two planning functions: long-range planning for a fixed length planning horizon which continually moves forward as ISS operations progress, and short-range planning which takes a small segment of the long-range plan and develops a detailed operations schedule. The continuous approach is compared with the incremental approach, the short and long-range planning functions are described, and the benefits and challenges of implementing a continuous operations planning approach for the ISS are summarized.

  16. Improvement of retinal blood vessel detection by spur removal and Gaussian matched filtering compensation

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Vignarajan, Janardhan; An, Dong; Tay-Kearney, Mei-Ling; Kanagasingam, Yogi

    2016-03-01

    Retinal photography is a non-invasive and well-accepted clinical diagnosis of ocular diseases. Qualitative and quantitative assessment of retinal images is crucial in ocular diseases related clinical application. In this paper, we proposed approaches for improving the quality of blood vessel detection based on our initial blood vessel detection methods. A blood vessel spur pruning method has been developed for removing the blood vessel spurs both on vessel medial lines and binary vessel masks, which are caused by artifacts and side-effect of Gaussian matched vessel enhancement. A Gaussian matched filtering compensation method has been developed for removing incorrect vessel branches in the areas of low illumination. The proposed approaches were applied and tested on the color fundus images from one publicly available database and our diabetic retinopathy screening dataset. A preliminary result has demonstrated the robustness and good performance of the proposed approaches and their potential application for improving retinal blood vessel detection.

  17. Developing the DESCARTE Model: The Design of Case Study Research in Health Care.

    PubMed

    Carolan, Clare M; Forbat, Liz; Smith, Annetta

    2016-04-01

    Case study is a long-established research tradition which predates the recent surge in mixed-methods research. Although a myriad of nuanced definitions of case study exist, seminal case study authors agree that the use of multiple data sources typify this research approach. The expansive case study literature demonstrates a lack of clarity and guidance in designing and reporting this approach to research. Informed by two reviews of the current health care literature, we posit that methodological description in case studies principally focuses on description of case study typology, which impedes the construction of methodologically clear and rigorous case studies. We draw from the case study and mixed-methods literature to develop the DESCARTE model as an innovative approach to the design, conduct, and reporting of case studies in health care. We examine how case study fits within the overall enterprise of qualitatively driven mixed-methods research, and the potential strengths of the model are considered. © The Author(s) 2015.

  18. Reverse engineering of gene regulatory networks.

    PubMed

    Cho, K H; Choo, S M; Jung, S H; Kim, J R; Choi, H S; Kim, J

    2007-05-01

    Systems biology is a multi-disciplinary approach to the study of the interactions of various cellular mechanisms and cellular components. Owing to the development of new technologies that simultaneously measure the expression of genetic information, systems biological studies involving gene interactions are increasingly prominent. In this regard, reconstructing gene regulatory networks (GRNs) forms the basis for the dynamical analysis of gene interactions and related effects on cellular control pathways. Various approaches of inferring GRNs from gene expression profiles and biological information, including machine learning approaches, have been reviewed, with a brief introduction of DNA microarray experiments as typical tools for measuring levels of messenger ribonucleic acid (mRNA) expression. In particular, the inference methods are classified according to the required input information, and the main idea of each method is elucidated by comparing its advantages and disadvantages with respect to the other methods. In addition, recent developments in this field are introduced and discussions on the challenges and opportunities for future research are provided.

  19. Data Analysis Approaches for the Risk-Informed Safety Margins Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Alfonsi, Andrea; Maljovec, Daniel P.

    2016-09-01

    In the past decades, several numerical simulation codes have been employed to simulate accident dynamics (e.g., RELAP5-3D, RELAP-7, MELCOR, MAAP). In order to evaluate the impact of uncertainties into accident dynamics, several stochastic methodologies have been coupled with these codes. These stochastic methods range from classical Monte-Carlo and Latin Hypercube sampling to stochastic polynomial methods. Similar approaches have been introduced into the risk and safety community where stochastic methods (such as RAVEN, ADAPT, MCDET, ADS) have been coupled with safety analysis codes in order to evaluate the safety impact of timing and sequencing of events. These approaches are usually calledmore » Dynamic PRA or simulation-based PRA methods. These uncertainties and safety methods usually generate a large number of simulation runs (database storage may be on the order of gigabytes or higher). The scope of this paper is to present a broad overview of methods and algorithms that can be used to analyze and extract information from large data sets containing time dependent data. In this context, “extracting information” means constructing input-output correlations, finding commonalities, and identifying outliers. Some of the algorithms presented here have been developed or are under development within the RAVEN statistical framework.« less

  20. A longitudinal multilevel CFA-MTMM model for interchangeable and structurally different methods

    PubMed Central

    Koch, Tobias; Schultze, Martin; Eid, Michael; Geiser, Christian

    2014-01-01

    One of the key interests in the social sciences is the investigation of change and stability of a given attribute. Although numerous models have been proposed in the past for analyzing longitudinal data including multilevel and/or latent variable modeling approaches, only few modeling approaches have been developed for studying the construct validity in longitudinal multitrait-multimethod (MTMM) measurement designs. The aim of the present study was to extend the spectrum of current longitudinal modeling approaches for MTMM analysis. Specifically, a new longitudinal multilevel CFA-MTMM model for measurement designs with structurally different and interchangeable methods (called Latent-State-Combination-Of-Methods model, LS-COM) is presented. Interchangeable methods are methods that are randomly sampled from a set of equivalent methods (e.g., multiple student ratings for teaching quality), whereas structurally different methods are methods that cannot be easily replaced by one another (e.g., teacher, self-ratings, principle ratings). Results of a simulation study indicate that the parameters and standard errors in the LS-COM model are well recovered even in conditions with only five observations per estimated model parameter. The advantages and limitations of the LS-COM model relative to other longitudinal MTMM modeling approaches are discussed. PMID:24860515

  1. The Pixon Method for Data Compression Image Classification, and Image Reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, Richard; Yahil, Amos

    2002-01-01

    As initially proposed, this program had three goals: (1) continue to develop the highly successful Pixon method for image reconstruction and support other scientist in implementing this technique for their applications; (2) develop image compression techniques based on the Pixon method; and (3) develop artificial intelligence algorithms for image classification based on the Pixon approach for simplifying neural networks. Subsequent to proposal review the scope of the program was greatly reduced and it was decided to investigate the ability of the Pixon method to provide superior restorations of images compressed with standard image compression schemes, specifically JPEG-compressed images.

  2. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  3. Incidence, risk factors, and pregnancy outcomes of gestational diabetes mellitus using one-step versus two-step diagnostic approaches: A population-based cohort study in Isfahan, Iran.

    PubMed

    Hosseini, Elham; Janghorbani, Mohsen; Aminorroaya, Ashraf

    2018-06-01

    To study the incidence, risk factors, and pregnancy outcomes associated with gestational diabetes mellitus (GDM) diagnosed with one-step and two-step screening approaches. 1000 pregnant women who were eligible and consented to participate underwent fasting plasma glucose testing at the first prenatal visit (6-14 weeks). The women free from GDM or overt diabetes were screened at 24-28 weeks using the 50-g glucose challenge test (GCT) followed by 100-g, 3-h oral glucose tolerance test (OGTT) (two-step method). Regardless of the GCT result, all women underwent a 75-g, 2-h OGTT within one-week interval (one-step method). GDM incidence using the one-step and two-step methods was 9.3% (95% CI: 7.4-11.2) and 4.2% (95% CI: 2.9-5.5). GDM significantly increased the risk of macrosomia, gestational hypertension, preeclampsia, and cesarean section and older age and family history of diabetes significantly increased the risk of developing GDM in both approaches. In two-step method, higher pre-pregnancy body mass index and lower physical activity during pregnancy along with higher earlier cesarean section also increased significantly the risk of developing GDM. Despite a higher incidence of GDM using the one-step approach, more risk factors for and a stronger effect of GDM on adverse pregnancy outcomes were found when using the two-step approach. Longer follow-up of women with and without GDM may change the results using both approaches. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  5. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  6. Extended Worksheet Developed According to 5E Model Based on Constructivist Learning Approach

    ERIC Educational Resources Information Center

    Töman, Ufuk; Akdeniz, Ali Riza; Odabasi Çimer, Sabiha; Gürbüz, Fatih

    2013-01-01

    In order to achieve the targeted objectives desired level of education and modern learning theories for learner centered methods are recommended. In this context the use of worksheets developed and that student participation is considered to be one of the methods. This research is one of the ethyl alcohol fermentation biology issues and prepare…

  7. A systematic review and critical appraisal of qualitative metasynthetic practice in public health to develop a taxonomy of operations of reciprocal translation.

    PubMed

    Melendez-Torres, G J; Grant, Sean; Bonell, Chris

    2015-12-01

    Reciprocal translation, the understanding of one study's findings in terms of another's, is the foundation of most qualitative metasynthetic methods. In light of the proliferation of metasynthesis methods, the current review sought to create a taxonomy of operations of reciprocal translation using recently published qualitative metasyntheses. On 19 August 2013, MEDLINE, Embase and PsycINFO were searched. Included articles were full reports of metasyntheses of qualitative studies published in 2012 in English-language peer-reviewed journals. Two reviewers, working independently, screened records, assessed full texts for inclusion and extracted data on methods from each included metasynthesis. Systematic review methods used were summarised, and metasynthetic methods were inductively analysed to develop the taxonomy. Of 61 included metasyntheses, 21 (34%) reported fully replicable search strategies and 51 (84%) critically appraised included studies. Based on methods in these metasyntheses, we developed a taxonomy of reciprocal translation with four overlapping categories: visual representation; key paper integration; data reduction and thematic extraction; and line-by-line coding. This systematic review presents an update on methods and reporting currently used in qualitative metasynthesis. It also goes beyond the proliferation of approaches to offer a parsimonious approach to understanding how reciprocal translations are accomplished across metasynthetis methods. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  9. Nanotechnology based approaches in cancer therapeutics

    NASA Astrophysics Data System (ADS)

    Kumer Biswas, Amit; Reazul Islam, Md; Sadek Choudhury, Zahid; Mostafa, Asif; Fahim Kadir, Mohammad

    2014-12-01

    The current decades are marked not by the development of new molecules for the cure of various diseases but rather the development of new delivery methods for optimum treatment outcome. Nanomedicine is perhaps playing the biggest role in this concern. Nanomedicine offers numerous advantages over conventional drug delivery approaches and is particularly the hot topic in anticancer research. Nanoparticles (NPs) have many unique criteria that enable them to be incorporated in anticancer therapy. This topical review aims to look at the properties and various forms of NPs and their use in anticancer treatment, recent development of the process of identifying new delivery approaches as well as progress in clinical trials with these newer approaches. Although the outcome of cancer therapy can be increased using nanomedicine there are still many disadvantages of using this approach. We aim to discuss all these issues in this review.

  10. Generalized empirical Bayesian methods for discovery of differential data in high-throughput biology.

    PubMed

    Hardcastle, Thomas J

    2016-01-15

    High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Regenerated cellulose capsules for controlled drug delivery: Part III. Developing a fabrication method and evaluating extemporaneous utility for controlled-release.

    PubMed

    Bhatt, Bhavik; Kumar, Vijay

    2016-08-25

    In this article, we describe a method to utilize cellulose dissolved in dimethyl sulfoxide and paraformaldehyde solvent system to fabricate two-piece regenerated cellulose hard shell capsules for their potential use as an oral controlled drug delivery a priori vehicle. A systematic evaluation of solution rheology as well as resulting capsule mechanical, visual and thermal analysis was performed to develop a suitable method to repeatedly fabricate RC hard shell capsule halves. Because of the viscoelastic nature of the cellulose solution, a combination of dip-coating and casting method, herein referred to as dip-casting method, was developed. The dip-casting method was formalized by utilizing two-stage 2(2) full factorial design approach in order to determine a suitable approach to fabricate capsules with minimal variability. Thermal annealing is responsible for imparting shape rigidity of the capsules. Proof-of-concept analysis for the utility of these capsules in controlled drug delivery was performed by evaluating the release of KCl from them as well as from commercially available USP equivalent formulations. Release of KCl from cellulose capsules was comparable to extended release capsule formulation. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    NASA Astrophysics Data System (ADS)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  13. A Proposal of Product Development Collaboration Method Using User Support Information and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Tanaka, Mitsuru; Kataoka, Masatoshi; Koizumi, Hisao

    As the market changes more rapidly and new products continue to get more complex and multifunctional, product development collaboration with competent partners and leading users is getting more important to come up with new products that are successful in the market in a timely manner. ECM (engineering chain management) and SCM (supply chain management) are supply-side approaches toward this collaboration. In this paper, we propose a demand-side approach toward product development collaboration with users based on the information gathered through user support interactions. The approach and methodology proposed here was applied to a real data set, and its effectiveness was verified.

  14. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    ERIC Educational Resources Information Center

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  15. Fracture Mechanics Analyses for Interface Crack Problems - A Review

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Shivakumar, Kunigal; Raju, Ivatury S.

    2013-01-01

    Recent developments in fracture mechanics analyses of the interfacial crack problem are reviewed. The intent of the review is to renew the awareness of the oscillatory singularity at the crack tip of a bimaterial interface and the problems that occur when calculating mode mixity using numerical methods such as the finite element method in conjunction with the virtual crack closure technique. Established approaches to overcome the nonconvergence issue of the individual mode strain energy release rates are reviewed. In the recent literature many attempts to overcome the nonconvergence issue have been developed. Among the many approaches found only a few methods hold the promise of providing practical solutions. These are the resin interlayer method, the method that chooses the crack tip element size greater than the oscillation zone, the crack tip element method that is based on plate theory and the crack surface displacement extrapolation method. Each of the methods is validated on a very limited set of simple interface crack problems. However, their utility for a wide range of interfacial crack problems is yet to be established.

  16. Using Mixed Methods and Collaboration to Evaluate an Education and Public Outreach Program (Invited)

    NASA Astrophysics Data System (ADS)

    Shebby, S.; Shipp, S. S.

    2013-12-01

    Traditional indicators (such as the number of participants or Likert-type ratings of participant perceptions) are often used to provide stakeholders with basic information about program outputs and to justify funding decisions. However, use of qualitative methods can strengthen the reliability of these data and provide stakeholders with more meaningful information about program challenges, successes, and ultimate impacts (Stern, Stame, Mayne, Forss, David & Befani, 2012). In this session, presenters will discuss how they used a mixed methods evaluation to determine the impact of an education and public outreach (EPO) program. EPO efforts were intended to foster more effective, sustainable, and efficient utilization of science discoveries and learning experiences through three main goals 1) increase engagement and support by leveraging of resources, expertise, and best practices; 2) organize a portfolio of resources for accessibility, connectivity, and strategic growth; and 3) develop an infrastructure to support coordination. The evaluation team used a mixed methods design to conduct the evaluation. Presenters will first discuss five potential benefits of mixed methods designs: triangulation of findings, development, complementarity, initiation, and value diversity (Greene, Caracelli & Graham, 2005). They will next demonstrate how a 'mix' of methods, including artifact collection, surveys, interviews, focus groups, and vignettes, was included in the EPO project's evaluation design, providing specific examples of how alignment between the program theory and the evaluation plan was best achieved with a mixed methods approach. The presentation will also include an overview of different mixed methods approaches and information about important considerations when using a mixed methods design, such as selection of data collection methods and sources, and the timing and weighting of quantitative and qualitative methods (Creswell, 2003). Ultimately, this presentation will provide insight into how a mixed methods approach was used to provide stakeholders with important information about progress toward program goals. Creswell, J.W. (2003). Research design: Qualitative, quantitative, and mixed approaches. Thousand Oaks, CA: Sage. Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. Stern, E; Stame, N; Mayne, J; Forss, K; Davis, R & Befani, B (2012) Broadening the range of designs and methods for impact evaluation. Department for International Development.

  17. pkCSM: Predicting Small-Molecule Pharmacokinetic and Toxicity Properties Using Graph-Based Signatures

    PubMed Central

    2015-01-01

    Drug development has a high attrition rate, with poor pharmacokinetic and safety properties a significant hurdle. Computational approaches may help minimize these risks. We have developed a novel approach (pkCSM) which uses graph-based signatures to develop predictive models of central ADMET properties for drug development. pkCSM performs as well or better than current methods. A freely accessible web server (http://structure.bioc.cam.ac.uk/pkcsm), which retains no information submitted to it, provides an integrated platform to rapidly evaluate pharmacokinetic and toxicity properties. PMID:25860834

  18. Breaking from binaries - using a sequential mixed methods design.

    PubMed

    Larkin, Patricia Mary; Begley, Cecily Marion; Devane, Declan

    2014-03-01

    To outline the traditional worldviews of healthcare research and discuss the benefits and challenges of using mixed methods approaches in contributing to the development of nursing and midwifery knowledge. There has been much debate about the contribution of mixed methods research to nursing and midwifery knowledge in recent years. A sequential exploratory design is used as an exemplar of a mixed methods approach. The study discussed used a combination of focus-group interviews and a quantitative instrument to obtain a fuller understanding of women's experiences of childbirth. In the mixed methods study example, qualitative data were analysed using thematic analysis and quantitative data using regression analysis. Polarised debates about the veracity, philosophical integrity and motivation for conducting mixed methods research have largely abated. A mixed methods approach can contribute to a deeper, more contextual understanding of a variety of subjects and experiences; as a result, it furthers knowledge that can be used in clinical practice. The purpose of the research study should be the main instigator when choosing from an array of mixed methods research designs. Mixed methods research offers a variety of models that can augment investigative capabilities and provide richer data than can a discrete method alone. This paper offers an example of an exploratory, sequential approach to investigating women's childbirth experiences. A clear framework for the conduct and integration of the different phases of the mixed methods research process is provided. This approach can be used by practitioners and policy makers to improve practice.

  19. A Bayesian Approach to Determination of F, D, and Z Values Used in Steam Sterilization Validation.

    PubMed

    Faya, Paul; Stamey, James D; Seaman, John W

    2017-01-01

    For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the well-known D T , z , and F o values that are used in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these values to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. LAY ABSTRACT: For manufacturers of sterile drug products, steam sterilization is a common method used to provide assurance of the sterility of manufacturing equipment and products. The validation of sterilization processes is a regulatory requirement and relies upon the estimation of key resistance parameters of microorganisms. Traditional methods have relied upon point estimates for the resistance parameters. In this paper, we propose a Bayesian method for estimation of the critical process parameters that are evaluated in the development and validation of sterilization processes. A Bayesian approach allows the uncertainty about these parameters to be modeled using probability distributions, thereby providing a fully risk-based approach to measures of sterility assurance. An example is given using the survivor curve and fraction negative methods for estimation of resistance parameters, and we present a means by which a probabilistic conclusion can be made regarding the ability of a process to achieve a specified sterility criterion. © PDA, Inc. 2017.

  20. On the stability of the relationship between reference evapotranspiration and single tree transpiration: test and validation over several irrigated tree orchards

    NASA Astrophysics Data System (ADS)

    Ayyoub, Abdellatif; Er-Raki, Salah; Khabba, Saïd; Merlin, Olivier; César Rodriguez, Julio; Ezzahar, Jamal; Bahlaoui, Ahmed; Chehbouni, Abdelghani

    2016-04-01

    The present work aims to develop a simple approach relating normalized daily sap flow (per unit of leaf area) and daily ET0 (mm/day) calculated by two methods: FAO-Penman-Monteith (FAO-PM) and Hargreaves-Samani (HARG). The data sets used for developing this approach are taken from three experimental sites (olive trees, cv. "Oleaeuropaea L.", olive trees, cv. "Arbequino" and citrus trees cv. "Clementine Afourar") conducted in the Tensift region around Marrakech, Morocco and one experimental site (pecan orchard, cv. "Caryaillinoinensis, Wangenh. K. Koch") conducted in the Yaqui Valley, northwest of Mexico). The results showed that the normalized daily sap flow (volume of transpired water per unit of leaf area) was linearly correlated with ET0 (mm per day) calculated by FAO-PM method. The coefficient of determination (R2) and the slope of this linear regression varied between 0.71 and 0.97 and between 0.30 and 0.35, respectively, depending on the type of orchards. For HARG method, the relationship between both terms is also linear but with less accuracy (R2 =0.7) as expected due to the underestimation of ET0 by this method. Afterward, the validation of the developed linear relationship was performed over an olive orchard ("Oleaeuropaea L.") where the measurements of sap flow were available for another (2004) cropping season. The scatter plot between the normalized measured and estimated sap flow based on FAO-PM method reveals a very good agreement (slope = 1, with R2 = 0.83 and RMSE=0.14 L/m2 leaf area). However, for the estimation of normalized sap flow based on HARG method, the correlation is more scattered with some underestimation (5%). A further validation wasperformed using the measurements of evapotranspiration (ET) by eddy correlation system and the results showed that the correlation between normalized measured ET and estimated normalized sap flow is best when using FAO-PM method (RMSE=0.33 L/m2 leaf area) for estimating ET0 than when using HARG method (RMSE= 0.51 L/m2 leaf area). Finally, the performance of the developed approach was compared to the traditional dual crop coefficient scheme for estimating plant transpiration. Cross-comparison of these two approaches with the measurements data gave satisfactory results with an average value of RMSE equal to about 0.37 mm/day for both approaches.

Top