Sample records for developing effective methodologies

  1. Methodological issues associated with preclinical drug development and increased placebo effects in schizophrenia clinical trials.

    PubMed

    Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I

    2016-01-20

    Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.

  2. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  3. A Proposed Theory Seeded Methodology for Design Based Research into Effective Use of MUVES in Vocational Education Contexts

    ERIC Educational Resources Information Center

    Cochrane, Todd; Davis, Niki; Morrow, Donna

    2013-01-01

    A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…

  4. Integrating Low-Cost Rapid Usability Testing into Agile System Development of Healthcare IT: A Methodological Perspective.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    The development of more usable and effective healthcare information systems has become a critical issue. In the software industry methodologies such as agile and iterative development processes have emerged to lead to more effective and usable systems. These approaches highlight focusing on user needs and promoting iterative and flexible development practices. Evaluation and testing of iterative agile development cycles is considered an important part of the agile methodology and iterative processes for system design and re-design. However, the issue of how to effectively integrate usability testing methods into rapid and flexible agile design cycles has remained to be fully explored. In this paper we describe our application of an approach known as low-cost rapid usability testing as it has been applied within agile system development in healthcare. The advantages of the integrative approach are described, along with current methodological considerations.

  5. A methodology to support the development of 4-year pavement management plan.

    DOT National Transportation Integrated Search

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  6. Methodology for assessing the effectiveness of access management techniques : executive summary.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  7. Methodology for assessing the effectiveness of access management techniques : final report, September 14, 1998.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  8. Effects-Based Operations in the Cyber Domain

    DTIC Science & Technology

    2017-05-03

    as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based

  9. A methodology for analyzing general categorical data with misclassification errors with an application in studying seat belt effectiveness

    DOT National Transportation Integrated Search

    1977-06-01

    Author's abstract: In this report, a methodology for analyzing general categorical data with misclassification errors is developed and applied to the study of seat belt effectiveness. The methodology assumes the availability of an original large samp...

  10. The Differential Effect of Attentional Condition on Subsequent Vocabulary Development

    ERIC Educational Resources Information Center

    Mohammed, Halah Abdulelah; Majid, Norazman Abdul; Abdullah, Tina

    2016-01-01

    This study addressed the potential methodological issues effect of attentional condition on subsequent vocabulary development from a different perspective, which addressed several potential methodological issues of previous research that have been based on psycholinguistic notion of second language learner as a limited capacity processor. The…

  11. The Combined Effect of Mere Exposure, Counterattitudinal Advocacy, and Art Criticism Methodology on Upper Elementary and Junior High Students' Affect Toward Art Works.

    ERIC Educational Resources Information Center

    Hollingsworth, Patricia

    1983-01-01

    Results indicated that, for elementary students, art criticism was more effective than a combination of methodologies for developing positive affect toward art works. For junior high students, the combination methodology was more effective than art criticism, the exposure method, or the counterattitudinal advocacy method. (Author/SR)

  12. Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation

    NASA Technical Reports Server (NTRS)

    Garrison, J. M.

    1973-01-01

    The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.

  13. TRAC Innovative Visualization Techniques

    DTIC Science & Technology

    2016-11-14

    Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and

  14. Evaluation of Urban After-School Programs: Effective Methodologies for a Diverse and Political Environment.

    ERIC Educational Resources Information Center

    Frank, Martina W.; Walker-Moffat, Wendy

    This study considered how 25 highly diverse after-school programs with funding of $5.6 million were evaluated during a 10-month period. The paper describes the evaluation methodologies used and determined which methodologies were most effective within a diverse and political context. The Bayview Fund for Youth Development (name assumed for…

  15. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  16. Three-dimensional viscous design methodology for advanced technology aircraft supersonic inlet systems

    NASA Technical Reports Server (NTRS)

    Anderson, B. H.

    1983-01-01

    A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.

  17. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  18. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  19. 'Emerging technologies for the changing global market' - Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  20. Development of a design methodology for asphalt treated mixtures.

    DOT National Transportation Integrated Search

    2013-12-01

    This report summarizes the results of a study that was conducted to develop a simplified design methodology for asphalt : treated mixtures that are durable, stable, constructible, and cost effective through the examination of the performance of : mix...

  1. U.S. EPA'S ACUTE REFERENCE EXPOSURE METHODOLOGY FOR ACUTE INHALATION EXPOSURES

    EPA Science Inventory

    The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...

  2. Development of Chemical Engineering Course Methods Using Action Research: Case Study

    ERIC Educational Resources Information Center

    Virkki-Hatakka, Terhi; Tuunila, Ritva; Nurkka, Niina

    2013-01-01

    This paper reports on the systematic development of a teaching methodology for two chemical engineering courses. The aim was to improve the quality of teaching to achieve expected learning outcomes more effectively. The development was carried out over a period of several years based on an action research methodology with data systematically…

  3. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  4. La efectividad de la educacion a distancia como metodologia en la desarrollo de destrezas de pensamiento (Effectiveness of Distance Education as a Methodology for Developing Thinking Skills).

    ERIC Educational Resources Information Center

    Melendez Alicea, Juan

    1992-01-01

    Presents steps taken in designing, justifying, and implementing an experimental study designed to investigate the effectiveness of distance education as a methodology for developing thinking skills. A discussion reviews major findings of the study by comparing student experiences from multimedia distance education and student experiences from…

  5. Preliminary methodology to assess the national and regional impact of U.S. wind energy development on birds and bats

    USGS Publications Warehouse

    Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.

    2015-01-01

    Components of the methodology are based on simplifying assumptions and require information that, for many species, may be sparse or unreliable. These assumptions are presented in the report and should be carefully considered when using output from the methodology. In addition, this methodology can be used to recommend species for more intensive demographic modeling or highlight those species that may not require any additional protection because effects of wind energy development on their populations are projected to be small.

  6. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    PubMed

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  7. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  8. Performance evaluation in full-mission simulation - Methodological advances and research challenges. [in air transport operations

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.

    1989-01-01

    The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.

  9. The Development of Methodologies for Determining Non-Linear Effects in Infrasound Sensors

    DTIC Science & Technology

    2010-09-01

    THE DEVELOPMENT OF METHODOLOGIES FOR DETERMINING NON-LINEAR EFFECTS IN INFRASOUND SENSORS Darren M. Hart, Harold V. Parks, and Randy K. Rembold...the past year, four new infrasound sensor designs were evaluated for common performance characteristics, i.e., power consumption, response (amplitude...and phase), noise, full-scale, and dynamic range. In the process of evaluating a fifth infrasound sensor, which is an update of an original design

  10. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  11. Event-driven, pattern-based methodology for cost-effective development of standardized personal health devices.

    PubMed

    Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis

    2014-11-01

    Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Building a Three-Dimensional Nano-Bio Interface for Aptasensing: An Analytical Methodology Based on Steric Hindrance Initiated Signal Amplification Effect.

    PubMed

    Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun

    2016-10-04

    The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.

  13. Incorporating Sustainability Content and Pedagogy through Faculty Development

    ERIC Educational Resources Information Center

    Hurney, Carol A.; Nash, Carole; Hartman, Christie-Joy B.; Brantmeier, Edward J.

    2016-01-01

    Purpose: Key elements of a curriculum are presented for a faculty development program that integrated sustainability content with effective course design methodology across a variety of disciplines. The study aims to present self-reported impacts for a small number of faculty participants and their courses. Design/methodology/approach: A yearlong…

  14. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  15. 32 CFR 310.38 - Training methodology and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... procedures. (a) Each DoD Component is responsible for the development of training procedures and methodology... widest possible audience. Web-based training and video conferencing have been effective means to provide...

  16. Methodological standards and patient-centeredness in comparative effectiveness research: the PCORI perspective.

    PubMed

    2012-04-18

    Rigorous methodological standards help to ensure that medical research produces information that is valid and generalizable, and are essential in patient-centered outcomes research (PCOR). Patient-centeredness refers to the extent to which the preferences, decision-making needs, and characteristics of patients are addressed, and is the key characteristic differentiating PCOR from comparative effectiveness research. The Patient Protection and Affordable Care Act signed into law in 2010 created the Patient-Centered Outcomes Research Institute (PCORI), which includes an independent, federally appointed Methodology Committee. The Methodology Committee is charged to develop methodological standards for PCOR. The 4 general areas identified by the committee in which standards will be developed are (1) prioritizing research questions, (2) using appropriate study designs and analyses, (3) incorporating patient perspectives throughout the research continuum, and (4) fostering efficient dissemination and implementation of results. A Congressionally mandated PCORI methodology report (to be issued in its first iteration in May 2012) will begin to provide standards in each of these areas, and will inform future PCORI funding announcements and review criteria. The work of the Methodology Committee is intended to enable generation of information that is relevant and trustworthy for patients, and to enable decisions that improve patient-centered outcomes.

  17. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  18. Emerging technologies for the changing global market

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  19. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  20. Development of design and analysis methodology for composite bolted joints

    NASA Astrophysics Data System (ADS)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  1. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  2. EVALUATING THE SUSTAINABILITY OF GREEN CHEMISTRIES

    EPA Science Inventory

    The U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of reaction chemistries. This methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Proc...

  3. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  4. Evidence-based development and evaluation of mobile cognitive support apps for people on the autism spectrum: methodological conclusions from two R+D projects.

    PubMed

    Gyori, Miklos; Stefanik, Krisztina; Kanizsai-Nagy, Ildikó

    2015-01-01

    A growing body of evidence confirms that mobile digital devices have key potentials as assistive/educational tools for people with autism spectrum disorders. The aim of this paper is to outline key aspects of development and evaluation methodologies that build on, and provide systematic evidence on effects of using such apps. We rely on the results of two R+D projects, both using quantitative and qualitative methods to support development and to evaluate developed apps (n=54 and n=22). Analyzing methodological conclusions from these studies we outline some guidelines for an 'ideal' R+D methodology but we also point to important trade-offs between the need for best systematic evidence and the limitations on development time and costs. We see these trade-offs as a key issue to be resolved in this field.

  5. Collaborative Action Research in the Context of Developmental Work Research: A Methodological Approach for Science Teachers' Professional Development

    ERIC Educational Resources Information Center

    Piliouras, Panagiotis; Lathouris, Dimitris; Plakitsi, Katerina; Stylianou, Liana

    2015-01-01

    The paper refers to the theoretical establishment and brief presentation of collaborative action research with the characteristics of "developmental work research" as an effective methodological approach so that science teachers develop themselves professionally. A specific case study is presented, in which we aimed to transform the…

  6. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  7. A systematic review of studies evaluating Australian indigenous community development projects: the extent of community participation, their methodological quality and their outcomes.

    PubMed

    Snijder, Mieke; Shakeshaft, Anthony; Wagemakers, Annemarie; Stephens, Anne; Calabria, Bianca

    2015-11-21

    Community development is a health promotion approach identified as having great potential to improve Indigenous health, because of its potential for extensive community participation. There has been no systematic examination of the extent of community participation in community development projects and little analysis of their effectiveness. This systematic review aims to identify the extent of community participation in community development projects implemented in Australian Indigenous communities, critically appraise the qualitative and quantitative methods used in their evaluation, and summarise their outcomes. Ten electronic peer-reviewed databases and two electronic grey literature databases were searched for relevant studies published between 1990 and 2015. The level of community participation and the methodological quality of the qualitative and quantitative components of the studies were assessed against standardised criteria. Thirty one evaluation studies of community development projects were identified. Community participation varied between different phases of project development, generally high during project implementation, but low during the evaluation phase. For the majority of studies, methodological quality was low and the methods were poorly described. Although positive qualitative or quantitative outcomes were reported in all studies, only two studies reported statistically significant outcomes. Partnerships between researchers, community members and service providers have great potential to improve methodological quality and community participation when research skills and community knowledge are integrated to design, implement and evaluate community development projects. The methodological quality of studies evaluating Australian Indigenous community development projects is currently too weak to confidently determine the cost-effectiveness of community development projects in improving the health and wellbeing of Indigenous Australians. Higher quality studies evaluating community development projects would strengthen the evidence base.

  8. [Progress in methodological characteristics of clinical practice guideline for osteoarthritis].

    PubMed

    Xing, D; Wang, B; Lin, J H

    2017-06-01

    At present, several clinical practice guidelines for the treatment of osteoarthritis have been developed by institutes or societies. The ultimate purpose of developing clinical practice guidelines is to formulate the process in the treatment of osteoarthritis effectively. However, the methodologies used in developing clinical practice guidelines may place an influence on the transformation and application of that in treating osteoarthritis. The present study summarized the methodological features of individual clinical practice guideline and presented the tools for quality evaluation of clinical practice guideline. The limitations of current osteoarthritis guidelines of China are also indicated. The review article might help relevant institutions improve the quality in developing guide and clinical transformation.

  9. HETEROGENEITY IN TREATMENT EFFECT AND COMPARATIVE EFFECTIVENESS RESEARCH.

    PubMed

    Luo, Zhehui

    2011-10-01

    The ultimate goal of comparative effectiveness research (CER) is to develop and disseminate evidence-based information about which interventions are most effective for which patients under what circumstances. To achieve this goal it is crucial that researchers in methodology development find appropriate methods for detecting the presence and sources of heterogeneity in treatment effect (HTE). Comparing with the typically reported average treatment effect (ATE) in randomized controlled trials and non-experimental (i.e., observational) studies, identifying and reporting HTE better reflect the nature and purposes of CER. Methodologies of CER include meta-analysis, systematic review, design of experiments that encompasses HTE, and statistical correction of various types of estimation bias, which is the focus of this review.

  10. Methodologic ramifications of paying attention to sex and gender differences in clinical research.

    PubMed

    Prins, Martin H; Smits, Kim M; Smits, Luc J

    2007-01-01

    Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.

  11. OPUS: Optimal Projection for Uncertain Systems. Volume 1

    DTIC Science & Technology

    1991-09-01

    unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of

  12. EVALUATING METRICS FOR GREEN CHEMISTRIES: INFORMATION AND CALCULATION NEEDS

    EPA Science Inventory

    Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Ob...

  13. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  14. Methodology evaluation: Effects of independent verification and intergration on one class of application

    NASA Technical Reports Server (NTRS)

    Page, J.

    1981-01-01

    The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.

  15. The Path to English Literacy: Analyzing Elementary Sight Word Procurement Using Computer Assisted Language Learning (CALL) in Contrast to Traditional Methodologies

    ERIC Educational Resources Information Center

    Madill, Michael T. R.

    2014-01-01

    Didactical approaches related to teaching English as a Foreign Language (EFL) have developed into a complex array of instructional methodologies, each having potential benefits attributed to elementary reading development. One such effective practice is Computer Assisted Language Learning (CALL), which uses various forms of technology such as…

  16. Regional risk assessment approaches to land planning for industrial polluted areas in China: the Hulunbeier region case study.

    PubMed

    Li, Daiqing; Zhang, Chen; Pizzol, Lisa; Critto, Andrea; Zhang, Haibo; Lv, Shihai; Marcomini, Antonio

    2014-04-01

    The rapid industrial development and urbanization processes that occurred in China over the past 30years has increased dramatically the consumption of natural resources and raw materials, thus exacerbating the human pressure on environmental ecosystems. In result, large scale environmental pollution of soil, natural waters and urban air were recorded. The development of effective industrial planning to support regional sustainable economy development has become an issue of serious concern for local authorities which need to select safe sites for new industrial settlements (i.e. industrial plants) according to assessment approaches considering cumulative impacts, synergistic pollution effects and risks of accidental releases. In order to support decision makers in the development of efficient and effective regional land-use plans encompassing the identification of suitable areas for new industrial settlements and areas in need of intervention measures, this study provides a spatial regional risk assessment methodology which integrates relative risk assessment (RRA) and socio-economic assessment (SEA) and makes use of spatial analysis (GIS) methodologies and multicriteria decision analysis (MCDA) techniques. The proposed methodology was applied to the Chinese region of Hulunbeier which is located in eastern Inner Mongolia Autonomous Region, adjacent to the Republic of Mongolia. The application results demonstrated the effectiveness of the proposed methodology in the identification of the most hazardous and risky industrial settlements, the most vulnerable regional receptors and the regional districts which resulted to be the most relevant for intervention measures since they are characterized by high regional risk and excellent socio-economic development conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Computational simulation of coupled material degradation processes for probabilistic lifetime strength of aerospace materials

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.

    1992-01-01

    The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  18. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  19. Seven Performance Drivers.

    ERIC Educational Resources Information Center

    Ross, Linda

    2003-01-01

    Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…

  20. Establishing a methodology to evaluate teen driver-training programs.

    DOT National Transportation Integrated Search

    2013-11-01

    The goal of this research project was to develop a methodology to assist the Wisconsin Department of : Transportation (WisDOT) in the evaluation of effectiveness of teen driver education programs over the : short and long terms. The research effort w...

  1. SIMPLIFYING EVALUATIONS OF GREEN CHEMISTRIES: HOW MUCH INFORMATION DO WE NEED?

    EPA Science Inventory

    Research within the U.S. EPA's National Risk Management Research Laboratory is developing a methodology for the evaluation of green chemistries. This methodology called GREENSCOPE (Gauging Reaction Effectiveness for the Environmental Sustainability of Chemistries with a multi-Ob...

  2. A Procedure for Measuring the Effectiveness of Training and Development Programs.

    ERIC Educational Resources Information Center

    Helliwell, Tanis

    1978-01-01

    The article presents an outline of the methodology used in a study designed to evaluate the effectiveness of staff development courses for Ontario's Civil Service Commission. The procedure included a literature search, instructor interviews, and questionnaire development. (MF)

  3. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  4. Building a Better Canine Warrior

    DTIC Science & Technology

    2017-10-12

    without adversely affecting pe r formance and to develop technical methodology that would dissipate metabolic heat without the expense o f body water...technical methodology that would dissipate metabolic heat without the expense of body water. Neither an increase in dietary salt nor decrease in...from a methodological aspect as well as emerging regulatory issues related to research in working dogs. Data suggested that the effect of high

  5. Development of Testing Methodologies for the Mechanical Properties of MEMS

    NASA Technical Reports Server (NTRS)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  6. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  7. Peer-led Aboriginal parent support: Program development for vulnerable populations with participatory action research.

    PubMed

    Munns, Ailsa; Toye, Christine; Hegney, Desley; Kickett, Marion; Marriott, Rhonda; Walker, Roz

    2017-10-01

    Participatory action research (PAR) is a credible, culturally appropriate methodology that can be used to effect collaborative change within vulnerable populations. This PAR study was undertaken in a Western Australian metropolitan setting to develop and evaluate the suitability, feasibility and effectiveness of an Aboriginal peer-led home visiting programme. A secondary aim, addressed in this paper, was to explore and describe research methodology used for the study and provide recommendations for its implementation in other similar situations. PAR using action learning sets was employed to develop the parent support programme and data addressing the secondary, methodological aim were collected through focus groups using semi-structured and unstructured interview schedules. Findings were addressed throughout the action research process to enhance the research process. The themes that emerged from the data and addressed the methodological aim were the need for safe communication processes; supportive engagement processes and supportive organisational processes. Aboriginal peer support workers (PSWs) and community support agencies identified three important elements central to their capacity to engage and work within the PAR methodology. This research has provided innovative data, highlighting processes and recommendations for child health nurses to engage with the PSWs, parents and community agencies to explore culturally acceptable elements for an empowering methodology for peer-led home visiting support. There is potential for this nursing research to credibly inform policy development for Aboriginal child and family health service delivery, in addition to other vulnerable population groups. Child health nurses/researchers can use these new understandings to work in partnership with Aboriginal communities and families to develop empowering and culturally acceptable strategies for developing Aboriginal parent support for the early years. Impact Statement Child health nurses and Aboriginal communities can collaborate through participatory action research to develop peer-led support for the early years. Indigenous Australian peoples are people who identify as Aboriginal or Torres Strait Islander. Respectfully, throughout this paper, they will be described as Aboriginal.

  8. Introducing a new methodology for the calculation of local philicity and multiphilic descriptor: an alternative to the finite difference approximation

    NASA Astrophysics Data System (ADS)

    Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel

    2018-07-01

    This work presents a new development based on the condensation scheme proposed by Chamorro and Pérez, in which new terms to correct the frozen molecular orbital approximation have been introduced (improved frontier molecular orbital approximation). The changes performed on the original development allow taking into account the orbital relaxation effects, providing equivalent results to those achieved by the finite difference approximation and leading also to a methodology with great advantages. Local reactivity indices based on this new development have been obtained for a sample set of molecules and they have been compared with those indices based on the frontier molecular orbital and finite difference approximations. A new definition based on the improved frontier molecular orbital methodology for the dual descriptor index is also shown. In addition, taking advantage of the characteristics of the definitions obtained with the new condensation scheme, the descriptor local philicity is analysed by separating the components corresponding to the frontier molecular orbital approximation and orbital relaxation effects, analysing also the local parameter multiphilic descriptor in the same way. Finally, the effect of using the basis set is studied and calculations using DFT, CI and Möller-Plesset methodologies are performed to analyse the consequence of different electronic-correlation levels.

  9. Common Effects Methodology for Pesticides

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  10. The Utilization of Navy People-Related RDT&E (Research, Development, Test, and Evaluation): Fiscal Year 1983.

    DTIC Science & Technology

    1984-06-01

    emostraion. Tese eserch ool wee deignted and experimental demonstrations wre successfully con- for demonstrations. These research tools wre designated ...Topics 4.02 Instructional Systems Design Methodology Instructional Systems Development and Effectiveness Evaluation .................................... 1...6 53 0 0 67w Report Page 10.07 Human Performance Variables/Factors 10.08 Man-Machine Design Methodology Computer Assisted Methods for Human

  11. Algorithm for evaluating the effectiveness of a high-rise development project based on current yield

    NASA Astrophysics Data System (ADS)

    Soboleva, Elena

    2018-03-01

    The article is aimed at the issues of operational evaluation of development project efficiency in high-rise construction under the current economic conditions in Russia. The author touches the following issues: problems of implementing development projects, the influence of the operational evaluation quality of high-rise construction projects on general efficiency, assessing the influence of the project's external environment on the effectiveness of project activities under crisis conditions and the quality of project management. The article proposes the algorithm and the methodological approach to the quality management of the developer project efficiency based on operational evaluation of the current yield efficiency. The methodology for calculating the current efficiency of a development project for high-rise construction has been updated.

  12. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  13. A review and preliminary evaluation of methodological factors in performance assessments of time-varying aircraft noise effects

    NASA Technical Reports Server (NTRS)

    Coates, G. D.; Alluisi, E. A.

    1975-01-01

    The effects of aircraft noise on human performance is considered. Progress is reported in the following areas: (1) review of the literature to identify the methodological and stimulus parameters involved in the study of noise effects on human performance; (2) development of a theoretical framework to provide working hypotheses as to the effects of noise on complex human performance; and (3) data collection on the first of several experimental investigations designed to provide tests of the hypotheses.

  14. Is Military Advertising Effective? An Estimation Methodology and Applications to Recruiting in the 1980’s and 90s

    DTIC Science & Technology

    2003-01-01

    This report documents research findings from a RAND project titled The Relative Cost Effectiveness of Military Advertising , the goal of which was to...develop and apply a methodology for assessing the cost effectiveness of the services’ advertising programs and to provide guidance for a more...examines issues related to the effectiveness of recruiting advertising during the 1980s and 1990s. It describes the policy context, summarizes the current

  15. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less

  16. Adaptation of a software development methodology to the implementation of a large-scale data acquisition and control system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Madrid, G. A.; Westmoreland, P. T.

    1983-01-01

    A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.

  17. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  18. A Cooperative IDS Approach Against MPTCP Attacks

    DTIC Science & Technology

    2017-06-01

    physical testbeds in order to present a methodology that allows distributed IDSs (DIDS) to cooperate in a manner that permits effective detection of...reconstruct MPTCP subflows and detect malicious content. Next, we build physical testbeds in order to present a methodology that allows distributed IDSs...hypotheses on a more realistic testbed environment. • Developing a methodology to incorporate multiple IDSs, real and virtual, to be able to detect cross

  19. A methodology for Manufacturing Execution Systems (MES) implementation

    NASA Astrophysics Data System (ADS)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  20. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  1. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  2. Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment

    NASA Technical Reports Server (NTRS)

    Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.

    2009-01-01

    An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.

  3. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2001-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  4. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2002-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  5. The ICA Communication Audit and Perceived Communication Effectiveness Changes in 16 Audited Organizations.

    ERIC Educational Resources Information Center

    Brooks, Keith; And Others

    1979-01-01

    Discusses the benefits of the International Communication Association Communication Audit as a methodology for evaluation of organizational communication processes and outcomes. An "after" survey of 16 audited organizations confirmed the audit as a valid diagnostic methodology and organization development intervention technique which…

  6. Comparative Effectiveness Research in Lung Diseases and Sleep Disorders

    PubMed Central

    Lieu, Tracy A.; Au, David; Krishnan, Jerry A.; Moss, Marc; Selker, Harry; Harabin, Andrea; Connors, Alfred

    2011-01-01

    The Division of Lung Diseases of the National Heart, Lung, and Blood Institute (NHLBI) held a workshop to develop recommendations on topics, methodologies, and resources for comparative effectiveness research (CER) that will guide clinical decision making about available treatment options for lung diseases and sleep disorders. A multidisciplinary group of experts with experience in efficacy, effectiveness, implementation, and economic research identified (a) what types of studies the domain of CER in lung diseases and sleep disorders should include, (b) the criteria and process for setting priorities, and (c) current resources for and barriers to CER in lung diseases. Key recommendations were to (1) increase efforts to engage stakeholders in developing CER questions and study designs; (2) invest in further development of databases and other infrastructure, including efficient methods for data sharing; (3) make full use of a broad range of study designs; (4) increase the appropriate use of observational designs and the support of methodologic research; (5) ensure that committees that review CER grant applications include persons with appropriate perspective and expertise; and (6) further develop the workforce for CER by supporting training opportunities that focus on the methodologic and practical skills needed. PMID:21965016

  7. Group Development of Effective Governance Teams

    ERIC Educational Resources Information Center

    Mar, Deborah Katherine

    2011-01-01

    Purpose. The purpose of this study was to identify and describe the behaviors of effective governance teams as they move through stages of group development during regular school board meetings, utilizing the task and process behaviors identified in the Group Development Assessment (Jones & Bearley, 1994). Methodology. This mixed-methods…

  8. Common Effects Methodology National Stakeholder Meeting December 1, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  9. Cochrane Qualitative and Implementation Methods Group guidance series-paper 3: methods for assessing methodological limitations, data extraction and synthesis, and confidence in synthesized qualitative findings.

    PubMed

    Noyes, Jane; Booth, Andrew; Flemming, Kate; Garside, Ruth; Harden, Angela; Lewin, Simon; Pantoja, Tomas; Hannes, Karin; Cargo, Margaret; Thomas, James

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method implementation evidence. Choice of appropriate methodologies, methods, and tools is essential when developing a rigorous protocol and conducting the synthesis. Cochrane authors who conduct qualitative evidence syntheses have thus far used a small number of relatively simple methods to address similarly written questions. Cochrane has invested in methodological work to develop new tools and to encourage the production of exemplar reviews to show the value of more innovative methods that address a wider range of questions. In this paper, in the series, we report updated guidance on the selection of tools to assess methodological limitations in qualitative studies and methods to extract and synthesize qualitative evidence. We recommend application of Grades of Recommendation, Assessment, Development, and Evaluation-Confidence in the Evidence from Qualitative Reviews to assess confidence in qualitative synthesized findings. This guidance aims to support review authors to undertake a qualitative evidence synthesis that is intended to be integrated subsequently with the findings of one or more Cochrane reviews of the effects of similar interventions. The review of intervention effects may be undertaken concurrently with or separate to the qualitative evidence synthesis. We encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Methodological guidelines for developing accident modification functions.

    PubMed

    Elvik, Rune

    2015-07-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Understanding Teachers' Cognitive Processes during Online Professional Learning: A Methodological Comparison

    ERIC Educational Resources Information Center

    Beach, Pamela; Willows, Dale

    2017-01-01

    This study examined the effectiveness of three types of think aloud methods for understanding elementary teachers' cognitive processes as they used a professional development website. A methodology combining a retrospective think aloud procedure with screen capture technology (referred to as the virtual revisit) was compared with concurrent and…

  12. Impact Evaluation of Quality Assurance in Higher Education: Methodology and Causal Designs

    ERIC Educational Resources Information Center

    Leiber, Theodor; Stensaker, Bjørn; Harvey, Lee

    2015-01-01

    In this paper, the theoretical perspectives and general methodological elements of impact evaluation of quality assurance in higher education institutions are discussed, which should be a cornerstone of quality development in higher education and contribute to improving the knowledge about the effectiveness (or ineffectiveness) of quality…

  13. A system management methodology for building successful resource management systems

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Willoughby, John K.

    1989-01-01

    This paper presents a system management methodology for building successful resource management systems that possess lifecycle effectiveness. This methodology is based on an analysis of the traditional practice of Systems Engineering Management as it applies to the development of resource management systems. The analysis produced fifteen significant findings presented as recommended adaptations to the traditional practice of Systems Engineering Management to accommodate system development when the requirements are incomplete, unquantifiable, ambiguous and dynamic. Ten recommended adaptations to achieve operational effectiveness when requirements are incomplete, unquantifiable or ambiguous are presented and discussed. Five recommended adaptations to achieve system extensibility when requirements are dynamic are also presented and discussed. The authors conclude that the recommended adaptations to the traditional practice of Systems Engineering Management should be implemented for future resource management systems and that the technology exists to build these systems extensibly.

  14. To Develop and Test Improved Procedures for the Development and Distribution of Quality Individualized Mediated Instructional Materials in Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    State Fair Community Coll., Sedalia, MO.

    Five objectives are reported for a project to develop and test effective procedures for designing, field testing, reproducing, and disseminating individualized mediated instructional materials: (1) improvement of teacher input, (2) development of individualized instruction modules, (3) development of methodology for evaluating the effectiveness of…

  15. Characteristics and Models of Effective Professional Development: The Case of School Teachers in Qatar

    ERIC Educational Resources Information Center

    Abu-Tineh, Abdullah M.; Sadiq, Hissa M.

    2018-01-01

    The purpose of this study was to investigate the characteristics of effective professional development and effective models of professional development as perceived by school teachers in the State of Qatar. This study is quantitative in nature and was conducted using a survey methodology. Means, standard deviations, t-test, and one-way analysis of…

  16. Children's concepts of physical illness: a review and critique of the cognitive-developmental literature.

    PubMed

    Burbach, D J; Peterson, L

    1986-01-01

    Cognitive-developmental studies relevant to children's concepts of physical illness are reviewed and critiqued. Although numerous methodological weaknesses make firm conclusions difficult, most data appear to suggest that children's concepts of illness do evolve in a systematic and predictable sequence consistent with Piaget's theory of cognitive development. Methodological weaknesses identified include poor description of samples, assessment instruments, and procedures; lack of control over potential observer bias, expectancy effects, and other confounding variables; and minimal attention to reliability and validity issues. Increased methodological rigor and a further explication of the specific and unique ways in which children's concepts of illness develop over the course of cognitive development could substantially increase the value of these studies for professionals in pediatric health care settings.

  17. Implementation of a cooperative methodology to develop organic chemical engineering skills

    NASA Astrophysics Data System (ADS)

    Arteaga, J. F.; Díaz Blanco, M. J.; Toscano Fuentes, C.; Martín Alfonso, J. E.

    2013-08-01

    The objective of this work is to investigate how most of the competences required by engineering students may be developed through an active methodology based on cooperative learning/evaluation. Cooperative learning was employed by the University of Huelva's third-year engineering students. The teaching methodology pretends to create some of the most relevant engineering skills required nowadays such as the ability to cooperate finding appropriate information; the ability to solve problems through critical and creative thinking; and the ability to make decisions and to communicate effectively. The statistical study carried out supports the hypothesis that comprehensive and well-defined protocols in the development of the subject, the rubric and cooperative evaluation allow students to acquire a successful learning.

  18. A quality-based cost model for new electronic systems and products

    NASA Astrophysics Data System (ADS)

    Shina, Sammy G.; Saigal, Anil

    1998-04-01

    This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.

  19. Stochastic model for fatigue crack size and cost effective design decisions. [for aerospace structures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1975-01-01

    This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.

  20. Common Effects Methodology National Stakeholder Meeting December 1, 2010 White Papers

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  1. Common Effects Methodology Regional Stakeholder Meeting January 11 -22, 2010

    EPA Pesticide Factsheets

    EPA is exploring how to build on the substantial high quality science developed under both OPP programs to develop additional tools and approaches to support a consistent and common set of effects characterization methods using best available information.

  2. Using discrete choice experiments within a cost-benefit analysis framework: some considerations.

    PubMed

    McIntosh, Emma

    2006-01-01

    A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.

  3. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  4. Ecological monitoring in a discrete-time prey-predator model.

    PubMed

    Gámez, M; López, I; Rodríguez, C; Varga, Z; Garay, J

    2017-09-21

    The paper is aimed at the methodological development of ecological monitoring in discrete-time dynamic models. In earlier papers, in the framework of continuous-time models, we have shown how a systems-theoretical methodology can be applied to the monitoring of the state process of a system of interacting populations, also estimating certain abiotic environmental changes such as pollution, climatic or seasonal changes. In practice, however, there may be good reasons to use discrete-time models. (For instance, there may be discrete cycles in the development of the populations, or observations can be made only at discrete time steps.) Therefore the present paper is devoted to the development of the monitoring methodology in the framework of discrete-time models of population ecology. By monitoring we mean that, observing only certain component(s) of the system, we reconstruct the whole state process. This may be necessary, e.g., when in a complex ecosystem the observation of the densities of certain species is impossible, or too expensive. For the first presentation of the offered methodology, we have chosen a discrete-time version of the classical Lotka-Volterra prey-predator model. This is a minimal but not trivial system where the methodology can still be presented. We also show how this methodology can be applied to estimate the effect of an abiotic environmental change, using a component of the population system as an environmental indicator. Although this approach is illustrated in a simplest possible case, it can be easily extended to larger ecosystems with several interacting populations and different types of abiotic environmental effects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Large-Eddy Simulation (LES) of a Compressible Mixing Layer and the Significance of Inflow Turbulence

    NASA Technical Reports Server (NTRS)

    Mankbadi, Mina Reda; Georgiadis, Nicholas J.; Debonis, James R.

    2017-01-01

    In the context of Large Eddy Simulations (LES), the effects of inflow turbulence are investigated through the Synthetic Eddy Method (SEM). The growth rate of a turbulent compressible mixing layer corresponding to operating conditions of GeobelDutton Case 2 is investigated herein. The effects of spanwise width on the growth rate of the mixing layer is investigated such that spanwise width independence is reached. The error in neglecting inflow turbulence effects is quantified by comparing two methodologies: (1) Hybrid-RANS-LES methodology and (2) SEM-LES methodology. Best practices learned from Case 2 are developed herein and then applied to a higher convective mach number corresponding to Case 4 experiments of GeobelDutton.

  6. A simple landslide susceptibility analysis for hazard and risk assessment in developing countries

    NASA Astrophysics Data System (ADS)

    Guinau, M.; Vilaplana, J. M.

    2003-04-01

    In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.

  7. Measuring Impact of U.S. DOE Geothermal Technologies Office Funding: Considerations for Development of a Geothermal Resource Reporting Metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    This paper reviews existing methodologies and reporting codes used to describe extracted energy resources such as coal and oil and describes a comparable proposed methodology to describe geothermal resources. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of assessing the impacts of its funding programs. This framework will allow for GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress. Standards and reporting codes used in other countries and energy sectorsmore » provide guidance to inform development of a geothermal methodology, but industry feedback and our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and we sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for assessing and reporting on GTO funding according to resource knowledge and resource grade (or quality). This methodology would allow GTO to target funding or measure impact by progression of projects or geological potential for development.« less

  8. IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning

    ERIC Educational Resources Information Center

    Winters, Niall; Mor, Yishay

    2008-01-01

    One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…

  9. VALIDATION OF A METHOD FOR ESTIMATING POLLUTION EMISSION RATES FROM AREA SOURCES USING OPEN-PATH FTIR SEPCTROSCOPY AND DISPERSION MODELING TECHNIQUES

    EPA Science Inventory

    The paper describes a methodology developed to estimate emissions factors for a variety of different area sources in a rapid, accurate, and cost effective manner. he methodology involves using an open-path Fourier transform infrared (FTIR) spectrometer to measure concentrations o...

  10. Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach

    ERIC Educational Resources Information Center

    Stevenson, Glenn A.

    2012-01-01

    For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…

  11. Traditions, Mentoring, and Vietnamese Women Leaders in Higher Education

    ERIC Educational Resources Information Center

    Lazarian-Chehab, Rina

    2017-01-01

    Purpose: The purpose of this study was to explore the effects of informal mentoring on the leadership development of women in leadership positions in Vietnamese universities. Methodology: This study was qualitative in nature; therefore, ethnographic design methodology was utilized to collect data. The data collection was performed in three stages:…

  12. Modeling Contextual Effects in Developmental Research: Linking Theory and Method in the Study of Social Development

    ERIC Educational Resources Information Center

    Ojanen, Tiina; Little, Todd D.

    2010-01-01

    This special section was inspired by the recent increased interest and methodological advances in the assessment of context-specificity in child and adolescent social development. While the effects of groups, situations, and social relationships on cognitive, affective and behavioral development have long been acknowledged in theoretical…

  13. New scoring methodology improves the sensitivity of the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) in clinical trials.

    PubMed

    Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K

    2015-11-12

    As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.

  14. Reliability modelling and analysis of thermal MEMS

    NASA Astrophysics Data System (ADS)

    Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.

  15. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    NASA Technical Reports Server (NTRS)

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  16. Employee Motivation for Personal Development Plan Effectiveness

    ERIC Educational Resources Information Center

    Eisele, Lisa; Grohnert, Therese; Beausaert, Simon; Segers, Mien

    2013-01-01

    Purpose: This article aims to understand conditions under which personal development plans (PDPs) can effectively be implemented for professional learning. Both the organization's manner of supporting the PDP practice as well as the individual employee's motivation is taken into account. Design/ methodology/approach: A questionnaire was…

  17. EVALUATING THE WATER QUALITY EFFECTIVENESS OF WATERSHED-SCALE SOURCE WATER PROTECTION PROGRAMS

    EPA Science Inventory

    The US EPA Office of Research and Development, the Ohio River Valley Water Sanitation Commission (ORSANCO) and the Upper Big Walnut Creek Quality Partnership created a collaborative team of eleven agencies and universities to develop a methodology for evaluating the effectiveness...

  18. Congestion Mitigation and Air Quality (CMAQ) Improvement Program: Cost-Effectiveness Tables Development and Methodology

    DOT National Transportation Integrated Search

    2015-05-01

    This document presents summary and detailed findings from a research effort to develop estimates of the cost-effectiveness of a range of project types funded under the Congestion Mitigation and Air Quality (CMAQ) Improvement Program. In this study, c...

  19. The effect of docetaxel on developing oedema in patients with breast cancer: a systematic review.

    PubMed

    Hugenholtz-Wamsteker, W; Robbeson, C; Nijs, J; Hoelen, W; Meeus, M

    2016-03-01

    Docetaxel is extensively used in chemotherapy for the treatment of breast cancer. Little attention has been given to oedema as a possible side effect of docetaxel-containing therapies. Until now, no review was conducted to evaluate docetaxel-containing therapies versus docetaxel-free therapies on the magnitude of the risk of developing oedema. In this systematic review, we investigated the risk of developing oedema in patients being treated for breast cancer with or without docetaxel. In this systematic literature review, we searched PubMed and Web of Knowledge for studies on breast cancer patients treated with chemotherapy containing docetaxel. We included clinical trials comparing docetaxel versus docetaxel-free chemotherapy. Oedema had to be reported and measured as a key outcome or an adverse effect. Methodological checklists were used to assess the risk of bias within the selected studies. Seven randomised clinical trials were included. Six trials were of moderate methodological quality. All trials showed an increased rate of oedema in the docetaxel-treatment arm. The trial of weakest methodological quality reported the highest incidence of oedema. The results moderately suggest that adjuvant chemotherapy containing docetaxel is related to a significantly increased risk of developing oedema, compared with docetaxel-free chemotherapy. © 2014 John Wiley & Sons Ltd.

  20. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  1. An integrated methodology to assess the benefits of urban green space.

    PubMed

    De Ridder, K; Adamec, V; Bañuelos, A; Bruse, M; Bürger, M; Damsgaard, O; Dufek, J; Hirsch, J; Lefebre, F; Pérez-Lacorzana, J M; Thierry, A; Weber, C

    2004-12-01

    The interrelated issues of urban sprawl, traffic congestion, noise, and air pollution are major socioeconomic problems faced by most European cities. A methodology is currently being developed for evaluating the role of green space and urban form in alleviating the adverse effects of urbanisation, mainly focusing on the environment but also accounting for socioeconomic aspects. The objectives and structure of the methodology are briefly outlined and illustrated with preliminary results obtained from case studies performed on several European cities.

  2. Optimized planning methodologies of ASON implementation

    NASA Astrophysics Data System (ADS)

    Zhou, Michael M.; Tamil, Lakshman S.

    2005-02-01

    Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.

  3. Adult day health care evaluation study: methodology and implementation. Adult Day Health Care Evaluation Development Group.

    PubMed Central

    Hedrick, S C; Rothman, M L; Chapko, M; Inui, T S; Kelly, J R; Ehreth, J

    1991-01-01

    The Adult Day Health Care Evaluation Study was developed in response to a congressional mandate to study the medical efficacy and cost effectiveness of the Adult Day Health Care (ADHC) effort in the Department of Veterans Affairs (VA). Four sites providing ADHC in VA facilities are participating in an ongoing randomized controlled trial. Three years of developmental work prior to the study addressed methodological issues that were problematic in previous studies. This developmental work resulted in the methodological approaches described here: (1) a patient recruitment process that actively recruits and screens all potential candidates using empirically developed admission criteria based on predictors of nursing home placement in VA; (2) the selection and development of measures of medical efficacy that assess a wide range of patient and caregiver outcomes with sufficient sensitivity to detect small but clinically important changes; and (3) methods for detailed, accurate, and efficient measurement of utilization and costs of health care within and outside VA. These approaches may be helpful to other researchers and may advance the methodological sophistication of long-term care program evaluation. PMID:1991678

  4. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  5. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.

  6. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep, and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  7. Use of evidence-based practice in an aid organisation: a proposal to deal with the variety in terminology and methodology.

    PubMed

    De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe

    2014-03-01

    As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.

  8. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less

  9. Evidence for success in health promotion: suggestions for improvement.

    PubMed

    Macdonald, G; Veen, C; Tones, K

    1996-09-01

    This paper argues that health promotion needs to develop an approach to evaluation and effectiveness that values qualitative methodologies. It posits the idea that qualitative research could learn from the experience of quantitative researchers and promote more useful ways of measuring effectiveness by the use of intermediate and indirect indicators. It refers to a European-wide project designed to gather information on the effectiveness of health promotion interventions. This project discovered that there was a need for an instrument that allowed qualitative intervention methodologies to be assessed in the same way as quantitative methods.

  10. Development of a Flipped Medical School Dermatology Module.

    PubMed

    Fox, Joshua; Faber, David; Pikarsky, Solomon; Zhang, Chi; Riley, Richard; Mechaber, Alex; O'Connell, Mark; Kirsner, Robert S

    2017-05-01

    The flipped classroom module incorporates independent study in advance of in-class instructional sessions. It is unproven whether this methodology is effective within a medical school second-year organ system module. We report the development, implementation, and effectiveness of the flipped classroom methodology in a second-year medical student dermatology module at the University of Miami Leonard M. Miller School of Medicine. In a retrospective cohort analysis, we compared attitudinal survey data and mean scores for a 50-item multiple-choice final examination of the second-year medical students who participated in this 1-week flipped course with those of the previous year's traditional, lecture-based course. Each group comprised nearly 200 students. Students' age, sex, Medical College Admission Test scores, and undergraduate grade point averages were comparable between the flipped and traditional classroom students. The flipped module students' mean final examination score of 92.71% ± 5.03% was greater than that of the traditional module students' 90.92% ± 5.51% ( P < 0.001) score. Three of the five most commonly missed questions were identical between the two cohorts. The majority of students preferred the flipped methodology to attending live lectures or watching previously recorded lectures. The flipped classroom can be an effective instructional methodology for a medical school second-year organ system module.

  11. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less

  12. Interactive multi-mode blade impact analysis

    NASA Technical Reports Server (NTRS)

    Alexander, A.; Cornell, R. W.

    1978-01-01

    The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.

  13. Assessing human rights impacts in corporate development projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salcito, Kendyl, E-mail: kendyl.salcito@unibas.ch; University of Basel, P.O. Box, CH-4003 Basel; NomoGaia, 1900 Wazee Street, Suite 303, Denver, CO 80202

    Human rights impact assessment (HRIA) is a process for systematically identifying, predicting and responding to the potential impact on human rights of a business operation, capital project, government policy or trade agreement. Traditionally, it has been conducted as a desktop exercise to predict the effects of trade agreements and government policies on individuals and communities. In line with a growing call for multinational corporations to ensure they do not violate human rights in their activities, HRIA is increasingly incorporated into the standard suite of corporate development project impact assessments. In this context, the policy world's non-structured, desk-based approaches to HRIAmore » are insufficient. Although a number of corporations have commissioned and conducted HRIA, no broadly accepted and validated assessment tool is currently available. The lack of standardisation has complicated efforts to evaluate the effectiveness of HRIA as a risk mitigation tool, and has caused confusion in the corporate world regarding company duties. Hence, clarification is needed. The objectives of this paper are (i) to describe an HRIA methodology, (ii) to provide a rationale for its components and design, and (iii) to illustrate implementation of HRIA using the methodology in two selected corporate development projects—a uranium mine in Malawi and a tree farm in Tanzania. We found that as a prognostic tool, HRIA could examine potential positive and negative human rights impacts and provide effective recommendations for mitigation. However, longer-term monitoring revealed that recommendations were unevenly implemented, dependent on market conditions and personnel movements. This instability in the approach to human rights suggests a need for on-going monitoring and surveillance. -- Highlights: • We developed a novel methodology for corporate human rights impact assessment. • We piloted the methodology on two corporate projects—a mine and a plantation. • Human rights impact assessment exposed impacts not foreseen in ESIA. • Corporations adopted the majority of findings, but not necessarily immediately. • Methodological advancements are expected for monitoring processes.« less

  14. Geothermal Resource Reporting Metric (GRRM) Developed for the U.S. Department of Energy's Geothermal Technologies Office

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    This paper reviews a methodology being developed for reporting geothermal resources and project progress. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of evaluating the impacts of its funding programs. This framework will allow the GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress and the public. Standards and reporting codes used in other countries and energy sectors provide guidance to develop the relevant geothermal methodology, but industry feedback andmore » our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by the GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for evaluating and reporting on GTO funding according to resource grade (geological, technical and socio-economic) and project progress. This methodology would allow GTO to target funding, measure impact by monitoring the progression of projects, or assess geological potential of targeted areas for development.« less

  15. Diet, Environment and Children's Development.

    ERIC Educational Resources Information Center

    Senemaud, B.

    1988-01-01

    This report describes the relationship between maternal malnutrition and child development. The report is divided into three sections. The first section, which describes child development, focuses on brain, mental, and psychomotor development. The second section describes the methodological difficulties of measuring effects of malnutrition on the…

  16. Procedures for Trade and Industrial Program Development.

    ERIC Educational Resources Information Center

    Campbell, Clifton P.

    The instructional systems development (ISD) approach for the development and accomplishment of vocational training programs provides a methodology for gathering and analyzing job information, developing instructional materials in a variety of media, conducting instruction, and evaluating and improving the effectiveness of training programs. This…

  17. A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.

    PubMed

    Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I

    2017-09-01

    Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.

  18. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  19. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  20. Remote-sensing application for facilitating land resource assessment and monitoring for utility-scale solar energy development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Yuki; Grippo, Mark A.

    2015-01-01

    A monitoring plan that incorporates regional datasets and integrates cost-effective data collection methods is necessary to sustain the long-term environmental monitoring of utility-scale solar energy development in expansive, environmentally sensitive desert environments. Using very high spatial resolution (VHSR; 15 cm) multispectral imagery collected in November 2012 and January 2014, an image processing routine was developed to characterize ephemeral streams, vegetation, and land surface in the southwestern United States where increased utility-scale solar development is anticipated. In addition to knowledge about desert landscapes, the methodology integrates existing spectral indices and transformation (e.g., visible atmospherically resistant index and principal components); a newlymore » developed index, erosion resistance index (ERI); and digital terrain and surface models, all of which were derived from a common VHSR image. The methodology identified fine-scale ephemeral streams with greater detail than the National Hydrography Dataset and accurately estimated vegetation distribution and fractional cover of various surface types. The ERI classified surface types that have a range of erosive potentials. The remote-sensing methodology could ultimately reduce uncertainty and monitoring costs for all stakeholders by providing a cost-effective monitoring approach that accurately characterizes the land resources at potential development sites.« less

  1. Cost-effectiveness methodology for computer systems selection

    NASA Technical Reports Server (NTRS)

    Vallone, A.; Bajaj, K. S.

    1980-01-01

    A new approach to the problem of selecting a computer system design has been developed. The purpose of this methodology is to identify a system design that is capable of fulfilling system objectives in the most economical way. The methodology characterizes each system design by the cost of the system life cycle and by the system's effectiveness in reaching objectives. Cost is measured by a 'system cost index' derived from an analysis of all expenditures and possible revenues over the system life cycle. Effectiveness is measured by a 'system utility index' obtained by combining the impact that each selection factor has on the system objectives and it is assessed through a 'utility curve'. A preestablished algorithm combines cost and utility and provides a ranking of the alternative system designs from which the 'best' design is selected.

  2. Reactor safeguards system assessment and design. Volume I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varnado, G.B.; Ericson, D.M. Jr.; Daniel, S.L.

    1978-06-01

    This report describes the development and application of a methodology for evaluating the effectiveness of nuclear power reactor safeguards systems. Analytic techniques are used to identify the sabotage acts which could lead to release of radioactive material from a nuclear power plant, to determine the areas of a plant which must be protected to assure that significant release does not occur, to model the physical plant layout, and to evaluate the effectiveness of various safeguards systems. The methodology was used to identify those aspects of reactor safeguards systems which have the greatest effect on overall system performance and which, therefore,more » should be emphasized in the licensing process. With further refinements, the methodology can be used by the licensing reviewer to aid in assessing proposed or existing safeguards systems.« less

  3. Methodology of project management at implementation of projects of high-rise construction

    NASA Astrophysics Data System (ADS)

    Papelniuk, Oksana

    2018-03-01

    High-rise construction is the perspective direction in urban development. An opportunity to arrange on rather small land plot a huge number of the living and commercial space makes high-rise construction very attractive for developers. However investment projects of high-rise buildings' construction are very expensive and complex that sets a task of effective management of such projects for the company builder. The best tool in this area today is the methodology of project management, which becomes a key factor of efficiency.

  4. Distributed collaborative team effectiveness: measurement and process improvement

    NASA Technical Reports Server (NTRS)

    Wheeler, R.; Hihn, J.; Wilkinson, B.

    2002-01-01

    This paper describes a measurement methodology developed for assessing the readiness, and identifying opportunities for improving the effectiveness, of distributed collaborative design teams preparing to conduct a coccurent design session.

  5. Bayesian WLS/GLS regression for regional skewness analysis for regions with large crest stage gage networks

    USGS Publications Warehouse

    Veilleux, Andrea G.; Stedinger, Jery R.; Eash, David A.

    2012-01-01

    This paper summarizes methodological advances in regional log-space skewness analyses that support flood-frequency analysis with the log Pearson Type III (LP3) distribution. A Bayesian Weighted Least Squares/Generalized Least Squares (B-WLS/B-GLS) methodology that relates observed skewness coefficient estimators to basin characteristics in conjunction with diagnostic statistics represents an extension of the previously developed B-GLS methodology. B-WLS/B-GLS has been shown to be effective in two California studies. B-WLS/B-GLS uses B-WLS to generate stable estimators of model parameters and B-GLS to estimate the precision of those B-WLS regression parameters, as well as the precision of the model. The study described here employs this methodology to develop a regional skewness model for the State of Iowa. To provide cost effective peak-flow data for smaller drainage basins in Iowa, the U.S. Geological Survey operates a large network of crest stage gages (CSGs) that only record flow values above an identified recording threshold (thus producing a censored data record). CSGs are different from continuous-record gages, which record almost all flow values and have been used in previous B-GLS and B-WLS/B-GLS regional skewness studies. The complexity of analyzing a large CSG network is addressed by using the B-WLS/B-GLS framework along with the Expected Moments Algorithm (EMA). Because EMA allows for the censoring of low outliers, as well as the use of estimated interval discharges for missing, censored, and historic data, it complicates the calculations of effective record length (and effective concurrent record length) used to describe the precision of sample estimators because the peak discharges are no longer solely represented by single values. Thus new record length calculations were developed. The regional skewness analysis for the State of Iowa illustrates the value of the new B-WLS/BGLS methodology with these new extensions.

  6. Effect of periodontal treatment on preterm birth rate: a systematic review of meta-analyses.

    PubMed

    López, Néstor J; Uribe, Sergio; Martinez, Benjamín

    2015-02-01

    Preterm birth is a major cause of neonatal morbidity and mortality in both developed and developing countries. Preterm birth is a highly complex syndrome that includes distinct clinical subtypes in which many different causes may be involved. The results of epidemiological, molecular, microbiological and animal-model studies support a positive association between maternal periodontal disease and preterm birth. However, the results of intervention studies carried out to determine the effect of periodontal treatment on reducing the risk of preterm birth are controversial. This systematic review critically analyzes the methodological issues of meta-analyses of the studies to determine the effect of periodontal treatment to reduce preterm birth. The quality of the individual randomized clinical trials selected is of highest relevance for a systematic review. This article describes the methodological features that should be identified a priori and assessed individually to determine the quality of a randomized controlled trial performed to evaluate the effect of periodontal treatment on pregnancy outcomes. The AMSTAR and the PRISMA checklist tools were used to assess the quality of the six meta-analyses selected, and the bias domain of the Cochrane Collaboration's Tool was applied to evaluate each of the trials included in the meta-analyses. In addition, the methodological characteristics of each clinical trial were assessed. The majority of the trials included in the meta-analyses have significant methodological flaws that threaten their internal validity. The lack of effect of periodontal treatment on preterm birth rate concluded by four meta-analyses, and the positive effect of treatment for reducing preterm birth risk concluded by the remaining two meta-analyses are not based on consistent scientific evidence. Well-conducted randomized controlled trials using rigorous methodology, including appropriate definition of the exposure, adequate control of confounders for preterm birth and application of effective periodontal interventions to eliminate periodontal infection, are needed to confirm the positive association between periodontal disease and preterm birth. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Development of performance measurement for freight transportation.

    DOT National Transportation Integrated Search

    2014-09-01

    In this project, the researchers built a set of performance measures that are unified, user-oriented, scalable, systematic, effective, and : calculable for intermodal freight management and developed methodologies to calculate and use the measures. :...

  8. Brake testing methodology study : driver effects testing

    DOT National Transportation Integrated Search

    1999-03-01

    The National Highway Traffic Safety Administration (NHTSA) is exploring the feasibility of developing brake tests to measure brake system performance of light vehicles. Developing test procedures requires controlling test variability so that measured...

  9. A proven approach for more effective software development and maintenance

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose; Hall, Dana; Sinclair, Craig

    1994-01-01

    Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.

  10. Vein matching using artificial neural network in vein authentication systems

    NASA Astrophysics Data System (ADS)

    Noori Hoshyar, Azadeh; Sulaiman, Riza

    2011-10-01

    Personal identification technology as security systems is developing rapidly. Traditional authentication modes like key; password; card are not safe enough because they could be stolen or easily forgotten. Biometric as developed technology has been applied to a wide range of systems. According to different researchers, vein biometric is a good candidate among other biometric traits such as fingerprint, hand geometry, voice, DNA and etc for authentication systems. Vein authentication systems can be designed by different methodologies. All the methodologies consist of matching stage which is too important for final verification of the system. Neural Network is an effective methodology for matching and recognizing individuals in authentication systems. Therefore, this paper explains and implements the Neural Network methodology for finger vein authentication system. Neural Network is trained in Matlab to match the vein features of authentication system. The Network simulation shows the quality of matching as 95% which is a good performance for authentication system matching.

  11. Sexual health education interventions for young people: a methodological review.

    PubMed Central

    Oakley, A.; Fullerton, D.; Holland, J.; Arnold, S.; France-Dawson, M.; Kelley, P.; McGrellis, S.

    1995-01-01

    OBJECTIVES--To locate reports of sexual health education interventions for young people, assess the methodological quality of evaluations, identify the subgroup with a methodologically sound design, and assess the evidence with respect to the effectiveness of different approaches to promoting young people's sexual health. DESIGN--Survey of reports in English by means of electronic databases and hand searches for relevant studies conducted in the developed world since 1982. Papers were reviewed for eight methodological qualities. The evidence on effectiveness generated by studies meeting four core criteria was assessed. Judgments on effectiveness by reviewers and authors were compared. PAPERS--270 papers reporting sexual health interventions. MAIN OUTCOME MEASURE--The methodological quality of evaluations. RESULTS--73 reports of evaluations of sexual health interventions examining the effectiveness of these interventions in changing knowledge, attitudes, or behavioural outcomes were identified, of which 65 were separate outcome evaluations. Of these studies, 45 (69%) lacked random control groups, 44 (68%) failed to present preintervention and 38 (59%) postintervention data, and 26 (40%) omitted to discuss the relevance of loss of data caused by drop outs. Only 12 (18%) of the 65 outcome evaluations were judged to be methodologically sound. Academic reviewers were more likely than authors to judge studies as unclear because of design faults. Only two of the sound evaluations recorded interventions which were effective in showing an impact on young people's sexual behaviour. CONCLUSIONS--The design of evaluations in sexual health intervention needs to be improved so that reliable evidence of the effectiveness of different approaches to promoting young people's sexual health may be generated. PMID:7833754

  12. On the generalized VIP time integral methodology for transient thermal problems

    NASA Technical Reports Server (NTRS)

    Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.

  13. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  14. Recent Approaches to Estimate Associations Between Source-Specific Air Pollution and Health.

    PubMed

    Krall, Jenna R; Strickland, Matthew J

    2017-03-01

    Estimating health effects associated with source-specific exposure is important for better understanding how pollution impacts health and for developing policies to better protect public health. Although epidemiologic studies of sources can be informative, these studies are challenging to conduct because source-specific exposures (e.g., particulate matter from vehicles) often are not directly observed and must be estimated. We reviewed recent studies that estimated associations between pollution sources and health to identify methodological developments designed to address important challenges. Notable advances in epidemiologic studies of sources include approaches for (1) propagating uncertainty in source estimation into health effect estimates, (2) assessing regional and seasonal variability in emissions sources and source-specific health effects, and (3) addressing potential confounding in estimated health effects. Novel methodological approaches to address challenges in studies of pollution sources, particularly evaluation of source-specific health effects, are important for determining how source-specific exposure impacts health.

  15. Development of a general methodology for labelling peptide-morpholino oligonucleotide conjugates using alkyne-azide click chemistry.

    PubMed

    Shabanpoor, Fazel; Gait, Michael J

    2013-11-11

    We describe a general methodology for fluorescent labelling of peptide conjugates of phosphorodiamidate morpholino oligonucleotides (PMOs) by alkyne functionalization of peptides, subsequent conjugation to PMOs and labelling with a fluorescent compound (Cy5-azide). Two peptide-PMO (PPMO) examples are shown. No detrimental effect of such labelled PMOs was seen in a biological assay.

  16. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  17. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  18. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  19. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  20. Measuring the Effectiveness of Professional Development in Early Literacy: Lessons Learned. PREL Research Brief

    ERIC Educational Resources Information Center

    Hammond, Ormond

    2005-01-01

    This Research Brief focuses on the methodology used to measure professional development (PD) effectiveness. It examines the needs that generated this research, what PREL did to meet those needs, and lessons that have been learned as a result. In particular, it discusses the development of a new instrument designed to measure the quality of PD as…

  1. Incorporating social network effects into cost-effectiveness analysis: a methodological contribution with application to obesity prevention

    PubMed Central

    Konchak, Chad; Prasad, Kislaya

    2012-01-01

    Objectives To develop a methodology for integrating social networks into traditional cost-effectiveness analysis (CEA) studies. This will facilitate the economic evaluation of treatment policies in settings where health outcomes are subject to social influence. Design This is a simulation study based on a Markov model. The lifetime health histories of a cohort are simulated, and health outcomes compared, under alternative treatment policies. Transition probabilities depend on the health of others with whom there are shared social ties. Setting The methodology developed is shown to be applicable in any healthcare setting where social ties affect health outcomes. The example of obesity prevention is used for illustration under the assumption that weight changes are subject to social influence. Main outcome measures Incremental cost-effectiveness ratio (ICER). Results When social influence increases, treatment policies become more cost effective (have lower ICERs). The policy of only treating individuals who span multiple networks can be more cost effective than the policy of treating everyone. This occurs when the network is more fragmented. Conclusions (1) When network effects are accounted for, they result in very different values of incremental cost-effectiveness ratios (ICERs). (2) Treatment policies can be devised to take network structure into account. The integration makes it feasible to conduct a cost-benefit evaluation of such policies. PMID:23117559

  2. Capturing security requirements for software systems.

    PubMed

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-07-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.

  3. Capturing security requirements for software systems

    PubMed Central

    El-Hadary, Hassan; El-Kassas, Sherif

    2014-01-01

    Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514

  4. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  5. Space, Place, and Social Justice: Developing a Rhythmanalysis of Education in South Africa

    ERIC Educational Resources Information Center

    Christie, Pam

    2013-01-01

    This article develops a methodological approach based on the spatial theory of Henri Lefebvre to address relationships between space, place, and social justice in education. In understanding the contradictory effects of globalization on local education policies and the continuing effects of historical geographies in education, Lefebvre's theory…

  6. HRD Interventions, Employee Competencies and Organizational Effectiveness: An Empirical Study

    ERIC Educational Resources Information Center

    Potnuru, Rama Krishna Gupta; Sahoo, Chandan Kumar

    2016-01-01

    Purpose: The purpose of the study is to examine the impact of human resource development (HRD) interventions on organizational effectiveness by means of employee competencies which are built by some of the selected HRD interventions. Design/methodology/approach: An integrated research model has been developed by combining the principal factors…

  7. The Promise of Virtual Teams: Identifying Key Factors in Effectiveness and Failure

    ERIC Educational Resources Information Center

    Horwitz, Frank M.; Bravington, Desmond; Silvis, Ulrik

    2006-01-01

    Purpose: The aim of the investigation is to identify enabling and disenabling factors in the development and operation of virtual teams; to evaluate the importance of factors such as team development, cross-cultural variables, leadership, communication and social cohesion as contributors to virtual team effectiveness. Design/methodology/approach:…

  8. Cost-effective public health guidance: asking questions from the decision-maker's viewpoint.

    PubMed

    Chalkidou, Kalipso; Culyer, Anthony; Naidoo, Bhash; Littlejohns, Peter

    2008-03-01

    In February 2004, in his assessment of the long-term financial viability of the NHS, Derek Wanless recommended the use of 'a consistent framework, such as the methodology developed by NICE, to evaluate the cost-effectiveness of interventions and initiatives across health care and public health'. One year later public health was added to NICE's remit and the new National Institute for Health and Clinical Excellence (NICE) was established, with amended statutory instruments to permit consideration of broader public sector costs when developing cost-effective guidance for public health. With the principle of 'a consistent framework' put forward by Wanless as the starting point, this paper provides an insight into the most challenging aspects of applying the principles of cost-effectiveness analysis in the public health context from the policymaker's perspective. It reflects on the long-term consequences of taking on responsibility for producing public health guidance on the Institute's overall approach to guidance development and describes the tension between striving for consistency and cross-evaluation comparability while ensuring that the methodological tools used are fit for the purpose of developing public health guidance.

  9. Development and exploration of a new methodology for the fitting and analysis of XAS data.

    PubMed

    Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

    2010-01-01

    A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010), J. Synchrotron Rad. 17, 132-137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl(4)(2-), a common reference compound used for calibration and covalency estimation in M-Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples.

  10. Development and exploration of a new methodology for the fitting and analysis of XAS data

    PubMed Central

    Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

    2010-01-01

    A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120

  11. Sustainability of a long term professional development program

    NASA Astrophysics Data System (ADS)

    Ries, Christine E.

    Currently, in most school districts, the main form of teacher education comes from professional development (PD) that claims to improve teaching and student achievement. School districts and teachers spend time and money trying to make sure that they are providing the best quality education for their students. Yet, educators are looking for what the most effective form of PD should look like. Utilizing the methodology of a descriptive case study a long-term PD grant, called Science Alliance was evaluated to add to the research on PD and grant program efficacy. Twelve teachers that participated in the Science Alliance grant were interviewed, observed, and given a survey to see how and to what degree they were implementing the inquiry methodology three years after the grant ended. The results were compared with previously existing data that were collected by a company that Science Alliance hired to complete external research on the effects of the PD. The findings suggest that the teachers that participated have sustained the utilization and implementation of the methodology learned during the training. School administrators and/or staff developers could utilize the findings from this study to see what effective PD may entail. Future researchers may use findings from this study when reporting about grant program evaluations and/or PD.

  12. Computer Assisted Chronic Disease Management: Does It Work? A Pilot Study Using Mixed Methods

    PubMed Central

    Jones, Kay M.; Biezen, Ruby; Piterman, Leon

    2013-01-01

    Background. Key factors for the effective chronic disease management (CDM) include the availability of practical and effective computer tools and continuing professional development/education. This study tested the effectiveness of a computer assisted chronic disease management tool, a broadband-based service known as cdmNet in increasing the development of care plans for patients with chronic disease in general practice. Methodology. Mixed methods are the breakthrough series methodology (workshops and plan-do-study-act cycles) and semistructured interviews. Results. Throughout the intervention period a pattern emerged suggesting GPs use of cdmNet initially increased, then plateaued practice nurses' and practice managers' roles expanded as they became more involved in using cdmNet. Seven main messages emerged from the GP interviews. Discussion. The overall use of cdmNet by participating GPs varied from “no change” to “significant change and developing many the GPMPs (general practice management plans) using cdmNet.” The variation may be due to several factors, not the least, allowing GPs adequate time to familiarise themselves with the software and recognising the benefit of the team approach. Conclusion. The breakthrough series methodology facilitated upskilling GPs' management of patients diagnosed with a chronic disease and learning how to use the broadband-based service cdmNet. PMID:24959576

  13. ADVISORY ON UPDATED METHODOLOGY FOR ...

    EPA Pesticide Factsheets

    The National Academy of Sciences (NAS) published the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in 2006. The Committee analyzed the most recent epidemiology from the important exposed cohorts and factored in changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee also considered relevant radiobiological data, including that from the Department of Energy's low dose effects research program. Based on the review of this information, the Committee proposed a set of models for estimating risks from low-dose ionizing radiation. ORIA then prepared a white paper revising the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This is the first product to be developed as a result of the BEIR VII report. We requested that the SAB conduct an advisory during the development of this methodology. The second product to be prepared will be a revised version of the document,

  14. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 1; Theory and Design Procedure

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes a project at the University of Washington to design a multirate suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing.

  15. Software Risk Identification for Interplanetary Probes

    NASA Technical Reports Server (NTRS)

    Dougherty, Robert J.; Papadopoulos, Periklis E.

    2005-01-01

    The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.

  16. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    NASA Astrophysics Data System (ADS)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  17. Track train dynamics analysis and test program: Methodology development for the derailment safety analysis of six-axle locomotives

    NASA Technical Reports Server (NTRS)

    Marcotte, P. P.; Mathewson, K. J. R.

    1982-01-01

    The operational safety of six axle locomotives is analyzed. A locomotive model with corresponding data on suspension characteristics, a method of track defect characterization, and a method of characterizing operational safety are used. A user oriented software package was developed as part of the methodology and was used to study the effect (on operational safety) of various locomotive parameters and operational conditions such as speed, tractive effort, and track curvature. The operational safety of three different locomotive designs was investigated.

  18. Distributed intelligent control and management (DICAM) applications and support for semi-automated development

    NASA Technical Reports Server (NTRS)

    Hayes-Roth, Frederick; Erman, Lee D.; Terry, Allan; Hayes-Roth, Barbara

    1992-01-01

    We have recently begun a 4-year effort to develop a new technology foundation and associated methodology for the rapid development of high-performance intelligent controllers. Our objective in this work is to enable system developers to create effective real-time systems for control of multiple, coordinated entities in much less time than is currently required. Our technical strategy for achieving this objective is like that in other domain-specific software efforts: analyze the domain and task underlying effective performance, construct parametric or model-based generic components and overall solutions to the task, and provide excellent means for specifying, selecting, tailoring or automatically generating the solution elements particularly appropriate for the problem at hand. In this paper, we first present our specific domain focus, briefly describe the methodology and environment we are developing to provide a more regular approach to software development, and then later describe the issues this raises for the research community and this specific workshop.

  19. USE OF GENOTOXIC ACTIVITY PROFILES IN ASSESSMENT OF CARCINOGENESIS AND TRANSMISSIBLE GENETIC EFFECTS

    EPA Science Inventory

    A methodology has been developed to display and evaluate multiple test quantitative information on genetic toxicants for purposes of assessing carcinogenesis and transmissible genetic effects. ose Information is collected from the open literature: either the lowest effective dose...

  20. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    PubMed

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  2. On computational methods for crashworthiness

    NASA Technical Reports Server (NTRS)

    Belytschko, T.

    1992-01-01

    The evolution of computational methods for crashworthiness and related fields is described and linked with the decreasing cost of computational resources and with improvements in computation methodologies. The latter includes more effective time integration procedures and more efficient elements. Some recent developments in methodologies and future trends are also summarized. These include multi-time step integration (or subcycling), further improvements in elements, adaptive meshes, and the exploitation of parallel computers.

  3. A Proposal for Methodology for Negotiation Practicum with Effective Use of ICT: A Technology Enhanced Course for Communication for Trust Building

    ERIC Educational Resources Information Center

    Yamamoto, Tosh; Tagami, Masanori; Nakazawa, Minoru

    2012-01-01

    This paper purports to demonstrate a problem solving case in the course of the development of methodology, in which the quality of the negotiation practicum is maintained or raised without sacrificing the class contact hours for the lessons for reading comprehension skills, on which the essence of negotiation practicum is solely based. In this…

  4. Effects of overweight vehicles on NYSDOT infrastructure.

    DOT National Transportation Integrated Search

    2015-09-01

    This report develops a methodology for estimating the effects of different categories of overweight : trucks on NYSDOT pavements and bridges. A data mining algorithm is used to categorize truck : data collected at several Weigh-In-Motion stations aro...

  5. Six methodological steps to build medical data warehouses for research.

    PubMed

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  6. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Warehouses information system design and development

    NASA Astrophysics Data System (ADS)

    Darajatun, R. A.; Sukanta

    2017-12-01

    Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radojcic, Riko; Nowak, Matt; Nakamoto, Mark

    The status of the development of a Design-for-Stress simulation flow that captures the stress effects in packaged 3D-stacked Si products like integrated circuits (ICs) using advanced via-middle Through Si Via technology is outlined. The next set of challenges required to proliferate the methodology and to deploy it for making and dispositioning real Si product decisions are described here. These include the adoption and support of a Process Design Kit (PDK) that includes the relevant material properties, the development of stress simulation methodologies that operate at higher levels of abstraction in a design flow, and the development and adoption of suitablemore » models required to make real product reliability decisions.« less

  9. METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES

    EPA Science Inventory

    A methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator), has been developed in the U.S. EPA's Office of Research and Development to directly compare the sustainability of proces...

  10. Reduction of adverse aerodynamic effects of large trucks, Volume I. Technical report

    DOT National Transportation Integrated Search

    1978-09-01

    The overall objective of this study has been to develop methods of minimizing three aerodynamic-related phenomena: truck-induced aerodynamic disturbances, splash, and spray. An analytical methodology has been developed and used to characterize aerody...

  11. Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.

    1978-01-01

    Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.

  12. Evaluating the cost effectiveness of environmental projects: Case studies in aerospace and defense

    NASA Technical Reports Server (NTRS)

    Shunk, James F.

    1995-01-01

    Using the replacement technology of high pressure waterjet decoating systems as an example, a simple methodology is presented for developing a cost effectiveness model. The model uses a four-step process to formulate an economic justification designed for presentation to decision makers as an assessment of the value of the replacement technology over conventional methods. Three case studies from major U.S. and international airlines are used to illustrate the methodology and resulting model. Tax and depreciation impacts are also presented as potential additions to the model.

  13. Methodology for worker neutron exposure evaluation in the PDCF facility design.

    PubMed

    Scherpelz, R I; Traub, R J; Pryor, K H

    2004-01-01

    A project headed by Washington Group International is meant to design the Pit Disassembly and Conversion Facility (PDCF) to convert the plutonium pits from excessed nuclear weapons into plutonium oxide for ultimate disposition. Battelle staff are performing the shielding calculations that will determine appropriate shielding so that the facility workers will not exceed target exposure levels. The target exposure levels for workers in the facility are 5 mSv y(-1) for the whole body and 100 mSv y(-1) for the extremity, which presents a significant challenge to the designers of a facility that will process tons of radioactive material. The design effort depended on shielding calculations to determine appropriate thickness and composition for glove box walls, and concrete wall thicknesses for storage vaults. Pacific Northwest National Laboratory (PNNL) staff used ORIGEN-S and SOURCES to generate gamma and neutron source terms, and Monte Carlo (computer code for) neutron photon (transport) (MCNP-4C) to calculate the radiation transport in the facility. The shielding calculations were performed by a team of four scientists, so it was necessary to develop a consistent methodology. There was also a requirement for the study to be cost-effective, so efficient methods of evaluation were required. The calculations were subject to rigorous scrutiny by internal and external reviewers, so acceptability was a major feature of the methodology. Some of the issues addressed in the development of the methodology included selecting appropriate dose factors, developing a method for handling extremity doses, adopting an efficient method for evaluating effective dose equivalent in a non-uniform radiation field, modelling the reinforcing steel in concrete, and modularising the geometry descriptions for efficiency. The relative importance of the neutron dose equivalent compared with the gamma dose equivalent varied substantially depending on the specific shielding conditions and lessons were learned from this effect. This paper addresses these issues and the resulting methodology.

  14. To Legalize or not to Legalize? That is the Question

    DTIC Science & Technology

    2009-04-01

    and international community to improve security and promote development in Afghanistan. The negative effect on the nations overall security, the...the problem/solution methodology. The research explores the current illicit opium cultivation in Afghanistan and its effect on the current...negative effect on the current development efforts. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF

  15. A Methodological Review and Critique of the "Intergenerational Transmission of Violence" Literature.

    PubMed

    Haselschwerdt, Megan L; Savasuk-Luxton, Rachel; Hlavaty, Kathleen

    2017-01-01

    Exposure to interpersonal or interparental violence (EIPV) and child abuse and maltreatment (CAM) are associated with an increased risk of maladaptive outcomes, including later involvement in adulthood intimate partner violence (IPV; often referred to as the theory of intergenerational transmission of violence). Recent meta-analyses, however, have documented a weak effect size when examining this association. By focusing on young adulthood, a development stage in which identity development and romantic relationship formation are salient tasks, we can provide insight into the association between EIPV, CAM, and IPV. Guided by the methodological critiques from the IPV and EIPV literatures, the present study reviewed the methodology used in 16 studies (published between 2002 and 2016) that tested the theory of intergenerational transmission of violence. The review study focused on how EIPV, CAM, and young adult dating violence were measured and analyzed, with the initial goal of better understanding how methodological decision informed the study's findings. Ultimately, we determined that there was simply too much methodological variability and yet too little methodological complexity to truly inform a review and discussion of the results; therefore, our review solely focused on the study's methodological decisions. Based on our review, we suggest that both of these challenges, too much variability and too little complexity, hinder our ability to examine the theory of intergenerational transmission of violence. Future research must strike a balance between methodological consistency and complexity to better understand the intricate nuances of IPV experiences and inform practice.

  16. Methodologies and systems for heterogeneous concurrent computing

    NASA Technical Reports Server (NTRS)

    Sunderam, V. S.

    1994-01-01

    Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.

  17. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  18. A Methodology and Implementation for Annotating Digital Images for Context-appropriate Use in an Academic Health Care Environment

    PubMed Central

    Goede, Patricia A.; Lauman, Jason R.; Cochella, Christopher; Katzman, Gregory L.; Morton, David A.; Albertine, Kurt H.

    2004-01-01

    Use of digital medical images has become common over the last several years, coincident with the release of inexpensive, mega-pixel quality digital cameras and the transition to digital radiology operation by hospitals. One problem that clinicians, medical educators, and basic scientists encounter when handling images is the difficulty of using business and graphic arts commercial-off-the-shelf (COTS) software in multicontext authoring and interactive teaching environments. The authors investigated and developed software-supported methodologies to help clinicians, medical educators, and basic scientists become more efficient and effective in their digital imaging environments. The software that the authors developed provides the ability to annotate images based on a multispecialty methodology for annotation and visual knowledge representation. This annotation methodology is designed by consensus, with contributions from the authors and physicians, medical educators, and basic scientists in the Departments of Radiology, Neurobiology and Anatomy, Dermatology, and Ophthalmology at the University of Utah. The annotation methodology functions as a foundation for creating, using, reusing, and extending dynamic annotations in a context-appropriate, interactive digital environment. The annotation methodology supports the authoring process as well as output and presentation mechanisms. The annotation methodology is the foundation for a Windows implementation that allows annotated elements to be represented as structured eXtensible Markup Language and stored separate from the image(s). PMID:14527971

  19. Effects of Kindergarten Retention on Children's Social-Emotional Development: An Application of Propensity Score Method to Multivariate, Multilevel Data

    ERIC Educational Resources Information Center

    Hong, Guanglei; Yu, Bing

    2008-01-01

    This study examines the effects of kindergarten retention on children's social-emotional development in the early, middle, and late elementary years. Previous studies have generated mixed results partly due to some major methodological challenges, including selection bias, measurement error, and divergent perceptions of multiple respondents in…

  20. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  1. A false-alarm aware methodology to develop robust and efficient multi-scale infrared small target detection algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2018-03-01

    False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.

  2. Students in Transition: Research and Practice in Career Development. The First-Year Experience Monograph Series No. 55

    ERIC Educational Resources Information Center

    Gore, Paul A., Jr., Ed.; Carter, Louisa P., Ed.

    2011-01-01

    Offering a primer on action research methodologies and examples of practice, "Students in Transition: Research and Practice in Career Development" responds to a dual challenge facing career development educators--designing cutting-edge career development interventions and demonstrating their effectiveness. Overviews of quantitative and qualitative…

  3. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  4. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  5. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  6. Creativity Development: The Role of Foreign Language Learning

    ERIC Educational Resources Information Center

    Sadykova, Aida G.; Shelestova, Olga V.

    2016-01-01

    The relevance of the present research stems from the need to consider the ways of preventing conflicts between the objective necessity of development of students' creative activity in the learning process, and insufficient development of pedagogical conditions for its effective implementation in theoretical and methodological terms. The article is…

  7. Developing Knowledge Intensive Ideas in Engineering Education: The Application of Camp Methodology

    ERIC Educational Resources Information Center

    Lassen, Astrid Heidemann; Nielsen, Suna Lowe

    2011-01-01

    Background: Globalization, technological advancement, environmental problems, etc. challenge organizations not just to consider cost-effectiveness, but also to develop new ideas in order to build competitive advantages. Hence, methods to deliberately enhance creativity and facilitate its processes of development must also play a central role in…

  8. Working towards Skills: Perspectives on Workforce Development in SMEs. Research Report.

    ERIC Educational Resources Information Center

    Hughes, Maria; Keddie, Vince; Webb, Peter; Corney, Mark

    Research into workforce development (WD) considered the relationship between corporate assessments of workers' development needs and WD strategies; how learning at work takes place; and what learning methods are used and their effectiveness. Focus was on practice in small and medium-sized enterprises (SMEs). Methodology included a literature…

  9. The Role of Research in Making Interactive Products Effective.

    ERIC Educational Resources Information Center

    Rossi, Robert J.

    1986-01-01

    Argues that research and development (R&D) methods should be utilized to develop new technologies for training and retailing and describes useful research tools--critical incident methodology, task analysis, performance recording. Discussion covers R&D applications to interactive systems development in the areas of product need, customer…

  10. Effective peer education in HIV: defining factors that maximise success.

    PubMed

    Lambert, Steven M; Debattista, Joseph; Bodiroza, Aleksandar; Martin, Jack; Staunton, Shaun; Walker, Rebecca

    2013-08-01

    Background Peer education is considered an effective health promotion and education strategy, particularly to populations traditionally resistant to conventional forms of health information dissemination. This has made it very applicable to HIV education and prevention, where those who are affected or at risk are often amongst the most vulnerable in society. However, there still remains uncertainty as to the reasons for its effectiveness, what constitutes an effective methodology and why a consistent methodology can often result in widely variable outcomes. Between 2008 and 2010, three separate reviews of peer education were undertaken across more than 30 countries in three distinct geographical regions across the globe. The reviews sought to identify determinants of the strengths and weaknesses inherent in approaches to peer education, particularly targeting young people and the most at-risk populations. By assessing the implementation of peer education programs across a variety of social environments, it was possible to develop a contextual understanding for peer education's effectiveness and provide a picture of the social, cultural, political, legal and geographic enablers and disablers to effective peer education. Several factors were significant contributors to program success, not as strategies of methodology, but as elements of the social, cultural, political and organisational context in which peer education was situated. Contextual elements create environments supportive of peer education. Consequently, adherence to a methodology or strategy without proper regard to its situational context rarely contributes to effective peer education.

  11. Traditional chinese medicine: an update on clinical evidence.

    PubMed

    Xue, Charlie C L; Zhang, Anthony L; Greenwood, Kenneth M; Lin, Vivian; Story, David F

    2010-03-01

    As an alternative medical system, Traditional Chinese Medicine (TCM) has been increasingly used over the last several decades. Such a consumer-driven development has resulted in introduction of education programs for practitioner training, development of product and practitioner regulation systems, and generation of an increasing interest in research. Significant efforts have been made in validating the quality, effectiveness, and safety of TCM interventions evidenced by a growing number of published trials and systematic reviews. Commonly, the results of these studies were inconclusive due to the lack of quality and quantity of the trials to answer specific and answerable clinical questions. The methodology of a randomized clinical trial (RCT) is not free from bias, and the unique features of TCM (such as individualization and holism) further complicate effective execution of RCTs in TCM therapies. Thus, data from limited RCTs and systematic reviews need to be interpreted with great caution. Nevertheless, until new and specific methodology is developed that can adequately address these methodology challenges for RCTs in TCM, evidence from quality RCTs and systematic reviews still holds the credibility of TCM in the scientific community. This article summarizes studies on TCM utilization, and regulatory and educational development with a focus on updating the TCM clinical evidence from RCTs and systematic reviews over the last decade. The key issues and challenges associated with evidence-based TCM developments are also explored.

  12. Analysis methods for Thematic Mapper data of urban regions

    NASA Technical Reports Server (NTRS)

    Wang, S. C.

    1984-01-01

    Studies have indicated the difficulty in deriving a detailed land-use/land-cover classification for heterogeneous metropolitan areas with Landsat MSS and TM data. The major methodological issues of digital analysis which possibly have effected the results of classification are examined. In response to these methodological issues, a multichannel hierarchical clustering algorithm has been developed and tested for a more complete analysis of the data for urban areas.

  13. Integrated structure/control design - Present methodology and future opportunities

    NASA Technical Reports Server (NTRS)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  14. Effectiveness evaluation of the R&D projects in organizations financed by the budget expenses

    NASA Astrophysics Data System (ADS)

    Yakovlev, D.; Yushkov, E.; Pryakhin, A.; Bogatyreova, M.

    2017-01-01

    The issues of R&D project performance and their prospects are closely concerned with knowledge management. In the initial stages of the project development, it is the quality of the project evaluation that is crucial for the result and generation of future knowledge. Currently there does not exist any common methodology for the evaluation of new R&D financed by the budget. Suffice it to say, the assessment of scientific and technical projects (ST projects) varies greatly depending on the type of customer - government or business structures. An extensive methodological groundwork was formed with respect to orders placed by business structures. It included “an internal administrative order” by the company management for the results of STA intended for its own ST divisions. Regretfully this is not the case with state orders in the field of STA although the issue requires state regulation and official methodological support. The article is devoted to methodological assessment of scientific and technical effectiveness of studies performed at the expense of budget funds, and suggests a new concept based on the definition of the cost-effectiveness index. Thus, the study reveals it necessary to extend the previous approach to projects of different levels - micro-, meso-, macro projects. The preliminary results of the research show that there must be a common methodological approach to underpin the financing of projects under government contracts within the framework of budget financing and stock financing. This should be developed as general guidelines as well as recommendations that reflect specific sectors of the public sector, various project levels and forms of financing, as well as different stages of project life cycle.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, E.A., E-mail: kingea@tcd.ie; O'Malley, V.P.

    The Irish National Roads Authority (NRA) recently completed over twenty post environmental impact assessment evaluations of noise chapters prepared as part of Environmental Impact Statements (EISs) for new national road schemes in Ireland. The study focused on a range of issues including a review of noise monitoring procedures, noise prediction methodologies and an assessment of the effectiveness of noise mitigation measures currently in use on national road schemes. This review was carried out taking cognisance of best international practices for noise assessment and methodologies used to mitigate road traffic noise. The primary focus of the study was to assess themore » actual noise impacts of national road scheme developments and to revise, where necessary, methodologies recommended in the current NRA guidance document describing the treatment of noise on national road schemes. This paper presents a summary of the study and identifies a number of key areas that should be considered prior to the development of future guidance documents. - Highlights: Black-Right-Pointing-Pointer Presents a post-EIS evaluation of noise assessments for national roads in Ireland. Black-Right-Pointing-Pointer The effectiveness of some noise mitigation measures is critically evaluated. Black-Right-Pointing-Pointer Issues related to the current EIS noise assessment methodologies are discussed. Black-Right-Pointing-Pointer Implications for alterations to the NRA noise guidelines.« less

  16. A methodology for the quantification of doctrine and materiel approaches in a capability-based assessment

    NASA Astrophysics Data System (ADS)

    Tangen, Steven Anthony

    Due to the complexities of modern military operations and the technologies employed on today's military systems, acquisition costs and development times are becoming increasingly large. Meanwhile, the transformation of the global security environment is driving the U.S. military's own transformation. In order to meet the required capabilities of the next generation without buying prohibitively costly new systems, it is necessary for the military to evolve across the spectrum of doctrine, organization, training, materiel, leadership and education, personnel, and facilities (DOTMLPF). However, the methods for analyzing DOTMLPF approaches within the early acquisition phase of a capability-based assessment (CBA) are not as well established as the traditional technology design techniques. This makes it difficult for decision makers to decide if investments should be made in materiel or non-materiel solutions. This research develops an agent-based constructive simulation to quantitatively assess doctrine alongside materiel approaches. Additionally, life-cycle cost techniques are provided to enable a cost-effectiveness trade. These techniques are wrapped together in a decision-making environment that brings crucial information forward so informed and appropriate acquisition choices can be made. The methodology is tested on a future unmanned aerial vehicle design problem. Through the implementation of this quantitative methodology on the proof-of-concept study, it is shown that doctrinal changes including fleet composition, asset allocation, and patrol pattern were capable of dramatic improvements in system effectiveness at a much lower cost than the incorporation of candidate technologies. Additionally, this methodology was able to quantify the precise nature of strong doctrine-doctrine and doctrine-technology interactions which have been observed only qualitatively throughout military history. This dissertation outlines the methodology and demonstrates how potential approaches to capability-gaps can be identified with respect to effectiveness, cost, and time. When implemented, this methodology offers the opportunity to achieve system capabilities in a new way, improve the design of acquisition programs, and field the right combination of ways and means to address future challenges to national security.

  17. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research.

    PubMed

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.

  18. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    PubMed Central

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  19. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.

  20. The U. S. Environmental Protection Agency's inhalation RfD methodology: Risk assessment for air toxics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarabek, A.M.; Menache, M.G.; Overton, J.H. Jr.

    1990-10-01

    The U.S. Environmental Protection Agency (U.S. EPA) has advocated the establishment of general and scientific guidelines for the evaluation of toxicological data and their use in deriving benchmark values to protect exposed populations from adverse health effects. The Agency's reference dose (RfD) methodology for deriving benchmark values for noncancer toxicity originally addressed risk assessment of oral exposures. This paper presents a brief background on the development of the inhalation reference dose (RfDi) methodology, including concepts and issues related to addressing the dynamics of the respiratory system as the portal of entry. Different dosimetric adjustments are described that were incorporated intomore » the methodology to account for the nature of the inhaled agent (particle or gas) and the site of the observed toxic effects (respiratory or extra-respiratory). Impacts of these adjustments on the extrapolation of toxicity data of inhaled agents for human health risk assessment and future research directions are also discussed.« less

  1. U. S. Environmental Protection Agency's inhalation RFD methodology: Risk assessment for air toxics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarabek, A.M.; Menache, M.G.; Overton, J.H.

    1989-01-01

    The U.S. Environmental Protection Agency (U.S. EPA) has advocated the establishment of general and scientific guidelines for the evaluation of toxicological data and their use in deriving benchmark values to protect exposed populations from adverse health effects. The Agency's reference dose (RfD) methodology for deriving benchmark values for noncancer toxicity originally addressed risk assessment of oral exposures. The paper presents a brief background on the development of the inhalation reference dose (RFDi) methodology, including concepts and issues related to addressing the dynamics of the respiratory system as the portal of entry. Different dosimetric adjustments are described that were incorporated intomore » the methodology to account for the nature of the inhaled agent (particle or gas) and the site of the observed toxic effects (respiratory or extrarespiratory). Impacts of these adjustments on the extrapolation of toxicity data of inhaled agents for human health risk assessment and future research directions are also discussed.« less

  2. Planning, Execution, and Assessment of Effects-Based Operations (EBO)

    DTIC Science & Technology

    2006-05-01

    time of execution that would maximize the likelihood of achieving a desired effect. GMU has developed a methodology, named ECAD -EA (Effective...Algorithm EBO Effects Based Operations ECAD -EA Effective Course of Action-Evolutionary Algorithm GMU George Mason University GUI Graphical...Probability Profile Generation ........................................................72 A.2.11 Running ECAD -EA (Effective Courses of Action Determination

  3. ASSESSING THE EFFECTS OF HABITAT ALTERATION ON BAY SCALLOP, ARGOPECTIN IRRADIANS, POPULATIONS

    EPA Science Inventory

    The U.S. EPA's National Health and Environmental Effects Laboratory is developing approaches for protecting and restoring the ecological integrity of aquatic ecosystems from the impacts of multiple aquatic stressors. These approaches will improve assessment methodologies, diagno...

  4. Effects of overweight vehicles on New York State DOT infrastructure.

    DOT National Transportation Integrated Search

    2015-09-01

    This report develops a methodology for estimating the effects of different categories of overweight : trucks on NYSDOT pavements and bridges. A data mining algorithm is used to categorize truck : data collected at several Weigh-In-Motion stations aro...

  5. 1-Propanol probing methodology: two-dimensional characterization of the effect of solute on H2O.

    PubMed

    Koga, Yoshikata

    2013-09-21

    The wording "hydrophobicity/hydrophilicity" has been used in a loose manner based on human experiences. We have devised a more quantitative way to redefine "hydrophobes" and "hydrophiles" in terms of the mole fraction dependence pattern of one of the third derivative quantities, the enthalpic interaction between solute molecules. We then devised a thermodynamic methodology to characterize the effect of a solute on H2O in terms of its hydrophobicity and/or hydrophilicity. We use a thermodynamic signature, the enthalpic interaction of 1-propanol, H, to monitor how the test solute modifies H2O. By this method, characterization is facilitated by two indices; one pertaining to its hydrophobicity and the other its hydrophilicity. Hence differences among amphiphiles are quantified in a two-dimensional manner. Furthermore, an individual ion can be characterized independent of a counter ion. By using this methodology, we have studied the effects on H2O of a number of solutes, and gained some important new insights. For example, such commonly used examples of hydrophobes in the literature as tetramethyl urea, trimethylamine-N-oxide, and tetramethylammonium salts are in fact surprisingly hydrophilic. Hence the conclusions about "hydrophobes" using these samples ought to be interpreted with caution. The effects of anions on H2O found by this methodology are in the same sequence of the Hofmeister ranking, which will no doubt aid a further investigation into this enigma in biochemistry. Thus, it is likely that this methodology could play an important role in the characterization of the effects of solutes in H2O, and a perspective view may be useful. Here, we describe the basis on which the methodology is developed and the methodology itself in m.ore detail than given in individual papers. We then summarize the results in two dimensional hydrophobicity/hydrophilicity maps.

  6. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias

    With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less

  7. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  8. Cost-effectiveness analyses of hepatitis A vaccine: a systematic review to explore the effect of methodological quality on the economic attractiveness of vaccination strategies.

    PubMed

    Anonychuk, Andrea M; Tricco, Andrea C; Bauch, Chris T; Pham, Ba'; Gilca, Vladimir; Duval, Bernard; John-Baptiste, Ava; Woo, Gloria; Krahn, Murray

    2008-01-01

    Hepatitis A vaccines have been available for more than a decade. Because the burden of hepatitis A virus has fallen in developed countries, the appropriate role of vaccination programmes, especially universal vaccination strategies, remains unclear. Cost-effectiveness analysis is a useful method of relating the costs of vaccination to its benefits, and may inform policy. This article systematically reviews the evidence on the cost effectiveness of hepatitis A vaccination in varying populations, and explores the effects of methodological quality and key modelling issues on the cost-effectiveness ratios.Cost-effectiveness/cost-utility studies of hepatitis A vaccine were identified via a series of literature searches (MEDLINE, EMBASE, HSTAR and SSCI). Citations and full-text articles were reviewed independently by two reviewers. Reference searching, author searches and expert consultation ensured literature saturation. Incremental cost-effectiveness ratios (ICERs) were abstracted for base-case analyses, converted to $US, year 2005 values, and categorised to reflect various levels of cost effectiveness. Quality of reporting, methodological issues and key modelling issues were assessed using frameworks published in the literature.Thirty-one cost-effectiveness studies (including 12 cost-utility analyses) were included from full-text article review (n = 58) and citation screening (n = 570). These studies evaluated universal mass vaccination (n = 14), targeted vaccination (n = 17) and vaccination of susceptibles (i.e. individuals initially screened for antibody and, if susceptible, vaccinated) [n = 13]. For universal vaccination, 50% of the ICERs were <$US20 000 per QALY or life-year gained. Analyses evaluating vaccination in children, particularly in high incidence areas, produced the most attractive ICERs. For targeted vaccination, cost effectiveness was highly dependent on the risk of infection.Incidence, vaccine cost and discount rate were the most influential parameters in sensitivity analyses. Overall, analyses that evaluated the combined hepatitis A/hepatitis B vaccine, adjusted incidence for under-reporting, included societal costs and that came from studies of higher methodological quality tended to have more attractive cost-effectiveness ratios. Methodological quality varied across studies. Major methodological flaws included inappropriate model type, comparator, incidence estimate and inclusion/exclusion of costs.

  9. Transient Reliability of Ceramic Structures For Heat Engine Applications

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama M.

    2002-01-01

    The objectives of this report was to develop a methodology to predict the time-dependent reliability (probability of failure) of brittle material components subjected to transient thermomechanical loading, taking into account the change in material response with time. This methodology for computing the transient reliability in ceramic components subjected to fluctuation thermomechanical loading was developed, assuming SCG (Slow Crack Growth) as the delayed mode of failure. It takes into account the effect of varying Weibull modulus and materials with time. It was also coded into a beta version of NASA's CARES/Life code, and an example demonstrating its viability was presented.

  10. Development and testing of incident detection algorithms. Vol. 2, research methodology and detailed results.

    DOT National Transportation Integrated Search

    1976-04-01

    The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...

  11. Open Learning within Growing Businesses

    ERIC Educational Resources Information Center

    Klofsten, Magnus; Jones-Evans, Dylan

    2013-01-01

    Purpose: Understanding the factors behind successful enterprise policy interventions are critical in ensuring effective programme development. The aim of this paper is to analyse an academic-industry initiative in Sweden developed to support knowledge-intensive businesses in expanding their operations. Design/methodology/approach: This paper…

  12. Multirate flutter suppression system design for the Benchmark Active Controls Technology Wing

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1994-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies will be applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing (also called the PAPA wing). Eventually, the designs will be implemented in hardware and tested on the BACT wing in a wind tunnel. This report describes a project at the University of Washington to design a multirate flutter suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing. The contributions of this project are (1) development of an algorithm for synthesizing robust low order multirate control laws (the algorithm is capable of synthesizing a single compensator which stabilizes both the nominal plant and multiple plant perturbations; (2) development of a multirate design methodology, and supporting software, for modeling, analyzing and synthesizing multirate compensators; and (3) design of a multirate flutter suppression system for NASA's BACT wing which satisfies the specified design criteria. This report describes each of these contributions in detail. Section 2.0 discusses our design methodology. Section 3.0 details the results of our multirate flutter suppression system design for the BACT wing. Finally, Section 4.0 presents our conclusions and suggestions for future research. The body of the report focuses primarily on the results. The associated theoretical background appears in the three technical papers that are included as Attachments 1-3. Attachment 4 is a user's manual for the software that is key to our design methodology.

  13. Some methodological aspects of ethics committees' expertise: the Ukrainian example.

    PubMed

    Pustovit, Svitlana V

    2006-01-01

    Today local, national and international ethics committees have become an effective means of social regulation in many European countries. Science itself is an important precondition for the development of bioethical knowledge and ethics expertise. Cultural, social, historical and religious preconditions can facilitate different forms and methods of ethics expertise in each country. Ukrainian ethics expertise has some methodological problems connected with its socio-cultural, historical, science and philosophy development particularities. In this context, clarification of some common legitimacies or methodological approaches to ethics committee (EC) phenomena such as globalization, scientization and the prioritization of an ethics paradigm are very important. On the other hand, elaborate study and critical analysis of international experience by Ukraine and other Eastern European countries will provide the integration of their local and national ethics expertises into a world bioethics ethos.

  14. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    NASA Astrophysics Data System (ADS)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  15. Effects of Tennis Training on Personality Development in Children and Early Adolescents

    ERIC Educational Resources Information Center

    Demir, Erdal; Sahin, Gülsah; Sentürk, Ugur; Aydin, Halide; Altinkök, Mustafa

    2016-01-01

    The objective of this study was to investigate the effects of a 12-week basic tennis training program on the personality development of early adolescents aged between 9 and 11 years. The research methodology consisted of a single group pre-test/post-test design implemented with a total of eight volunteer children (three boys and five girls). The…

  16. The Effect of Perceived Spiritual Leadership on Envy Management of Faculty Members through the Role of Professional Development Mediation and Job Satisfaction

    ERIC Educational Resources Information Center

    Haris, Zarin Daneshvar; Saidabadi, Reza Yousefi; Niazazari, Kiumars

    2016-01-01

    Purpose: the present study aimed to investigate the effect of perceived spiritual leadership on envy management of faculty members of Islamic Azad Universities of East Azerbaijan province through the role of professional development mediation and job satisfaction. Methodology: this study was a descriptive and correlational study that was conducted…

  17. The Applications of Computers in Education in Developing Countries--with Specific Reference to the Cost-Effectiveness of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Lai, Kwok-Wing

    Designed to examine the application and cost-effectiveness of computer-assisted instruction (CAI) for secondary education in developing countries, this document is divided into eight chapters. A general introduction defines the research problem, describes the research methodology, and provides definitions of key terms used throughout the paper.…

  18. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  19. SH-2F LAMPS Instructional Systems Development: Phase II. Final Report.

    ERIC Educational Resources Information Center

    Gibbons, Andrew S.; Hymes, Jonah P.

    This project was one of four aircrew training development projects in a continuing study of the methodology, effectiveness, and resource requirements of the Instructional Systems Development (ISD) process. This report covers the Phase II activities of a two-phase project for the development of aircrew training for SH-2F anti-submarine warfare…

  20. A mechanics framework for a progressive failure methodology for laminated composites

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Allen, David H.; Lo, David C.

    1989-01-01

    A laminate strength and life prediction methodology has been postulated for laminated composites which accounts for the progressive development of microstructural damage to structural failure. A damage dependent constitutive model predicts the stress redistribution in an average sense that accompanies damage development in laminates. Each mode of microstructural damage is represented by a second-order tensor valued internal state variable which is a strain like quantity. The mechanics framework together with the global-local strategy for predicting laminate strength and life is presented in the paper. The kinematic effects of damage are represented by effective engineering moduli in the global analysis and the results of the global analysis provide the boundary conditions for the local ply level stress analysis. Damage evolution laws are based on experimental results.

  1. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  2. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot.

    PubMed

    Barhen, J; Toomarian, N; Protopopescu, V

    1987-12-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of singleneuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as prec xdence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  3. Optimization of the computational load of a hypercube supercomputer onboard a mobile robot

    NASA Technical Reports Server (NTRS)

    Barhen, Jacob; Toomarian, N.; Protopopescu, V.

    1987-01-01

    A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

  4. Economic evaluation of health promotion interventions for older people: do applied economic studies meet the methodological challenges?

    PubMed

    Huter, Kai; Dubas-Jakóbczyk, Katarzyna; Kocot, Ewa; Kissimova-Skarbek, Katarzyna; Rothgang, Heinz

    2018-01-01

    In the light of demographic developments health promotion interventions for older people are gaining importance. In addition to methodological challenges arising from the economic evaluation of health promotion interventions in general, there are specific methodological problems for the particular target group of older people. There are especially four main methodological challenges that are discussed in the literature. They concern measurement and valuation of informal caregiving, accounting for productivity costs, effects of unrelated cost in added life years and the inclusion of 'beyond-health' benefits. This paper focuses on the question whether and to what extent specific methodological requirements are actually met in applied health economic evaluations. Following a systematic review of pertinent health economic evaluations, the included studies are analysed on the basis of four assessment criteria that are derived from methodological debates on the economic evaluation of health promotion interventions in general and economic evaluations targeting older people in particular. Of the 37 studies included in the systematic review, only very few include cost and outcome categories discussed as being of specific relevance to the assessment of health promotion interventions for older people. The few studies that consider these aspects use very heterogeneous methods, thus there is no common methodological standard. There is a strong need for the development of guidelines to achieve better comparability and to include cost categories and outcomes that are relevant for older people. Disregarding these methodological obstacles could implicitly lead to discrimination against the elderly in terms of health promotion and disease prevention and, hence, an age-based rationing of public health care.

  5. Design, Progressive Modeling, Manufacture, and Testing of Composite Shield for Turbine Engine Blade Containment

    NASA Technical Reports Server (NTRS)

    Binienda, Wieslaw K.; Sancaktar, Erol; Roberts, Gary D. (Technical Monitor)

    2002-01-01

    An effective design methodology was established for composite jet engine containment structures. The methodology included the development of the full and reduced size prototypes, and FEA models of the containment structure, experimental and numerical examination of the modes of failure clue to turbine blade out event, identification of materials and design candidates for future industrial applications, and design and building of prototypes for testing and evaluation purposes.

  6. Effects of surface chemistry on hot corrosion life

    NASA Technical Reports Server (NTRS)

    Fryxell, R. E.; Leese, G. E.

    1985-01-01

    This program has its primary objective: the development of hot corrosion life prediction methodology based on a combination of laboratory test data and evaluation of field service turbine components which show evidence of hot corrosion. The laboratory program comprises burner rig testing by TRW. A summary of results is given for two series of burner rig tests. The life prediction methodology parameters to be appraised in a final campaign of burner rig tests are outlined.

  7. Parallel processing in a host plus multiple array processor system for radar

    NASA Technical Reports Server (NTRS)

    Barkan, B. Z.

    1983-01-01

    Host plus multiple array processor architecture is demonstrated to yield a modular, fast, and cost-effective system for radar processing. Software methodology for programming such a system is developed. Parallel processing with pipelined data flow among the host, array processors, and discs is implemented. Theoretical analysis of performance is made and experimentally verified. The broad class of problems to which the architecture and methodology can be applied is indicated.

  8. WOULD YOU BELIEVE A 20% EXCESS RISK OF CARDIOVASCULAR MORTALITY FOR A 10UG/M3 INCREASE IN FINE PM (FOR PEOPLE 65-99 YEARS OLD) IN PHOENIX, AZ 1995-1997? IF SO, WHAT IS SPECIAL ABOUT PHOENIX? IF NOT, FIND THE ERROR!

    EPA Science Inventory

    The US EPA National Center for Environmental Assessment has developed a methodology to derive acute inhalation toxicity benchmarks, called acute reference exposures (AREs), for noncancer effects. The methodology provides guidance for the derivation of chemical-specific benchmark...

  9. Technology CAD for integrated circuit fabrication technology development and technology transfer

    NASA Astrophysics Data System (ADS)

    Saha, Samar

    2003-07-01

    In this paper systematic simulation-based methodologies for integrated circuit (IC) manufacturing technology development and technology transfer are presented. In technology development, technology computer-aided design (TCAD) tools are used to optimize the device and process parameters to develop a new generation of IC manufacturing technology by reverse engineering from the target product specifications. While in technology transfer to manufacturing co-location, TCAD is used for process centering with respect to high-volume manufacturing equipment of the target manufacturing equipment of the target manufacturing facility. A quantitative model is developed to demonstrate the potential benefits of the simulation-based methodology in reducing the cycle time and cost of typical technology development and technology transfer projects over the traditional practices. The strategy for predictive simulation to improve the effectiveness of a TCAD-based project, is also discussed.

  10. Human-Computer System Development Methodology for the Dialogue Management System.

    DTIC Science & Technology

    1982-05-01

    methodologies [HOSIJ78] are given below: I. The Michael Jackson Methodology [JACKM75] 2. The Warnier-Orr Methodolgy [HOSIJ78] 3. SADT (Structured...All the mentioned methodologies use top-down development strategy. The first two methodologies above ( Michael Jackson and Warnier-Orr) use data as the

  11. An Evaluation of Management Training and Coaching

    ERIC Educational Resources Information Center

    Berg, Morten Emil; Karlsen, Jan Terje

    2012-01-01

    Purpose: The focus of this paper is on management training and development. The purpose has been to address how coaching can be applied to learn about leadership tools and what effect this has on management behaviour and development. Design/methodology/approach: This is a qualitative case study of a management development program. The empirical…

  12. Leadership Development in Social Housing: A Research Agenda

    ERIC Educational Resources Information Center

    Ward, Carolyn; Blenkinsopp, John; McCauley-Smith, Catherine

    2010-01-01

    Purpose: The purpose of this paper is to develop a research agenda to underpin leadership development activity in the social housing sector, in the light of an identified need for effective leadership in this sector owing to the continual reform and changes it faces. Design/methodology/approach: A literature review is conducted by searching a…

  13. Early Experience and the Development of Cognitive Competence: Some Theoretical and Methodological Issues.

    ERIC Educational Resources Information Center

    Ulvund, Stein Erik

    1982-01-01

    Argues that in analyzing effects of early experience on development of cognitive competence, theoretical analyses as well as empirical investigations should be based on a transactional model of development. Shows optimal stimulation hypothesis, particularly the enhancement prediction, seems to represent a transactional approach to the study of…

  14. Development of Usability Criteria for E-Learning Content Development Software

    ERIC Educational Resources Information Center

    Celik, Serkan

    2012-01-01

    Revolutionary advancements have been observed in e-learning technologies though an amalgamated evaluation methodology for new generation e-learning content development tools is not available. The evaluation of educational software for online use must consider its usability and as well as its pedagogic effectiveness. This study is a first step…

  15. Use of Case Study Methods in Human Resource Management, Development, and Training Courses: Strategies and Techniques

    ERIC Educational Resources Information Center

    Maxwell, James R.; Gilberti, Anthony F.; Mupinga, Davison M.

    2006-01-01

    This paper will study some of the problems associated with case studies and make recommendations using standard and innovative methodologies effectively. Human resource management (HRM) and resource development cases provide context for analysis and decision-making designs in different industries. In most HRM development and training courses…

  16. Talent Development Environment and Workplace Adaptation: The Mediating Effects of Organisational Support

    ERIC Educational Resources Information Center

    Kunasegaran, Mageswari; Ismail, Maimunah; Rasdi, Roziah Mohd; Ismail, Ismi Arif; Ramayah, T.

    2016-01-01

    Purpose: This study aims to examine the relationship between talent development environment (TDE) variables of job focus and long-term development with the workplace adaptation (WA) of Malaysian professional returnees as mediated by the organisational support. Design/methodology/approach: A total of 130 respondents who are Malaysian professional…

  17. ISSUES IN DEVELOPING A TWO-GENERATION AVIAN TOXICITY TEST WITH JAPANESE QUAIL

    EPA Science Inventory

    As a subgroup of the OECD Expert Group on Assessment of Endocrine Disrupting Effects in Birds, we reviewed unresolved methodological issures important for the development of a two-generation toxicity test, discussed advantages and disadvantages of alternative approaches, and prop...

  18. Assurance of Learning in the MIS Program

    ERIC Educational Resources Information Center

    Harper, Jeffrey S.; Harder, Joseph T.

    2009-01-01

    This article describes the development of a systematic and practical methodology for assessing program effectiveness and monitoring student development in undergraduate decision sciences programs. The model we present is based on a student's progression through learning stages associated with four key competencies: technical, analytical,…

  19. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  20. System cost/performance analysis (study 2.3). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Kazangey, T.

    1973-01-01

    The relationships between performance, safety, cost, and schedule parameters were identified and quantified in support of an overall effort to generate program models and methodology that provide insight into a total space vehicle program. A specific space vehicle system, the attitude control system (ACS), was used, and a modeling methodology was selected that develops a consistent set of quantitative relationships among performance, safety, cost, and schedule, based on the characteristics of the components utilized in candidate mechanisms. These descriptive equations were developed for a three-axis, earth-pointing, mass expulsion ACS. A data base describing typical candidate ACS components was implemented, along with a computer program to perform sample calculations. This approach, implemented on a computer, is capable of determining the effect of a change in functional requirements to the ACS mechanization and the resulting cost and schedule. By a simple extension of this modeling methodology to the other systems in a space vehicle, a complete space vehicle model can be developed. Study results and recommendations are presented.

  1. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  2. A macro environmental risk assessment methodology for establishing priorities among risks to human health and the environment in the Philippines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gernhofer, S.; Oliver, T.J.; Vasquez, R.

    1994-12-31

    A macro environmental risk assessment (ERA) methodology was developed for the Philippine Department of Environment and Natural Resources (DENR) as part of the US Agency for International Development Industrial Environmental Management Project. The DENR allocates its limited resources to mitigate those environmental problems that pose the greatest threat to human health and the environment. The National Regional Industry Prioritization Strategy (NRIPS) methodology was developed as a risk assessment tool to establish a national ranking of industrial facilities. The ranking establishes regional and national priorities, based on risk factors, that DENR can use to determine the most effective allocation of itsmore » limited resources. NRIPS is a systematic framework that examines the potential risk to human health and the environment from hazardous substances released from a facility, and, in doing so, generates a relative numerical score that represents that risk. More than 3,300 facilities throughout the Philippines were evaluated successfully with the NRIPS.« less

  3. The development of evidence-based guidelines in dentistry.

    PubMed

    Faggion, C M

    2013-02-01

    Use of guidelines is an important means of reducing the gap between research and clinical practice. Sound and unbiased information should be available to enable dental professionals to provide better clinical treatment for their patients. The development of clinical guidelines in dentistry should follow standard and transparent methodology. The purpose of this article is to propose important steps for developing evidence-based clinical recommendations in dentistry. Initially, dental guidelines should be extensively sought and assessed to answer focused clinical questions. If there is a paucity of guidelines or if existing guidelines are not of good methodological quality, systematic reviews should be searched or conducted to serve as a basis for the development of evidence-based guidelines. When systematic reviews are produced, they should be rigorous in order to provide the best evidence possible. In the last phase of the process, the overall quality of evidence should be scrutinized and assessed, together with other factors (balance between treatment effects and side effects, patients' values, and cost-effectiveness of therapy) to determine the strength of recommendations. It is expected this approach will result in the development of sound clinical guidelines and consequent improvement of dental treatment.

  4. The need for a comprehensive expert system development methodology

    NASA Technical Reports Server (NTRS)

    Baumert, John; Critchfield, Anna; Leavitt, Karen

    1988-01-01

    In a traditional software development environment, the introduction of standardized approaches has led to higher quality, maintainable products on the technical side and greater visibility into the status of the effort on the management side. This study examined expert system development to determine whether it differed enough from traditional systems to warrant a reevaluation of current software development methodologies. Its purpose was to identify areas of similarity with traditional software development and areas requiring tailoring to the unique needs of expert systems. A second purpose was to determine whether existing expert system development methodologies meet the needs of expert system development, management, and maintenance personnel. The study consisted of a literature search and personal interviews. It was determined that existing methodologies and approaches to developing expert systems are not comprehensive nor are they easily applied, especially to cradle to grave system development. As a result, requirements were derived for an expert system development methodology and an initial annotated outline derived for such a methodology.

  5. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  6. An applicational process for dynamic balancing of turbomachinery shafting

    NASA Technical Reports Server (NTRS)

    Verhoff, Vincent G.

    1990-01-01

    The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.

  7. 12th meeting of the Scientific Group on Methodologies for the Safety Evaluation of Chemicals: susceptibility to environmental hazards.

    PubMed Central

    Barrett, J C; Vainio, H; Peakall, D; Goldstein, B D

    1997-01-01

    The 12th meeting of the Scientific Group on Methodologies for the Safety Evaluation of Chemicals (SGOMSEC) considered the topic of methodologies for determining human and ecosystem susceptibility to environmental hazards. The report prepared at the meeting describes measurement of susceptibility through the use of biological markers of exposure, biological markers of effect, and biomarkers directly indicative of susceptibility of humans or of ecosystems. The utility and validity of these biological markers for the study of susceptibility are evaluated, as are opportunities for developing newer approaches for the study of humans or of ecosystems. For the first time a SGOMSEC workshop also formally considered the issue of ethics in relation to methodology, an issue of particular concern for studies of susceptibility. PMID:9255554

  8. The Greek National Observatory of Forest Fires (NOFFi)

    NASA Astrophysics Data System (ADS)

    Tompoulidou, Maria; Stefanidou, Alexandra; Grigoriadis, Dionysios; Dragozi, Eleni; Stavrakoudis, Dimitris; Gitas, Ioannis Z.

    2016-08-01

    Efficient forest fire management is a key element for alleviating the catastrophic impacts of wildfires. Overall, the effective response to fire events necessitates adequate planning and preparedness before the start of the fire season, as well as quantifying the environmental impacts in case of wildfires. Moreover, the estimation of fire danger provides crucial information required for the optimal allocation and distribution of the available resources. The Greek National Observatory of Forest Fires (NOFFi)—established by the Greek Forestry Service in collaboration with the Laboratory of Forest Management and Remote Sensing of the Aristotle University of Thessaloniki and the International Balkan Center—aims to develop a series of modern products and services for supporting the efficient forest fire prevention management in Greece and the Balkan region, as well as to stimulate the development of transnational fire prevention and impacts mitigation policies. More specifically, NOFFi provides three main fire-related products and services: a) a remote sensing-based fuel type mapping methodology, b) a semi-automatic burned area mapping service, and c) a dynamically updatable fire danger index providing mid- to long-term predictions. The fuel type mapping methodology was developed and applied across the country, following an object-oriented approach and using Landsat 8 OLI satellite imagery. The results showcase the effectiveness of the generated methodology in obtaining highly accurate fuel type maps on a national level. The burned area mapping methodology was developed as a semi-automatic object-based classification process, carefully crafted to minimize user interaction and, hence, be easily applicable on a near real-time operational level as well as for mapping historical events. NOFFi's products can be visualized through the interactive Fire Forest portal, which allows the involvement and awareness of the relevant stakeholders via the Public Participation GIS (PPGIS) tool.

  9. Effects of heavy ion radiation on the brain vascular system and embryonic development

    NASA Technical Reports Server (NTRS)

    Yang, T. C.; Tobias, C. A.

    1984-01-01

    The present investigation is concerned with the effects of heavy-ion radiation on the vascular system and the embryonic development, taking into account the results of experiments with neonatal rats and mouse embryos. It is found that heavy ions can be highly effective in producing brain hemorrhages and in causing body deformities. Attention is given to aspects of methodology, the induction of brain hemorrhages by X-rays and heavy ions, and the effect of iron particles on embryonic development. Reported results suggest that high linear energy transfer (LET) heavy ions can be very effective in producing developmental abnormalities.

  10. Development of an Unstructured Mesh Code for Flows About Complete Vehicles

    NASA Technical Reports Server (NTRS)

    Peraire, Jaime; Gupta, K. K. (Technical Monitor)

    2001-01-01

    This report describes the research work undertaken at the Massachusetts Institute of Technology, under NASA Research Grant NAG4-157. The aim of this research is to identify effective algorithms and methodologies for the efficient and routine solution of flow simulations about complete vehicle configurations. For over ten years we have received support from NASA to develop unstructured mesh methods for Computational Fluid Dynamics. As a result of this effort a methodology based on the use of unstructured adapted meshes of tetrahedra and finite volume flow solvers has been developed. A number of gridding algorithms, flow solvers, and adaptive strategies have been proposed. The most successful algorithms developed from the basis of the unstructured mesh system FELISA. The FELISA system has been extensively for the analysis of transonic and hypersonic flows about complete vehicle configurations. The system is highly automatic and allows for the routine aerodynamic analysis of complex configurations starting from CAD data. The code has been parallelized and utilizes efficient solution algorithms. For hypersonic flows, a version of the code which incorporates real gas effects, has been produced. The FELISA system is also a component of the STARS aeroservoelastic system developed at NASA Dryden. One of the latest developments before the start of this grant was to extend the system to include viscous effects. This required the development of viscous generators, capable of generating the anisotropic grids required to represent boundary layers, and viscous flow solvers. We show some sample hypersonic viscous computations using the developed viscous generators and solvers. Although this initial results were encouraging it became apparent that in order to develop a fully functional capability for viscous flows, several advances in solution accuracy, robustness and efficiency were required. In this grant we set out to investigate some novel methodologies that could lead to the required improvements. In particular we focused on two fronts: (1) finite element methods and (2) iterative algebraic multigrid solution techniques.

  11. Using an action research process in pharmacy practice research--a cooperative project between university and internship pharmacies.

    PubMed

    Sørensen, Ellen Westh; Haugbølle, Lotte Stig

    2008-12-01

    Action research (AR) is a common research-based methodology useful for development and organizational changes in health care when participant involvement is key. However, AR is not widely used for research in the development of pharmaceutical care services in pharmacy practice. To disseminate the experience from using AR methodology to develop cognitive services in pharmacies by describing how the AR process was conducted in a specific study, and to describe the outcome for participants. The study was conducted over a 3-year period and run by a steering group of researchers, pharmacy students, and preceptors. The study design was based on AR methodology. The following data production methods were used to describe and evaluate the AR model: documentary analysis, qualitative interviews, and questionnaires. Experiences from using AR methodology and the outcome for participants are described. A set of principles was followed while the study, called the Pharmacy-University study, was being conducted. These principles are considered useful for designing future AR studies. Outcome for participating pharmacies was registered for staff-oriented and patient-oriented activities. Outcome for students was practice as project leaders and enhancement of clinical pharmacy-based skills. Outcome for researchers and the steering group conducting the study was in-depth knowledge of the status of pharmacies in giving advice to patient groups, and effective learning methods for students. Developing and implementing cognitive pharmaceutical services (CPS) involves wide-reaching changes that require the willingness of pharmacy and staff as well as external partners. The use of AR methodology creates a platform that supports raising the awareness and the possible inclusion of these partners. During this study, a set of tools was developed for use in implementing CPS as part of AR.

  12. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  13. Alignment of Developments in Higher Education

    ERIC Educational Resources Information Center

    Cowan, John; George, Judith W.; Pinheiro-Torres, Andreia

    2004-01-01

    This study builds upon the concept of alignment within the curriculum (due to Biggs) and suggests, in the context of two current examples, an integrated methodology for effectively aligned development activities within universities. Higher Education institutions face important challenges. Firstly, quality enhancement of the curriculum is now an…

  14. Becoming Independent Storytellers: Modeling Children's Development of Narrative Macrostructure

    ERIC Educational Resources Information Center

    Kelly, Kimberly Reynolds; Bailey, Alison Louise

    2013-01-01

    For parents to provide effective support for their children's language development, they must be attuned to their child's changing abilities. This study presents a theoretically driven strategy that addresses a methodological challenge present when tracking longitudinally the cessation or "fading" of behaviors by capturing withdrawal of…

  15. Entrepreneurial Education at University Level and Entrepreneurship Development

    ERIC Educational Resources Information Center

    Hasan, Sk. Mahmudul; Khan, Eijaz Ahmed; Nabi, Md. Noor Un

    2017-01-01

    Purpose: The purpose of this paper is to contribute to the literature on effectiveness of entrepreneurship education by empirically assessing the role of university entrepreneurial education in entrepreneurship development and reporting the results. Design/methodology/approach: A quantitative method was applied for this study. This research was…

  16. Cost-Effectiveness Analysis of the Automation of a Circulation System.

    ERIC Educational Resources Information Center

    Mosley, Isobel

    A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…

  17. Institutional Effectiveness Analysis and Student Goal Attainment in the Community College.

    ERIC Educational Resources Information Center

    Meyer, Marilyn Wertheimer

    In an effort to effect institutional change through an analysis of institutional effectiveness, California's Fresno City College (FCC) undertook a 3-year project to examine student success. In order to determine appropriate measures of and methodologies for improving student success, a Student Success Task was established, developing 13 core…

  18. The Public Library Effectiveness Study: Final Report.

    ERIC Educational Resources Information Center

    Childers, Thomas; Van House, Nancy A.

    This study investigated the construct of effectiveness as it applies to public libraries and developed a methodology that can be transferred to other types of libraries and organizations. The research team began by compiling a list of indicators that are commonly used to gauge library effectiveness within the areas of: (1) services access; (2)…

  19. Understanding Effective High Schools: Evidence for Personalization for Academic and Social Emotional Learning

    ERIC Educational Resources Information Center

    Rutledge, Stacey A.; Cohen-Vogel, Lora; Osborne-Lampkin, La'Tara; Roberts, Ronnie L.

    2015-01-01

    This article presents findings from a year-long multilevel comparative case study exploring the characteristics of effective urban high schools. We developed a comprehensive framework from the school effectiveness research that guided our data collection and analysis at the four high schools. Using value-added methodology, we identified two higher…

  20. 7T MRI subthalamic nucleus atlas for use with 3T MRI.

    PubMed

    Milchenko, Mikhail; Norris, Scott A; Poston, Kathleen; Campbell, Meghan C; Ushe, Mwiza; Perlmutter, Joel S; Snyder, Abraham Z

    2018-01-01

    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) reduces motor symptoms in most patients with Parkinson disease (PD), yet may produce untoward effects. Investigation of DBS effects requires accurate localization of the STN, which can be difficult to identify on magnetic resonance images collected with clinically available 3T scanners. The goal of this study is to develop a high-quality STN atlas that can be applied to standard 3T images. We created a high-definition STN atlas derived from seven older participants imaged at 7T. This atlas was nonlinearly registered to a standard template representing 56 patients with PD imaged at 3T. This process required development of methodology for nonlinear multimodal image registration. We demonstrate mm-scale STN localization accuracy by comparison of our 3T atlas with a publicly available 7T atlas. We also demonstrate less agreement with an earlier histological atlas. STN localization error in the 56 patients imaged at 3T was less than 1 mm on average. Our methodology enables accurate STN localization in individuals imaged at 3T. The STN atlas and underlying 3T average template in MNI space are freely available to the research community. The image registration methodology developed in the course of this work may be generally applicable to other datasets.

  1. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  2. Development of CAG Model for Developing Instructional Materials for Teaching Physical Science Concepts for Grade 8 Students.

    ERIC Educational Resources Information Center

    Hse, Shun-Yi

    1991-01-01

    The development of an instructional model based on a learning cycle including correlation, analysis, and generalization (CAG) is described. A module developed for heat and temperature was administered to test its effects by comparing its use with the same unit in the New Physical Science Curriculum (NPSC). The methodology, results, and discussion…

  3. Development and application of stir bar sorptive extraction with polyurethane foams for the determination of testosterone and methenolone in urine matrices.

    PubMed

    Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F

    2011-04-01

    This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.

  4. Inter-provider comparison of patient-reported outcomes: developing an adjustment to account for differences in patient case mix.

    PubMed

    Nuttall, David; Parkin, David; Devlin, Nancy

    2015-01-01

    This paper describes the development of a methodology for the case-mix adjustment of patient-reported outcome measures (PROMs) data permitting the comparison of outcomes between providers on a like-for-like basis. Statistical models that take account of provider-specific effects form the basis of the proposed case-mix adjustment methodology. Indirect standardisation provides a transparent means of case mix adjusting the PROMs data, which are updated on a monthly basis. Recently published PROMs data for patients undergoing unilateral knee replacement are used to estimate empirical models and to demonstrate the application of the proposed case-mix adjustment methodology in practice. The results are illustrative and are used to highlight a number of theoretical and empirical issues that warrant further exploration. For example, because of differences between PROMs instruments, case-mix adjustment methodologies may require instrument-specific approaches. A number of key assumptions are made in estimating the empirical models, which could be open to challenge. The covariates of post-operative health status could be expanded, and alternative econometric methods could be employed. © 2013 Crown copyright.

  5. Multiobjective Optimization of Atmospheric Plasma Spray Process Parameters to Deposit Yttria-Stabilized Zirconia Coatings Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.

    2011-03-01

    Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.

  6. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594

  8. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  9. PLANNING AND COORDINATION OF ACTIVITIES SUPPORTING THE RUSSIAN SYSTEM OF CONTROL AND ACCOUNTING OF NUCLEAR MATERIALS AT ROSATOM FACILITIES IN THE FRAMEWORK OF THE U.S.-RUSSIAN COOPERATION.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SVIRIDOVA, V.V.; ERASTOV, V.V.; ISAEV, N.V.

    2005-05-16

    The MC&A Equipment and Methodological Support Strategic Plan (MEMS SP) for implementing modern MC&A equipment and methodologies at Rosatom facilities has been developed within the framework of the U.S.-Russian MPC&A Program. This plan developed by the Rosatom's Russian MC&A Equipment and Methodologies (MEM) Working Group and is coordinated by that group with support and coordination provided by the MC&A Measurements Project, Office of National Infrastructure and Sustainability, US DOE. Implementation of different tasks of the MEMS Strategic Plan is coordinated by Rosatom and US-DOE in cooperation with different U.S.-Russian MC&A-related working groups and joint site project teams. This cooperation allowsmore » to obtain and analyze information about problems, current needs and successes at Rosatom facilities and facilitates solution of the problems, satisfying the facilities' needs and effective exchange of expertise and lessons learned. The objective of the MEMS Strategic Plan is to enhance effectiveness of activities implementing modern equipment and methodologies in the Russian State MC&A system. These activities are conducted within the joint Russian-US MPC&A program aiming at reduction of possibility for theft or diversion of nuclear materials and enhancement of control of nuclear materials.« less

  10. Model for the Effect of Fiber Bridging on the Fracture Resistance of Reinforced-Carbon-Carbon

    NASA Technical Reports Server (NTRS)

    Chan, Kwai S.; Lee, Yi-Der; Hudak, Stephen J., Jr.

    2009-01-01

    A micromechanical methodology has been developed for analyzing fiber bridging and resistance-curve behavior in reinforced-carbon-carbon (RCC) panels with a three-dimensional (3D) composite architecture and a silicon carbide (SiC) surface coating. The methodology involves treating fiber bridging traction on the crack surfaces in terms of a weight function approach and a bridging law that relates the bridging stress to the crack opening displacement. A procedure has been developed to deduce material constants in the bridging law from the linear portion of the K-resistance curve. This report contains information on the application of procedures and outcomes.

  11. Early Implementation of QbD in Biopharmaceutical Development: A Practical Example

    PubMed Central

    Zurdo, Jesús; Arnell, Andreas; Obrezanova, Olga; Smith, Noel; Gómez de la Cuesta, Ramón; Gallagher, Thomas R. A.; Michael, Rebecca; Stallwood, Yvette; Ekblad, Caroline; Abrahmsén, Lars; Höidén-Guthenberg, Ingmarie

    2015-01-01

    In drug development, the “onus” of the low R&D efficiency has been put traditionally onto the drug discovery process (i.e., finding the right target or “binding” functionality). Here, we show that manufacturing is not only a central component of product success, but also that, by integrating manufacturing and discovery activities in a “holistic” interpretation of QbD methodologies, we could expect to increase the efficiency of the drug discovery process as a whole. In this new context, early risk assessment, using developability methodologies and computational methods in particular, can assist in reducing risks during development in a cost-effective way. We define specific areas of risk and how they can impact product quality in a broad sense, including essential aspects such as product efficacy and patient safety. Emerging industry practices around developability are introduced, including some specific examples of applications to biotherapeutics. Furthermore, we suggest some potential workflows to illustrate how developability strategies can be introduced in practical terms during early drug development in order to mitigate risks, reduce drug attrition and ultimately increase the robustness of the biopharmaceutical supply chain. Finally, we also discuss how the implementation of such methodologies could accelerate the access of new therapeutic treatments to patients in the clinic. PMID:26075248

  12. Characteristics of Effective Leadership Networks

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Azah, Vera Ndifor

    2016-01-01

    Purpose: The purpose of this paper is to inquire about the characteristics of effective school leadership networks and the contribution of such networks to the development of individual leaders' professional capacities. Design/methodology/approach: The study used path-analytic techniques with survey data provided by 450 school and district leaders…

  13. Focal Event, Contextualization, and Effective Communication in the Mathematics Classroom

    ERIC Educational Resources Information Center

    Nilsson, Per; Ryve, Andreas

    2010-01-01

    The aim of this article is to develop analytical tools for studying mathematical communication in collaborative activities. The theoretical construct of contextualization is elaborated methodologically in order to study diversity in individual thinking in relation to effective communication. The construct of contextualization highlights issues of…

  14. Experiential Learning in Education for Sustainable Development: Experiences from a Czech-Kazakh Social Learning Programme

    ERIC Educational Resources Information Center

    Cincera, Jan

    2013-01-01

    The article presents experience from a joint Czech-Kazakh project based on experiential education. The goal of the project was to develop trust and cooperation between various stakeholders to promote effective public participation in local sustainable development issues in Kazakhstan. The article describes the methodology of the programme and its…

  15. Development of a Peer-Assisted Learning Strategy in Computer-Supported Collaborative Learning Environments for Elementary School Students

    ERIC Educational Resources Information Center

    Tsuei, Mengping

    2011-01-01

    This study explores the effects of Electronic Peer-Assisted Learning for Kids (EPK), on the quality and development of reading skills, peer interaction and self-concept in elementary students. The EPK methodology uses a well-developed, synchronous computer-supported, collaborative learning system to facilitate students' learning in Chinese. We…

  16. Design and Implementation Issues in Surveying the Views of Young Children in Ethnolinguistically Diverse Developing Country Contexts

    ERIC Educational Resources Information Center

    Smith, Hilary A.; Haslett, Stephen J.

    2016-01-01

    This paper discusses issues in the development of a methodology appropriate for eliciting sound quantitative data from primary school children in the complex contexts of ethnolinguistically diverse developing countries. Although these issues often occur in field-based surveys, the large extent and compound effects of their occurrence in…

  17. Protegiendo Nuestra Comunidad: empowerment participatory education for HIV prevention.

    PubMed

    McQuiston, C; Choi-Hevel, S; Clawson, M

    2001-10-01

    To be effective, HIV/AIDS interventions must be culturally and linguistically appropriate and must occur within the context of the specific community in which they are delivered. In this article, the development of a culture-specific lay health advisor (LHA) program, Protegiendo Nuestra Comunidad, for recently immigrated Mexicans is described. This program is one component of a collaborative inquiry research project involving community participants and researchers working as partners in carrying out and assessing a program for the prevention of HIV/AIDS. The collaborative inquiry process was applied as an empowerment philosophy and methodology of Paulo Freire and an ecological framework was used for the development of Protegiendo Nuestra Comunidad. The use of principles of empowerment for curriculum development, teaching methodology, and program delivery are described.

  18. Methodology for determining the investment attractiveness of construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana

    2018-03-01

    The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.

  19. Facilitating Lecturer Development and Student Learning through Action Research

    ERIC Educational Resources Information Center

    van der Westhuizen, C. N.

    2008-01-01

    The aim of the action research project is to improve my own practice as research methodology lecturer to facilitate effective student learning to enable students to become reflective practitioners with responsibility for their own professional development through action research in their own classrooms, and to motivate the students and increase…

  20. A Scenario Approach to Assessment of New Communications Media.

    ERIC Educational Resources Information Center

    Spangler, Kathleen; And Others

    In a study supported by the Charles F. Kettering Foundation, a research team developed a methodology for illustrating the effective and ineffective uses of audio, video, and computer teleconferencing by developing scenarios for eacb medium. The group first invented a general situation--a conference involving participants with global, regional, and…

  1. Developing a Biostatistical Collaboration Course in a Health Science Research Methodology Program

    ERIC Educational Resources Information Center

    Thabane, Lehana; Walter, Stephen D.; Hanna, Steven; Goldsmith, Charles H.; Pullenayegum, Eleanor

    2008-01-01

    Effective statistical collaboration in a multidisciplinary health research environment requires skills not taught in the usual statistics courses. Graduates often learn such collaborative skills through trial and error. In this paper, we discuss the development of a biostatistical collaboration course aimed at graduate students in a Health…

  2. Auto Mechanics; Methodology. Technical Instruction Manual.

    ERIC Educational Resources Information Center

    Systems Operation Support, Inc., King of Prussia, PA.

    This student instruction manual was written in conformance with selected criteria for programed instruction books as developed previously for various military training courses. The manual was developed as a part of "A Study of the Effectiveness of a Military-Type Computer-Based Instructional System When Used in Civilian High School Courses in…

  3. Development of Teaching Materials for Field Identification of Plants and Analysis of Their Effectiveness in Science Education.

    ERIC Educational Resources Information Center

    Ohkawa, Chizuru

    2000-01-01

    Introduces teaching materials developed for field identification of plants with synoptical keys, identification tables, cards, and programs. Selects approximately 2000 seed plants and uses visibly identifiable characteristics for classification. Recommends using the methodology of identification in other areas for biological identification. (YDS)

  4. University-Industry Linkages in Developing Countries: Perceived Effect on Innovation

    ERIC Educational Resources Information Center

    Vaaland, Terje I.; Ishengoma, Esther

    2016-01-01

    Purpose: The purpose of this paper is to assess the perceptions of both universities and the resource-extractive companies on the influence of university-industry linkages (UILs) on innovation in a developing country. Design/Methodology/Approach: A total of 404 respondents were interviewed. Descriptive analysis and multinomial logistic regression…

  5. Research in Special Education: Scientific Methods and Evidence-Based Practices

    ERIC Educational Resources Information Center

    Odom, Samuel L.; Brantlinger, Ellen; Gersten, Russell; Horner, Robert H.; Thompson, Bruce; Harris, Karen R.

    2005-01-01

    This article sets the context for the development of research quality indicators and guidelines for evidence of effective practices provided by different methodologies. The current conceptualization of scientific research in education and the complexity of conducting research in special education settings underlie the development of quality…

  6. Enhancing Young Graduates' Intention towards Entrepreneurship Development in Malaysia

    ERIC Educational Resources Information Center

    Mohamed, Zainalabidin; Rezai, Golnaz; Shamsudin, Mad Nasir; Mahmud, Muhammad Mu'az

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of the Basic Student Entrepreneurial Programme (BSEP) among local university graduates who have undergone the training programme in entrepreneurship development. Design/methodology/approach: In total, 410 respondents who had participated in BSEP were interviewed with a structural…

  7. A Low Cost Course Information Syndication System

    ERIC Educational Resources Information Center

    Ajayi, A. O.; Olajubu, E. A.; Bello, S. A.; Soriyan, H. A.; Obamuyide, A. V.

    2011-01-01

    This study presents a cost effective, reliable, and convenient mobile web-based system to facilitate the dissemination of course information to students, to support interaction that goes beyond the classroom. The system employed the Really Simple Syndication (RSS) technology and was developed using Rapid Application Development (RAD) methodology.…

  8. When is good, good enough? Methodological pragmatism for sustainable guideline development.

    PubMed

    Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C

    2015-03-06

    Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.

  9. A real options approach to biotechnology investment policy-the case of developing a Campylobacter vaccine to poultry.

    PubMed

    Lund, Mogens; Jensen, Jørgen Dejgård

    2016-06-01

    The aim of the article is to identify and analyse public-private incentives for the development and marketing of new animal vaccines within a real options methodological framework, and to investigate how real options methodology can be utilized to support economic incentives for vaccine development in a cost-effective way. The development of a vaccine against Campylobacter jejuni in poultry is applied as a case study. Employing the real options methodology, the net present value of the vaccine R&D project becomes larger than a purely probabilistic expected present value throughout the different stages of the project - and the net present value becomes larger, when more types of real options are taken into consideration. The insight from the real options analysis reveals opportunities for new policies to promote the development of animal vaccines. One such approach might be to develop schemes combining stage-by-stage optimized subsidies in the individual development stages, with proper account taken of investors'/developers' economic incentives to proceed, sell or cancel the project in the respective stages. Another way of using the real options approach to support the development of desirable animal vaccines could be to issue put options for the vaccine candidate, enabling vaccine developers to hedge against the economic risk from market volatility. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  11. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  12. Navigating the grounded theory terrain. Part 2.

    PubMed

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.

  13. Simulation of Attacks for Security in Wireless Sensor Network.

    PubMed

    Diaz, Alvaro; Sanchez, Pablo

    2016-11-18

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.

  14. Cyclic Load Effects on Long Term Behavior of Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Chamis, C. C.

    1996-01-01

    A methodology to compute the fatigue life for different ratios, r, of applied stress to the laminate strength based on first ply failure criteria combined with thermal cyclic loads has been developed and demonstrated. Degradation effects resulting from long term environmental exposure and thermo-mechanical cyclic loads are considered in the simulation process. A unified time-stress dependent multi-factor interaction equation model developed at NASA Lewis Research Center has been used to account for the degradation of material properties caused by cyclic and aging loads. Effect of variation in the thermal cyclic load amplitude on a quasi-symmetric graphite/epoxy laminate has been studied with respect to the impending failure modes. The results show that, for the laminate under consideration, the fatigue life under combined mechanical and low thermal amplitude cyclic loads is higher than that due to mechanical loads only. However, as the thermal amplitude increases, the life also decreases. The failure mode changes from tensile under mechanical loads only to the compressive and shear at high mechanical and thermal loads. Also, implementation of the developed methodology in the design process has been discussed.

  15. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  16. Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 3: how to assess methodological limitations.

    PubMed

    Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte

    2018-01-25

    The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.

  17. Emerging Concepts and Methodologies in Cancer Biomarker Discovery.

    PubMed

    Lu, Meixia; Zhang, Jinxiang; Zhang, Lanjing

    2017-01-01

    Cancer biomarker discovery is a critical part of cancer prevention and treatment. Despite the decades of effort, only a small number of cancer biomarkers have been identified for and validated in clinical settings. Conceptual and methodological breakthroughs may help accelerate the discovery of additional cancer biomarkers, particularly their use for diagnostics. In this review, we have attempted to review the emerging concepts in cancer biomarker discovery, including real-world evidence, open access data, and data paucity in rare or uncommon cancers. We have also summarized the recent methodological progress in cancer biomarker discovery, such as high-throughput sequencing, liquid biopsy, big data, artificial intelligence (AI), and deep learning and neural networks. Much attention has been given to the methodological details and comparison of the methodologies. Notably, these concepts and methodologies interact with each other and will likely lead to synergistic effects when carefully combined. Newer, more innovative concepts and methodologies are emerging as the current emerging ones became mainstream and widely applied to the field. Some future challenges are also discussed. This review contributes to the development of future theoretical frameworks and technologies in cancer biomarker discovery and will contribute to the discovery of more useful cancer biomarkers.

  18. Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.

  19. A Quantitative, Non-Destructive Methodology for Habitat Characterisation and Benthic Monitoring at Offshore Renewable Energy Developments

    PubMed Central

    Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.

    2010-01-01

    Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment. PMID:21206748

  20. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    NASA Astrophysics Data System (ADS)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity at different scales and to produce maps with the spatial representation of the geodiversity index, which could be an inestimable contribute for land-use management.

  1. A quantitative, non-destructive methodology for habitat characterisation and benthic monitoring at offshore renewable energy developments.

    PubMed

    Sheehan, Emma V; Stevens, Timothy F; Attrill, Martin J

    2010-12-29

    Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment.

  2. Benchmarking for the Effective Use of Student Evaluation Data

    ERIC Educational Resources Information Center

    Smithson, John; Birks, Melanie; Harrison, Glenn; Nair, Chenicheri Sid; Hitchins, Marnie

    2015-01-01

    Purpose: The purpose of this paper is to examine current approaches to interpretation of student evaluation data and present an innovative approach to developing benchmark targets for the effective and efficient use of these data. Design/Methodology/Approach: This article discusses traditional approaches to gathering and using student feedback…

  3. An Improvement in Instructional Quality: Can Evaluation of Teaching Effectiveness Make a Difference?

    ERIC Educational Resources Information Center

    Ngware, Moses Waithanji; Ndirangu, Mwangi

    2005-01-01

    Purpose: To report study findings on teaching effectiveness and feedback mechanisms in Kenyan universities, which can guide management in developing a comprehensive quality control policy. Design/methodology/approach: The study adopted an exploratory descriptive design. Three public and two private universities were randomly selected to…

  4. Scale development on consumer behavior toward counterfeit drugs in a developing country: a quantitative study exploiting the tools of an evolving paradigm

    PubMed Central

    2013-01-01

    Background Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. Methods The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. Results As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach’s alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. Conclusion The results of this study indicate that the “Consumer Behavior Toward Counterfeit Drugs Scale” is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem. PMID:24020730

  5. Scale development on consumer behavior toward counterfeit drugs in a developing country: a quantitative study exploiting the tools of an evolving paradigm.

    PubMed

    Alfadl, Abubakr A; Ibrahim, Mohamed Izham b Mohamed; Hassali, Mohamed Azmi Ahmad

    2013-09-11

    Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach's alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. The results of this study indicate that the "Consumer Behavior Toward Counterfeit Drugs Scale" is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem.

  6. Computational fluid dynamics combustion analysis evaluation

    NASA Technical Reports Server (NTRS)

    Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.

    1992-01-01

    This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.

  7. The ethics of placebo-controlled trials: methodological justifications.

    PubMed

    Millum, Joseph; Grady, Christine

    2013-11-01

    The use of placebo controls in clinical trials remains controversial. Ethical analysis and international ethical guidance permit the use of placebo controls in randomized trials when scientifically indicated in four cases: (1) when there is no proven effective treatment for the condition under study; (2) when withholding treatment poses negligible risks to participants; (3) when there are compelling methodological reasons for using placebo, and withholding treatment does not pose a risk of serious harm to participants; and, more controversially, (4) when there are compelling methodological reasons for using placebo, and the research is intended to develop interventions that can be implemented in the population from which trial participants are drawn, and the trial does not require participants to forgo treatment they would otherwise receive. The concept of methodological reasons is essential to assessing the ethics of placebo controls in these controversial last two cases. This article sets out key considerations relevant to considering whether methodological reasons for a placebo control are compelling. © 2013.

  8. Applications of cost-effectiveness methodologies in behavioral medicine.

    PubMed

    Kaplan, Robert M; Groessl, Erik J

    2002-06-01

    In 1996, the Panel on Cost-Effectiveness in Health and Medicine developed standards for cost-effectiveness analysis. The standards include the use of a societal perspective, that treatments be evaluated in comparison with the best available alternative (rather than with no care at all), and that health benefits be expressed in standardized units. Guidelines for cost accounting were also offered. Among 24,562 references on cost-effectiveness in Medline between 1995 and 2000, only a handful were relevant to behavioral medicine. Only 19 studies published between 1983 and 2000 met criteria for further evaluation. Among analyses that were reported, only 2 studies were found consistent with the Panel's criteria for high-quality analyses, although more recent studies were more likely to meet methodological standards. There are substantial opportunities to advance behavioral medicine by performing standardized cost-effectiveness analyses.

  9. Developing Army Leaders through Increased Rigor in Professional Military Training and Education

    DTIC Science & Technology

    2017-06-09

    leadership. Research Methodology An applied, exploratory, qualitative research methodology via a structured and focused case study comparison was...research methodology via a structured and focused case study comparison. Finally, it will discuss how the methodology will be conducted to make...development models; it serves as the base data for case study comparison. 48 Research Methodology and Data Analysis A qualitative research

  10. A Methodology to Apply Business Process Reengineering within the Brazilian Aeronautical Ministry

    DTIC Science & Technology

    1999-03-01

    systems of the agency are designed, developed , maintained, and used effectively and efficiently. Defense Directives Within DoD the regulations that...over the last two years in an attempt to develop the best and most efficient and effective approach to BPR in DoD. The tasks to be performed are: 1...how I would develop my research. To my classmates, GIR/GIS class 98S, and also class 99D, thank you all for the encouragement and wisdom you gave me

  11. Extending the Instructional Systems Development Methodology.

    ERIC Educational Resources Information Center

    O'Neill, Colin E.

    1993-01-01

    Describes ways that components of Information Engineering (IE) methodology can be used by training system developers to extend Instructional Systems Development (ISD) methodology. Aspects of IE that are useful in ISD are described, including requirements determination, group facilitation, integrated automated tool support, and prototyping.…

  12. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  13. Weapon Simulator Test Methodology Investigation: Comparison of Live Fire and Weapon Simulator Test Methodologies and the Effects of Clothing and Individual Equipment on Marksmanship

    DTIC Science & Technology

    2016-09-15

    METHODOLOGY INVESTIGATION: COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION...COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON MARKSMANSHIP 5a. CONTRACT

  14. AMD NOX REDUCTION IMPACTS

    EPA Science Inventory

    This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...

  15. Rail-Highway Crossing Resource Allocation Model

    DOT National Transportation Integrated Search

    1981-04-01

    This report describes a methodology developed at the Transportation Systems Center for the Federal Railroad Administration and the Federal Highway Administration to aid in determining the most effective allocation of funds to improve safety at rail-h...

  16. Rapid development of xylanase assay conditions using Taguchi methodology.

    PubMed

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  17. [Methodological aspects of integrated care pathways].

    PubMed

    Gomis, R; Mata Cases, M; Mauricio Puente, D; Artola Menéndez, S; Ena Muñoz, J; Mediavilla Bravo, J J; Miranda Fernández-Santos, C; Orozco Beltrán, D; Rodríguez Mañas, L; Sánchez Villalba, C; Martínez, J A

    An Integrated Healthcare Pathway (PAI) is a tool which has as its aim to increase the effectiveness of clinical performance through greater coordination and to ensure continuity of care. PAI places the patient as the central focus of the organisation of health services. It is defined as the set of activities carried out by the health care providers in order to increase the level of health and satisfaction of the population receiving services. The development of a PAI requires the analysis of the flow of activities, the inter-relationships between professionals and care teams, and patient expectations. The methodology for the development of a PAI is presented and discussed in this article, as well as the success factors for its definition and its effective implementation. It also explains, as an example, the recent PAI for Hypoglycaemia in patients with Type 2 Diabetes Mellitus developed by a multidisciplinary team and supported by several scientific societies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  19. Assessment of reproductive and developmental effects of DINP, DnHP and DCHP using quantitative weight of evidence.

    PubMed

    Dekant, Wolfgang; Bridges, James

    2016-11-01

    Quantitative weight of evidence (QWoE) methodology utilizes detailed scoring sheets to assess the quality/reliability of each publication on toxicity of a chemical and gives numerical scores for quality and observed toxicity. This QWoE-methodology was applied to the reproductive toxicity data on diisononylphthalate (DINP), di-n-hexylphthalate (DnHP), and dicyclohexylphthalate (DCHP) to determine if the scientific evidence for adverse effects meets the requirements for classification as reproductive toxicants. The scores for DINP were compared to those when applying the methodology DCHP and DnHP that have harmonized classifications. Based on the quality/reliability scores, application of the QWoE shows that the three databases are of similar quality; but effect scores differ widely. Application of QWoE to DINP studies resulted in an overall score well below the benchmark required to trigger classification. For DCHP, the QWoE also results in low scores. The high scores from the application of the QWoE methodology to the toxicological data for DnHP represent clear evidence for adverse effects and justify a classification of DnHP as category 1B for both development and fertility. The conclusions on classification based on the QWoE are well supported using a narrative assessment of consistency and biological plausibility. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows

    NASA Astrophysics Data System (ADS)

    Safari, Mehdi

    Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.

  1. A history of the development of Brucella vaccines.

    PubMed

    Avila-Calderón, Eric Daniel; Lopez-Merino, Ahidé; Sriranganathan, Nammalwar; Boyle, Stephen M; Contreras-Rodríguez, Araceli

    2013-01-01

    Brucellosis is a worldwide zoonosis affecting animal and human health. In the last several decades, much research has been performed to develop safer Brucella vaccines to control the disease mainly in animals. Till now, no effective human vaccine is available. The aim of this paper is to review and discuss the importance of methodologies used to develop Brucella vaccines in pursuing this challenge.

  2. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  3. Environment, genes, and experience: lessons from behavior genetics.

    PubMed

    Barsky, Philipp I

    2010-11-01

    The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Methodology, status and plans for development and assessment of Cathare code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less

  5. Research methodology workshops evaluation using the Kirkpatrick's model: translating theory into practice.

    PubMed

    Abdulghani, Hamza Mohammad; Shaik, Shaffi Ahamed; Khamis, Nehal; Al-Drees, Abdulmajeed Abdulrahman; Irshad, Mohammad; Khalil, Mahmoud Salah; Alhaqwi, Ali Ibrahim; Isnani, Arthur

    2014-04-01

    Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness, and dissemination of comparative quality reports as well as quality improvement efforts. To evaluate the five research methodology workshops through assessing participants' satisfaction, knowledge and skills gain and impact on practices by the Kirkpatrick's evaluation model. The four level Kirkpatrick's model was applied for the evaluation. Training feedback questionnaires, pre and post tests, learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs. Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills by 17.67% (p ≤ 0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts (p ≤ 0.031) and proposal writing (p ≤ 0.834). As for the impact, 56.9% of participants started research, and 6.9% published their studies. The results from participants' performance revealed an overall positive feedback and 79% of participant reported transfer of training skills at their workplace. The course outcomes achievement and suggestions given for improvements offer insight into the program which were encouraging and very useful. Encouraging "research culture" and work-based learning are probably the most powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its training and development in the research methodology aspects.

  6. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  7. American Society of Composites, 32nd Technical Conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitharaju, Venkat; Wollschlager, Jeffrey; Plakomytis2, Dimitrios

    This paper will present a general methodology by which weave draping manufacturing simulation results can be utilized to include the effects of weave draping and scissor angle in a structural multiscale simulation. While the methodology developed is general in nature, this paper will specifically demonstrate the methodology applied to a truncated pyramid, utilizing manufacturing simulation weave draping results from ESI PAM-FORM, and multiscale simulation using Altair Multiscale Designer (MDS) and OptiStruct. From a multiscale simulation perspective, the weave draping manufacturing simulation results will be used to develop a series of woven unit cells which cover the range of weave scissormore » angles existing within the part. For each unit cell, a multiscale material model will be developed, and applied to the corresponding spatial locations within the structural simulation mesh. In addition, the principal material orientation will be mapped from the wave draping manufacturing simulation mesh to the structural simulation mesh using Altair HyperMesh mapping technology. Results of the coupled simulation will be compared and verified against experimental data of the same available via General Motors (GM) Department of Energy (DOE) project.« less

  8. Application of Observing System Simulation Experiments (OSSEs) to determining science and user requirements for space-based missions

    NASA Astrophysics Data System (ADS)

    Atlas, R. M.

    2016-12-01

    Observing System Simulation Experiments (OSSEs) provide an effective method for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations. As such, OSSEs can be an important tool for determining science and user requirements, and for incorporating these requirements into the planning for future missions. Detailed OSSEs have been conducted at NASA/ GSFC and NOAA/AOML in collaboration with Simpson Weather Associates and operational data assimilation centers over the last three decades. These OSSEs determined correctly the quantitative potential for several proposed satellite observing systems to improve weather analysis and prediction prior to their launch, evaluated trade-offs in orbits, coverage and accuracy for space-based wind lidars, and were used in the development of the methodology that led to the first beneficial impacts of satellite surface winds on numerical weather prediction. In this talk, the speaker will summarize the development of OSSE methodology, early and current applications of OSSEs and how OSSEs will evolve in order to enhance mission planning.

  9. Improving the completion of Quality Improvement projects amongst psychiatry core trainees.

    PubMed

    Ewins, Liz

    2015-01-01

    Quality Improvement (QI) projects are seen increasingly as more valuable and effective in developing services than traditional audit. However, the development of this methodology has been slower in the mental health field and QI projects are new to most psychiatrists. This project describes a way of engaging trainees across Avon and Wiltshire Mental Health Partnership (AWP) Trust and the Severn School of Psychiatry in QI projects, using QI methodology itself. Through the implementation and development of training sessions and simple, low cost and sustainable interventions over a 10 month period, two thirds of core trainees and over a half of the advanced psychiatry trainees in the School are now participating in 28 individual QI projects and QI project methodology is to become embedded in the core psychiatry training course. As an additional positive outcome, specialty doctors, consultants, foundation doctors, GP trainees, medical students, as well as the wider multidisciplinary team, have all become engaged in QI projects alongside trainees, working with service users and their families to identify problems to tackle and ideas to test.

  10. Developing and Implementing an Online Doctoral Programme

    ERIC Educational Resources Information Center

    Combe, Colin

    2005-01-01

    Purpose: This article is a critical reflection of the development and implementation of one of the first online doctoral programs in the UK set up at the University of Northumbria, Newcastle in 2000. Design/methodology/approach: The method adopted for analysis takes the form of a case study. Findings: Effective market research has to be undertaken…

  11. The Integration of Green Chemistry Experiments with Sustainable Development Concepts in Pre-Service Teachers' Curriculum: Experiences from Malaysia

    ERIC Educational Resources Information Center

    Karpudewan, Mageswary; Ismail, Zurida Hg; Mohamed, Norita

    2009-01-01

    Purpose: The purpose of this paper is to introduce green chemistry experiments as laboratory-based pedagogy and to evaluate effectiveness of green chemistry experiments in delivering sustainable development concepts (SDCs) and traditional environmental concepts (TECs). Design/methodology/approach: Repeated measure design was employed to evaluate…

  12. A Methodological Review of Research on Leadership Development and Social Capital: Is There a Cause and Effect Relationship?

    ERIC Educational Resources Information Center

    Van De Valk, Lawrence J.; Constas, Mark A.

    2011-01-01

    Recent interest in studying social aspects of leadership has brought attention to the relationship between leadership and social capital. There is also growing interest among stakeholders (researchers, practitioners, funders, and program participants) to improve evaluation methods for leadership development programs (LDPs). The purpose of the…

  13. Parenting and Psychosocial Development of IVF Children: Review of the Research Literature.

    ERIC Educational Resources Information Center

    Colpin, Hilde

    2002-01-01

    Examines the many hypotheses formulated about the possible effects that in vitro fertilization as a method of conception may have on the parent-child relationship and the child's psychosocial development. Discusses potential explanations for the various study findings since the 1990s (including methodological issues) and suggestions for future…

  14. Development of field-deployable instrumentation based on “antigen–antibody” reactions for detection of hemorrhagic disease in ruminants

    USDA-ARS?s Scientific Manuscript database

    Development of field-deployable methodology utilizing antigen–antibody reactions and the surface Plasmon resonance (SPR) effect to provide a rapid diagnostic test for recognition of the blue tongue virus (BTV) and epizootic hemorrhage disease virus (EHDV) in wild and domestic ruminants is reported. ...

  15. Evaluating Interactive Policy Making on Biotechnology: The Case of the Dutch Ministry of Health, Welfare and Sport

    ERIC Educational Resources Information Center

    Broerse, Jacqueline E. W.; de Cock Buning, Tjard; Roelofsen, Anneloes; Bunders, Joske F. G.

    2009-01-01

    Public engagement is increasingly advocated and applied in the development and implementation of technological innovations. However, initiatives so far are rarely considered effective. There is a need for more methodological rigor and insight into conducive conditions. The authors developed an evaluative framework and assessed accordingly the…

  16. Development of Managers' Emotional Competencies: Mind-Body Training Implication

    ERIC Educational Resources Information Center

    Gruicic, Dusan; Benton, Stephen

    2015-01-01

    Purpose: This paper aims to research about the effect of mind-body training on the development of emotional competencies of managers. Design/methodology/approach: Quasi-experimental design, i.e. before and after (test-retest). Findings: Results showed that the experimental group, after training, achieved around 15 per cent higher scores compared…

  17. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  18. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  19. Improving Junior Infantry Officer Leader Development and Performance

    DTIC Science & Technology

    2017-06-09

    researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...CHAPTER 3 RESEARCH METHODOLOGY ..............................................................132 CHAPTER 4 QUALITATIVE ANALYSIS

  20. Alzheimer’s Disease Drug Development in 2008 and Beyond: Problems and Opportunities

    PubMed Central

    Becker, Robert E.; Greig, Nigel H.

    2008-01-01

    Recently, a number of Alzheimer’s disease (AD) multi-center clinical trials (CT) have failed to provide statistically significant evidence of drug efficacy. To test for possible design or execution flaws we analyzed in detail CTs for two failed drugs that were strongly supported by preclinical evidence and by proven CT AD efficacy for other drugs in their class. Studies of the failed commercial trials suggest that methodological flaws may contribute to the failures and that these flaws lurk within current drug development practices ready to impact other AD drug development [1]. To identify and counter risks we considered the relevance to AD drug development of the following factors: (1) effective dosing of the drug product, (2) reliable evaluations of research subjects, (3) effective implementation of quality controls over data at research sites, (4) resources for practitioners to effectively use CT results in patient care, (5) effective disease modeling, (6) effective research designs. New drugs currently under development for AD address a variety of specific mechanistic targets. Mechanistic targets provide AD drug development opportunities to escape from many of the factors that currently undermine AD clinical pharmacology, especially the problems of inaccuracy and imprecision associated with using rated outcomes. In this paper we conclude that many of the current problems encountered in AD drug development can be avoided by changing practices. Current problems with human errors in clinical trials make it difficult to differentiate drugs that fail to evidence efficacy from apparent failures due to Type II errors. This uncertainty and the lack of publication of negative data impede researchers’ abilities to improve methodologies in clinical pharmacology and to develop a sound body of knowledge about drug actions. We consider the identification of molecular targets as offering further opportunities for overcoming current failures in drug development. PMID:18690832

  1. A Screening Method for Assessing Cumulative Impacts

    PubMed Central

    Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan

    2012-01-01

    The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening tool to help Cal/EPA programs prioritize their activities and target those communities with the greatest cumulative impacts. PMID:22470315

  2. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue.

  3. Investigation of indigenous water, salt and soil for solar ponds

    NASA Astrophysics Data System (ADS)

    Marsh, H. E.

    The existence of salt-gradient solar ponds in nature is a strong indication that the successful exploitation of this phenomenon must account adequately for the influences of the local setting. Sun, weather and other general factors are treated elsewhere. This paper deals with water, salt, and soil. A general methodology for evaluating and, where feasible, adjusting the effects of these elements is under development. Eight essential solar pond characteristics have been identified, along with a variety of their dependencies upon properties of water, salt and soil. The comprehensive methodology, when fully developed, will include laboratory investigation in such diverse areas as brine physical chemistry, light transmission, water treatment, brine-soil interactions, sealants, and others. With the Salton Sea solar pond investigation as an example, some methods under development will be described.

  4. Investigation of indigenous water, salt and soil for solar ponds

    NASA Technical Reports Server (NTRS)

    Marsh, H. E.

    1983-01-01

    The existence of salt-gradient solar ponds in nature is a strong indication that the successful exploitation of this phenomenon must account adequately for the influences of the local setting. Sun, weather and other general factors are treated elsewhere. This paper deals with water, salt, and soil. A general methodology for evaluating and, where feasible, adjusting the effects of these elements is under development. Eight essential solar pond characteristics have been identified, along with a variety of their dependencies upon properties of water, salt and soil. The comprehensive methodology, when fully developed, will include laboratory investigation in such diverse areas as brine physical chemistry, light transmission, water treatment, brine-soil interactions, sealants, and others. With the Salton Sea solar pond investigation as an example, some methods under development will be described.

  5. A method for the design of transonic flexible wings

    NASA Technical Reports Server (NTRS)

    Smith, Leigh Ann; Campbell, Richard L.

    1990-01-01

    Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.

  6. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    NASA Technical Reports Server (NTRS)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  7. Evaluating building performance in healthcare facilities: an organizational perspective.

    PubMed

    Steinke, Claudia; Webster, Lynn; Fontaine, Marie

    2010-01-01

    Using the environment as a strategic tool is one of the most cost-effective and enduring approaches for improving public health; however, it is one that requires multiple perspectives. The purpose of this article is to highlight an innovative methodology that has been developed for conducting comprehensive performance evaluations in public sector health facilities in Canada. The building performance evaluation methodology described in this paper is a government initiative. The project team developed a comprehensive building evaluation process for all new capital health projects that would respond to the aforementioned need for stakeholders to be more accountable and to better integrate the larger organizational strategy of facilities. The Balanced Scorecard, which is a multiparadigmatic, performance-based business framework, serves as the underlying theoretical framework for this initiative. It was applied in the development of the conceptual model entitled the Building Performance Evaluation Scorecard, which provides the following benefits: (1) It illustrates a process to link facilities more effectively to the overall mission and goals of an organization; (2) It is both a measurement and a management system that has the ability to link regional facilities to measures of success and larger business goals; (3) It provides a standardized methodology that ensures consistency in assessing building performance; and (4) It is more comprehensive than traditional building evaluations. The methodology presented in this paper is both a measurement and management system that integrates the principles of evidence-based design with the practices of pre- and post-occupancy evaluation. It promotes accountability and continues throughout the life cycle of a project. The advantage of applying this framework is that it engages health organizations in clarifying a vision and strategy for their facilities and helps translate those strategies into action and measurable performance outcomes.

  8. Developing purchasing strategy: a case study of a District Health Authority using soft systems methodology.

    PubMed

    Brown, A D

    1997-02-01

    This paper examines the attempt by a District Health Authority (DHA) to create structures (called Purchasing Strategy Groups or PSGs) to facilitate the effective development of its purchasing strategy. The paper is based on a case study design conducted using Soft Systems Methodology (SSM). The research contribution the paper makes is twofold. First, it analyses some of the fundamental management-related difficulties that a DHA can experience when attempting to come to terms with its role and responsibilities in the 1990s. Second, it provides a discussion and evaluation of the utility of SSM for qualitative research in the National Health Service (NHS) in the UK.

  9. Using soft systems methodology to develop a simulation of out-patient services.

    PubMed

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  10. Development of Behavioral Toxicology Methodology for Interactive Exposure Regimens.

    DTIC Science & Technology

    1983-12-01

    exposures conducted at weekly Interval had Identical effects. Five consecutive daily exposures resulted In partial tolerance to the disruptive effects...No. 2001C) for which the tap line was also located at the top of the chamber head measured the negative pressure of chamber Interior in relation to...result In partial tolerance development. lIT RESEARCH INSTITUTE 44 Although In contrast to Ator and Merigan’s finding, tolerance was not complete after

  11. The Effectiveness of a Program Based on the Combination of Relevance and Confidence Motivational Strategies in Developing EFL Argumentative Writing Skills and Overcoming Writing Apprehension among Students Teachers at Faculty of Education

    ERIC Educational Resources Information Center

    Ahmed Helwa, Hasnaa Sabry Abdel-Hamid

    2014-01-01

    The aim of this research is to investigate the effectiveness of a program based on the combination of relevance and confidence motivational strategies in developing EFL argumentative writing skills and overcoming writing apprehension among students teachers at Faculty of Education. The design of the research is a mixed research methodology. It…

  12. Transportation Infrastructure Robustness : Joint Engineering and Economic Analysis

    DOT National Transportation Integrated Search

    2017-11-01

    The objectives of this study are to develop a methodology for assessing the robustness of transportation infrastructure facilities and assess the effect of damage to such facilities on travel demand and the facilities users welfare. The robustness...

  13. Corrosion and corrosion fatigue of airframe aluminum alloys

    NASA Technical Reports Server (NTRS)

    Chen, G. S.; Gao, M.; Harlow, D. G.; Wei, R. P.

    1994-01-01

    Localized corrosion and corrosion fatigue crack nucleation and growth are recognized as degradation mechanisms that effect the durability and integrity of commercial transport aircraft. Mechanically based understanding is needed to aid the development of effective methodologies for assessing durability and integrity of airframe components. As a part of the methodology development, experiments on pitting corrosion, and on corrosion fatigue crack nucleation and early growth from these pits were conducted. Pitting was found to be associated with constituent particles in the alloys and pit growth often involved coalescence of individual particle-nucleated pits, both laterally and in depth. Fatigue cracks typically nucleated from one of the larger pits that formed by a cluster of particles. The size of pit at which fatigue crack nucleates is a function of stress level and fatigue loading frequency. The experimental results are summarized, and their implications on service performance and life prediction are discussed.

  14. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    PubMed Central

    Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan

    2013-01-01

    Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329

  15. Three-dimensional stochastic adjustment of volcano geodetic network in Arenal volcano, Costa Rica

    NASA Astrophysics Data System (ADS)

    Muller, C.; van der Laat, R.; Cattin, P.-H.; Del Potro, R.

    2009-04-01

    Volcano geodetic networks are a key instrument to understanding magmatic processes and, thus, forecasting potentially hazardous activity. These networks are extensively used on volcanoes worldwide and generally comprise a number of different traditional and modern geodetic surveying techniques such as levelling, distances, triangulation and GNSS. However, in most cases, data from the different methodologies are surveyed, adjusted and analysed independently. Experience shows that the problem with this procedure is the mismatch between the excellent correlation of position values within a single technique and the low cross-correlation of such values within different techniques or when the same network is surveyed shortly after using the same technique. Moreover one different independent network for each geodetic surveying technique strongly increase logistics and thus the cost of each measurement campaign. It is therefore important to develop geodetic networks which combine the different geodetic surveying technique, and to adjust geodetic data together in order to better quantify the uncertainties associated to the measured displacements. In order to overcome the lack of inter-methodology data integration, the Geomatic Institute of the University of Applied Sciences of Western Switzerland (HEIG-VD) has developed a methodology which uses a 3D stochastic adjustment software of redundant geodetic networks, TRINET+. The methodology consists of using each geodetic measurement technique for its strengths relative to other methodologies. Also, the combination of the measurements in a single network allows more cost-effective surveying. The geodetic data are thereafter adjusted and analysed in the same referential frame. The adjustment methodology is based on the least mean square method and links the data with the geometry. Trinet+ also allows to run a priori simulations of the network, hence testing the quality and resolution to be expected for a determined network even before it is built. Moreover, a posterior analysis enables identifying, and hence dismissing, measurement errors (antenna height, atmospheric effects, etc.). Here we present a preliminary effort to apply this technique to volcano deformation. A geodetic network has been developed on the western flank of the Arenal volcano in Costa Rica. It is surveyed with GNSS, angular and EDM (Electronic Distance Measurements) measurements. Three measurement campaigns were carried out between February and June 2008. The results show consistent and accurate output of deformation and uncertainty for each of the 12 benchmarks surveyed. The three campaigns also prove the repeatability and consistency of the statistical indicators and the displacement vectors. Although, this methodology has only recently been applied to volcanoes, we suggest that due to its cost-effective high-quality results it has the potential to be incorporated into the design and analysis of volcano geodetic networks worldwide.

  16. Ligand and structure-based methodologies for the prediction of the activity of G protein-coupled receptor ligands

    NASA Astrophysics Data System (ADS)

    Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.

    2009-11-01

    Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.

  17. Designing Facilities for Collaborative Operations

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Powell, Mark; Backes, Paul; Steinke, Robert; Tso, Kam; Wales, Roxana

    2003-01-01

    A methodology for designing operational facilities for collaboration by multiple experts has begun to take shape as an outgrowth of a project to design such facilities for scientific operations of the planned 2003 Mars Exploration Rover (MER) mission. The methodology could also be applicable to the design of military "situation rooms" and other facilities for terrestrial missions. It was recognized in this project that modern mission operations depend heavily upon the collaborative use of computers. It was further recognized that tests have shown that layout of a facility exerts a dramatic effect on the efficiency and endurance of the operations staff. The facility designs (for example, see figure) and the methodology developed during the project reflect this recognition. One element of the methodology is a metric, called effective capacity, that was created for use in evaluating proposed MER operational facilities and may also be useful for evaluating other collaboration spaces, including meeting rooms and military situation rooms. The effective capacity of a facility is defined as the number of people in the facility who can be meaningfully engaged in its operations. A person is considered to be meaningfully engaged if the person can (1) see, hear, and communicate with everyone else present; (2) see the material under discussion (typically data on a piece of paper, computer monitor, or projection screen); and (3) provide input to the product under development by the group. The effective capacity of a facility is less than the number of people that can physically fit in the facility. For example, a typical office that contains a desktop computer has an effective capacity of .4, while a small conference room that contains a projection screen has an effective capacity of around 10. Little or no benefit would be derived from allowing the number of persons in an operational facility to exceed its effective capacity: At best, the operations staff would be underutilized; at worst, operational performance would deteriorate. Elements of this methodology were applied to the design of three operations facilities for a series of rover field tests. These tests were observed by human-factors researchers and their conclusions are being used to refine and extend the methodology to be used in the final design of the MER operations facility. Further work is underway to evaluate the use of personal digital assistant (PDA) units as portable input interfaces and communication devices in future mission operations facilities. A PDA equipped for wireless communication and Ethernet, Bluetooth, or another networking technology would cost less than a complete computer system, and would enable a collaborator to communicate electronically with computers and with other collaborators while moving freely within the virtual environment created by a shared immersive graphical display.

  18. Breaking the Link between Environmental Degradation and Oil Palm Expansion: A Method for Enabling Sustainable Oil Palm Expansion

    PubMed Central

    Smit, Hans Harmen; Meijaard, Erik; van der Laan, Carina; Mantel, Stephan; Budiman, Arif; Verweij, Pita

    2013-01-01

    Land degradation is a global concern. In tropical areas it primarily concerns the conversion of forest into non-forest lands and the associated losses of environmental services. Defining such degradation is not straightforward hampering effective reduction in degradation and use of already degraded lands for more productive purposes. To facilitate the processes of avoided degradation and land rehabilitation, we have developed a methodology in which we have used international environmental and social sustainability standards to determine the suitability of lands for sustainable agricultural expansion. The method was developed and tested in one of the frontiers of agricultural expansion, West Kalimantan province in Indonesia. The focus was on oil palm expansion, which is considered as a major driver for deforestation in tropical regions globally. The results suggest that substantial changes in current land-use planning are necessary for most new plantations to comply with international sustainability standards. Through visualizing options for sustainable expansion with our methodology, we demonstrate that the link between oil palm expansion and degradation can be broken. Application of the methodology with criteria and thresholds similar to ours could help the Indonesian government and the industry to achieve its pro-growth, pro-job, pro-poor and pro-environment development goals. For sustainable agricultural production, context specific guidance has to be developed in areas suitable for expansion. Our methodology can serve as a template for designing such commodity and country specific tools and deliver such guidance. PMID:24039700

  19. Quasi-Maximum Likelihood Estimation of Structural Equation Models with Multiple Interaction and Quadratic Effects

    ERIC Educational Resources Information Center

    Klein, Andreas G.; Muthen, Bengt O.

    2007-01-01

    In this article, a nonlinear structural equation model is introduced and a quasi-maximum likelihood method for simultaneous estimation and testing of multiple nonlinear effects is developed. The focus of the new methodology lies on efficiency, robustness, and computational practicability. Monte-Carlo studies indicate that the method is highly…

  20. Methodological Controversies in the Treatment of Panic Disorder.

    ERIC Educational Resources Information Center

    McNally, Richard J.

    1996-01-01

    Although the National Institutes of Health Consensus Development Conference on the Treatment of Panic Disorder endorsed the effectiveness of cognitive-behavior therapy (CBT), D. F. Klein argues that fatal flaws in all but one CBT study undermine claims about the effectiveness of CBT for panic disorder. This article critiques Klein's arguments and…

  1. An Exploratory Study of Video Browsing User Interface Designs and Research Methodologies: Effectiveness in Information Seeking Tasks.

    ERIC Educational Resources Information Center

    Tse, Tony; Vegh, Sandor; Shneiderman, Ben; Marchionini, Gary

    1999-01-01

    The purpose of this exploratory study was to develop research methods to compare the effectiveness of two video browsing interface designs, or surrogates--one static (storyboard) and one dynamic (slide show)--on two distinct information seeking tasks (gist determination and object recognition). (AEF)

  2. Effective Teaching of the Physical Design of Integrated Circuits Using Educational Tools

    ERIC Educational Resources Information Center

    Aziz, Syed Mahfuzul; Sicard, Etienne; Ben Dhia, Sonia

    2010-01-01

    This paper presents the strategies used for effective teaching and skill development in integrated circuit (IC) design using project-based learning (PBL) methodologies. It presents the contexts in which these strategies are applied to IC design courses at the University of South Australia, Adelaide, Australia, and the National Institute of Applied…

  3. Effectiveness of Social Media for Communicating Health Messages in Ghana

    ERIC Educational Resources Information Center

    Bannor, Richard; Asare, Anthony Kwame; Bawole, Justice Nyigmah

    2017-01-01

    Purpose: The purpose of this paper is to develop an in-depth understanding of the effectiveness, evolution and dynamism of the current health communication media used in Ghana. Design/methodology/approach: This paper uses a multi-method approach which utilizes a combination of qualitative and quantitative approaches. In-depth interviews are…

  4. The Spacing Effect and Its Relevance to Second Language Acquisition

    ERIC Educational Resources Information Center

    Rogers, John

    2017-01-01

    This commentary discusses some theoretical and methodological issues related to research on the spacing effect in second language acquisition research (SLA). There has been a growing interest in SLA in how the temporal distribution of input might impact language development. SLA research in this area has frequently drawn upon the rich field of…

  5. A Critical Look at Methodologies Used to Evaluate Charter School Effectiveness

    ERIC Educational Resources Information Center

    Ackerman, Matthew; Egalite, Anna J.

    2017-01-01

    There is no consensus among researchers on charter school effectiveness in the USA, in part because of discrepancies in the research methods employed across various studies. Causal impact estimates from experimental studies demonstrate large positive impacts, but concerns about the generalizability of these results have prompted the development of…

  6. Benchmark Eye Movement Effects during Natural Reading in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Howard, Philippa L.; Liversedge, Simon P.; Benson, Valerie

    2017-01-01

    In 2 experiments, eye tracking methodology was used to assess on-line lexical, syntactic and semantic processing in autism spectrum disorder (ASD). In Experiment 1, lexical identification was examined by manipulating the frequency of target words. Both typically developed (TD) and ASD readers showed normal frequency effects, suggesting that the…

  7. Whale Hearing Models

    DTIC Science & Technology

    2005-06-20

    methodologies and partnership projects developed under the ONR Effect of Sound in the Marine Environment (ESME) Program. The effort involved an integration...computational models to predict audiograms for these species. National Security These data will assist in designing effective noise mitigation measures and...includes marine species for which there are reliable hearing data as well as sample sources with appropriate distance effects in their renditions, including

  8. The Effectiveness of Policies and Programs that Attempt to Reduce Firearm Violence: A Meta-Analysis

    ERIC Educational Resources Information Center

    Makarios, Matthew D.; Pratt, Travis C.

    2012-01-01

    In response to rising rates of firearms violence that peaked in the mid-1990s, a wide range of policy interventions have been developed in an attempt to reduce violent crimes committed with firearms. Although some of these approaches appear to be effective at reducing gun violence, methodological variations make comparing effects across program…

  9. Place-based education: An impetus for teacher efficacy

    NASA Astrophysics Data System (ADS)

    Coleman, Tamara Chase

    This research investigated professional development in place-based (PB) methodology on the efficacy of science teachers. While teachers are expected to use best practices they do not always implement them due to a lack of efficacy in implementation. A professional development program (PD) was designed to increase confidence among teachers planning to incorporate PB methods. Place-based education (PBE) is recognized as a best-practice among professional educators. PBE includes the selection, design and engagement with science using the geographic place as the content. The literature reports that student learning and teacher efficacy will improve when teachers are prepared effectively in PB practices. This dissertation research examined the effects of PD in PB methodology and its influence on the efficacy of seven science teachers who participated in this research. An exploratory, qualitative research approach was used to study the characteristics of change among teachers. Qualitative information was collected about the teachers' confidence with PBE methodology and practices through interviews, in reflective journals and through observations of them working with students in PB settings. Changes in teacher efficacy were accompanied by their becoming more intentional with PBE, networking with experts and expressing a commitment to connect content with the community. The consistency of changes in efficacy among the seven teachers in the study was mixed. Three of the teachers became more confident in their approach to teaching using PB methods and reported the gain in confidence was influenced by the PBE professional development. Three teachers reported that the PD had little effect on their efficacy as teachers to implement PBE. These teachers cited complications from more critical issues in their careers such as time to prepare PBE lessons and meaningful participation in the PD. Those difficulties proved to be hindrances in developing efficacy in implementing PBE. Themes emerging from this research are: PBE is accepted by teachers as a positive methodology to improve efficacy; PBE was recognized as connecting students with and engaging them in learning about their local community and environment; longevity in teaching does not equate with efficacy, and the level of efficacy improves when teachers meaningfully engage in PBE.

  10. Techniques for assessing water resource potentials in the developing countries: with emphasis on streamflow, erosion and sediment transport, water movement in unsaturated soils, ground water, and remote sensing in hydrologic applications

    USGS Publications Warehouse

    Taylor, George C.

    1971-01-01

    Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.

  11. Improved Atmospheric Soundings and Error Estimates from Analysis of AIRS/AMSU Data

    NASA Technical Reports Server (NTRS)

    Susskind, Joel

    2007-01-01

    The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Three very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control; and 3) development of an accurate AIRS only cloud clearing and retrieval system. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions, without the need for microwave observations in the cloud clearing step as has been done previously. In this methodology, longwave C02 channel observations in the spectral region 700 cm-' to 750 cm-' are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm-' to 2395 cm-' are used for temperature sounding purposes. The new methodology for improved error estimates and their use in quality control is described briefly and results are shown indicative of their accuracy. Results are also shown of forecast impact experiments assimilating AIRS Version 5.0 retrieval products in the Goddard GEOS 5 Data Assimilation System using different quality control thresholds.

  12. Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.

    PubMed

    Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D

    2016-04-01

    Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.

  13. EPRI/NRC-RES fire human reliability analysis guidelines.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan

    2010-03-01

    During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less

  14. Technology-enhanced focus groups as a component of instrument development.

    PubMed

    Strout, Tania D; DiFazio, Rachel L; Vessey, Judith A

    2017-06-22

    Background Bullying is a critical public health problem and a screening tool for use in healthcare is needed. Focus groups are a common tool for generating qualitative data when developing an instrument and evidence suggests that technology-enhanced focus groups can be effective in simultaneously engaging participants from diverse settings. Aim To examine the use of technology-enhanced focus groups in generating an item pool to develop a youth-bullying screening tool. Discussion The authors explore methodological and ethical issues related to conducting technology-enhanced focus groups, drawing on their experience in developing a youth-bullying measure. They conducted qualitative focus groups with professionals from the front lines of bullying response and intervention. They describe the experience of conducting technology-enhanced focus group sessions, focusing on the methodological and ethical issues that researchers engaging in similar work may encounter. Challenges associated with this methodology include establishing rapport among participants, privacy concerns and limited non-verbal communication. Conclusion The use of technology-enhanced focus groups can be valuable in obtaining rich data from a wide variety of disciplines and contexts. Organising these focus groups was inexpensive and preferred by the study's participants. Implications for practice Researchers should consider using technology-enhanced focus groups to generate data to develop health-related measurement tools.

  15. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology.

    PubMed

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid

    2017-03-16

    Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. ©Richard Harte, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Leo R Quinlan, Gearóid ÓLaighin. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.03.2017.

  16. Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dilley, Lorie M.

    2015-04-13

    The purpose of this project was to: 1) evaluate the relationship between geothermal fluid processes and the compositions of the fluid inclusion gases trapped in the reservoir rocks; and 2) develop methodologies for interpreting fluid inclusion gas data in terms of the chemical, thermal and hydrological properties of geothermal reservoirs. Phase 1 of this project was designed to conduct the following: 1) model the effects of boiling, condensation, conductive cooling and mixing on selected gaseous species; using fluid compositions obtained from geothermal wells, 2) evaluate, using quantitative analyses provided by New Mexico Tech (NMT), how these processes are recorded bymore » fluid inclusions trapped in individual crystals; and 3) determine if the results obtained on individual crystals can be applied to the bulk fluid inclusion analyses determined by Fluid Inclusion Technology (FIT). Our initial studies however, suggested that numerical modeling of the data would be premature. We observed that the gas compositions, determined on bulk and individual samples were not the same as those discharged by the geothermal wells. Gases discharged from geothermal wells are CO 2-rich and contain low concentrations of light gases (i.e. H 2, He, N, Ar, CH4). In contrast many of our samples displayed enrichments in these light gases. Efforts were initiated to evaluate the reasons for the observed gas distributions. As a first step, we examined the potential importance of different reservoir processes using a variety of commonly employed gas ratios (e.g. Giggenbach plots). The second technical target was the development of interpretational methodologies. We have develop methodologies for the interpretation of fluid inclusion gas data, based on the results of Phase 1, geologic interpretation of fluid inclusion data, and integration of the data. These methodologies can be used in conjunction with the relevant geological and hydrological information on the system to create fluid models for the system. The hope is that the methodologies developed will allow bulk fluid inclusion gas analysis to be a useful tool for estimating relative temperatures, identifying the sources and origins of the geothermal fluids, and developing conceptual models that can be used to help target areas of enhanced permeability.« less

  17. Simulation of Attacks for Security in Wireless Sensor Network

    PubMed Central

    Diaz, Alvaro; Sanchez, Pablo

    2016-01-01

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710

  18. No evidence for intervention-dependent influence of methodological features on treatment effect.

    PubMed

    Jacobs, Wilco C H; Kruyt, Moyo C; Moojen, Wouter A; Verbout, Ab J; Oner, F Cumhur

    2013-12-01

    The goal of this systematic review was to evaluate if the influence of methodological features on treatment effect differs between types of intervention. MEDLINE, Embase, Web of Science, Cochrane methodology register, and reference lists were searched for meta-epidemiologic studies on the influence of methodological features on treatment effect. Studies analyzing influence of methodological features related to internal validity were included. We made a distinction among surgical, pharmaceutical, and therapeutical as separate types of intervention. Heterogeneity was calculated to identify differences among these types. Fourteen meta-epidemiologic studies were found with 51 estimates of influence of methodological features on treatment effect. Heterogeneity was observed among the intervention types for randomization. Surgical intervention studies showed a larger treatment effect when randomized; this was in contrast to pharmaceutical studies that found the opposite. For allocation concealment and double blinding, the influence of methodological features on the treatment effect was comparable across different types of intervention. For the remaining methodological features, there were insufficient observations. The influence of allocation concealment and double blinding on the treatment effect is consistent across studies of different interventional types. The influence of randomization although, may be different between surgical and nonsurgical studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Application of Autonomous Smart Inverter Volt-VAR Function for Voltage Reduction Energy Savings and Power Quality in Electric Distribution Systems: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Nagarajan, Adarsh; Baggu, Murali

    This paper evaluated the impact of smart inverter Volt-VAR function on voltage reduction energy saving and power quality in electric power distribution systems. A methodology to implement the voltage reduction optimization was developed by controlling the substation LTC and capacitor banks, and having smart inverters participate through their autonomous Volt-VAR control. In addition, a power quality scoring methodology was proposed and utilized to quantify the effect on power distribution system power quality. All of these methodologies were applied to a utility distribution system model to evaluate the voltage reduction energy saving and power quality under various PV penetrations and smartmore » inverter densities.« less

  20. Subgroup analyses in confirmatory clinical trials: time to be specific about their purposes.

    PubMed

    Tanniou, Julien; van der Tweel, Ingeborg; Teerenstra, Steven; Roes, Kit C B

    2016-02-18

    It is well recognized that treatment effects may not be homogeneous across the study population. Subgroup analyses constitute a fundamental step in the assessment of evidence from confirmatory (Phase III) clinical trials, where conclusions for the overall study population might not hold. Subgroup analyses can have different and distinct purposes, requiring specific design and analysis solutions. It is relevant to evaluate methodological developments in subgroup analyses against these purposes to guide health care professionals and regulators as well as to identify gaps in current methodology. We defined four purposes for subgroup analyses: (1) Investigate the consistency of treatment effects across subgroups of clinical importance, (2) Explore the treatment effect across different subgroups within an overall non-significant trial, (3) Evaluate safety profiles limited to one or a few subgroup(s), (4) Establish efficacy in the targeted subgroup when included in a confirmatory testing strategy of a single trial. We reviewed the methodology in line with this "purpose-based" framework. The review covered papers published between January 2005 and April 2015 and aimed to classify them in none, one or more of the aforementioned purposes. In total 1857 potentially eligible papers were identified. Forty-eight papers were selected and 20 additional relevant papers were identified from their references, leading to 68 papers in total. Nineteen were dedicated to purpose 1, 16 to purpose 4, one to purpose 2 and none to purpose 3. Seven papers were dedicated to more than one purpose, the 25 remaining could not be classified unambiguously. Purposes of the methods were often not specifically indicated, methods for subgroup analysis for safety purposes were almost absent and a multitude of diverse methods were developed for purpose (1). It is important that researchers developing methodology for subgroup analysis explicitly clarify the objectives of their methods in terms that can be understood from a patient's, health care provider's and/or regulator's perspective. A clear operational definition for consistency of treatment effects across subgroups is lacking, but is needed to improve the usability of subgroup analyses in this setting. Finally, methods to particularly explore benefit-risk systematically across subgroups need more research.

  1. Preparative Purification of Recombinant Proteins: Current Status and Future Trends

    PubMed Central

    Saraswat, Mayank; Ravidá, Alessandra; Holthofer, Harry

    2013-01-01

    Advances in fermentation technologies have resulted in the production of increased yields of proteins of economic, biopharmaceutical, and medicinal importance. Consequently, there is an absolute requirement for the development of rapid, cost-effective methodologies which facilitate the purification of such products in the absence of contaminants, such as superfluous proteins and endotoxins. Here, we provide a comprehensive overview of a selection of key purification methodologies currently being applied in both academic and industrial settings and discuss how innovative and effective protocols such as aqueous two-phase partitioning, membrane chromatography, and high-performance tangential flow filtration may be applied independently of or in conjunction with more traditional protocols for downstream processing applications. PMID:24455685

  2. The Methodology for Developing Mobile Agent Application for Ubiquitous Environment

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi

    A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.

  3. Prognostics and health management design for rotary machinery systems—Reviews, methodology and applications

    NASA Astrophysics Data System (ADS)

    Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David

    2014-01-01

    Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.

  4. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets ofmore » compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less

  5. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Sanders

    2006-09-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less

  6. Experiences of Structured Elicitation for Model-Based Cost-Effectiveness Analyses.

    PubMed

    Soares, Marta O; Sharples, Linda; Morton, Alec; Claxton, Karl; Bojke, Laura

    2018-06-01

    Empirical evidence supporting the cost-effectiveness estimates of particular health care technologies may be limited, or it may even be missing entirely. In these situations, additional information, often in the form of expert judgments, is needed to reach a decision. There are formal methods to quantify experts' beliefs, termed as structured expert elicitation (SEE), but only limited research is available in support of methodological choices. Perhaps as a consequence, the use of SEE in the context of cost-effectiveness modelling is limited. This article reviews applications of SEE in cost-effectiveness modelling with the aim of summarizing the basis for methodological choices made in each application and recording the difficulties and challenges reported by the authors in the design, conduct, and analyses. The methods used in each application were extracted along with the criteria used to support methodological and practical choices and any issues or challenges discussed in the text. Issues and challenges were extracted using an open field, and then categorised and grouped for reporting. The review demonstrates considerable heterogeneity in methods used, and authors acknowledge great methodological uncertainty in justifying their choices. Specificities of the context area emerging as potentially important in determining further methodological research in elicitation are between- expert variation and its interpretation, the fact that substantive experts in the area may not be trained in quantitative subjects, that judgments are often needed on various parameter types, the need for some form of assessment of validity, and the need for more integration with behavioural research to devise relevant debiasing strategies. This review of experiences of SEE highlights a number of specificities/constraints that can shape the development of guidance and target future research efforts in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Screening Methodologies to Support Risk and Technology ...

    EPA Pesticide Factsheets

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title

  8. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  9. TRENDS IN RURAL SULFUR CONCENTRATIONS

    EPA Science Inventory

    As the focus of environmental management has shifted toward regional- scale strategies, there is a growing need to develop statistical methodology for the estimation of regional trends in air pollution. This information is critical to assessing the effects of legislated emission ...

  10. GENETIC ACTIVITY PROFILES AND HAZARD ASSESSMENT

    EPA Science Inventory

    A methodology has been developed to display and evaluate multiple test quantitative information on genetic toxicants for purposes of hazard/risk assessment. ose information is collected from the open literature: either the lowest effective dose (LED) or the highest ineffective do...

  11. Analysis of effects of impurities intentionally incorporated into silicon

    NASA Technical Reports Server (NTRS)

    Uno, F.

    1977-01-01

    A methodology was developed and implemented to allow silicon samples containing intentionally incorporated impurities to be fabricated into finished solar cells under carefully controlled conditions. The electrical and spectral properties were then measured for each group processed.

  12. Managing In-House Development of a Campus-Wide Information System

    ERIC Educational Resources Information Center

    Shurville, Simon; Williams, John

    2005-01-01

    Purpose: To show how a combination of hard and soft project and change management methodologies guided successful in-house development of a campus-wide information system. Design/methodology/approach: A case study of the methodologies and management structures that guided the development is presented. Findings: Applying a combination of the…

  13. Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikkel, D. J.; McCabe, J.

    This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less

  14. [Reconsidering children's dreams. A critical review of methods and results in developmental dream research from Freud to contemporary works].

    PubMed

    Sándor, Piroska; Bódizs, Róbert

    2014-01-01

    Examining children's dream development is a significant challenge for researchers. Results from studies on children's dreaming may enlighten us on the nature and role of dreaming as well as broaden our knowledge of consciousness and cognitive development. This review summarizes the main questions and historical progress in developmental dream research, with the aim of shedding light on the advantages, disadvantages and effects of different settings and methods on research outcomes. A typical example would be the dreams of 3 to 5 year-olds: they are simple and static, with a relative absence of emotions and active self participation according to laboratory studies; studies using different methodology however found them to be vivid, rich in emotions, with the self as an active participant. Questions about the validity of different methods arise, and are considered within this review. Given that methodological differences can result in highly divergent outcomes, it is strongly recommended for future research to select methodology and treat results more carefully.

  15. Systematic and progressive implementation of the centers of excellence for rheumatoid arthritis: a methodological proposal.

    PubMed

    Santos-Moreno, Pedro; Caballero-Uribe, Carlo V; Massardo, Maria Loreto; Maldonado, Claudio Galarza; Soriano, Enrique R; Pineda, Carlos; Cardiel, Mario; Benavides, Juan Alberto; Beltrán, Paula Andrea

    2017-12-01

    The implementation of excellence centers in specific diseases has been gaining recognition in the field of health; specifically in rheumatoid arthritis, where the prognosis of the disease is related to an early diagnosis and a timely intervention, it is necessary that the provision of health services is developed in an environment of quality, opportunity, and safety with the highest standards of care. A methodology that allows this implementation in such a way that is achievable by the most of the care centers is a priority to achieve a better attention to populations with this disease. In this paper, we propose a systematic and progressive methodology that will help all the institutions to develop successful models without faltering in the process. The expected impact on public health is defined by a better effective coverage of high-quality treatments, obtaining better health outcomes with safety and accessibility that reduces the budgetary impact for the health systems of our countries.

  16. Solution- and solid-phase parallel synthesis of 4-alkoxy-substituted pyrimidines with high molecular diversity.

    PubMed

    Font, David; Heras, Montserrat; Villalgordo, José M

    2003-01-01

    A simple and straightforward methodology toward the synthesis of novel 2,6-disubstituted-4-alkoxypyrimidine derivatives of type 16 and 19 has been developed. This methodology, initially developed in solution, can be perfectly adapted to the solid support under analogous conditions, taking full advantage of automated parallel synthesis systems. This successful methodology benefits from the key role played by the thioether linkage placed at the 2-position in 3, 9, or 13 in a double manner: on one side, the steric effect exerted by the thioether linkage is likely to be responsible for the very high observed selectivity toward the formation of the O-alkylation products. On the other side, this sulfur linkage can serve not only as a robust point of attachment for the heterocycle, stable to a number of reaction conditions, but also as a means of introducing a new element of diversity through activation to the corresponding sulfone (safety-catch linker concept) and subsequent ipso-substitution reaction with a variety of different N-nucleophiles.

  17. NASA Handbook for Spacecraft Structural Dynamics Testing

    NASA Technical Reports Server (NTRS)

    Kern, Dennis L.; Scharton, Terry D.

    2005-01-01

    Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook are solicited from the spacecraft structural dynamics testing community.

  18. NASA Handbook for Spacecraft Structural Dynamics Testing

    NASA Technical Reports Server (NTRS)

    Kern, Dennis L.; Scharton, Terry D.

    2004-01-01

    Recent advances in the area of structural dynamics and vibrations, in both methodology and capability, have the potential to make spacecraft system testing more effective from technical, cost, schedule, and hardware safety points of view. However, application of these advanced test methods varies widely among the NASA Centers and their contractors. Identification and refinement of the best of these test methodologies and implementation approaches has been an objective of efforts by the Jet Propulsion Laboratory on behalf of the NASA Office of the Chief Engineer. But to develop the most appropriate overall test program for a flight project from the selection of advanced methodologies, as well as conventional test methods, spacecraft project managers and their technical staffs will need overall guidance and technical rationale. Thus, the Chief Engineer's Office has recently tasked JPL to prepare a NASA Handbook for Spacecraft Structural Dynamics Testing. An outline of the proposed handbook, with a synopsis of each section, has been developed and is presented herein. Comments on the proposed handbook is solicited from the spacecraft structural dynamics testing community.

  19. A multi-scale modelling procedure to quantify hydrological impacts of upland land management

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.; Jackson, B.; Bulygina, N.; Ballard, C.; McIntyre, N.; Marshall, M.; Frogbrook, Z.; Solloway, I.; Reynolds, B.

    2008-12-01

    Recent UK floods have focused attention on the effects of agricultural intensification on flood risk. However, quantification of these effects raises important methodological issues. Catchment-scale data have proved inadequate to support analysis of impacts of land management change, due to climate variability, uncertainty in input and output data, spatial heterogeneity in land use and lack of data to quantify historical changes in management practices. Manipulation experiments to quantify the impacts of land management change have necessarily been limited and small scale, and in the UK mainly focused on the lowlands and arable agriculture. There is a need to develop methods to extrapolate from small scale observations to predict catchment-scale response, and to quantify impacts for upland areas. With assistance from a cooperative of Welsh farmers, a multi-scale experimental programme has been established at Pontbren, in mid-Wales, an area of intensive sheep production. The data have been used to support development of a multi-scale modelling methodology to assess impacts of agricultural intensification and the potential for mitigation of flood risk through land use management. Data are available from replicated experimental plots under different land management treatments, from instrumented field and hillslope sites, including tree shelter belts, and from first and second order catchments. Measurements include climate variables, soil water states and hydraulic properties at multiple depths and locations, tree interception, overland flow and drainflow, groundwater levels, and streamflow from multiple locations. Fine resolution physics-based models have been developed to represent soil and runoff processes, conditioned using experimental data. The detailed models are used to calibrate simpler 'meta- models' to represent individual hydrological elements, which are then combined in a semi-distributed catchment-scale model. The methodology is illustrated using field and catchment-scale simulations to demonstrate the the response of improved and unimproved grassland, and the potential effects of land management interventions, including farm ponds, tree shelter belts and buffer strips. It is concluded that the methodology developed has the potential to represent and quantify catchment-scale effects of upland management; continuing research is extending the work to a wider range of upland environments and land use types, with the aim of providing generic simulation tools that can be used to provide strategic policy guidance.

  20. Task III: Development of an Effective Computational Methodology for Body Force Representation of High-speed Rotor 37

    NASA Technical Reports Server (NTRS)

    Tan, Choon-Sooi; Suder, Kenneth (Technical Monitor)

    2003-01-01

    A framework for an effective computational methodology for characterizing the stability and the impact of distortion in high-speed multi-stage compressor is being developed. The methodology consists of using a few isolated-blade row Navier-Stokes solutions for each blade row to construct a body force database. The purpose of the body force database is to replace each blade row in a multi-stage compressor by a body force distribution to produce same pressure rise and flow turning. To do this, each body force database is generated in such a way that it can respond to the changes in local flow conditions. Once the database is generated, no hrther Navier-Stokes computations are necessary. The process is repeated for every blade row in the multi-stage compressor. The body forces are then embedded as source terms in an Euler solver. The method is developed to have the capability to compute the performance in a flow that has radial as well as circumferential non-uniformity with a length scale larger than a blade pitch; thus it can potentially be used to characterize the stability of a compressor under design. It is these two latter features as well as the accompanying procedure to obtain the body force representation that distinguish the present methodology from the streamline curvature method. The overall computational procedures have been developed. A dimensional analysis was carried out to determine the local flow conditions for parameterizing the magnitudes of the local body force representation of blade rows. An Euler solver was modified to embed the body forces as source terms. The results from the dimensional analysis show that the body forces can be parameterized in terms of the two relative flow angles, the relative Mach number, and the Reynolds number. For flow in a high-speed transonic blade row, they can be parameterized in terms of the local relative Mach number alone.

  1. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    PubMed

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  2. Development of a Quantitative Methodology to Assess the Impacts of Urban Transport Interventions and Related Noise on Well-Being

    PubMed Central

    Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A.; Keuken, Menno; Perez, Laura; Martuzzi, Marco

    2015-01-01

    Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project “Urban Reduction of Greenhouse Gas Emissions in China and Europe” (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution. PMID:26016437

  3. Development of a quantitative methodology to assess the impacts of urban transport interventions and related noise on well-being.

    PubMed

    Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A; Keuken, Menno; Perez, Laura; Martuzzi, Marco

    2015-05-26

    Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project "Urban Reduction of Greenhouse Gas Emissions in China and Europe" (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution.

  4. Regional Shelter Analysis Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less

  5. Influence of the Participatory Budgeting on the Infrastructural Development of the Territories in the Russian Federation

    ERIC Educational Resources Information Center

    Tsurkan, Marina V.; Sotskova, Svetlana I.; Aksinina, Olga S.; Lyubarskaya, Maria A.; Tkacheva, Oksana N.

    2016-01-01

    The relevance of the investigated problem is caused by the need for the advancing of participatory budgeting practice in the Russian Federation. Due to insufficient development of theoretical, scientific, and methodological aspects of the participatory budgeting, very few territories in the Russian Federation use this tool effectively. The most…

  6. Effective Learning Systems through Blended Teaching Modules in Adult Secondary Education Systems in Developing Nations: Need for Partnership

    ERIC Educational Resources Information Center

    Ike, Eucharia; Okechukwu, Ibeh Bartholomew

    2015-01-01

    We investigated methodological lessons in randomly selected adult secondary schools to construct a case for international partnership while examining education development in Nigeria. Standard database and web-based searches were conducted for publications between 1985 and 2012 on learning systems. This paper presents its absence and finds a heavy…

  7. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  8. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  9. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  10. Equipped for Change: Development and Implementation of a Case Statement at an Urban Community College

    ERIC Educational Resources Information Center

    Krishnan, Sathasivam

    2010-01-01

    This action research study examined the process of creation and implementation of a case statement for an urban community college foundation. An instrumental case study methodology was used in examining this process. The study chronicled a successful participatory development process that allowed a number of stakeholders to effectively work on…

  11. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  12. The state of research on the effects of therapeutic touch.

    PubMed

    Easter, A

    1997-06-01

    Therapeutic Touch is investigated using an integrative review of the literature. Using Ganong's (1987) methodology, the article explores the research question, What is the state of development of research regarding Therapeutic Touch? by analyzing primary research reports from 23 articles in 14 referred journals. The findings of the review indicate positive regard for the use of Therapeutic Touch. All research points to the need for further study in this area. Research methods used are satisfactory, but more rigorous methodologies would promote a more scientific contribution to the body of literature on Therapeutic Touch.

  13. Apparatus and methodology for fire gas characterization by means of animal exposure

    NASA Technical Reports Server (NTRS)

    Marcussen, W. H.; Hilado, C. J.; Furst, A.; Leon, H. A.; Kourtides, D. A.; Parker, J. A.; Butte, J. C.; Cummins, J. M.

    1976-01-01

    While there is a great deal of information available from small-scale laboratory experiments and for relatively simple mixtures of gases, considerable uncertainty exists regarding appropriate bioassay techniques for the complex mixture of gases generated in full-scale fires. Apparatus and methodology have been developed based on current state of the art for determining the effects of fire gases in the critical first 10 minutes of a full-scale fire on laboratory animals. This information is presented for its potential value and use while further improvements are being made.

  14. Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.

  15. An Alternative Methodological Approach for Cost-Effectiveness Analysis and Decision Making in Genomic Medicine.

    PubMed

    Fragoulakis, Vasilios; Mitropoulou, Christina; van Schaik, Ron H; Maniadakis, Nikolaos; Patrinos, George P

    2016-05-01

    Genomic Medicine aims to improve therapeutic interventions and diagnostics, the quality of life of patients, but also to rationalize healthcare costs. To reach this goal, careful assessment and identification of evidence gaps for public health genomics priorities are required so that a more efficient healthcare environment is created. Here, we propose a public health genomics-driven approach to adjust the classical healthcare decision making process with an alternative methodological approach of cost-effectiveness analysis, which is particularly helpful for genomic medicine interventions. By combining classical cost-effectiveness analysis with budget constraints, social preferences, and patient ethics, we demonstrate the application of this model, the Genome Economics Model (GEM), based on a previously reported genome-guided intervention from a developing country environment. The model and the attendant rationale provide a practical guide by which all major healthcare stakeholders could ensure the sustainability of funding for genome-guided interventions, their adoption and coverage by health insurance funds, and prioritization of Genomic Medicine research, development, and innovation, given the restriction of budgets, particularly in developing countries and low-income healthcare settings in developed countries. The implications of the GEM for the policy makers interested in Genomic Medicine and new health technology and innovation assessment are also discussed.

  16. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review.

    PubMed

    Booth, Andrew

    2016-05-04

    Qualitative systematic reviews or qualitative evidence syntheses (QES) are increasingly recognised as a way to enhance the value of systematic reviews (SRs) of clinical trials. They can explain the mechanisms by which interventions, evaluated within trials, might achieve their effect. They can investigate differences in effects between different population groups. They can identify which outcomes are most important to patients, carers, health professionals and other stakeholders. QES can explore the impact of acceptance, feasibility, meaningfulness and implementation-related factors within a real world setting and thus contribute to the design and further refinement of future interventions. To produce valid, reliable and meaningful QES requires systematic identification of relevant qualitative evidence. Although the methodologies of QES, including methods for information retrieval, are well-documented, little empirical evidence exists to inform their conduct and reporting. This structured methodological overview examines papers on searching for qualitative research identified from the Cochrane Qualitative and Implementation Methods Group Methodology Register and from citation searches of 15 key papers. A single reviewer reviewed 1299 references. Papers reporting methodological guidance, use of innovative methodologies or empirical studies of retrieval methods were categorised under eight topical headings: overviews and methodological guidance, sampling, sources, structured questions, search procedures, search strategies and filters, supplementary strategies and standards. This structured overview presents a contemporaneous view of information retrieval for qualitative research and identifies a future research agenda. This review concludes that poor empirical evidence underpins current information practice in information retrieval of qualitative research. A trend towards improved transparency of search methods and further evaluation of key search procedures offers the prospect of rapid development of search methods.

  17. Cognition in multiple sclerosis

    PubMed Central

    Benedict, Ralph; Enzinger, Christian; Filippi, Massimo; Geurts, Jeroen J.; Hamalainen, Paivi; Hulst, Hanneke; Inglese, Matilde; Leavitt, Victoria M.; Rocca, Maria A.; Rosti-Otajarvi, Eija M.; Rao, Stephen

    2018-01-01

    Cognitive decline is recognized as a prevalent and debilitating symptom of multiple sclerosis (MS), especially deficits in episodic memory and processing speed. The field aims to (1) incorporate cognitive assessment into standard clinical care and clinical trials, (2) utilize state-of-the-art neuroimaging to more thoroughly understand neural bases of cognitive deficits, and (3) develop effective, evidence-based, clinically feasible interventions to prevent or treat cognitive dysfunction, which are lacking. There are obstacles to these goals. Our group of MS researchers and clinicians with varied expertise took stock of the current state of the field, and we identify several important practical and theoretical challenges, including key knowledge gaps and methodologic limitations related to (1) understanding and measurement of cognitive deficits, (2) neuroimaging of neural bases and correlates of deficits, and (3) development of effective treatments. This is not a comprehensive review of the extensive literature, but instead a statement of guidelines and priorities for the field. For instance, we provide recommendations for improving the scientific basis and methodologic rigor for cognitive rehabilitation research. Toward this end, we call for multidisciplinary collaborations toward development of biologically based theoretical models of cognition capable of empirical validation and evidence-based refinement, providing the scientific context for effective treatment discovery. PMID:29343470

  18. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  19. A multi-criteria analysis of options for energy recovery from municipal solid waste in India and the UK.

    PubMed

    Yap, H Y; Nixon, J D

    2015-12-01

    Energy recovery from municipal solid waste plays a key role in sustainable waste management and energy security. However, there are numerous technologies that vary in suitability for different economic and social climates. This study sets out to develop and apply a multi-criteria decision making methodology that can be used to evaluate the trade-offs between the benefits, opportunities, costs and risks of alternative energy from waste technologies in both developed and developing countries. The technologies considered are mass burn incineration, refuse derived fuel incineration, gasification, anaerobic digestion and landfill gas recovery. By incorporating qualitative and quantitative assessments, a preference ranking of the alternative technologies is produced. The effect of variations in decision criteria weightings are analysed in a sensitivity analysis. The methodology is applied principally to compare and assess energy recovery from waste options in the UK and India. These two countries have been selected as they could both benefit from further development of their waste-to-energy strategies, but have different technical and socio-economic challenges to consider. It is concluded that gasification is the preferred technology for the UK, whereas anaerobic digestion is the preferred technology for India. We believe that the presented methodology will be of particular value for waste-to-energy decision-makers in both developed and developing countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Mechanical model of suture joints with fibrous connective layer

    NASA Astrophysics Data System (ADS)

    Miroshnichenko, Kateryna; Liu, Lei; Tsukrov, Igor; Li, Yaning

    2018-02-01

    A composite model for suture joints with a connective layer of aligned fibers embedded in soft matrix is proposed. Based on the principle of complementary virtual work, composite cylinder assemblage (CCA) approach and generalized self-consistent micro-mechanical models, a hierarchical homogenization methodology is developed to systematically quantify the synergistic effects of suture morphology and fiber orientation on the overall mechanical properties of sutures. Suture joints with regular triangular wave-form serve as an example material system to apply this methodology. Both theoretical and finite element mechanical models are developed and compared to evaluate the overall normal stiffness of sutures as a function of wavy morphology of sutures, fiber orientation, fiber volume fraction, and the mechanical properties of fibers and matrix in the interfacial layer. It is found that generally due to the anisotropy-induced coupling effects between tensile and shear deformation, the effective normal stiffness of sutures is highly dependent on the fiber orientation in the connective layer. Also, the effective shear modulus of the connective layer and the stiffness ratio between the fiber and matrix significantly influence the effects of fiber orientation. In addition, optimal fiber orientations are found to maximize the stiffness of suture joints.

  1. The temporal structure of pollution levels in developed cities.

    PubMed

    Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos

    2015-06-01

    Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Methodology for the Elimination of Reflection and System Vibration Effects in Particle Image Velocimetry Data Processing

    NASA Technical Reports Server (NTRS)

    Bremmer, David M.; Hutcheson, Florence V.; Stead, Daniel J.

    2005-01-01

    A methodology to eliminate model reflection and system vibration effects from post processed particle image velocimetry data is presented. Reflection and vibration lead to loss of data, and biased velocity calculations in PIV processing. A series of algorithms were developed to alleviate these problems. Reflections emanating from the model surface caused by the laser light sheet are removed from the PIV images by subtracting an image in which only the reflections are visible from all of the images within a data acquisition set. The result is a set of PIV images where only the seeded particles are apparent. Fiduciary marks painted on the surface of the test model were used as reference points in the images. By locating the centroids of these marks it was possible to shift all of the images to a common reference frame. This image alignment procedure as well as the subtraction of model reflection are performed in a first algorithm. Once the images have been shifted, they are compared with a background image that was recorded under no flow conditions. The second and third algorithms find the coordinates of fiduciary marks in the acquisition set images and the background image and calculate the displacement between these images. The final algorithm shifts all of the images so that fiduciary mark centroids lie in the same location as the background image centroids. This methodology effectively eliminated the effects of vibration so that unbiased data could be used for PIV processing. The PIV data used for this work was generated at the NASA Langley Research Center Quiet Flow Facility. The experiment entailed flow visualization near the flap side edge region of an airfoil model. Commercial PIV software was used for data acquisition and processing. In this paper, the experiment and the PIV acquisition of the data are described. The methodology used to develop the algorithms for reflection and system vibration removal is stated, and the implementation, testing and validation of these algorithms are presented.

  3. Quantifying the influence of previously burned areas on suppression effectiveness and avoided exposure: A case study of the Las Conchas Fire

    Treesearch

    Matthew P. Thompson; Patrick Freeborn; Jon D. Rieck; Dave Calkin; Julie W. Gilbertson-Day; Mark A. Cochrane; Michael S. Hand

    2016-01-01

    We present a case study of the Las Conchas Fire (2011) to explore the role of previously burned areas (wildfires and prescribed fires) on suppression effectiveness and avoided exposure. Methodological innovations include characterisation of the joint dynamics of fire growth and suppression activities, development of a fire line effectiveness framework, and...

  4. A Methodology for Performing Effects-Based Assessments

    DTIC Science & Technology

    2006-03-01

    can then be used to make assessments (Smith, 2002:384). 2.8 Cause-and-Effect Diagrams Developed by Dr. Kaoru Ishikawa in the field of Quality Control...John Wiley & Sons, Inc., 2004. Ishikawa , Kaoru . Guide to Quality Control. White Plains NY: Asian Productivity Organization, 1989...various causes affecting product quality by sorting out and relating the causes. Cause-and-effect diagrams are sometimes called Ishikawa diagrams, or

  5. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  6. Airport emissions quantification: Impacts of electrification. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geba, V.

    1998-07-01

    Four airports were assessed to demonstrate that electrification of economically viable air- and land-side vehicles and equipment can significantly reduce total airport emissions. Assessments were made using the FAA`s Emissions and Dispersion Modeling System and EPRI Airport Electrification Project data. Development and implementation of cost-effective airport emissions reduction strategies can be complex, requiring successful collaboration of local, state, and federal regulatory agencies with airport authorities. The methodology developed in this study helps to simplify this task. The objectives of this study were: to develop a methodology to quantify annual emissions at US airports from all sources--aircraft, vehicles, and infrastructure; andmore » to demonstrate that electrification of economically viable air- and land-side vehicles and equipment can significantly reduce total airport emissions on-site, even when allowing for emissions from the generation of electricity.« less

  7. The Cost Effectivenes of the Pomona Plan.

    ERIC Educational Resources Information Center

    Metzler, Howard C.

    1979-01-01

    The cost effectiveness of the Pomona Plan, a well-established annuity and trust program, is examined. The historical development of the deferred-giving plan, its ability to elicit support, and methodologies for evaluating gift values and related costs are discussed. (Author/SF)

  8. Evaluation of Ethanol Fuel Blends in EPA MOVES2014 Model

    DOT National Transportation Integrated Search

    2016-01-01

    In this report, the methodology and prediction effects of the MOVES model development are reviewed and evaluated in relation to the use of ethanol fuel blends. Particular attention is placed on mid-level ethanol fuel blends (containing between ...

  9. Mentos and Scientific Method: A Sweet Combination

    ERIC Educational Resources Information Center

    Eichler, Jack F.; Patrick, Heather; Harmon, Brenda; Coonce, Janet

    2007-01-01

    Several active-learning techniques and inquiry-driven laboratory exercises were incorporated in labs to determine the effects of these methodologies on the fundamental skills of the students. The practice has been found extremely useful for developing the learning abilities of the students.

  10. Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)

    DOT National Transportation Integrated Search

    1998-08-01

    The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...

  11. Simulation analysis of route diversion strategies for freeway incident management : final report.

    DOT National Transportation Integrated Search

    1995-02-01

    The purpose of this project was to investigate whether simulation models could : be used as decision aids for defining traffic diversion strategies for effective : incident management. A methodology was developed for using such a model to : determine...

  12. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  13. "Pedagogic Strategies": A Conceptual Framework for Effective Parent and Practitioner Strategies When Working with Children under Five

    ERIC Educational Resources Information Center

    Lawrence, Penny; Gallagher, Tracy

    2015-01-01

    This article traces the development of adult Pedagogic Strategies with children aged 0-5 years at the Pen Green Centre for Children and Their Families in England. Pedagogical Strategies are a conceptual framework of effective strategies both practitioners and parents "already" have to support children's learning. The methodology was…

  14. Development of a methodology to measure the effect of ergot alkaloids on forestomach motility using real-time wireless telemetry

    USDA-ARS?s Scientific Manuscript database

    The objectives of these experiments were to characterize rumen motility patterns of cattle fed once daily using a real-time wireless telemetry system, determine when to measure rumen motility with this system, and determine the effect of ruminal dosing of ergot alkaloids on rumen motility. Ruminally...

  15. Teaching Competitive Intelligence Skills to North American and Overseas Audiences: A World of Difference in Pedagogical Effectiveness

    ERIC Educational Resources Information Center

    Blenkhorn, David L.; Fleisher, Craig S.

    2010-01-01

    This article contrasts teaching methodologies and pedagogical effectiveness in executive development programs delivered in North America and three diverse regions of the world. Based on the authors' collective teaching experience exceeding 40 years encompassing over 24 countries, and augmented by a review of the literature, a theoretical model is…

  16. Methods, History, Selected Findings, and Recommendations from the Louisiana School Effectiveness Study, 1980-85

    ERIC Educational Resources Information Center

    Teddlie, Charles; Stringfield, Samuel; Desselle, Stephanie

    2017-01-01

    An overview of the first five years of the Louisiana School Effectiveness Study (LSES) is described. The longitudinal nature of the study has allowed the research team to develop an evolving methodology, one benefiting from prior external studies as well as prior phases of LSES. Practical implications and recommendations for future research are…

  17. Guiding the Development and Use of Cost-Effectiveness Analysis in Education

    ERIC Educational Resources Information Center

    Levin, Henry M.; Belfield, Clive

    2015-01-01

    Cost-effectiveness analysis is rarely used in education. When it is used, it often fails to meet methodological standards, especially with regard to cost measurement. Although there are occasional criticisms of these failings, we believe that it is useful to provide a listing of the more common concerns and how they might be addressed. Based upon…

  18. Invited Reaction: Outsourcing Relationships between Firms and their Training Providers--The Role of Trust

    ERIC Educational Resources Information Center

    Leimbach, Michael P.

    2005-01-01

    Outsourcing in the training and development industry has been steadily increasing and shows no indication of slowing (Surgue & Kim, 2004). Gainey and Klaas's study shines light on the role of interfirm trust in effective outsourcing relationships. This reaction addresses a methodological question of the effect of the rating target on the results,…

  19. Learning from Abdallah: A Case Study of an Arabic-Speaking Child in a U.S. School

    ERIC Educational Resources Information Center

    Palmer, Barbara C.; El-Ashry, Fathi; Leclere, Judith T.; Chang, Sara

    2007-01-01

    Abdallah, an Arabic-speaking, Palestinian 9-year-old student, was observed as he worked to understand his new English language and culture. Some issues and questions addressed in the article include effective methodologies for the assessment of literacy development in the Arabic and English languages, effective instructional strategies to scaffold…

  20. Revisiting an Old Methodology for Teaching Counting, Computation, and Place Value: The Effectiveness of the Finger Calculation Method for At-Risk Children

    ERIC Educational Resources Information Center

    Calder Stegemann, Kim; Grünke, Matthias

    2014-01-01

    Number sense is critical to the development of higher order mathematic abilities. However, some children have difficulty acquiring these fundamental skills and the knowledge base of effective interventions/remediation is relatively limited. Based on emerging neuro-scientific research which has identified the association between finger…

  1. Effect of Participatory Research on Farmers' Knowledge and Practice of IPM: The Case of Cotton in Benin

    ERIC Educational Resources Information Center

    Togbé, Codjo Euloge; Haagsma, Rein; Aoudji, Augustin K. N.; Vodouhê, Simplice D.

    2015-01-01

    Purpose: This study assesses the effect of participatory research on farmers' knowledge and practice of Integrated Pest Management (IPM) in Benin. The participatory field experiments were carried out during the 2011-2012 cotton growing season, and focused on the development and application of pest management knowledge. Methodology: A…

  2. ON THE CONSTRUCTION OF LATIN SQUARES COUNTERBALANCED FOR IMMEDIATE SEQUENTIAL EFFECTS.

    ERIC Educational Resources Information Center

    HOUSTON, TOM R., JR.

    THIS REPORT IS ONE OF A SERIES DESCRIBING NEW DEVELOPMENTS IN THE AREA OF RESEARCH METHODOLOGY. IT DEALS WITH LATIN SQUARES AS A CONTROL FOR PROGRESSIVE AND ADJACENCY EFFECTS IN EXPERIMENTAL DESIGNS. THE HISTORY OF LATIN SQUARES IS ALSO REVIEWED, AND SEVERAL ALGORITHMS FOR THE CONSTRUCTION OF LATIN AND GRECO-LATIN SQUARES ARE PROPOSED. THE REPORT…

  3. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boissonnade, A; Hossain, Q; Kimball, J

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less

  4. How Methodologic Differences Affect Results of Economic Analyses: A Systematic Review of Interferon Gamma Release Assays for the Diagnosis of LTBI

    PubMed Central

    Oxlade, Olivia; Pinto, Marcia; Trajman, Anete; Menzies, Dick

    2013-01-01

    Introduction Cost effectiveness analyses (CEA) can provide useful information on how to invest limited funds, however they are less useful if different analysis of the same intervention provide unclear or contradictory results. The objective of our study was to conduct a systematic review of methodologic aspects of CEA that evaluate Interferon Gamma Release Assays (IGRA) for the detection of Latent Tuberculosis Infection (LTBI), in order to understand how differences affect study results. Methods A systematic review of studies was conducted with particular focus on study quality and the variability in inputs used in models used to assess cost-effectiveness. A common decision analysis model of the IGRA versus Tuberculin Skin Test (TST) screening strategy was developed and used to quantify the impact on predicted results of observed differences of model inputs taken from the studies identified. Results Thirteen studies were ultimately included in the review. Several specific methodologic issues were identified across studies, including how study inputs were selected, inconsistencies in the costing approach, the utility of the QALY (Quality Adjusted Life Year) as the effectiveness outcome, and how authors choose to present and interpret study results. When the IGRA versus TST test strategies were compared using our common decision analysis model predicted effectiveness largely overlapped. Implications Many methodologic issues that contribute to inconsistent results and reduced study quality were identified in studies that assessed the cost-effectiveness of the IGRA test. More specific and relevant guidelines are needed in order to help authors standardize modelling approaches, inputs, assumptions and how results are presented and interpreted. PMID:23505412

  5. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  6. Methodological considerations in observational comparative effectiveness research for implantable medical devices: an epidemiologic perspective.

    PubMed

    Jalbert, Jessica J; Ritchey, Mary Elizabeth; Mi, Xiaojuan; Chen, Chih-Ying; Hammill, Bradley G; Curtis, Lesley H; Setoguchi, Soko

    2014-11-01

    Medical devices play a vital role in diagnosing, treating, and preventing diseases and are an integral part of the health-care system. Many devices, including implantable medical devices, enter the market through a regulatory pathway that was not designed to assure safety and effectiveness. Several recent studies and high-profile device recalls have demonstrated the need for well-designed, valid postmarketing studies of medical devices. Medical device epidemiology is a relatively new field compared with pharmacoepidemiology, which for decades has been developed to assess the safety and effectiveness of medications. Many methodological considerations in pharmacoepidemiology apply to medical device epidemiology. Fundamental differences in mechanisms of action and use and in how exposure data are captured mean that comparative effectiveness studies of medical devices often necessitate additional and different considerations. In this paper, we discuss some of the most salient issues encountered in conducting comparative effectiveness research on implantable devices. We discuss special methodological considerations regarding the use of data sources, exposure and outcome definitions, timing of exposure, and sources of bias. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. A scaleable methodology for assessing the impacts of urban shade on the summer electricity use of residential homes

    NASA Astrophysics Data System (ADS)

    Taylor, Robert Vanderlei

    Our cities are experiencing unprecedented growth while net global temperatures continue to trend warmer making sustainable urban development and energy conservation pressing public issues. This research explores how urban landscaping -- in particular trees and buildings -- affect summer electricity use in residential homes. I studied the interactions of urban shade and temperature to explore how vegetation distribution and intensity could play a meaningful role in heat mitigation in urban environments. Only a few studies have reconciled modeled electricity savings from tree shade with actual electricity consumption data. This research proposes a methodology for modeling the isolated effects of urban shade (tree shade vs building shade) on buildings' summertime electricity consumption from micro to mesoscales, empirically validating the modeled shade with actual electricity billing data, and comparing the electric energetic impact of tree shade effects with building shade effects. This proposed methodology seeks to resolve three primary research questions: 1) What are the modeled quantities of urban shade associated with the area of interest (AOI)? 2) To what extent do the effects of shading from trees and buildings mitigate summertime heat in the AOI? 2) To what extent do the shade effects from trees and buildings reduce summertime electricity consumption in the AOI?

  8. Probabilistic lifetime strength of aerospace materials via computational simulation

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

    1991-01-01

    The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

  9. Methodological and Epistemological Considerations in Utilizing Qualitative Inquiry to Develop Interventions.

    PubMed

    Duggleby, Wendy; Williams, Allison

    2016-01-01

    The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.

  10. Accounting for the drug life cycle and future drug prices in cost-effectiveness analysis.

    PubMed

    Hoyle, Martin

    2011-01-01

    Economic evaluations of health technologies typically assume constant real drug prices and model only the cohort of patients currently eligible for treatment. It has recently been suggested that, in the UK, we should assume that real drug prices decrease at 4% per annum and, in New Zealand, that real drug prices decrease at 2% per annum and at patent expiry the drug price falls. It has also recently been suggested that we should model multiple future incident cohorts. In this article, the cost effectiveness of drugs is modelled based on these ideas. Algebraic expressions are developed to capture all costs and benefits over the entire life cycle of a new drug. The lifetime of a new drug in the UK, a key model parameter, is estimated as 33 years, based on the historical lifetime of drugs in England over the last 27 years. Under the proposed methodology, cost effectiveness is calculated for seven new drugs recently appraised in the UK. Cost effectiveness as assessed in the future is also estimated. Whilst the article is framed in mathematics, the findings and recommendations are also explained in non-mathematical language. The 'life-cycle correction factor' is introduced, which is used to convert estimates of cost effectiveness as traditionally calculated into estimates under the proposed methodology. Under the proposed methodology, all seven drugs appear far more cost effective in the UK than published. For example, the incremental cost-effectiveness ratio decreases by 46%, from £61, 900 to £33, 500 per QALY, for cinacalcet versus best supportive care for end-stage renal disease, and by 45%, from £31,100 to £17,000 per QALY, for imatinib versus interferon-α for chronic myeloid leukaemia. Assuming real drug prices decrease over time, the chance that a drug is publicly funded increases over time, and is greater when modelling multiple cohorts than with a single cohort. Using the methodology (compared with traditional methodology) all drugs in the UK and New Zealand are predicted to be more cost effective. It is suggested that the willingness-to-pay threshold should be reduced in the UK and New Zealand. The ranking of cost effectiveness will change with drugs assessed as relatively more cost effective and medical devices and surgical procedures relatively less cost effective than previously thought. The methodology is very simple to implement. It is suggested that the model should be parameterized for other countries.

  11. Implementation of an active instructional design for teaching the concepts of current, voltage and resistance

    NASA Astrophysics Data System (ADS)

    Orlaineta-Agüero, S.; Del Sol-Fernández, S.; Sánchez-Guzmán, D.; García-Salcedo, R.

    2017-01-01

    In the present work we show the implementation of a learning sequence based on an active learning methodology for teaching Physics, this proposal tends to promote a better learning in high school students with the use of a comic book and it combines the use of different low-cost experimental activities for teaching the electrical concepts of Current, Resistance and Voltage. We consider that this kind of strategy can be easily extrapolated to higher-education levels like Engineering-college/university level and other disciplines of Science. To evaluate this proposal, we used some conceptual questions from the Electric Circuits Concept Evaluation survey developed by Sokoloff and the results from this survey was analysed with the Normalized Conceptual Gain proposed by Hake and the Concentration Factor that was proposed by Bao and Redish, to identify the effectiveness of the methodology and the models that the students presented after and before the instruction, respectively. We found that this methodology was more effective than only the implementation of traditional lectures, we consider that these results cannot be generalized but gave us the opportunity to view many important approaches in Physics Education; finally, we will continue to apply the same experiment with more students, in the same and upper levels of education, to confirm and validate the effectiveness of this methodology proposal.

  12. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  13. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  14. Developing an ethical code for engineers: the discursive approach.

    PubMed

    Lozano, J Félix

    2006-04-01

    From the Hippocratic Oath on, deontological codes and other professional self-regulation mechanisms have been used to legitimize and identify professional groups. New technological challenges and, above all, changes in the socioeconomic environment require adaptable codes which can respond to new demands. We assume that ethical codes for professionals should not simply focus on regulative functions, but must also consider ideological and educative functions. Any adaptations should take into account both contents (values, norms and recommendations) and the drafting process itself. In this article we propose a process for developing a professional ethical code for an official professional association (Colegio Oficial de Ingenieros Industriales de Valencia (COIIV) starting from the philosophical assumptions of discursive ethics but adapting them to critical hermeneutics. Our proposal is based on the Integrity Approach rather than the Compliance Approach. A process aiming to achieve an effective ethical document that fulfils regulative and ideological functions requires a participative, dialogical and reflexive methodology. This process must respond to moral exigencies and demands for efficiency and professional effectiveness. In addition to the methodological proposal we present our experience of producing an ethical code for the industrial engineers' association in Valencia (Spain) where this methodology was applied, and we evaluate the detected problems and future potential.

  15. The VeTOOLS Project: an example of how to strengthen collaboration between scientists and Civil Protections in disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Marti, Joan; Bartolini, Stefania; Becerril, Laura

    2016-04-01

    VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524

  16. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    NASA Astrophysics Data System (ADS)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over developing new super sensors.

  17. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  18. [Counseling interventions for smoking cessation: systematic review].

    PubMed

    Alba, Luz Helena; Murillo, Raúl; Castillo, Juan Sebastián

    2013-04-01

    A systematic review on efficacy and safety of smoking cessation counseling was developed. The ADAPTE methodology was used with a search of Clinical Practice Guidelines (CPG) in Medline, EMBASE, CINAHL, LILACS, and Cochrane. DELBI was used to select CPG with score over 60 in methodological rigor and applicability to the Colombian health system. Smoking cessation rates at 6 months were assessed according to counseling provider, model, and format. In total 5 CPG out of 925 references were selected comprising 44 systematic reviews and meta-analyses. Physician brief counseling and trained health professionals' intensive counseling (individual, group, proactive telephone) are effective with abstinence rates between 2.1% and 17.4%. Only practical counseling and motivational interview were found effective intensive interventions. The clinical effect of smoking cessation counseling is low and long term cessation rates uncertain. Cost-effectiveness analyses are recommended for the implementation of counseling in public health programs.

  19. Progressive failure methodologies for predicting residual strength and life of laminated composites

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Allen, David H.; Obrien, T. Kevin

    1991-01-01

    Two progressive failure methodologies currently under development by the Mechanics of Materials Branch at NASA Langley Research Center are discussed. The damage tolerance/fail safety methodology developed by O'Brien is an engineering approach to ensuring adequate durability and damage tolerance by treating only delamination onset and the subsequent delamination accumulation through the laminate thickness. The continuum damage model developed by Allen and Harris employs continuum damage laws to predict laminate strength and life. The philosophy, mechanics framework, and current implementation status of each methodology are presented.

  20. Assessment of cognitive safety in clinical drug development

    PubMed Central

    Roiser, Jonathan P.; Nathan, Pradeep J.; Mander, Adrian P.; Adusei, Gabriel; Zavitz, Kenton H.; Blackwell, Andrew D.

    2016-01-01

    Cognitive impairment is increasingly recognised as an important potential adverse effect of medication. However, many drug development programmes do not incorporate sensitive cognitive measurements. Here, we review the rationale for cognitive safety assessment, and explain several basic methodological principles for measuring cognition during clinical drug development, including study design and statistical analysis, from Phase I through to postmarketing. The crucial issue of how cognition should be assessed is emphasized, especially the sensitivity of measurement. We also consider how best to interpret the magnitude of any identified effects, including comparison with benchmarks. We conclude by discussing strategies for the effective communication of cognitive risks. PMID:26610416

  1. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    PubMed

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Developing comparative criminology and the case of China: an introduction.

    PubMed

    Liu, Jianhong

    2007-02-01

    Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.

  3. Principles of sustainable development of the territory and priorities of architectural and urban construction activity

    NASA Astrophysics Data System (ADS)

    Dontsov, Dmitry; Yushkova, Natalia

    2017-01-01

    The paper is aimed at detecting conceptual conflicts within the architectural and urban construction activity (AUCA), defining their reasons and substantiating ways to decrease adverse effects they caused. Methods of causes and effects analyses are used, as well as evolutional and comparative analyses. They allow defining the laws to form activity model in modern environment, whose elements are ranked. Relevance of the paper is based on defining scientific and theoretical grounds of necessity to improve methodology of AUCA via its adaption to the imperatives of state management. System analyses enabled to prove practicability of considering factors of institution environment for reorganization of the model of AUCA, which provide the fullest implementation of sustainable development principles. It was proved that territorial planning is not only the leading type of AUCA, but also integrator for functioning structures of state management within planning of social and economic development. As main result of the paper consist in detection of the perspective ways for evolution of modern methodology due to increasing interdisciplinary aspect leading to the qualitative renewal of territorial management principles.

  4. [Addictive potential in man: methodological aspects].

    PubMed

    Warot, D; Marra, D

    1995-01-01

    Different methods have been developed in clinical abuse liability testing in man. Tolerance, psychic and/or physical dependence must be investigated through clinical studies during drug development of a new substance. Adequate methodology is needed using double-blind, time-blind evaluations, comparisons of different dose levels and duration of treatment for a given drug, abrupt and gradual interruption of treatment, appropriate period of observation after treatment cessation ... The optimal scale to evaluate properly the symptoms occurring after drug discontinuation is still under investigation. These studies will or should permit the differentiation of rebound, withdrawal and recurrence. Methods developed to study reinforcing effects in post-addicts and healthy subjects are self-administration and choice procedures. In addition, the more traditional approach has been through assessing self-reported effects in which standardized questionnaires are used (Addiction Research Center Inventory or A.R.C.I.; Single Dose Questionnaire or S.D.Q.). A third focus of measurement has been discrimination studies performed in individuals with histories of drug abuse as well as healthy subjects. Abuse-liability testing of a new compound needs a multidimensional assessment to optimize the predictivity in defining the relative risk.

  5. Cognitive simulators for medical education and training.

    PubMed

    Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L

    2009-08-01

    Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.

  6. The usefulness of lean six sigma to the development of a clinical pathway for hip fractures.

    PubMed

    Niemeijer, Gerard C; Flikweert, Elvira; Trip, Albert; Does, Ronald J M M; Ahaus, Kees T B; Boot, Anja F; Wendt, Klaus W

    2013-10-01

    The objective of this study was to show the usefulness of lean six sigma (LSS) for the development of a multidisciplinary clinical pathway. A single centre, both retrospective and prospective, non-randomized controlled study design was used to identify the variables of a prolonged length of stay (LOS) for hip fractures in the elderly and to measure the effect of the process improvements--with the aim of improving efficiency of care and reducing the LOS. The project identified several variables influencing LOS, and interventions were designed to improve the process of care. Significant results were achieved by reducing both the average LOS by 4.2 days (-31%) and the average duration of surgery by 57 minutes (-36%). The average LOS of patients discharged to a nursing home reduced by 4.4 days. The findings of this study show a successful application of LSS methodology within the development of a clinical pathway. Further research is needed to explore the effect of the use of LSS methodology at clinical outcome and quality of life. © 2012 John Wiley & Sons Ltd.

  7. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...

  8. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...

  9. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...

  10. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Training methodology. 904.205... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205 Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the...

  11. The substantiation of methodical instrumentation to increase the tempo of high-rise construction in region

    NASA Astrophysics Data System (ADS)

    Belyaeva, Svetlana; Makeeva, Tatyana; Chugunov, Andrei; Andreeva, Peraskovya

    2018-03-01

    One of the important conditions of effective renovation of accommodation in region on the base of realization of high-rise construction projects is attraction of investments by forming favorable investment climate, as well as reduction if administrative barriers in construction and update of main funds of housing and communal services. The article proposes methodological bases for assessing the state of the investment climate in the region, as well as the methodology for the formation and evaluation of the investment program of the housing and communal services enterprise. The proposed methodologies are tested on the example of the Voronezh region. Authors also showed the necessity and expediency of using the consulting mechanism in the development of state and non-state investment projects and programs.

  12. Assessment of health risks of policies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ádám, Balázs, E-mail: badam@cmss.sdu.dk; Department of Preventive Medicine, Faculty of Public Health, University of Debrecen, P.O. Box 9, H-4012 Debrecen; Molnár, Ágnes, E-mail: MolnarAg@smh.ca

    The assessment of health risks of policies is an inevitable, although challenging prerequisite for the inclusion of health considerations in political decision making. The aim of our project was to develop a so far missing methodological guide for the assessment of the complex impact structure of policies. The guide was developed in a consensual way based on experiences gathered during the assessment of specific national policies selected by the partners of an EU project. Methodological considerations were discussed and summarized in workshops and pilot tested on the EU Health Strategy for finalization. The combined tool, which includes a textual guidancemore » and a checklist, follows the top-down approach, that is, it guides the analysis of causal chains from the policy through related health determinants and risk factors to health outcomes. The tool discusses the most important practical issues of assessment by impact level. It emphasises the transparent identification and prioritisation of factors, the consideration of the feasibility of exposure and outcome assessment with special focus on quantification. The developed guide provides useful methodological instructions for the comprehensive assessment of health risks of policies that can be effectively used in the health impact assessment of policy proposals. - Highlights: • Methodological guide for the assessment of health risks of policies is introduced. • The tool is developed based on the experiences from several case studies. • The combined tool consists of a textual guidance and a checklist. • The top-down approach is followed through the levels of the full impact chain. • The guide provides assistance for the health impact assessment of policy proposals.« less

  13. Improved Surface Parameter Retrievals using AIRS/AMSU Data

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Blaisdell, John

    2008-01-01

    The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Two very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; and 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions. In this methodology, longwave C02 channel observations in the spectral region 700 cm(exp -1) to 750 cm(exp -1) are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm(exp -1) 2395 cm(exp -1) are used for temperature sounding purposes. This allows for accurate temperature soundings under more difficult cloud conditions. This paper further improves on the methodology used in Version 5 to derive surface skin temperature and surface spectral emissivity from AIRS/AMSU observations. Now, following the approach used to improve tropospheric temperature profiles, surface skin temperature is also derived using only shortwave window channels. This produces improved surface parameters, both day and night, compared to what was obtained in Version 5. These in turn result in improved boundary layer temperatures and retrieved total O3 burden.

  14. Innovative educational methods and technologies applicable to continuing professional development in periodontology.

    PubMed

    Mattheos, N; Schoonheim-Klein, M; Walmsley, A D; Chapple, I L C

    2010-05-01

    Continuous professional development (CPD) in Periodontology refers to the overall framework of opportunities that facilitate a life-long learning practice, driven by the learner-practitioner and supported by a variety of institutions and individuals. CPD must address different needs for a great diversity of practitioners. It is clear that no particular methodology or technology is able to successfully accommodate the entire spectrum of CPD in Periodontology. Course designers must choose from and combine a wide array of methodologies and technologies, depending upon the needs of the learners and the objectives of the intended education. Research suggests that 'interactivity', 'flexibility', 'continuity' and 'relevance to learners' practice' are major characteristics of successful CPD. Various methods of mentoring, peer-learning environments and work-based learning have been combined with reflective practice and self-study to form the methodological backbone of CPD courses. Blended learning encompasses a wide array of technologies and methodologies and has been successfully used in CPD courses. Internet-based content learning management systems, portable Internet devices, powerful databases and search engines, together with initiatives such as 'open access' and 'open courseware' provide an array of effective instructional and communication tools. Assessment remains a key issue in CPD, providing learners with valuable feedback and it ensures the credibility and effectiveness of the learning process. Assessment is a multi-level process using different methods for different learning outcomes, as directed by current evidence and best practices. Finally, quality assurance of the education provided must follow CPD courses at all times through a structured and credible process.

  15. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors.

    PubMed

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-08-24

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design.

  16. Design Methodology for Magnetic Field-Based Soft Tri-Axis Tactile Sensors

    PubMed Central

    Wang, Hongbo; de Boer, Greg; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Hewson, Robert; Culmer, Peter

    2016-01-01

    Tactile sensors are essential if robots are to safely interact with the external world and to dexterously manipulate objects. Current tactile sensors have limitations restricting their use, notably being too fragile or having limited performance. Magnetic field-based soft tactile sensors offer a potential improvement, being durable, low cost, accurate and high bandwidth, but they are relatively undeveloped because of the complexities involved in design and calibration. This paper presents a general design methodology for magnetic field-based three-axis soft tactile sensors, enabling researchers to easily develop specific tactile sensors for a variety of applications. All aspects (design, fabrication, calibration and evaluation) of the development of tri-axis soft tactile sensors are presented and discussed. A moving least square approach is used to decouple and convert the magnetic field signal to force output to eliminate non-linearity and cross-talk effects. A case study of a tactile sensor prototype, MagOne, was developed. This achieved a resolution of 1.42 mN in normal force measurement (0.71 mN in shear force), good output repeatability and has a maximum hysteresis error of 3.4%. These results outperform comparable sensors reported previously, highlighting the efficacy of our methodology for sensor design. PMID:27563908

  17. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  18. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  19. Liquid Pipeline Operator's Control Room Human Factors Risk Assessment and Management Guide

    DOT National Transportation Integrated Search

    2008-11-26

    The purpose of this guide is to document methodologies, tools, procedures, guidance, and instructions that have been developed to provide liquid pipeline operators with an efficient and effective means of managing the human factors risks in their con...

  20. The Developmental Neurotoxicity Guideline Study: Issues with Methodology, Evaluation and Regulation

    EPA Science Inventory

    Recently social concerns are increasing for the effects of environmental factors on children's health, especially on their nervous system. The U.S. Environmental Protection Agency (EPA) and the Organization for Economic Co-operation and Development (GECD) have published testing ...

Top