Sample records for rating methodology development

  1. Load and resistance factor rating (LRFR) in New York State : volume II.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  2. Load and resistance factor rating (LRFR) in NYS : volume II final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  3. Load and resistance factor rating (LRFR) in NYS : volume I final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  4. Load and resistance factor rating (LRFR) in New York State : volume I.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  5. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  6. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale.

    PubMed

    Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon

    2008-08-01

    Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.

  7. Determination of Time Dependent Virus Inactivation Rates

    NASA Astrophysics Data System (ADS)

    Chrysikopoulos, C. V.; Vogler, E. T.

    2003-12-01

    A methodology is developed for estimating temporally variable virus inactivation rate coefficients from experimental virus inactivation data. The methodology consists of a technique for slope estimation of normalized virus inactivation data in conjunction with a resampling parameter estimation procedure. The slope estimation technique is based on a relatively flexible geostatistical method known as universal kriging. Drift coefficients are obtained by nonlinear fitting of bootstrap samples and the corresponding confidence intervals are obtained by bootstrap percentiles. The proposed methodology yields more accurate time dependent virus inactivation rate coefficients than those estimated by fitting virus inactivation data to a first-order inactivation model. The methodology is successfully applied to a set of poliovirus batch inactivation data. Furthermore, the importance of accurate inactivation rate coefficient determination on virus transport in water saturated porous media is demonstrated with model simulations.

  8. ChargeOut! : determining machine and capital equipment charge-out rates using discounted cash-flow analysis

    Treesearch

    E.M. (Ted) Bilek

    2007-01-01

    The model ChargeOut! was developed to determine charge-out rates or rates of return for machines and capital equipment. This paper introduces a costing methodology and applies it to a piece of capital equipment. Although designed for the forest industry, the methodology is readily transferable to other sectors. Based on discounted cash-flow analysis, ChargeOut!...

  9. Application Of The Iberdrola Licensing Methodology To The Cofrentes BWR-6 110% Extended Power Up-rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier

    Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less

  10. Predictive Methodology for Delamination Growth in Laminated Composites Part 1: Theoretical Development and Preliminary Experimental Results

    DOT National Transportation Integrated Search

    1998-04-01

    A methodology is presented for the prediction of delamination growth in laminated structures. The methodology is aimed at overcoming computational difficulties in the determination of energy release rate and mode mix. It also addresses the issue that...

  11. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  12. Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.

    1999-01-01

    A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.

  13. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  14. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  15. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  16. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  17. 45 CFR 98.101 - Case Review Methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Case Review Methodology. 98.101 Section 98.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Error Rate Reporting § 98.101 Case Review Methodology. (a) Case Reviews and Sampling—In preparing...

  18. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    NASA Astrophysics Data System (ADS)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  19. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    PubMed

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.

  20. Evaluating Cross-National Metrics of Tertiary Graduation Rates for OECD Countries: A Case for Increasing Methodological Congruence and Data Comparability

    ERIC Educational Resources Information Center

    Heuser, Brian L.; Drake, Timothy A.; Owens, Taya L.

    2013-01-01

    By examining the different methods and processes by which national data gathering agencies compile and submit their findings to the Organization for Economic Cooperation and Development (OECD), the authors (1) assess the methodological challenges of accurately reporting tertiary completion and graduation rates cross-nationally; (2) to examine the…

  1. 76 FR 33419 - Nationally Recognized Statistical Rating Organizations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ... documentation of the internal control structure) or should the factors focus on the design (i.e., establishment... related to implementing them. a. Controls reasonably designed to ensure that a newly developed methodology... U.S.C. 78o-7(r)(1)(A). b. Controls reasonably designed to ensure that a newly developed methodology...

  2. EPA’s AP-42 development methodology: Converting or rerating current AP-42 datasets

    USDA-ARS?s Scientific Manuscript database

    In August 2013, the U.S. Environmental Protection Agency’s (EPA) published their new methodology for updating the Compilation of Air Pollution Emission Factors (AP-42). The “Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE Database” instructs that the ratings of the...

  3. Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Miller, James; Leggett, Jay; Kramer-White, Julie

    2008-01-01

    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.

  4. A Descent Rate Control Approach to Developing an Autonomous Descent Vehicle

    NASA Astrophysics Data System (ADS)

    Fields, Travis D.

    Circular parachutes have been used for aerial payload/personnel deliveries for over 100 years. In the past two decades, significant work has been done to improve the landing accuracies of cargo deliveries for humanitarian and military applications. This dissertation discusses the approach developed in which a circular parachute is used in conjunction with an electro-mechanical reefing system to manipulate the landing location. Rather than attempt to steer the autonomous descent vehicle directly, control of the landing location is accomplished by modifying the amount of time spent in a particular wind layer. Descent rate control is performed by reversibly reefing the parachute canopy. The first stage of the research investigated the use of a single actuation during descent (with periodic updates), in conjunction with a curvilinear target. Simulation results using real-world wind data are presented, illustrating the utility of the methodology developed. Additionally, hardware development and flight-testing of the single actuation autonomous descent vehicle are presented. The next phase of the research focuses on expanding the single actuation descent rate control methodology to incorporate a multi-actuation path-planning system. By modifying the parachute size throughout the descent, the controllability of the system greatly increases. The trajectory planning methodology developed provides a robust approach to accurately manipulate the landing location of the vehicle. The primary benefits of this system are the inherent robustness to release location errors and the ability to overcome vehicle uncertainties (mass, parachute size, etc.). A separate application of the path-planning methodology is also presented. An in-flight path-prediction system was developed for use in high-altitude ballooning by utilizing the path-planning methodology developed for descent vehicles. The developed onboard system improves landing location predictions in-flight using collected flight information during the ascent and descent. Simulation and real-world flight tests (using the developed low-cost hardware) demonstrate the significance of the improvements achievable when flying the developed system.

  5. Multiobjective Optimization of Atmospheric Plasma Spray Process Parameters to Deposit Yttria-Stabilized Zirconia Coatings Using Response Surface Methodology

    NASA Astrophysics Data System (ADS)

    Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.

    2011-03-01

    Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.

  6. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  7. Phenological prediction of forest pest-defoliators

    Treesearch

    Valentina Meshkova

    2003-01-01

    The methodology for predicting phenological events are useful for predicting the seasonal development of insects in the current year, for analyzing terms and rate variation of insect population development in different years, and for comparing different geographical and ecological insect populations after terms and rate of different stages of seasonal development....

  8. A false-alarm aware methodology to develop robust and efficient multi-scale infrared small target detection algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2018-03-01

    False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.

  9. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  10. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.

    PubMed

    Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T

    2015-06-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.

  11. Transfer Assembly Project, 2001: VCCS Transfer Rates.

    ERIC Educational Resources Information Center

    McHewitt, Earl R.; Taylor, Garry

    This document discusses the transfer rates of students who entered Virginia's community colleges in the fall of 1995, using the methodology developed by the Center for the Study of Community Colleges. Numerous tables in the document include individual college rates with breakdowns by race/ethnicity and gender. College-specific transfer rates are…

  12. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control.

    PubMed

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-10-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way.

  13. PRECEPT: an evidence assessment framework for infectious disease epidemiology, prevention and control

    PubMed Central

    Harder, Thomas; Takla, Anja; Eckmanns, Tim; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Rehfuess, Eva; Schünemann, Holger; Zuiderent-Jerak, Teun; de Carvalho Gomes, Helena; Wichmann, Ole

    2017-01-01

    Decisions in public health should be based on the best available evidence, reviewed and appraised using a rigorous and transparent methodology. The Project on a Framework for Rating Evidence in Public Health (PRECEPT) defined a methodology for evaluating and grading evidence in infectious disease epidemiology, prevention and control that takes different domains and question types into consideration. The methodology rates evidence in four domains: disease burden, risk factors, diagnostics and intervention. The framework guiding it has four steps going from overarching questions to an evidence statement. In step 1, approaches for identifying relevant key areas and developing specific questions to guide systematic evidence searches are described. In step 2, methodological guidance for conducting systematic reviews is provided; 15 study quality appraisal tools are proposed and an algorithm is given for matching a given study design with a tool. In step 3, a standardised evidence-grading scheme using the Grading of Recommendations Assessment, Development and Evaluation Working Group (GRADE) methodology is provided, whereby findings are documented in evidence profiles. Step 4 consists of preparing a narrative evidence summary. Users of this framework should be able to evaluate and grade scientific evidence from the four domains in a transparent and reproducible way. PMID:29019317

  14. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 2, Part 2: Appendixes B, C, D and E

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    The derivation of the equations is presented, the rate control algorithm described, and simulation methodologies summarized. A set of dynamics equations that can be used recursively to calculate forces and torques acting at the joints of an n link manipulator given the manipulator joint rates are derived. The equations are valid for any n link manipulator system with any kind of joints connected in any sequence. The equations of motion for the class of manipulators consisting of n rigid links interconnected by rotary joints are derived. A technique is outlined for reducing the system of equations to eliminate contraint torques. The linearized dynamics equations for an n link manipulator system are derived. The general n link linearized equations are then applied to a two link configuration. The coordinated rate control algorithm used to compute individual joint rates when given end effector rates is described. A short discussion of simulation methodologies is presented.

  15. Methodology and Implications of Statewide Success Rates of Community College Students.

    ERIC Educational Resources Information Center

    McConochie, Daniel D.; Rajasekhara, Koosappa

    In 1991, the Maryland State Board for Community Colleges developed the "success rate," a reporting index which combined graduation, transfer, and persistence rates. Success rate matrices were produced by tracking first-time, full-time students representing seven cohorts (1980 to 1986) over a 4-year period, and matching entering…

  16. Effective Rating Scale Development for Speaking Tests: Performance Decision Trees

    ERIC Educational Resources Information Center

    Fulcher, Glenn; Davidson, Fred; Kemp, Jenny

    2011-01-01

    Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…

  17. VALIDATION OF A METHOD FOR ESTIMATING POLLUTION EMISSION RATES FROM AREA SOURCES USING OPEN-PATH FTIR SEPCTROSCOPY AND DISPERSION MODELING TECHNIQUES

    EPA Science Inventory

    The paper describes a methodology developed to estimate emissions factors for a variety of different area sources in a rapid, accurate, and cost effective manner. he methodology involves using an open-path Fourier transform infrared (FTIR) spectrometer to measure concentrations o...

  18. Generalized quantum kinetic expansion: Higher-order corrections to multichromophoric Förster theory

    NASA Astrophysics Data System (ADS)

    Wu, Jianlan; Gong, Zhihao; Tang, Zhoufei

    2015-08-01

    For a general two-cluster energy transfer network, a new methodology of the generalized quantum kinetic expansion (GQKE) method is developed, which predicts an exact time-convolution equation for the cluster population evolution under the initial condition of the local cluster equilibrium state. The cluster-to-cluster rate kernel is expanded over the inter-cluster couplings. The lowest second-order GQKE rate recovers the multichromophoric Förster theory (MCFT) rate. The higher-order corrections to the MCFT rate are systematically included using the continued fraction resummation form, resulting in the resummed GQKE method. The reliability of the GQKE methodology is verified in two model systems, revealing the relevance of higher-order corrections.

  19. Scoping reviews: time for clarity in definition, methods, and reporting.

    PubMed

    Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David

    2014-12-01

    The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  1. Rain Rate Statistics in Southern New Mexico

    NASA Technical Reports Server (NTRS)

    Paulic, Frank J., Jr.; Horan, Stephen

    1997-01-01

    The methodology used in determining empirical rain-rate distributions for Southern New Mexico in the vicinity of White Sands APT site is discussed. The hardware and the software developed to extract rain rate from the rain accumulation data collected at White Sands APT site are described. The accuracy of Crane's Global Model for rain rate predictions is analyzed.

  2. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    PubMed

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  3. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2001-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  4. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2002-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  5. 18 CFR 342.4 - Other rate changing methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... METHODOLOGIES AND PROCEDURES § 342.4 Other rate changing methodologies. (a) Cost-of-service rates. A carrier may...

  6. Brief report: reporting practices of methodological information in four journals of pediatric and child psychology.

    PubMed

    Raad, Jennifer M; Bellinger, Skylar; McCormick, Erica; Roberts, Michael C; Steele, Ric G

    2008-08-01

    To replicate Sifers, Puddy, Warren, and Roberts (2002) examining reporting rates of demographic, methodological, and ethical information in articles published during 1997, and to compare these rates to those found in articles published during 2005, in order to determine whether and how reporting practices of these variables have changed over time. We examined reporting demographic, methodological, and ethical information in articles in four journals: Journal of Pediatric Psychology, Journal of Clinical Child and Adolescent Psychology, Journal of Abnormal Child Psychology, and Child Development. Reporting rates during 2005 were compared to articles published during 1997. These four journals improved on many of the 23 variables compared to Sifers et al. including increases in the reporting of ethnicity, attrition, child assent procedures, socioeconomic status, reliability, and reward/incentive offered to participants. Improvements in descriptive information have implications for interpretation, replication, and generalizability of research findings.

  7. REE radiation fault model: a tool for organizing and communication radiation test data and construction COTS based spacebourne computing systems

    NASA Technical Reports Server (NTRS)

    Ferraro, R.; Some, R.

    2002-01-01

    The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.

  8. Large-Eddy Simulation (LES) of a Compressible Mixing Layer and the Significance of Inflow Turbulence

    NASA Technical Reports Server (NTRS)

    Mankbadi, Mina Reda; Georgiadis, Nicholas J.; Debonis, James R.

    2017-01-01

    In the context of Large Eddy Simulations (LES), the effects of inflow turbulence are investigated through the Synthetic Eddy Method (SEM). The growth rate of a turbulent compressible mixing layer corresponding to operating conditions of GeobelDutton Case 2 is investigated herein. The effects of spanwise width on the growth rate of the mixing layer is investigated such that spanwise width independence is reached. The error in neglecting inflow turbulence effects is quantified by comparing two methodologies: (1) Hybrid-RANS-LES methodology and (2) SEM-LES methodology. Best practices learned from Case 2 are developed herein and then applied to a higher convective mach number corresponding to Case 4 experiments of GeobelDutton.

  9. A multimedia approach for teaching human embryology: Development and evaluation of a methodology.

    PubMed

    Moraes, Suzana Guimarães; Pereira, Luis Antonio Violin

    2010-12-20

    Human embryology requires students to understand the simultaneous changes in embryos, but students find it difficult to grasp the concepts presented and to visualise the related processes in three dimensions. The aims of this study have been to develop and evaluate new educational materials and a teaching methodology based on multimedia approaches to improve the comprehension of human development. The materials developed at the State University of Campinas include clinical histories, movies, animations, and ultrasound, as well as autopsy images from embryos and foetuses. The series of embryology lectures were divided into two parts. The first part of the series addressed the development of the body's structures, while in the second part, clinical history and the corresponding materials were shown to the students, who were encouraged to discuss the malformations. The teaching materials were made available on software used by the students in classes. At the end of the discipline, the material and methodology were evaluated with an attitudinal instrument, interviews, and knowledge examination. The response rate to the attitudinal instrument was 95.35%, and the response rate to the interview was 46%. The students approved of the materials and the teaching methodology (reliability of the attitudinal instrument was 0.9057). The exams showed that most students scored above 6.0. A multimedia approach proved useful for solving an important problem associated with teaching methods in many medical institutions: the lack of integration between basic sciences and clinical disciplines. 2010 Elsevier GmbH. All rights reserved.

  10. Pressure Decay Testing Methodology for Quantifying Leak Rates of Full-Scale Docking System Seals

    NASA Technical Reports Server (NTRS)

    Dunlap, Patrick H., Jr.; Daniels, Christopher C.; Wasowski, Janice L.; Garafolo, Nicholas G.; Penney, Nicholas; Steinetz, Bruce M.

    2010-01-01

    NASA is developing a new docking system to support future space exploration missions to low-Earth orbit and the Moon. This system, called the Low Impact Docking System, is a mechanism designed to connect the Orion Crew Exploration Vehicle to the International Space Station, the lunar lander (Altair), and other future Constellation Project vehicles. NASA Glenn Research Center is playing a key role in developing the main interface seal for this docking system. This seal will be relatively large with an outside diameter in the range of 54 to 58 in. (137 to 147 cm). As part of this effort, a new test apparatus has been designed, fabricated, and installed to measure leak rates of candidate full-scale seals under simulated thermal, vacuum, and engagement conditions. Using this test apparatus, a pressure decay testing and data processing methodology has been developed to quantify full-scale seal leak rates. Tests performed on untreated 54 in. diameter seals at room temperature in a fully compressed state resulted in leak rates lower than the requirement of less than 0.0025 lbm, air per day (0.0011 kg/day).

  11. Weigh-in-motion (WIM) data for site-specific LRFR bridge load rating.

    DOT National Transportation Integrated Search

    2011-08-12

    The live load factors in the Load and Resistant Factor Rating (LRFR) Manual are based on load data from Ontario : thought to be representative of traffic volumes nationwide. However, in accordance with the methodology for : developing site-specific l...

  12. Analysis of bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF exchange rate within the scope of econophysics

    NASA Astrophysics Data System (ADS)

    Deviren, Bayram; Kocakaplan, Yusuf; Keskin, Mustafa; Balcılar, Mehmet; Özdemir, Zeynel Abidin; Ersoy, Ersan

    2014-09-01

    In this study, we analyze the Turkish Lira/US Dollar (TRY/USD), Turkish Lira/Euro (TRY/EUR), Turkish Lira/Japanese Yen (TRY/JPY) and Turkish Lira/Swiss Franc (TRY/CHF) exchange rates in the global financial crisis period to detect the bubbles and crashes in the TRY by using a mathematical methodology developed by Watanabe et al. (2007). The methodology defines the bubbles and crashes in financial market price fluctuations by considering an exponential fitting of the associated data. This methodology is applied to detect the bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF exchange rates from January, 1, 2005 to December, 20, 2013. In this mathematical methodology, the whole period of bubbles and crashes can be determined purely from past data, and the start of bubbles and crashes can be identified even before its bursts. In this way, the periods of bubbles and crashes in the TRY/USD, TRY/EUR, TRY/JPY and TRY/CHF are determined, and the beginning and end points of these periods are detected. The results show that the crashes in the TRY/CHF exchange rate are commonly finished earlier than in the other exchange rates; hence it is probable that the crashes in the other exchange rates would be finished soon when the crashes in the TRY/CHF exchange rate ended. We also find that the periods of crashes in the TRY/EUR exchange rate take longer time than in the other exchange rates. This information can be used in risk management and/or speculative gain. The crashes' periods in the TRY/EUR and TRY/USD exchange rates are observed to be relatively longer than in the other exchange rates.

  13. Epidemiology of fetal alcohol syndrome in a South African community in the Western Cape Province.

    PubMed Central

    May, P A; Brooke, L; Gossage, J P; Croxford, J; Adnams, C; Jones, K L; Robinson, L; Viljoen, D

    2000-01-01

    OBJECTIVES: This study determined the characteristics of fetal alcohol syndrome in a South African community, and methodology was designed for the multidisciplinary study of fetal alcohol syndrome in developing societies. METHODS: An active case ascertainment, 2-tier methodology was used among 992 first-grade pupils. A case-control design, using measures of growth, development, dysmorphology, and maternal risk, delineated characteristics of children with fetal alcohol syndrome. RESULTS: A high rate of fetal alcohol syndrome was found in the schools--40.5 to 46.4 per 1000 children aged 5 to 9 years--and age-specific community rates (ages 6-7) were 39.2 to 42.9. These rates are 18 to 141 times greater than in the United States. Rural residents had significantly more fetal alcohol syndrome. After control for ethnic variation, children with fetal alcohol syndrome had traits similar to those elsewhere: poor growth and development, congruent dysmorphology, and lower intellectual functioning. CONCLUSIONS: This study documented the highest fetal alcohol syndrome rate to date in an overall community population. Fetal alcohol syndrome initiatives that incorporate innovative sampling and active case ascertainment methods can be used to obtain timely and accurate data among developing populations. PMID:11111264

  14. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    PubMed

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  15. 42 CFR 413.220 - Methodology for calculating the per-treatment base rate under the ESRD prospective payment system...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....171 of this part, into a single per treatment base rate developed from 2007 claims data. The steps to..., or 2009. CMS removes the effects of enrollment and price growth from total expenditures for 2007...

  16. Preloading To Accelerate Slow-Crack-Growth Testing

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Choi, Sung R.; Pawlik, Ralph J.

    2004-01-01

    An accelerated-testing methodology has been developed for measuring the slow-crack-growth (SCG) behavior of brittle materials. Like the prior methodology, the accelerated-testing methodology involves dynamic fatigue ( constant stress-rate) testing, in which a load or a displacement is applied to a specimen at a constant rate. SCG parameters or life prediction parameters needed for designing components made of the same material as that of the specimen are calculated from the relationship between (1) the strength of the material as measured in the test and (2) the applied stress rate used in the test. Despite its simplicity and convenience, dynamic fatigue testing as practiced heretofore has one major drawback: it is extremely time-consuming, especially at low stress rates. The present accelerated methodology reduces the time needed to test a specimen at a given rate of applied load, stress, or displacement. Instead of starting the test from zero applied load or displacement as in the prior methodology, one preloads the specimen and increases the applied load at the specified rate (see Figure 1). One might expect the preload to alter the results of the test and indeed it does, but fortunately, it is possible to account for the effect of the preload in interpreting the results. The accounting is done by calculating the normalized strength (defined as the strength in the presence of preload the strength in the absence of preload) as a function of (1) the preloading factor (defined as the preload stress the strength in the absence of preload) and (2) a SCG parameter, denoted n, that is used in a power-law crack-speed formulation. Figure 2 presents numerical results from this theoretical calculation.

  17. Strengthening the Validity of Population-Based Suicide Rate Comparisons: An Illustration Using U.S. Military and Civilian Data

    ERIC Educational Resources Information Center

    Eaton, Karen M.; Messer, Stephen C.; Garvey Wilson, Abigail L.; Hoge, Charles W.

    2006-01-01

    The objectives of this study were to generate precise estimates of suicide rates in the military while controlling for factors contributing to rate variability such as demographic differences and classification bias, and to develop a simple methodology for the determination of statistically derived thresholds for detecting significant rate…

  18. Promoting energy efficiency through improved electricity pricing: A mid-project report

    NASA Astrophysics Data System (ADS)

    Action, J. P.; Kohler, D. F.; Mitchell, B. M.; Park, R. E.

    1982-03-01

    Five related areas of electricity demand analysis under alternative rate forms were studied. Adjustments by large commercial and industrial customers are examined. Residential demand under time of day (TOD) pricing is examined. A methodology for evaluating alternative rate structures is developed and applied.

  19. Development of a methodology for accident causation research

    DOT National Transportation Integrated Search

    1983-06-01

    The obj ective of this study was to fully develop and apply a me thodology to : study accident causation, uhich was outlined in a previous study . " Causal" factors : are those pre-crash factors, which are statistically related to the accident rate :...

  20. The National Aviation Operational Monitoring Service (NAOMS): A Documentation of the Development of a Survey Methodology

    NASA Technical Reports Server (NTRS)

    Connors, Mary M.; Mauro, Robert; Statler, Irving C.

    2012-01-01

    The National Aviation Operational Monitoring Service (NAOMS) was a research project under NASA s Aviation Safety Program during the years from 2000 to 2005. The purpose of this project was to develop a methodology for gaining reliable information on changes over time in the rates-of-occurrence of safety-related events as a means of assessing the safety of the national airspace. The approach was a scientifically designed survey of the operators of the aviation system concerning their safety-related experiences. This report presents the results of the methodology developed and a demonstration of the NAOMS concept through a survey of nearly 20,000 randomly selected air-carrier pilots. Results give evidence that the NAOMS methodology can provide a statistically sound basis for evaluating trends of incidents that could compromise safety. The approach and results are summarized in the report and supporting documentation and complete analyses of results are presented in 14 appendices.

  1. Simplified bridge load rating methodology using the national bridge inventory file : user manual

    DOT National Transportation Integrated Search

    1988-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  2. Simplified bridge load rating methodology using the national bridge inventory file : program listing

    DOT National Transportation Integrated Search

    1987-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  3. Quantitative trait Loci analysis using the false discovery rate.

    PubMed

    Benjamini, Yoav; Yekutieli, Daniel

    2005-10-01

    False discovery rate control has become an essential tool in any study that has a very large multiplicity problem. False discovery rate-controlling procedures have also been found to be very effective in QTL analysis, ensuring reproducible results with few falsely discovered linkages and offering increased power to discover QTL, although their acceptance has been slower than in microarray analysis, for example. The reason is partly because the methodological aspects of applying the false discovery rate to QTL mapping are not well developed. Our aim in this work is to lay a solid foundation for the use of the false discovery rate in QTL mapping. We review the false discovery rate criterion, the appropriate interpretation of the FDR, and alternative formulations of the FDR that appeared in the statistical and genetics literature. We discuss important features of the FDR approach, some stemming from new developments in FDR theory and methodology, which deem it especially useful in linkage analysis. We review false discovery rate-controlling procedures--the BH, the resampling procedure, and the adaptive two-stage procedure-and discuss the validity of these procedures in single- and multiple-trait QTL mapping. Finally we argue that the control of the false discovery rate has an important role in suggesting, indicating the significance of, and confirming QTL and present guidelines for its use.

  4. Operational rate-distortion performance for joint source and channel coding of images.

    PubMed

    Ruf, M J; Modestino, J W

    1999-01-01

    This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.

  5. Constructing a question bank based on script concordance approach as a novel assessment methodology in surgical education.

    PubMed

    Aldekhayel, Salah A; Alselaim, Nahar A; Magzoub, Mohi Eldin; Al-Qattan, Mohammad M; Al-Namlah, Abdullah M; Tamim, Hani; Al-Khayal, Abdullah; Al-Habdan, Sultan I; Zamakhshary, Mohammed F

    2012-10-24

    Script Concordance Test (SCT) is a new assessment tool that reliably assesses clinical reasoning skills. Previous descriptions of developing SCT-question banks were merely subjective. This study addresses two gaps in the literature: 1) conducting the first phase of a multistep validation process of SCT in Plastic Surgery, and 2) providing an objective methodology to construct a question bank based on SCT. After developing a test blueprint, 52 test items were written. Five validation questions were developed and a validation survey was established online. Seven reviewers were asked to answer this survey. They were recruited from two countries, Saudi Arabia and Canada, to improve the test's external validity. Their ratings were transformed into percentages. Analysis was performed to compare reviewers' ratings by looking at correlations, ranges, means, medians, and overall scores. Scores of reviewers' ratings were between 76% and 95% (mean 86% ± 5). We found poor correlations between reviewers (Pearson's: +0.38 to -0.22). Ratings of individual validation questions ranged between 0 and 4 (on a scale 1-5). Means and medians of these ranges were computed for each test item (mean: 0.8 to 2.4; median: 1 to 3). A subset of test items comprising 27 items was generated based on a set of inclusion and exclusion criteria. This study proposes an objective methodology for validation of SCT-question bank. Analysis of validation survey is done from all angles, i.e., reviewers, validation questions, and test items. Finally, a subset of test items is generated based on a set of criteria.

  6. The UCERF3 grand inversion: Solving for the long‐term rate of ruptures in a fault system

    USGS Publications Warehouse

    Page, Morgan T.; Field, Edward H.; Milner, Kevin; Powers, Peter M.

    2014-01-01

    We present implementation details, testing, and results from a new inversion‐based methodology, known colloquially as the “grand inversion,” developed for the Uniform California Earthquake Rupture Forecast (UCERF3). We employ a parallel simulated annealing algorithm to solve for the long‐term rate of all ruptures that extend through the seismogenic thickness on major mapped faults in California while simultaneously satisfying available slip‐rate, paleoseismic event‐rate, and magnitude‐distribution constraints. The inversion methodology enables the relaxation of fault segmentation and allows for the incorporation of multifault ruptures, which are needed to remove magnitude‐distribution misfits that were present in the previous model, UCERF2. The grand inversion is more objective than past methodologies, as it eliminates the need to prescriptively assign rupture rates. It also provides a means to easily update the model as new data become available. In addition to UCERF3 model results, we present verification of the grand inversion, including sensitivity tests, tuning of equation set weights, convergence metrics, and a synthetic test. These tests demonstrate that while individual rupture rates are poorly resolved by the data, integrated quantities such as magnitude–frequency distributions and, most importantly, hazard metrics, are much more robust.

  7. Design Of Combined Stochastic Feedforward/Feedback Control

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1989-01-01

    Methodology accommodates variety of control structures and design techniques. In methodology for combined stochastic feedforward/feedback control, main objectives of feedforward and feedback control laws seen clearly. Inclusion of error-integral feedback, dynamic compensation, rate-command control structure, and like integral element of methodology. Another advantage of methodology flexibility to develop variety of techniques for design of feedback control with arbitrary structures to obtain feedback controller: includes stochastic output feedback, multiconfiguration control, decentralized control, or frequency and classical control methods. Control modes of system include capture and tracking of localizer and glideslope, crab, decrab, and flare. By use of recommended incremental implementation, control laws simulated on digital computer and connected with nonlinear digital simulation of aircraft and its systems.

  8. Estimation of the Reactive Flow Model Parameters for an Ammonium Nitrate-Based Emulsion Explosive Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Ribeiro, J. B.; Silva, C.; Mendes, R.

    2010-10-01

    A real coded genetic algorithm methodology that has been developed for the estimation of the parameters of the reaction rate equation of the Lee-Tarver reactive flow model is described in detail. This methodology allows, in a single optimization procedure, using only one experimental result and, without the need of any starting solution, to seek the 15 parameters of the reaction rate equation that fit the numerical to the experimental results. Mass averaging and the plate-gap model have been used for the determination of the shock data used in the unreacted explosive JWL equation of state (EOS) assessment and the thermochemical code THOR retrieved the data used in the detonation products' JWL EOS assessments. The developed methodology was applied for the estimation of the referred parameters for an ammonium nitrate-based emulsion explosive using poly(methyl methacrylate) (PMMA)-embedded manganin gauge pressure-time data. The obtained parameters allow a reasonably good description of the experimental data and show some peculiarities arising from the intrinsic nature of this kind of composite explosive.

  9. Accouting for Greenhouse Gas Emissions from Reservoirs

    NASA Astrophysics Data System (ADS)

    Beaulieu, J. J.; Deemer, B. R.; Harrison, J. A.; Nietch, C. T.; Waldo, S.

    2016-12-01

    Nearly three decades of research has demonstrated that the impoundment of rivers and the flooding of terrestrial ecosystems behind dams can increase rates of greenhouse gas emission, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used as a `basis for future methodological development' due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. In the U.S., research approaches include: 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions linked to the National Lakes Assessment.

  10. 77 FR 10767 - Rate Adjustments for Indian Irrigation Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ... Irrigation Project on the proposed rates about the following issues: (1) The methodology for O&M rate setting... BIA's responses are provided below. Comment: The BIA's methodology for setting the 2013 O&M assessment rate was unreasonable. Response: The methodology used by the BIA to determine the 2013 O&M assessment...

  11. Using Reported Rates of Sexually Transmitted Diseases to Illustrate Potential Methodological Issues in the Measurement of Racial and Ethnic Disparities.

    PubMed

    Chesson, Harrell W; Patel, Chirag G; Gift, Thomas L; Bernstein, Kyle T; Aral, Sevgi O

    2017-09-01

    Racial disparities in the burden of sexually transmitted diseases (STDs) have been documented and described for decades. Similarly, methodological issues and limitations in the use of disparity measures to quantify disparities in health have also been well documented. The purpose of this study was to use historic STD surveillance data to illustrate four of the most well-known methodological issues associated with the use of disparity measures. We manually searched STD surveillance reports to find examples of racial/ethnic distributions of reported STDs that illustrate key methodological issues in the use of disparity measures. The disparity measures we calculated included the black-white rate ratio, the Index of Disparity (weighted and unweighted by subgroup population), and the Gini coefficient. The 4 examples we developed included illustrations of potential differences in relative and absolute disparity measures, potential differences in weighted and nonweighted disparity measures, the importance of the reference point when calculating disparities, and differences in disparity measures in the assessment of trends in disparities over time. For example, the gonorrhea rate increased for all minority groups (relative to whites) from 1992 to 1993, yet the Index of Disparity suggested that racial/ethnic disparities had decreased. Although imperfect, disparity measures can be useful to quantify racial/ethnic disparities in STDs, to assess trends in these disparities, and to inform interventions to reduce these disparities. Our study uses reported STD rates to illustrate potential methodological issues with these disparity measures and highlights key considerations when selecting disparity measures for quantifying disparities in STDs.

  12. Determination of Strain Rate Sensitivity of Micro-struts Manufactured Using the Selective Laser Melting Method

    NASA Astrophysics Data System (ADS)

    Gümrük, Recep; Mines, R. A. W.; Karadeniz, Sami

    2018-03-01

    Micro-lattice structures manufactured using the selective laser melting (SLM) process provides the opportunity to realize optimal cellular materials for impact energy absorption. In this paper, strain rate-dependent material properties are measured for stainless steel 316L SLM micro-lattice struts in the strain rate range of 10-3 to 6000 s-1. At high strain rates, a novel version of the split Hopkinson Bar has been developed. Strain rate-dependent materials data have been used in Cowper-Symonds material model, and the scope and limit of this model in the context of SLM struts have been discussed. Strain rate material data and the Cowper-Symonds model have been applied to the finite element analysis of a micro-lattice block subjected to drop weight impact loading. The model output has been compared to experimental results, and it has been shown that the increase in crush stress due to impact loading is mainly the result of strain rate material behavior. Hence, a systematic methodology has been developed to investigate the impact energy absorption of a micro-lattice structure manufactured using additive layer manufacture (SLM). This methodology can be extended to other micro-lattice materials and configurations, and to other impact conditions.

  13. A methodology to reduce uncertainties in the high-flow portion of a rating curve

    USDA-ARS?s Scientific Manuscript database

    Flow monitoring at watershed scale relies on the establishment of a rating curve that describes the relationship between stage and flow and is developed from actual flow measurements at various stages. Measurement errors increase with out-of-bank flow conditions because of safety concerns and diffic...

  14. Analysis of the methods for assessing socio-economic development level of urban areas

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Bogacheva, Elena

    2017-01-01

    The present paper provides a targeted analysis of current approaches (ratings) in the assessment of socio-economic development of urban areas. The survey focuses on identifying standardized methodologies to area assessment techniques formation that will result in developing the system of intelligent monitoring, dispatching, building management, scheduling and effective management of an administrative-territorial unit. This system is characterized by complex hierarchical structure, including tangible and intangible properties (parameters, attributes). Investigating the abovementioned methods should increase the administrative-territorial unit's attractiveness for investors and residence. The research aims at studying methods for evaluating socio-economic development level of the Russian Federation territories. Experimental and theoretical territory estimating methods were revealed. Complex analysis of the characteristics of the areas was carried out and evaluation parameters were determined. Integral indicators (resulting rating criteria values) as well as the overall rankings (parameters, characteristics) were analyzed. The inventory of the most widely used partial indicators (parameters, characteristics) of urban areas was revealed. The resulting criteria of rating values homogeneity were verified and confirmed by determining the root mean square deviation, i.e. divergence of indices. The principal shortcomings of assessment methodologies were revealed. The assessment methods with enhanced effectiveness and homogeneity were proposed.

  15. 78 FR 4369 - Rates for Interstate Inmate Calling Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    .... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...

  16. NREL module energy rating methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitaker, C.; Newmiller, J.; Kroposki, B.

    1995-11-01

    The goals of this project were to develop a tool for: evaluating one module in different climates; comparing different modules; provide a Q&D method for estimating periodic energy production; provide an achievable module rating; provide an incentive for manufacturers to optimize modules to non-STC conditions; and to have a consensus-based, NREL-sponsored activity. The approach taken was to simulate module energy for five reference days of various weather conditions. A performance model was developed.

  17. New Methodology for Estimating Fuel Economy by Vehicle Class

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less

  18. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE) - A Systematic Review of Rating Scales

    PubMed Central

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Background Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students’ communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. Methods We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Results Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Discussion Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students’ academic success. PMID:27031506

  19. Assessing Communication Skills of Medical Students in Objective Structured Clinical Examinations (OSCE)--A Systematic Review of Rating Scales.

    PubMed

    Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle

    2016-01-01

    Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students' academic success.

  20. Fleet management performance monitoring.

    DOT National Transportation Integrated Search

    2013-05-01

    The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...

  1. Early Childhood Longitudinal Study, Birth Cohort (ECLS-B): Methodology Report for the 9-Month Data Collection (2001-02). Volume 2: Sampling. NCES 2005-147

    ERIC Educational Resources Information Center

    Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry

    2005-01-01

    This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…

  2. Signal processing methodologies for an acoustic fetal heart rate monitor

    NASA Technical Reports Server (NTRS)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  3. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knobloch, Henri; Turner, Claire; Spooner, Andrew

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspacemore » analysis. However, the potential of e-nose technology is also discussed.« less

  4. Developing a set of consensus indicators to support maternity service quality improvement: using Core Outcome Set methodology including a Delphi process.

    PubMed

    Bunch, K J; Allin, B; Jolly, M; Hardie, T; Knight, M

    2018-05-16

    To develop a core metric set to monitor the quality of maternity care. Delphi process followed by a face-to-face consensus meeting. English maternity units. Three representative expert panels: service designers, providers and users. Maternity care metrics judged important by participants. Participants were asked to complete a two-phase Delphi process, scoring metrics from existing local maternity dashboards. A consensus meeting discussed the results and re-scored the metrics. In all, 125 distinct metrics across six domains were identified from existing dashboards. Following the consensus meeting, 14 metrics met the inclusion criteria for the final core set: smoking rate at booking; rate of birth without intervention; caesarean section delivery rate in Robson group 1 women; caesarean section delivery rate in Robson group 2 women; caesarean section delivery rate in Robson group 5 women; third- and fourth-degree tear rate among women delivering vaginally; rate of postpartum haemorrhage of ≥1500 ml; rate of successful vaginal birth after a single previous caesarean section; smoking rate at delivery; proportion of babies born at term with an Apgar score <7 at 5 minutes; proportion of babies born at term admitted to the neonatal intensive care unit; proportion of babies readmitted to hospital at <30 days of age; breastfeeding initiation rate; and breastfeeding rate at 6-8 weeks. Core outcome set methodology can be used to incorporate the views of key stakeholders in developing a core metric set to monitor the quality of care in maternity units, thus enabling improvement. Achieving consensus on core metrics for monitoring the quality of maternity care. © 2018 The Authors. BJOG: An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.

  5. Methodology for urban rail and construction technology research and development planning

    NASA Technical Reports Server (NTRS)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  6. 76 FR 34270 - Federal-State Extended Benefits Program-Methodology for Calculating “on” or “off” Total...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ...--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate Indicators for Purposes of Determining...'' or ``off'' total unemployment rate (TUR) indicators to determine when extended benefit (EB) periods...-State Extended Benefits Program--Methodology for Calculating ``on'' or ``off'' Total Unemployment Rate...

  7. A methodology to reduce uncertainties in the high-flow portion of the rating curve for Goodwater Creek Watershed

    USDA-ARS?s Scientific Manuscript database

    Flow monitoring at watershed scale relies on the establishment of a rating curve that describes the relationship between stage and flow and is developed from actual flow measurements at various stages. Measurement errors increase with out-of-bank flow conditions because of safety concerns and diffic...

  8. A Methodology for Modeling Nuclear Power Plant Passive Component Aging in Probabilistic Risk Assessment under the Impact of Operating Conditions, Surveillance and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Guler Yigitoglu, Askin

    In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a multi-state physics based model is selected to represent the aging process. The model is modified via sojourn time approach to reflect the operational and maintenance history dependence of the transition rates. Thermal-hydraulic parameters of the model are calculated via the reactor simulation environment and uncertainties associated with both parameters and the models are assessed via a two-loop Monte Carlo approach (Latin hypercube sampling) to propagate input probability distributions through the physical model. The effort documented in this thesis towards this overall objective consists of : i) defining a process for selecting critical passive components and related aging mechanisms, ii) aging model selection, iii) calculating the probability that aging would cause the component to fail, iv) uncertainty/sensitivity analyses, v) procedure development for modifying an existing PRA to accommodate consideration of passive component failures, and, vi) including the calculated failure probability in the modified PRA. The proposed methodology is applied to pressurizer surge line pipe weld aging and steam generator tube degradation in pressurized water reactors.

  9. Mild cognitive impairment: historical development and summary of research

    PubMed Central

    Golomb, James; Kluger, Alan; Ferris, Steven H

    2004-01-01

    This review article broadly traces the historical development, diagnostic criteria, clinical and neuropathological characteristics, and treatment strategies related to mild cognitive impairment (MCI), The concept of MCI is considered in the context of other terms that have been developed to characterize the elderly with varying degrees of cognitive impairment Criteria based on clinical global scale ratings, cognitive test performance, and performance on other domains of functioning are discussed. Approaches employing clinical, neuropsychological, neuroimaging, biological, and molecular genetic methodology used in the validation of MCI are considered, including results from cross-sectional, longitudinal, and postmortem investigations. Results of recent drug treatment studies of MCI and related methodological issues are also addressed. PMID:22034453

  10. Predicting boundary shear stress and sediment transport over bed forms

    USGS Publications Warehouse

    McLean, S.R.; Wolfe, S.R.; Nelson, J.M.

    1999-01-01

    To estimate bed-load sediment transport rates in flows over bed forms such as ripples and dunes, spatially averaged velocity profiles are frequently used to predict mean boundary shear stress. However, such averaging obscures the complex, nonlinear interaction of wake decay, boundary-layer development, and topographically induced acceleration downstream of flow separation and often leads to inaccurate estimates of boundary stress, particularly skin friction, which is critically important in predicting bed-load transport rates. This paper presents an alternative methodology for predicting skin friction over 2D bed forms. The approach is based on combining the equations describing the mechanics of the internal boundary layer with semiempirical structure functions to predict the velocity at the crest of a bedform, where the flow is most similar to a uniform boundary layer. Significantly, the methodology is directed toward making specific predictions only at the bed-form crest, and as a result it avoids the difficulty and questionable validity of spatial averaging. The model provides an accurate estimate of the skin friction at the crest where transport rates are highest. Simple geometric constraints can be used to derive the mean transport rates as long as bed load is dominant.To estimate bed-load sediment transport rates in flows over bed forms such as ripples and dunes, spatially averaged velocity profiles are frequently used to predict mean boundary shear stress. However, such averaging obscures the complex, nonlinear interaction of wake decay, boundary-layer development, and topographically induced acceleration downstream of flow separation and often leads to inaccurate estimates of boundary stress, particularly skin friction, which is critically important in predicting bed-load transport rates. This paper presents an alternative methodology for predicting skin friction over 2D bed forms. The approach is based on combining the equations describing the mechanics of the internal boundary layer with semiempirical structure functions to predict the velocity at the crest of a bedform, where the flow is most similar to a uniform boundary layer. Significantly, the methodology is directed toward making specific predictions only at the bed-form crest, and as a result it avoids the difficulty and questionable validity of spatial averaging. The model provides an accurate estimate of the skin friction at the crest where transport rates are highest. Simple geometric constraints can be used to derive the mean transport rates as long as bed load is dominant.

  11. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  12. Crude and intrinsic birth rates for Asian countries.

    PubMed

    Rele, J R

    1978-01-01

    An attempt to estimate birth rates for Asian countries. The main sources of information in developing countries has been census age-sex distribution, although inaccuracies in the basic data have made it difficult to reach a high degree of accuracy. Different methods bring widely varying results. The methodology presented here is based on the use of the conventional child-woman ratio from the census age-sex distribution, with a rough estimate of the expectation of life at birth. From the established relationships between child-woman ratio and the intrinsic birth rate of the nature y = a + bx + cx(2) at each level of life expectation, the intrinsic birth rate is first computed using coefficients already computed. The crude birth rate is obtained using the adjustment based on the census age-sex distribution. An advantage to this methodology is that the intrinsic birth rate, normally an involved computation, can be obtained relatively easily as a biproduct of the crude birth rates and the bases for the calculations for each of 33 Asian countries, in some cases over several time periods.

  13. [Methodological problems of noninfectious epidemiology and hygiene under chemical pollution of the environment].

    PubMed

    Rusakov, N V

    In modern conditions the base of the assurance of the safety of human being from harmful factors of environment is the hygienic rationing for the latters. The use of this methodological principle led to the considerable decline in the level of chemical pollution of environment objects. However tens of millions of Russians are exposed to the impact of chemicals above admissible hygienic level. There was noted the high prevalence and mortality rate due to noninfectious diseases of the population. The hygienic science needs to develop and introduce methodology of personification prevention on protection of the person against chemical environmental pollution.

  14. Identification of critical sediment source areas at regional level

    NASA Astrophysics Data System (ADS)

    Fargas, D.; Casasnovas, J. A. Martínez; Poch, R.

    In order to identify critical sediment sources in large catchments, using easily available terrain information at regional scale, a methodology has been developed to obtain a qualitative assessment necessary for further studies. The main objective of the model is to use basic terrain data related to the erosive processes which contribute to the production, transport and accumulation of sediments through the main water paths in the watershed. The model is based on the selection of homogeneous zones regarding drainage density and lithology, achieved by joining the spatial basic units by a rating system. The values of drainage density are rated according to an erosion class (Bucko & Mazurova, 1958). The lithology is rated by erosion indexes, adapted from FAO (1977). The combination and reclassification of the results brings about five qualitative classes of sediment emission risk. This methodology has been tested an validated for the watershed of the Joaquín Costa reservoir (NE Spain), with a surface of 1500 km 2. The mapping scale was 1:100.000 and the model was implemented through a vector GIS (Arc/Info). The prediction was checked by means of photo-interpretation and field work, which gave a accuracy of 78.5%. The proposed methodology has been proved useful as an initial approach for erosion assessment and soil conservation planning at the regional level, and also to select priority areas where further analyses can be developed.

  15. Review of PCR methodology.

    DOT National Transportation Integrated Search

    1998-03-01

    This study was conducted to review the Pavement Condition Rating (PCR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  16. Accounting For Greenhouse Gas Emissions From Flooded ...

    EPA Pesticide Factsheets

    Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th

  17. Accounting for Greenhouse Gas Emissions from Reservoirs ...

    EPA Pesticide Factsheets

    Nearly three decades of research has demonstrated that the impoundment of rivers and the flooding of terrestrial ecosystems behind dams can increase rates of greenhouse gas emission, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used as a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. In the U.S., research approaches include: 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane em

  18. Flow Rates Measurement and Uncertainty Analysis in Multiple-Zone Water-Injection Wells from Fluid Temperature Profiles

    PubMed Central

    Reges, José E. O.; Salazar, A. O.; Maitelli, Carla W. S. P.; Carvalho, Lucas G.; Britto, Ursula J. B.

    2016-01-01

    This work is a contribution to the development of flow sensors in the oil and gas industry. It presents a methodology to measure the flow rates into multiple-zone water-injection wells from fluid temperature profiles and estimate the measurement uncertainty. First, a method to iteratively calculate the zonal flow rates using the Ramey (exponential) model was described. Next, this model was linearized to perform an uncertainty analysis. Then, a computer program to calculate the injected flow rates from experimental temperature profiles was developed. In the experimental part, a fluid temperature profile from a dual-zone water-injection well located in the Northeast Brazilian region was collected. Thus, calculated and measured flow rates were compared. The results proved that linearization error is negligible for practical purposes and the relative uncertainty increases as the flow rate decreases. The calculated values from both the Ramey and linear models were very close to the measured flow rates, presenting a difference of only 4.58 m³/d and 2.38 m³/d, respectively. Finally, the measurement uncertainties from the Ramey and linear models were equal to 1.22% and 1.40% (for injection zone 1); 10.47% and 9.88% (for injection zone 2). Therefore, the methodology was successfully validated and all objectives of this work were achieved. PMID:27420068

  19. Development and Field Test of an Audit Tool and Tracer Methodology for Clinician Assessment of Quality in End-of-Life Care.

    PubMed

    Bookbinder, Marilyn; Hugodot, Amandine; Freeman, Katherine; Homel, Peter; Santiago, Elisabeth; Riggs, Alexa; Gavin, Maggie; Chu, Alice; Brady, Ellen; Lesage, Pauline; Portenoy, Russell K

    2018-02-01

    Quality improvement in end-of-life care generally acquires data from charts or caregivers. "Tracer" methodology, which assesses real-time information from multiple sources, may provide complementary information. The objective of this study was to develop a valid brief audit tool that can guide assessment and rate care when used in a clinician tracer to evaluate the quality of care for the dying patient. To identify items for a brief audit tool, 248 items were created to evaluate overall quality, quality in specific content areas (e.g., symptom management), and specific practices. Collected into three instruments, these items were used to interview professional caregivers and evaluate the charts of hospitalized patients who died. Evidence that this information could be validly captured using a small number of items was obtained through factor analyses, canonical correlations, and group comparisons. A nurse manager field tested tracer methodology using candidate items to evaluate the care provided to other patients who died. The survey of 145 deaths provided chart data and data from 445 interviews (26 physicians, 108 nurses, 18 social workers, and nine chaplains). The analyses yielded evidence of construct validity for a small number of items, demonstrating significant correlations between these items and content areas identified as latent variables in factor analyses. Criterion validity was suggested by significant differences in the ratings on these items between the palliative care unit and other units. The field test evaluated 127 deaths, demonstrated the feasibility of tracer methodology, and informed reworking of the candidate items into the 14-item Tracer EoLC v1. The Tracer EoLC v1 can be used with tracer methodology to guide the assessment and rate the quality of end-of-life care. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  20. Plate motions and deformations from geologic and geodetic data

    NASA Technical Reports Server (NTRS)

    Jordan, T. H.

    1986-01-01

    Research effort on behalf of the Crustal Dynamics Project focused on the development of methodologies suitable for the analysis of space-geodetic data sets for the estimation of crustal motions, in conjunction with results derived from land-based geodetic data, neo-tectonic studies, and other geophysical data. These methodologies were used to provide estimates of both global plate motions and intraplate deformation in the western U.S. Results from the satellite ranging experiment for the rate of change of the baseline length between San Diego and Quincy, California indicated that relative motion between the North American and Pacific plates over the course of the observing period during 1972 to 1982 were consistent with estimates calculated from geologic data averaged over the past few million years. This result, when combined with other kinematic constraints on western U.S. deformation derived from land-based geodesy, neo-tectonic studies, and other geophysical data, places limits on the possible extension of the Basin and Range province, and implies significant deformation is occurring west of the San Andreas fault. A new methodology was developed to analyze vector-position space-geodetic data to provide estimates of relative vector motions of the observing sites. The algorithm is suitable for the reduction of large, inhomogeneous data sets, and takes into account the full position covariances, errors due to poorly resolved Earth orientation parameters and vertical positions, and reduces baises due to inhomogeneous sampling of the data. This methodology was applied to the problem of estimating the rate-scaling parameter of a global plate tectonic model using satellite laser ranging observations over a five-year interval. The results indicate that the mean rate of global plate motions for that interval are consistent with those averaged over several million years, and are not consistent with quiescent or greatly accelerated plate motions. This methodology was also used to provide constraints on deformation in the western U.S. using very long baseline interferometry observations over a two-year period.

  1. Methodologies for extracting kinetic constants for multiphase reacting flow simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S.L.; Lottes, S.A.; Golchert, B.

    1997-03-01

    Flows in industrial reactors often involve complex reactions of many species. A computational fluid dynamics (CFD) computer code, ICRKFLO, was developed to simulate multiphase, multi-species reacting flows. The ICRKFLO uses a hybrid technique to calculate species concentration and reaction for a large number of species in a reacting flow. This technique includes a hydrodynamic and reacting flow simulation with a small but sufficient number of lumped reactions to compute flow field properties followed by a calculation of local reaction kinetics and transport of many subspecies (order of 10 to 100). Kinetic rate constants of the numerous subspecies chemical reactions aremore » difficult to determine. A methodology has been developed to extract kinetic constants from experimental data efficiently. A flow simulation of a fluid catalytic cracking (FCC) riser was successfully used to demonstrate this methodology.« less

  2. Development of a Barbershop-Based Cancer Communication Intervention

    ERIC Educational Resources Information Center

    Holt, Cheryl L.; Wynn, Theresa A.; Lewis, Ivey; Litaker, Mark S.; Jeames, Sanford; Huckaby, Francine; Stroud, Leonardo; Southward, Penny L.; Simons, Virgil; Lee, Crystal; Ross, Louis; Mitchell, Theodies

    2009-01-01

    Purpose: Prostate and colorectal cancer (CRC) rates are disproportionately high among African-American men. The purpose of this paper is to describe the development of an intervention in which barbers were trained to educate clients about early detection for prostate and CRC. Design/methodology/approach: Working with an advisory panel of local…

  3. Mobile Applications and 4G Wireless Networks: A Framework for Analysis

    ERIC Educational Resources Information Center

    Yang, Samuel C.

    2012-01-01

    Purpose: The use of mobile wireless data services continues to increase worldwide. New fourth-generation (4G) wireless networks can deliver data rates exceeding 2 Mbps. The purpose of this paper is to develop a framework of 4G mobile applications that utilize such high data rates and run on small form-factor devices. Design/methodology/approach:…

  4. The Forgotten Half of Program Evaluation: A Focus on the Translation of Rating Scales for Use with Hispanic Populations

    ERIC Educational Resources Information Center

    Dogan, Shannon J.; Sitnick, Stephanie L.; Onati, Lenna L.

    2012-01-01

    Extension professionals often work with diverse clientele; however, most assessment tools have been developed and validated with English-speaking samples. There is little research and practical guidance on the cultural adaptation and translation of rating scales. The purpose of this article is to summarize the methodological work in this area as…

  5. Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation.

    PubMed

    Passalía, Claudio; Alfano, Orlando M; Brandi, Rodolfo J

    2017-06-07

    An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.

  6. GRADE equity guidelines 1: considering health equity in GRADE guideline development: introduction and rationale.

    PubMed

    Welch, Vivian A; Akl, Elie A; Guyatt, Gordon; Pottie, Kevin; Eslava-Schmalbach, Javier; Ansari, Mohammed T; de Beer, Hans; Briel, Matthias; Dans, Tony; Dans, Inday; Hultcrantz, Monica; Jull, Janet; Katikireddi, Srinivasa Vittal; Meerpohl, Joerg; Morton, Rachael; Mosdol, Annhild; Petkovic, Jennifer; Schünemann, Holger J; Sharaf, Ravi N; Singh, Jasvinder A; Stanev, Roger; Tonia, Thomy; Tristan, Mario; Vitols, Sigurd; Watine, Joseph; Tugwell, Peter

    2017-10-01

    This article introduces the rationale and methods for explicitly considering health equity in the Grading of Recommendations Assessment, Development and Evaluation (GRADE) methodology for development of clinical, public health, and health system guidelines. We searched for guideline methodology articles, conceptual articles about health equity, and examples of guidelines that considered health equity explicitly. We held three meetings with GRADE Working Group members and invited comments from the GRADE Working Group listserve. We developed three articles on incorporating equity considerations into the overall approach to guideline development, rating certainty, and assembling the evidence base and evidence to decision and/or recommendation. Clinical and public health guidelines have a role to play in promoting health equity by explicitly considering equity in the process of guideline development. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Review of PCR methodology : executive summary.

    DOT National Transportation Integrated Search

    1998-01-01

    This study was conducted to review the Pavement Condition Rating (peR) : methodology currently used by the Ohio DOT. The results of the literature search in this : connection indicated that many Highway agencies use a similar methodology to rate thei...

  8. A new approach to assessing the water footprint of wine: an Italian case study.

    PubMed

    Lamastra, Lucrezia; Suciu, Nicoleta Alina; Novelli, Elisa; Trevisan, Marco

    2014-08-15

    Agriculture is the largest freshwater consumer, accounting for 70% of the world's water withdrawal. Water footprints (WFs) are being increasingly used to indicate the impacts of water use by production systems. A new methodology to assess WF of wine was developed in the framework of the V.I.V.A. project (Valutazione Impatto Viticoltura sull'Ambiente), launched by the Italian Ministry for the Environment in 2011 to improve the Italian wine sector's sustainability. The new methodology has been developed that enables different vines from the same winery to be compared. This was achieved by calculating the gray water footprint, following Tier III approach proposed by Hoekstra et al. (2011). The impact of water use during the life cycle of grape-wine production was assessed for six different wines from the same winery in Sicily, Italy using both the newly developed methodology (V.I.V.A.) and the classical methodology proposed by the Water Footprint Network (WFN). In all cases green water was the largest contributor to WF, but the new methodology also detected differences between vines of the same winery. Furthermore, V.I.V.A. methodology assesses water body contamination by pesticides application whereas the WFN methodology considers just fertilization. This fact ended highlights the highest WF of vineyard 4 calculated by V.I.V.A. if compared with the WF calculated with WFN methodology. Comparing the WF of wine produced with grapes from the six different wines, the factors most greatly influencing the results obtained in this study were: distance from the water body, fertilization rate, amount and eco-toxicological behavior of the active ingredients used. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. 78 FR 69647 - Drill Pipe From the People's Republic of China: Notice of Court Decision Not in Harmony With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... drill pipe green tubes and the labor wage rate in the less-than-fair-value investigation. \\1\\ Downhole... Department revised the labor wage rate and applied the wage rate methodology from Labor Methodologies.\\4\\ On... States, 604 F.3d 1363, 1372 (Fed. Cir. 2010) (``Dorbest''); see also Antidumping Methodologies in...

  10. Global, regional and national levels and trends of preterm birth rates for 1990 to 2014: protocol for development of World Health Organization estimates.

    PubMed

    Vogel, Joshua P; Chawanpaiboon, Saifon; Watananirun, Kanokwaroon; Lumbiganon, Pisake; Petzold, Max; Moller, Ann-Beth; Thinkhamrop, Jadsada; Laopaiboon, Malinee; Seuc, Armando H; Hogan, Daniel; Tunçalp, Ozge; Allanson, Emma; Betrán, Ana Pilar; Bonet, Mercedes; Oladapo, Olufemi T; Gülmezoglu, A Metin

    2016-06-17

    The official WHO estimates of preterm birth are an essential global resource for assessing the burden of preterm birth and developing public health programmes and policies. This protocol describes the methods that will be used to identify, critically appraise and analyse all eligible preterm birth data, in order to develop global, regional and national level estimates of levels and trends in preterm birth rates for the period 1990 - 2014. We will conduct a systematic review of civil registration and vital statistics (CRVS) data on preterm birth for all WHO Member States, via national Ministries of Health and Statistics Offices. For Member States with absent, limited or lower-quality CRVS data, a systematic review of surveys and/or research studies will be conducted. Modelling will be used to develop country, regional and global rates for 2014, with time trends for Member States where sufficient data are available. Member States will be invited to review the methodology and provide additional eligible data via a country consultation before final estimates are developed and disseminated. This research will be used to generate estimates on the burden of preterm birth globally for 1990 to 2014. We invite feedback on the methodology described, and call on the public health community to submit pertinent data for consideration. Registered at PROSPERO CRD42015027439 CONTACT: pretermbirth@who.int.

  11. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  12. Bayesian inference on proportional elections.

    PubMed

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  13. Assessment of Integrated Pedestrian Protection Systems with Autonomous Emergency Braking (AEB) and Passive Safety Components.

    PubMed

    Edwards, Mervyn; Nathanson, Andrew; Carroll, Jolyon; Wisch, Marcus; Zander, Oliver; Lubbe, Nils

    2015-01-01

    Autonomous emergency braking (AEB) systems fitted to cars for pedestrians have been predicted to offer substantial benefit. On this basis, consumer rating programs-for example, the European New Car Assessment Programme (Euro NCAP)-are developing rating schemes to encourage fitment of these systems. One of the questions that needs to be answered to do this fully is how the assessment of the speed reduction offered by the AEB is integrated with the current assessment of the passive safety for mitigation of pedestrian injury. Ideally, this should be done on a benefit-related basis. The objective of this research was to develop a benefit-based methodology for assessment of integrated pedestrian protection systems with AEB and passive safety components. The method should include weighting procedures to ensure that it represents injury patterns from accident data and replicates an independently estimated benefit of AEB. A methodology has been developed to calculate the expected societal cost of pedestrian injuries, assuming that all pedestrians in the target population (i.e., pedestrians impacted by the front of a passenger car) are impacted by the car being assessed, taking into account the impact speed reduction offered by the car's AEB (if fitted) and the passive safety protection offered by the car's frontal structure. For rating purposes, the cost for the assessed car is normalized by comparing it to the cost calculated for a reference car. The speed reductions measured in AEB tests are used to determine the speed at which each pedestrian in the target population will be impacted. Injury probabilities for each impact are then calculated using the results from Euro NCAP pedestrian impactor tests and injury risk curves. These injury probabilities are converted into cost using "harm"-type costs for the body regions tested. These costs are weighted and summed. Weighting factors were determined using accident data from Germany and Great Britain and an independently estimated AEB benefit. German and Great Britain versions of the methodology are available. The methodology was used to assess cars with good, average, and poor Euro NCAP pedestrian ratings, in combination with a current AEB system. The fitment of a hypothetical A-pillar airbag was also investigated. It was found that the decrease in casualty injury cost achieved by fitting an AEB system was approximately equivalent to that achieved by increasing the passive safety rating from poor to average. Because the assessment was influenced strongly by the level of head protection offered in the scuttle and windscreen area, a hypothetical A-pillar airbag showed high potential to reduce overall casualty cost. A benefit-based methodology for assessment of integrated pedestrian protection systems with AEB has been developed and tested. It uses input from AEB tests and Euro NCAP passive safety tests to give an integrated assessment of the system performance, which includes consideration of effects such as the change in head impact location caused by the impact speed reduction given by the AEB.

  14. 49 CFR 1109.4 - Mandatory mediation in rate cases to be considered under the stand-alone cost methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Mandatory mediation in rate cases to be considered... § 1109.4 Mandatory mediation in rate cases to be considered under the stand-alone cost methodology. (a) A... methodology must engage in non-binding mediation of its dispute with the railroad upon filing a formal...

  15. Evaluating Managerial Styles for System Development Life Cycle Stages to Ensure Software Project Success

    ERIC Educational Resources Information Center

    Kocherla, Showry

    2012-01-01

    Information technology (IT) projects are considered successful if they are completed on time, within budget, and within scope. Even though, the required tools and methodologies are in place, IT projects continue to fail at a higher rate. Current literature lacks explanation for success within the stages of system development life-cycle (SDLC) such…

  16. [Counseling interventions for smoking cessation: systematic review].

    PubMed

    Alba, Luz Helena; Murillo, Raúl; Castillo, Juan Sebastián

    2013-04-01

    A systematic review on efficacy and safety of smoking cessation counseling was developed. The ADAPTE methodology was used with a search of Clinical Practice Guidelines (CPG) in Medline, EMBASE, CINAHL, LILACS, and Cochrane. DELBI was used to select CPG with score over 60 in methodological rigor and applicability to the Colombian health system. Smoking cessation rates at 6 months were assessed according to counseling provider, model, and format. In total 5 CPG out of 925 references were selected comprising 44 systematic reviews and meta-analyses. Physician brief counseling and trained health professionals' intensive counseling (individual, group, proactive telephone) are effective with abstinence rates between 2.1% and 17.4%. Only practical counseling and motivational interview were found effective intensive interventions. The clinical effect of smoking cessation counseling is low and long term cessation rates uncertain. Cost-effectiveness analyses are recommended for the implementation of counseling in public health programs.

  17. Long-term cliff retreat and erosion hotspots along the central shores of the Monterey Bay National Marine Sanctuary

    USGS Publications Warehouse

    Moore, Laura J.; Griggs, Gary B.

    2002-01-01

    Quantification of cliff retreat rates for the southern half of Santa Cruz County, CA, USA, located within the Monterey Bay National Marine Sanctuary, using the softcopy/geographic information system (GIS) methodology results in average cliff retreat rates of 7–15 cm/yr between 1953 and 1994. The coastal dunes at the southern end of Santa Cruz County migrate seaward and landward through time and display net accretion between 1953 and 1994, which is partially due to development. In addition, three critically eroding segments of coastline with high average erosion rates ranging from 20 to 63 cm/yr are identified as erosion ‘hotspots’. These locations include: Opal Cliffs, Depot Hill and Manresa. Although cliff retreat is episodic, spatially variable at the scale of meters, and the factors affecting cliff retreat vary along the Santa Cruz County coastline, there is a compensation between factors affecting retreat such that over the long-term the coastline maintains a relatively smooth configuration. The softcopy/GIS methodology significantly reduces errors inherent in the calculation of retreat rates in high-relief areas (e.g. erosion rates generated in this study are generally correct to within 10 cm) by removing errors due to relief displacement. Although the resulting root mean squared error for erosion rates is relatively small, simple projections of past erosion rates are inadequate to provide predictions of future cliff position. Improved predictions can be made for individual coastal segments by using a mean erosion rate and the standard deviation as guides to future cliff behavior in combination with an understanding of processes acting along the coastal segments in question. This methodology can be applied on any high-relief coast where retreat rates can be measured.

  18. Variable frame rate transmission - A review of methodology and application to narrow-band LPC speech coding

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Makhoul, J.; Schwartz, R. M.; Huggins, A. W. F.

    1982-04-01

    The variable frame rate (VFR) transmission methodology developed, implemented, and tested in the years 1973-1978 for efficiently transmitting linear predictive coding (LPC) vocoder parameters extracted from the input speech at a fixed frame rate is reviewed. With the VFR method, parameters are transmitted only when their values have changed sufficiently over the interval since their preceding transmission. Two distinct approaches to automatic implementation of the VFR method are discussed. The first bases the transmission decisions on comparisons between the parameter values of the present frame and the last transmitted frame. The second, which is based on a functional perceptual model of speech, compares the parameter values of all the frames that lie in the interval between the present frame and the last transmitted frame against a linear model of parameter variation over that interval. Also considered is the application of VFR transmission to the design of narrow-band LPC speech coders with average bit rates of 2000-2400 bts/s.

  19. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate ('dynamic fatigue') testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rate in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  20. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  1. Hybrid response surface methodology-artificial neural network optimization of drying process of banana slices in a forced convective dryer.

    PubMed

    Taheri-Garavand, Amin; Karimi, Fatemeh; Karimi, Mahmoud; Lotfi, Valiullah; Khoobbakht, Golmohammad

    2018-06-01

    The aim of the study is to fit models for predicting surfaces using the response surface methodology and the artificial neural network to optimize for obtaining the maximum acceptability using desirability functions methodology in a hot air drying process of banana slices. The drying air temperature, air velocity, and drying time were chosen as independent factors and moisture content, drying rate, energy efficiency, and exergy efficiency were dependent variables or responses in the mentioned drying process. A rotatable central composite design as an adequate method was used to develop models for the responses in the response surface methodology. Moreover, isoresponse contour plots were useful to predict the results by performing only a limited set of experiments. The optimum operating conditions obtained from the artificial neural network models were moisture content 0.14 g/g, drying rate 1.03 g water/g h, energy efficiency 0.61, and exergy efficiency 0.91, when the air temperature, air velocity, and drying time values were equal to -0.42 (74.2 ℃), 1.00 (1.50 m/s), and -0.17 (2.50 h) in the coded units, respectively.

  2. Operational Impacts of Wind Energy Resources in the Bonneville Power Administration Control Area - Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai

    2008-07-15

    This report presents a methodology developed to study the future impact of wind on BPA power system load following and regulation requirements. The methodology uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system, by mimicking the actual power system operations. Therefore, the results are close to reality, yet the study based on this methodology is convenient to conduct. Compared with the proposed methodology, existing methodologies for doing similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. Dispatch model simulation is constrained by the design of themore » dispatch program, and standard deviation evaluation is artificial in separating the load following and regulation requirements, both of which usually do not reflect actual operational practice. The methodology used in this study provides not only capacity requirement information, it also analyzes the ramp rate requirements for system load following and regulation processes. The ramp rate data can be used to evaluate generator response/maneuverability requirements, which is another necessary capability of the generation fleet for the smooth integration of wind energy. The study results are presented in an innovative way such that the increased generation capacity or ramp requirements are compared for two different years, across 24 hours a day. Therefore, the impact of different levels of wind energy on generation requirements at different times can be easily visualized.« less

  3. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    PubMed

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  4. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research.

    PubMed

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.

  5. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    PubMed Central

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  6. Spacecraft software training needs assessment research, appendices

    NASA Technical Reports Server (NTRS)

    Ratcliff, Shirley; Golas, Katharine

    1990-01-01

    The appendices to the previously reported study are presented: statistical data from task rating worksheets; SSD references; survey forms; fourth generation language, a powerful, long-term solution to maintenance cost; task list; methodology; SwRI's instructional systems development model; relevant research; and references.

  7. A combined stochastic feedforward and feedback control design methodology with application to autoland design

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1987-01-01

    A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.

  8. Fire Hazards from Combustible Ammunition, Methodology Development. Phase I

    DTIC Science & Technology

    1980-06-01

    5.3 Flame Length , Flame Diameter and Mass Burning Rate 37 5.4 Flame Emissive Power 41 5.5 Fire Plume Axial Gas Velocity 41 5.6 Flame Temperature...B.2 Exit Velocity 93 B.3 Rate of Energy Flow 93 B.4 Chamber Characteristics 94 B.5 Flame Length 95 B.6 Flame Lift Angle 95 B.7 Summary 97...Viewing Flame in Test Series 5 17. Flame Length Scaling 18. Scaling Trends for Mass Burning Rate 19. Effective Flame Emissive Power versus Flame

  9. A scenario elicitation methodology to identify the drivers of electricity infrastructure cost in South America

    NASA Astrophysics Data System (ADS)

    Moksnes, Nandi; Taliotis, Constantinos; Broad, Oliver; de Moura, Gustavo; Howells, Mark

    2017-04-01

    Developing a set of scenarios to assess a proposed policy or future development pathways requires a certain level of information, as well as establishing the socio-economic context. As the future is difficult to predict, great care in defining the selected scenarios is needed. Even so it can be difficult to assess if the selected scenario is covering the possible solution space. Instead, this paper's methodology develops a large set of scenarios (324) in OSeMOSYS using the SAMBA 2.0 (South America Model Base) model to assess long-term electricity supply scenarios and applies a scenario-discovery statistical data mining algorithm, Patient Rule Induction Method (PRIM). By creating a multidimensional space, regions related to high and low cost can be identified as well as their key driver. The six key drivers are defined a priori in three (high, medium, low) or two levers (high, low): 1) Demand projected from GDP, population, urbanization and transport, 2) Fossil fuel price, 3) Climate change impact on hydropower, 4) Renewable technology learning rate, 5) Discount rate, 6) CO2 emission targets.

  10. Subthreshold posttraumatic stress disorder: A meta-analytic review of DSM-IV prevalence and a proposed DSM-5 approach to measurement.

    PubMed

    Brancu, Mira; Mann-Wrobel, Monica; Beckham, Jean C; Wagner, H Ryan; Elliott, Alyssa; Robbins, Allison T; Wong, Madrianne; Berchuck, Ania E; Runnals, Jennifer J

    2016-03-01

    Subthreshold posttraumatic stress disorder (PTSD) is a chronic condition that is often ignored, the cumulative effects of which can negatively impact an individual's quality of life and overall health care costs. However, subthreshold PTSD prevalence rates and impairment remain unclear due to variations in research methodology. This study examined the existing literature in order to recommend approaches to standardize subthreshold PTSD assessment. We conducted (a) a meta-analysis of subthreshold PTSD prevalence rates and (b) compared functional impairment associated with the 3 most commonly studied subthreshold PTSD definitions. Meta-analytic results revealed that the average prevalence rate of subthreshold PTSD across studies was 14.7%, with a lower rate (12.6%) among the most methodologically rigorous studies and higher rate (15.6%) across less rigorous studies. There were significant methodological differences among reviewed studies with regard to definition, measurement, and population. Different definitions led to prevalence rates ranging between 13.7% and 16.4%. Variability in prevalence rates most related to population and sample composition, with trauma type and community (vs. epidemiological) samples significantly impacting heterogeneity. Qualitative information gathered from studies presenting functional correlates supported current evidence that psychological and behavioral parameters were worse among subthreshold PTSD groups compared with no-PTSD groups, but not as severe as impairment in PTSD groups. Several studies also reported significant increased risk of suicidality and hopelessness as well as higher health care utilization rates among those with subthreshold PTSD (compared with trauma exposed no-PTSD samples). Based on findings, we propose recommendations for developing a standard approach to evaluation of subthreshold PTSD. (c) 2016 APA, all rights reserved).

  11. Enhanced styrene recovery from waste polystyrene pyrolysis using response surface methodology coupled with Box-Behnken design.

    PubMed

    Mo, Yu; Zhao, Lei; Wang, Zhonghui; Chen, Chia-Lung; Tan, Giin-Yu Amy; Wang, Jing-Yuan

    2014-04-01

    A work applied response surface methodology coupled with Box-Behnken design (RSM-BBD) has been developed to enhance styrene recovery from waste polystyrene (WPS) through pyrolysis. The relationship between styrene yield and three selected operating parameters (i.e., temperature, heating rate, and carrier gas flow rate) was investigated. A second order polynomial equation was successfully built to describe the process and predict styrene yield under the study conditions. The factors identified as statistically significant to styrene production were: temperature, with a quadratic effect; heating rate, with a linear effect; carrier gas flow rate, with a quadratic effect; interaction between temperature and carrier gas flow rate; and interaction between heating rate and carrier gas flow rate. The optimum conditions for the current system were determined to be at a temperature range of 470-505°C, a heating rate of 40°C/min, and a carrier gas flow rate range of 115-140mL/min. Under such conditions, 64.52% WPS was recovered as styrene, which was 12% more than the highest reported yield for reactors of similar size. It is concluded that RSM-BBD is an effective approach for yield optimization of styrene recovery from WPS pyrolysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Factors that affect implementation of a nurse staffing directive: results from a qualitative multi-case evaluation.

    PubMed

    Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E

    2016-08-01

    To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  13. 42 CFR 416.171 - Determination of payment rates for ASC services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Determination of payment rates for ASC services... Determination of payment rates for ASC services. (a) Standard methodology. The standard methodology for determining the national unadjusted payment rate for ASC services is to calculate the product of the...

  14. Engineering of solar photocatalytic detoxification and disinfection process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goswami, D.Y.

    1995-12-31

    Use of solar radiation for photocatalytic detoxification and disinfection is a very fascinating and fast-developing area. Although scientific research on these processes, especially photocatalytic oxidation, has been conducted for at least the last three decades, the development of industrial/commercial applications, engineering systems and engineering design methodologies have occurred only recently. A number of reactor concepts and designs, including concentrating and non-concentrating types and various methods of catalyst deployment have been developed. Some of these reactors have been used in field demonstrations of groundwater and wastewater remediation. Recent research has been focused on improvements of catalysts to increase the reaction rates,more » as well as finding new applications of the process. This paper reviews the latest developments of solar detoxification and disinfection including catalyst development, industrial/commercial applications, reactor design and engineering system design methodologies. 80 refs., 20 figs., 3 tabs.« less

  15. Non-adiabatic quantum reactive scattering in hyperspherical coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kendrick, Brian K.

    A new electronically non-adiabatic quantum reactive scattering methodology is presented based on a time-independent coupled channel formalism and the adiabatically adjusting principal axis hyperspherical coordinates of Pack and Parker [J. Chem. Phys. 87, 3888 (1987)]. The methodology computes the full state-to-state scattering matrix for A + B 2(v, j) ↔ AB(v', j') + B and A + AB(v, j) → A + AB(v', j') reactions that involve two coupled electronic states which exhibit a conical intersection. The methodology accurately treats all six degrees of freedom relative to the center-of-mass which includes non-zero total angular momentum J and identical particle exchangemore » symmetry. The new methodology is applied to the ultracold hydrogen exchange reaction for which large geometric phase effects have been recently reported [B. K. Kendrick et al., Phys. Rev. Lett. 115, 153201 (2015)]. Rate coefficients for the H/D + HD(v = 4, j = 0) → H/D + HD(v', j') reactions are reported for collision energies between 1 μK and 100 K (total energy ≈1.9 eV). A new diabatic potential energy matrix is developed based on the Boothroyd, Keogh, Martin, and Peterson (BKMP2) and double many body expansion plus single-polynomial (DSP) adiabatic potential energy surfaces for the ground and first excited electronic states of H 3, respectively. The rate coefficients computed using the new non-adiabatic methodology and diabatic potential matrix reproduce the recently reported rates that include the geometric phase and are computed using a single adiabatic ground electronic state potential energy surface (BKMP2). The dramatic enhancement and suppression of the ultracold rates due to the geometric phase are confirmed as well as its effects on several shape resonances near 1 K. In conclusion, the results reported here represent the first fully non-adiabatic quantum reactive scattering calculation for an ultracold reaction and validate the importance of the geometric phase on the Wigner threshold behavior.« less

  16. Non-adiabatic quantum reactive scattering in hyperspherical coordinates

    NASA Astrophysics Data System (ADS)

    Kendrick, Brian K.

    2018-01-01

    A new electronically non-adiabatic quantum reactive scattering methodology is presented based on a time-independent coupled channel formalism and the adiabatically adjusting principal axis hyperspherical coordinates of Pack and Parker [J. Chem. Phys. 87, 3888 (1987)]. The methodology computes the full state-to-state scattering matrix for A + B2(v , j) ↔ AB(v ', j') + B and A + AB(v , j) → A + AB(v ', j') reactions that involve two coupled electronic states which exhibit a conical intersection. The methodology accurately treats all six degrees of freedom relative to the center-of-mass which includes non-zero total angular momentum J and identical particle exchange symmetry. The new methodology is applied to the ultracold hydrogen exchange reaction for which large geometric phase effects have been recently reported [B. K. Kendrick et al., Phys. Rev. Lett. 115, 153201 (2015)]. Rate coefficients for the H/D + HD(v = 4, j = 0) → H/D + HD(v ', j') reactions are reported for collision energies between 1 μK and 100 K (total energy ≈1.9 eV). A new diabatic potential energy matrix is developed based on the Boothroyd, Keogh, Martin, and Peterson (BKMP2) and double many body expansion plus single-polynomial (DSP) adiabatic potential energy surfaces for the ground and first excited electronic states of H3, respectively. The rate coefficients computed using the new non-adiabatic methodology and diabatic potential matrix reproduce the recently reported rates that include the geometric phase and are computed using a single adiabatic ground electronic state potential energy surface (BKMP2). The dramatic enhancement and suppression of the ultracold rates due to the geometric phase are confirmed as well as its effects on several shape resonances near 1 K. The results reported here represent the first fully non-adiabatic quantum reactive scattering calculation for an ultracold reaction and validate the importance of the geometric phase on the Wigner threshold behavior.

  17. Non-adiabatic quantum reactive scattering in hyperspherical coordinates

    DOE PAGES

    Kendrick, Brian K.

    2018-01-28

    A new electronically non-adiabatic quantum reactive scattering methodology is presented based on a time-independent coupled channel formalism and the adiabatically adjusting principal axis hyperspherical coordinates of Pack and Parker [J. Chem. Phys. 87, 3888 (1987)]. The methodology computes the full state-to-state scattering matrix for A + B 2(v, j) ↔ AB(v', j') + B and A + AB(v, j) → A + AB(v', j') reactions that involve two coupled electronic states which exhibit a conical intersection. The methodology accurately treats all six degrees of freedom relative to the center-of-mass which includes non-zero total angular momentum J and identical particle exchangemore » symmetry. The new methodology is applied to the ultracold hydrogen exchange reaction for which large geometric phase effects have been recently reported [B. K. Kendrick et al., Phys. Rev. Lett. 115, 153201 (2015)]. Rate coefficients for the H/D + HD(v = 4, j = 0) → H/D + HD(v', j') reactions are reported for collision energies between 1 μK and 100 K (total energy ≈1.9 eV). A new diabatic potential energy matrix is developed based on the Boothroyd, Keogh, Martin, and Peterson (BKMP2) and double many body expansion plus single-polynomial (DSP) adiabatic potential energy surfaces for the ground and first excited electronic states of H 3, respectively. The rate coefficients computed using the new non-adiabatic methodology and diabatic potential matrix reproduce the recently reported rates that include the geometric phase and are computed using a single adiabatic ground electronic state potential energy surface (BKMP2). The dramatic enhancement and suppression of the ultracold rates due to the geometric phase are confirmed as well as its effects on several shape resonances near 1 K. In conclusion, the results reported here represent the first fully non-adiabatic quantum reactive scattering calculation for an ultracold reaction and validate the importance of the geometric phase on the Wigner threshold behavior.« less

  18. Load capacity of hollowed timber piles.

    DOT National Transportation Integrated Search

    1998-12-01

    The goal of this study was to develop a reliable load-rating methodology for timber piles based on the level of documented damage. Louisiana currently has over 4,000 timber bridges in its inventory of over 13,800 bridges. A quarter of these 4,000 tim...

  19. 76 FR 65504 - Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ..., including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility... Reliability Standard, FAC- 008-3--Facility Ratings, developed by the North American Electric Reliability... Reliability Standard FAC- 008-3 is pending before the Commission. The proposed Reliability Standard modifies...

  20. Ground water contamination and costs of pesticide restrictions in the southeastern coastal plain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, L.E.; Carlson, G.A.; Liu, S.

    The project developed new methodology for estimating: (1) groundwater contamination potential (GWCP) in the Southeast Coastal Plain, and (2) the potential economic impacts of selected policies that restrict pesticide use. The potential for ground water contamination was estimated by use of a simple matrix for combining ratings for both soil leaching potential and pesticide leaching potential. Key soil variables included soil texture, soil acidity and organic matter content. Key pesticide characteristics included Koc, pesticide half-life, the rate of application and the fraction of the pesticide hitting the soil. Comparisons of pesticide use from various farmer and expert opinion surveys weremore » made for pesticide groups and for individual pesticide products. Methodology for merging the GWCP changes and lost benefits from selected herbicide cancellations was developed using corn production in the North Carolina Coastal Plain. Economic evaluations of pesticide cancellations for corn included national and Coastal Plain estimates for atrazine; metolachlor; dicamba; dicamba and atrazine; and dicamba, atrazine and metolachlor.« less

  1. Methodology for computing the burden of disease of adverse events following immunization.

    PubMed

    McDonald, Scott A; Nijsten, Danielle; Bollaerts, Kaatje; Bauwens, Jorgen; Praet, Nicolas; van der Sande, Marianne; Bauchau, Vincent; de Smedt, Tom; Sturkenboom, Miriam; Hahné, Susan

    2018-03-24

    Composite disease burden measures such as disability-adjusted life-years (DALY) have been widely used to quantify the population-level health impact of disease or injury, but application has been limited for the estimation of the burden of adverse events following immunization. Our objective was to assess the feasibility of adapting the DALY approach for estimating adverse event burden. We developed a practical methodological framework, explicitly describing all steps involved: acquisition of relative or absolute risks and background event incidence rates, selection of disability weights and durations, and computation of the years lived with disability (YLD) measure, with appropriate estimation of uncertainty. We present a worked example, in which YLD is computed for 3 recognized adverse reactions following 3 childhood vaccination types, based on background incidence rates and relative/absolute risks retrieved from the literature. YLD provided extra insight into the health impact of an adverse event over presentation of incidence rates only, as severity and duration are additionally incorporated. As well as providing guidance for the deployment of DALY methodology in the context of adverse events associated with vaccination, we also identified where data limitations potentially occur. Burden of disease methodology can be applied to estimate the health burden of adverse events following vaccination in a systematic way. As with all burden of disease studies, interpretation of the estimates must consider the quality and accuracy of the data sources contributing to the DALY computation. © 2018 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  2. Novel methodology for pharmaceutical expenditure forecast.

    PubMed

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making.

  3. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Technical Reports Server (NTRS)

    Wolf, M.; Newhouse, M.

    1986-01-01

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  4. Novel measurement techniques (development and analysis of silicon solar cells near 20% effciency)

    NASA Astrophysics Data System (ADS)

    Wolf, M.; Newhouse, M.

    Work in identifying, developing, and analyzing techniques for measuring bulk recombination rates, and surface recombination velocities and rates in all regions of high-efficiency silicon solar cells is presented. The accuracy of the previously developed DC measurement system was improved by adding blocked interference filters. The system was further automated by writing software that completely samples the unkown solar cell regions with data of numerous recombination velocity and lifetime pairs. The results can be displayed in three dimensions and the best fit can be found numerically using the simplex minimization algorithm. Also described is a theoretical methodology to analyze and compare existing dynamic measurement techniques.

  5. Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research

    ERIC Educational Resources Information Center

    Woodcock, James M.

    1971-01-01

    Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)

  6. Methodology for dynamic biaxial tension testing of pregnant uterine tissue.

    PubMed

    Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan

    2007-01-01

    Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.

  7. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    NASA Astrophysics Data System (ADS)

    Fensin, Michael Lorne

    Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.

  8. Training Objectives for Tank Platoon Leaders: Interview Excerpts and Analysis

    DTIC Science & Technology

    1984-02-01

    25 2. Composite post-interview ratings of the O’Brien-Drucker methodology. .. ... .......... . ........... 46 3. Characteristics and ratings of...sibility, and other characteristics that are pertinent to the format’s further development and refinement. This reseaizh note provides a comprehensive...all down because he is impressionable . Then you take this away from him and put him in a tank under fire, he will fall completely apart. He will not be

  9. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.337 Methodology for calculating the...

  10. 42 CFR 413.337 - Methodology for calculating the prospective payment rates.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-STAGE RENAL DISEASE SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospective Payment for Skilled Nursing Facilities § 413.337 Methodology for calculating the...

  11. Trip Energy Estimation Methodology and Model Based on Real-World Driving Data for Green Routing Applications: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holden, Jacob; Van Til, Harrison J; Wood, Eric W

    A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any typemore » of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.« less

  12. The Navigation Guide—Evidence-Based Medicine Meets Environmental Health: Systematic Review of Human Evidence for PFOA Effects on Fetal Growth

    PubMed Central

    Sutton, Patrice; Atchley, Dylan S.; Koustas, Erica; Lam, Juleen; Sen, Saunak; Robinson, Karen A.; Axelrad, Daniel A.; Woodruff, Tracey J.

    2014-01-01

    Background: The Navigation Guide methodology was developed to meet the need for a robust method of systematic and transparent research synthesis in environmental health science. We conducted a case study systematic review to support proof of concept of the method. Objective: We applied the Navigation Guide systematic review methodology to determine whether developmental exposure to perfluorooctanoic acid (PFOA) affects fetal growth in humans. Methods: We applied the first 3 steps of the Navigation Guide methodology to human epidemiological data: 1) specify the study question, 2) select the evidence, and 3) rate the quality and strength of the evidence. We developed a protocol, conducted a comprehensive search of the literature, and identified relevant studies using prespecified criteria. We evaluated each study for risk of bias and conducted meta-analyses on a subset of studies. We rated quality and strength of the entire body of human evidence. Results: We identified 18 human studies that met our inclusion criteria, and 9 of these were combined through meta-analysis. Through meta-analysis, we estimated that a 1-ng/mL increase in serum or plasma PFOA was associated with a –18.9 g (95% CI: –29.8, –7.9) difference in birth weight. We concluded that the risk of bias across studies was low, and we assigned a “moderate” quality rating to the overall body of human evidence. Conclusion: On the basis of this first application of the Navigation Guide systematic review methodology, we concluded that there is “sufficient” human evidence that developmental exposure to PFOA reduces fetal growth. Citation: Johnson PI, Sutton P, Atchley DS, Koustas E, Lam J, Sen S, Robinson KA, Axelrad DA, Woodruff TJ. 2014. The Navigation Guide—evidence-based medicine meets environmental health: systematic review of human evidence for PFOA effects on fetal growth. Environ Health Perspect 122:1028–1039; http://dx.doi.org/10.1289/ehp.1307893 PMID:24968388

  13. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  14. Evaluating thermoregulation in reptiles: the fallacy of the inappropriately applied method.

    PubMed

    Seebacher, Frank; Shine, Richard

    2004-01-01

    Given the importance of heat in most biological processes, studies on thermoregulation have played a major role in understanding the ecology of ectothermic vertebrates. It is, however, difficult to assess whether body temperature is actually regulated, and several techniques have been developed that allow an objective assessment of thermoregulation. Almost all recent studies on reptiles follow a single methodology that, when used correctly, facilitates comparisons between species, climates, and so on. However, the use of operative temperatures in this methodology assumes zero heat capacity of the study animals and is, therefore, appropriate for small animals only. Operative temperatures represent potentially available body temperatures accurately for small animals but can substantially overestimate the ranges of body temperature available to larger animals whose slower rates of heating and cooling mean that they cannot reach equilibrium if they encounter operative temperatures that change rapidly through either space or time. This error may lead to serious misinterpretations of field data. We derive correction factors specific for body mass and rate of movement that can be used to estimate body temperature null distributions of larger reptiles, thereby overcoming this methodological problem.

  15. Tracking the demise of state hospital rate setting.

    PubMed

    McDonough, J E

    1997-01-01

    From its once preeminent position in state health policy, prospective hospital rate setting has declined in use from more than thirty states in 1980 to two today. This essay tracks the trend toward deregulation in various states--especially Massachusetts, New Jersey, and New York-- and examines the continuation of rate setting in Maryland. Principally, the decline reflects the development of managed care and capitation as alternative means to control health spending growth. This trend represents both an evolution in prospective payment methodology and a renewed preference for private over public-sector price controls.

  16. 76 FR 3060 - Call for Information: Information Related to the Development of Emission-Estimating Methodologies...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-19

    ... approach that incorporates ``mass balance'' constraints to determine emissions from AFOs. Unfortunately... ventilation rate of the monitored confinement structure. Nitrogen content of process inputs and outputs (e.g., feed, water, bedding, eggs, milk). Nitrogen content of manure excreted. Description of any control...

  17. Ecological Development and Validation of a Music Performance Rating Scale for Five Instrument Families

    ERIC Educational Resources Information Center

    Wrigley, William J.; Emmerson, Stephen B.

    2013-01-01

    This study investigated ways to improve the quality of music performance evaluation in an effort to address the accountability imperative in tertiary music education. An enhanced scientific methodology was employed incorporating ecological validity and using recognized qualitative methods involving grounded theory and quantitative methods…

  18. AN IN VIVO MICRODIALYSIS METHOD FOR THE QUALITATIVE ANALYSIS OF HEPATIC PHASE I METABOLITES OF PHENOL IN RAINBOW TROUT (ONCORHYNCHUS MYKISS)

    EPA Science Inventory

    Development of reliable and accurate methodologies for determination of xenobiotic hepatic biotransformation rate and capacity parameters is important to the derivation of precise physiologically-based toxicokinetic (PB-TK) models. Biotransformation data incorporated into PB-TK m...

  19. 42 CFR 413.348 - Limitation on review.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... payment rates, the case-mix methodology, and the development and application of the wage index. This... 42 Public Health 2 2011-10-01 2011-10-01 false Limitation on review. 413.348 Section 413.348 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE...

  20. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  1. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  2. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  3. A comparative review of nurse turnover rates and costs across countries.

    PubMed

    Duffield, Christine M; Roche, Michael A; Homer, Caroline; Buchan, James; Dimitrelis, Sofia

    2014-12-01

    To compare nurse turnover rates and costs from four studies in four countries (US, Canada, Australia, New Zealand) that have used the same costing methodology; the original Nursing Turnover Cost Calculation Methodology. Measuring and comparing the costs and rates of turnover is difficult because of differences in definitions and methodologies. Comparative review. Searches were carried out within CINAHL, Business Source Complete and Medline for studies that used the original Nursing Turnover Cost Calculation Methodology and reported on both costs and rates of nurse turnover, published from 2014 and prior. A comparative review of turnover data was conducted using four studies that employed the original Nursing Turnover Cost Calculation Methodology. Costing data items were converted to percentages, while total turnover costs were converted to US 2014 dollars and adjusted according to inflation rates, to permit cross-country comparisons. Despite using the same methodology, Australia reported significantly higher turnover costs ($48,790) due to higher termination (~50% of indirect costs) and temporary replacement costs (~90% of direct costs). Costs were almost 50% lower in the US ($20,561), Canada ($26,652) and New Zealand ($23,711). Turnover rates also varied significantly across countries with the highest rate reported in New Zealand (44·3%) followed by the US (26·8%), Canada (19·9%) and Australia (15·1%). A significant proportion of turnover costs are attributed to temporary replacement, highlighting the importance of nurse retention. The authors suggest a minimum dataset is also required to eliminate potential variability across countries, states, hospitals and departments. © 2014 John Wiley & Sons Ltd.

  4. Evaluation of criteria for developing traffic safety materials for Latinos.

    PubMed

    Streit-Kaplan, Erica L; Miara, Christine; Formica, Scott W; Gallagher, Susan Scavo

    2011-03-01

    This quantitative study assessed the validity of guidelines that identified four key characteristics of culturally appropriate Spanish-language traffic safety materials: language, translation, formative evaluation, and credible source material. From a sample of 190, the authors randomly selected 12 Spanish-language educational materials for analysis by 15 experts. Hypotheses included that the experts would rate materials with more of the key characteristics as more effective (likely to affect behavioral change) and rate materials originally developed in Spanish and those that utilized formative evaluation (e.g., pilot tests, focus groups) as more culturally appropriate. Although results revealed a weak association between the number of key characteristics in a material and the rating of its effectiveness, reviewers rated materials originally created in Spanish and those utilizing formative evaluation as significantly more culturally appropriate. The findings and methodology demonstrated important implications for developers and evaluators of any health-related materials for Spanish speakers and other population groups.

  5. [Detailed methodological recommendations for the treatment of Clostridium difficile-associated diarrhea with faecal transplantation].

    PubMed

    Nagy, Gergely György; Várvölgyi, Csaba; Balogh, Zoltán; Orosi, Piroska; Paragh, György

    2013-01-06

    The incidence of Clostridium difficile associated enteral disease shows dramatic increase worldwide, with appallingly high treatment costs, mortality figures, recurrence rates and treatment refractoriness. It is not surprising, that there is significant interest in the development and introduction of alternative therapeutic strategies. Among these only stool transplantation (or faecal bacteriotherapy) is gaining international acceptance due to its excellent cure rate (≈92%), low recurrence rate (≈6%), safety and cost-effectiveness. Unfortunately faecal transplantation is not available for most patients, although based on promising international results, its introduction into the routine clinical practice is well justified and widely expected. The authors would like to facilitate this process, by presenting a detailed faecal transplantation protocol prepared in their Institution based on the available literature and clinical rationality. Officially accepted national methodological guidelines will need to be issued in the future, founded on the expert opinion of relevant professional societies and upcoming advances in this field.

  6. Using Innovative Methodologies From Technology and Manufacturing Companies to Reduce Heart Failure Readmissions.

    PubMed

    Johnson, Amber E; Winner, Laura; Simmons, Tanya; Eid, Shaker M; Hody, Robert; Sampedro, Angel; Augustine, Sharon; Sylvester, Carol; Parakh, Kapil

    2016-05-01

    Heart failure (HF) patients have high 30-day readmission rates with high costs and poor quality of life. This study investigated the impact of a framework blending Lean Sigma, design thinking, and Lean Startup on 30-day all-cause readmissions among HF patients. This was a prospective study in an academic hospital in Baltimore, Maryland. Thirty-day all-cause readmission was assessed using the hospital's electronic medical record. The baseline readmission rate for HF was 28.4% in 2010 with 690 discharges. The framework was developed and interventions implemented in the second half of 2011. The impact of the interventions was evaluated through 2012. The rate declined to 18.9% among 703 discharges (P < .01). There was no significant change for non-HF readmissions. This study concluded that methodologies from technology and manufacturing companies can reduce 30-day readmissions in HF, demonstrating the potential of this innovations framework to improve chronic disease care. © The Author(s) 2014.

  7. Application of low-cost methodologies for mobile phone app development.

    PubMed

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines.

  8. Application of Low-Cost Methodologies for Mobile Phone App Development

    PubMed Central

    Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-01-01

    Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines. PMID:25491323

  9. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  10. Future Mission Trends and their Implications for the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Abraham, Douglas S.

    2006-01-01

    Planning for the upgrade and/or replacement of Deep Space Network (DSN) assets that typically operate for forty or more years necessitates understanding potential customer needs as far into the future as possible. This paper describes the methodology Deep Space Network (DSN) planners use to develop this understanding, some key future mission trends that have emerged from application of this methodology, and the implications of the trends for the DSN's future evolution. For NASA's current plans out to 2030, these trends suggest the need to accommodate: three times as many communication links, downlink rates two orders of magnitude greater than today's, uplink rates some four orders of magnitude greater, and end-to-end link difficulties two-to-three orders of magnitude greater. To meet these challenges, both DSN capacity and capability will need to increase.

  11. Self-Care Behaviors of African Americans Living with Heart Failure.

    PubMed

    Woda, Aimee; Haglund, Kristin; Belknap, Ruth Ann; Sebern, Margaret

    2015-01-01

    African Americans have a higher risk of developing heart failure (HF) than persons from other ethnic groups. Once diagnosed, they have lower rates of HF self-care and poorer health outcomes. Promoting engagement in HF self-care is amenable to change and represents an important way to improve the health of African Americans with HF. This study used a community-based participatory action research methodology called photovoice to explore the practice of HF self-care among low-income, urban, community dwelling African Americans. Using the photovoice methodology, themes emerged regarding self-care management and self-care maintenance.

  12. Solution methods for one-dimensional viscoelastic problems

    NASA Technical Reports Server (NTRS)

    Stubstad, John M.; Simitses, George J.

    1987-01-01

    A recently developed differential methodology for solution of one-dimensional nonlinear viscoelastic problems is presented. Using the example of an eccentrically loaded cantilever beam-column, the results from the differential formulation are compared to results generated using a previously published integral solution technique. It is shown that the results obtained from these distinct methodologies exhibit a surprisingly high degree of correlation with one another. A discussion of the various factors affecting the numerical accuracy and rate of convergence of these two procedures is also included. Finally, the influences of some 'higher order' effects, such as straining along the centroidal axis are discussed.

  13. Non-isothermal elastoviscoplastic analysis of planar curved beams

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Carlson, R. L.; Riff, R.

    1988-01-01

    The development of a general mathematical model and solution methodologies, to examine the behavior of thin structural elements such as beams, rings, and arches, subjected to large nonisothermal elastoviscoplastic deformations is presented. Thus, geometric as well as material type nonlinearities of higher order are present in the analysis. For this purpose a complete true abinito rate theory of kinematics and kinetics for thin bodies, without any restriction on the magnitude of the transformation is presented. A previously formulated elasto-thermo-viscoplastic material constitutive law is employed in the analysis. The methodology is demonstrated through three different straight and curved beams problems.

  14. Improving patient care in cardiac surgery using Toyota production system based methodology.

    PubMed

    Culig, Michael H; Kunkle, Richard F; Frndak, Diane C; Grunden, Naida; Maher, Thomas D; Magovern, George J

    2011-02-01

    A new cardiac surgery program was developed in a community hospital setting using the operational excellence (OE) method, which is based on the principles of the Toyota production system. The initial results of the first 409 heart operations, performed over the 28 months between March 1, 2008, and June 30, 2010, are presented. Operational excellence methodology was taught to the cardiac surgery team. Coaching started 2 months before the opening of the program and continued for 24 months. Of the 409 cases presented, 253 were isolated coronary artery bypass graft operations. One operative death occurred. According to the database maintained by The Society of Thoracic Surgeons, the risk-adjusted operative mortality rate was 61% lower than the regional rate. Likewise, the risk-adjusted rate of major complications was 57% lower than The Society of Thoracic Surgeons regional rate. Daily solution to determine cause was attempted on 923 distinct perioperative problems by all team members. Using the cost of complications as described by Speir and coworkers, avoiding predicted complications resulted in a savings of at least $884,900 as compared with the regional average. By the systematic use of a real time, highly formatted problem-solving methodology, processes of care improved daily. Using carefully disciplined teamwork, reliable implementation of evidence-based protocols was realized by empowering the front line to make improvements. Low rates of complications were observed, and a cost savings of $3,497 per each case of isolated coronary artery bypass graft was realized. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Client Perceptions of Helpfulness in Therapy: a Novel Video-Rating Methodology for Examining Process Variables at Brief Intervals During a Single Session.

    PubMed

    Cocklin, Alexandra A; Mansell, Warren; Emsley, Richard; McEvoy, Phil; Preston, Chloe; Comiskey, Jody; Tai, Sara

    2017-11-01

    The value of clients' reports of their experiences in therapy is widely recognized, yet quantitative methodology has rarely been used to measure clients' self-reported perceptions of what is helpful over a single session. A video-rating method using was developed to gather data at brief intervals using process measures of client perceived experience and standardized measures of working alliance (Session Rating Scale; SRS). Data were collected over the course of a single video-recorded session of cognitive therapy (Method of Levels Therapy; Carey, 2006; Mansell et al., 2012). We examined the acceptability and feasibility of the methodology and tested the concurrent validity of the measure by utilizing theory-led constructs. Eighteen therapy sessions were video-recorded and clients each rated a 20-minute session of therapy at two-minute intervals using repeated measures. A multi-level analysis was used to test for correlations between perceived levels of helpfulness and client process variables. The design proved to be feasible. Concurrent validity was borne out through high correlations between constructs. A multi-level regression examined the independent contributions of client process variables to client perceived helpfulness. Client perceived control (b = 0.39, 95% CI .05 to 0.73), the ability to talk freely (b = 0.30, SE = 0.11, 95% CI .09 to 0.51) and therapist approach (b = 0.31, SE = 0.14, 95% CI .04 to 0.57) predicted client-rated helpfulness. We identify a feasible and acceptable method for studying continuous measures of helpfulness and their psychological correlates during a single therapy session.

  16. ChargeOut! : discounted cash flow compared with traditional machine-rate analysis

    Treesearch

    Ted Bilek

    2008-01-01

    ChargeOut!, a discounted cash-flow methodology in spreadsheet format for analyzing machine costs, is compared with traditional machine-rate methodologies. Four machine-rate models are compared and a common data set representative of logging skidders’ costs is used to illustrate the differences between ChargeOut! and the machine-rate methods. The study found that the...

  17. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  18. Epidemiology of multiple chronic conditions: an international perspective.

    PubMed

    Schellevis, François G

    2013-01-01

    The epidemiology of multimorbidity, or multiple chronic conditions (MCCs), is one of the research priority areas of the U.S. Department of Health and Human Services (HHS) by its Strategic Framework on MCCs. A conceptual model addressing methodological issues leading to a valid measurement of the prevalence rates of MCCs has been developed and applied in descriptive epidemiological studies. Comparing these results with those from prevalence studies performed earlier and in other countries is hampered by methodological limitations. Therefore, this paper aims to put the size and patterns of MCCs in the USA, as established within the HHS Strategic Framework on MCCs, in perspective of the findings on the prevalence of MCCs in other countries. General common trends can be observed: increasing prevalence rates with increasing age, and multimorbidity being the rule rather than the exception at old age. Most frequent combinations of chronic diseases include the most frequently occurring single chronic diseases. New descriptive epidemiological studies will probably not provide new results; therefore, future descriptive studies should focus on the prevalence rates of MCCs in subpopulations, statistical clustering of chronic conditions, and the development of the prevalence rates of MCCs over time. The finding of common trends also indicates the necessary transition to a next phase of MCC research, addressing the quality of care of patients with MCCs from an organizational perspective and with respect to the content of care. Journal of Comorbidity 2013;3:36-40.

  19. MUSIC-Expected maximization gaussian mixture methodology for clustering and detection of task-related neuronal firing rates.

    PubMed

    Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A

    2017-01-15

    Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Development of Improved Accelerated Corrosion Qualification Test Methodology for Aerospace Materials

    DTIC Science & Technology

    2014-11-01

    irradiation and ozone gas • Cumulative damage model for predicting atmospheric corrosion rates of 1010 steel was developed using inputs from weather...data: – Temperature, – Relative humidity (%RH) – Atmospheric contaminants (chloride, SO2, and ozone ) levels Silver Al Alloy 7075 Al Alloy...2024 Al Alloy 6061 Copper Steel Ozone generator Ozone monitor 10 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited

  1. Life Prediction/Reliability Data of Glass-Ceramic Material Determined for Radome Applications

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.

    2002-01-01

    Brittle materials, ceramics, are candidate materials for a variety of structural applications for a wide range of temperatures. However, the process of slow crack growth, occurring in any loading configuration, limits the service life of structural components. Therefore, it is important to accurately determine the slow crack growth parameters required for component life prediction using an appropriate test methodology. This test methodology also should be useful in determining the influence of component processing and composition variables on the slow crack growth behavior of newly developed or existing materials, thereby allowing the component processing and composition to be tailored and optimized to specific needs. Through the American Society for Testing and Materials (ASTM), the authors recently developed two test methods to determine the life prediction parameters of ceramics. The two test standards, ASTM 1368 for room temperature and ASTM C 1465 for elevated temperatures, were published in the 2001 Annual Book of ASTM Standards, Vol. 15.01. Briefly, the test method employs constant stress-rate (or dynamic fatigue) testing to determine flexural strengths as a function of the applied stress rate. The merit of this test method lies in its simplicity: strengths are measured in a routine manner in flexure at four or more applied stress rates with an appropriate number of test specimens at each applied stress rate. The slow crack growth parameters necessary for life prediction are then determined from a simple relationship between the strength and the applied stress rate. Extensive life prediction testing was conducted at the NASA Glenn Research Center using the developed ASTM C 1368 test method to determine the life prediction parameters of a glass-ceramic material that the Navy will use for radome applications.

  2. 18 CFR 342.4 - Other rate changing methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... regard to the applicable ceiling level under § 342.3. (b) Market-based rates. A carrier may attempt to...

  3. Effect of periodontal treatment on preterm birth rate: a systematic review of meta-analyses.

    PubMed

    López, Néstor J; Uribe, Sergio; Martinez, Benjamín

    2015-02-01

    Preterm birth is a major cause of neonatal morbidity and mortality in both developed and developing countries. Preterm birth is a highly complex syndrome that includes distinct clinical subtypes in which many different causes may be involved. The results of epidemiological, molecular, microbiological and animal-model studies support a positive association between maternal periodontal disease and preterm birth. However, the results of intervention studies carried out to determine the effect of periodontal treatment on reducing the risk of preterm birth are controversial. This systematic review critically analyzes the methodological issues of meta-analyses of the studies to determine the effect of periodontal treatment to reduce preterm birth. The quality of the individual randomized clinical trials selected is of highest relevance for a systematic review. This article describes the methodological features that should be identified a priori and assessed individually to determine the quality of a randomized controlled trial performed to evaluate the effect of periodontal treatment on pregnancy outcomes. The AMSTAR and the PRISMA checklist tools were used to assess the quality of the six meta-analyses selected, and the bias domain of the Cochrane Collaboration's Tool was applied to evaluate each of the trials included in the meta-analyses. In addition, the methodological characteristics of each clinical trial were assessed. The majority of the trials included in the meta-analyses have significant methodological flaws that threaten their internal validity. The lack of effect of periodontal treatment on preterm birth rate concluded by four meta-analyses, and the positive effect of treatment for reducing preterm birth risk concluded by the remaining two meta-analyses are not based on consistent scientific evidence. Well-conducted randomized controlled trials using rigorous methodology, including appropriate definition of the exposure, adequate control of confounders for preterm birth and application of effective periodontal interventions to eliminate periodontal infection, are needed to confirm the positive association between periodontal disease and preterm birth. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. A methodology for creating greenways through multidisciplinary sustainable landscape planning.

    PubMed

    Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila

    2010-01-01

    This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. A methodology to condition distorted acoustic emission signals to identify fracture timing from human cadaver spine impact tests.

    PubMed

    Arun, Mike W J; Yoganandan, Narayan; Stemper, Brian D; Pintar, Frank A

    2014-12-01

    While studies have used acoustic sensors to determine fracture initiation time in biomechanical studies, a systematic procedure is not established to process acoustic signals. The objective of the study was to develop a methodology to condition distorted acoustic emission data using signal processing techniques to identify fracture initiation time. The methodology was developed from testing a human cadaver lumbar spine column. Acoustic sensors were glued to all vertebrae, high-rate impact loading was applied, load-time histories were recorded (load cell), and fracture was documented using CT. Compression fracture occurred to L1 while other vertebrae were intact. FFT of raw voltage-time traces were used to determine an optimum frequency range associated with high decibel levels. Signals were bandpass filtered in this range. Bursting pattern was found in the fractured vertebra while signals from other vertebrae were silent. Bursting time was associated with time of fracture initiation. Force at fracture was determined using this time and force-time data. The methodology is independent of selecting parameters a priori such as fixing a voltage level(s), bandpass frequency and/or using force-time signal, and allows determination of force based on time identified during signal processing. The methodology can be used for different body regions in cadaver experiments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Direct Administration of Nerve-Specific Contrast to Improve Nerve Sparing Radical Prostatectomy

    PubMed Central

    Barth, Connor W.; Gibbs, Summer L.

    2017-01-01

    Nerve damage remains a major morbidity following nerve sparing radical prostatectomy, significantly affecting quality of life post-surgery. Nerve-specific fluorescence guided surgery offers a potential solution by enhancing nerve visualization intraoperatively. However, the prostate is highly innervated and only the cavernous nerve structures require preservation to maintain continence and potency. Systemic administration of a nerve-specific fluorophore would lower nerve signal to background ratio (SBR) in vital nerve structures, making them difficult to distinguish from all nervous tissue in the pelvic region. A direct administration methodology to enable selective nerve highlighting for enhanced nerve SBR in a specific nerve structure has been developed herein. The direct administration methodology demonstrated equivalent nerve-specific contrast to systemic administration at optimal exposure times. However, the direct administration methodology provided a brighter fluorescent nerve signal, facilitating nerve-specific fluorescence imaging at video rate, which was not possible following systemic administration. Additionally, the direct administration methodology required a significantly lower fluorophore dose than systemic administration, that when scaled to a human dose falls within the microdosing range. Furthermore, a dual fluorophore tissue staining method was developed that alleviates fluorescence background signal from adipose tissue accumulation using a spectrally distinct adipose tissue specific fluorophore. These results validate the use of the direct administration methodology for specific nerve visualization with fluorescence image-guided surgery, which would improve vital nerve structure identification and visualization during nerve sparing radical prostatectomy. PMID:28255352

  7. Direct Administration of Nerve-Specific Contrast to Improve Nerve Sparing Radical Prostatectomy.

    PubMed

    Barth, Connor W; Gibbs, Summer L

    2017-01-01

    Nerve damage remains a major morbidity following nerve sparing radical prostatectomy, significantly affecting quality of life post-surgery. Nerve-specific fluorescence guided surgery offers a potential solution by enhancing nerve visualization intraoperatively. However, the prostate is highly innervated and only the cavernous nerve structures require preservation to maintain continence and potency. Systemic administration of a nerve-specific fluorophore would lower nerve signal to background ratio (SBR) in vital nerve structures, making them difficult to distinguish from all nervous tissue in the pelvic region. A direct administration methodology to enable selective nerve highlighting for enhanced nerve SBR in a specific nerve structure has been developed herein. The direct administration methodology demonstrated equivalent nerve-specific contrast to systemic administration at optimal exposure times. However, the direct administration methodology provided a brighter fluorescent nerve signal, facilitating nerve-specific fluorescence imaging at video rate, which was not possible following systemic administration. Additionally, the direct administration methodology required a significantly lower fluorophore dose than systemic administration, that when scaled to a human dose falls within the microdosing range. Furthermore, a dual fluorophore tissue staining method was developed that alleviates fluorescence background signal from adipose tissue accumulation using a spectrally distinct adipose tissue specific fluorophore. These results validate the use of the direct administration methodology for specific nerve visualization with fluorescence image-guided surgery, which would improve vital nerve structure identification and visualization during nerve sparing radical prostatectomy.

  8. Methodological approach for the collection and simultaneous estimation of greenhouse gases emission from aquaculture ponds.

    PubMed

    Vasanth, Muthuraman; Muralidhar, Moturi; Saraswathy, Ramamoorthy; Nagavel, Arunachalam; Dayal, Jagabattula Syama; Jayanthi, Marappan; Lalitha, Natarajan; Kumararaja, Periyamuthu; Vijayan, Koyadan Kizhakkedath

    2016-12-01

    Global warming/climate change is the greatest environmental threat of our time. Rapidly developing aquaculture sector is an anthropogenic activity, the contribution of which to global warming is little understood, and estimation of greenhouse gases (GHGs) emission from the aquaculture ponds is a key practice in predicting the impact of aquaculture on global warming. A comprehensive methodology was developed for sampling and simultaneous analysis of GHGs, carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) from the aquaculture ponds. The GHG fluxes were collected using cylindrical acrylic chamber, air pump, and tedlar bags. A cylindrical acrylic floating chamber was fabricated to collect the GHGs emanating from the surface of aquaculture ponds. The sampling methodology was standardized and in-house method validation was established by achieving linearity, accuracy, precision, and specificity. GHGs flux was found to be stable at 10 ± 2 °C of storage for 3 days. The developed methodology was used to quantify GHGs in the Pacific white shrimp Penaeus vannamei and black tiger shrimp Penaeus monodon culture ponds for a period of 4 months. The rate of emission of carbon dioxide was found to be much greater when compared to other two GHGs. Average GHGs emission in gha -1  day -1 during the culture was comparatively high in P.vannamei culture ponds.

  9. An Inexpensive and Simple Method to Demonstrate Soil Water and Nutrient Flow

    ERIC Educational Resources Information Center

    Nichols, K. A.; Samson-Liebig, S.

    2011-01-01

    Soil quality, soil health, and soil sustainability are concepts that are being widely used but are difficult to define and illustrate, especially to a non-technical audience. The objectives of this manuscript were to develop simple and inexpensive methodologies to both qualitatively and quantitatively estimate water infiltration rates (IR),…

  10. The Rankings Game: Who's Playing Whom?

    ERIC Educational Resources Information Center

    Burness, John F.

    2008-01-01

    This summer, Forbes magazine published its new rankings of "America's Best Colleges," implying that it had developed a methodology that would give the public the information that it needed to choose a college wisely. "U.S. News & World Report," which in 1983 published the first annual ranking, just announced its latest ratings last week--including…

  11. From the Users' Perspective-The UCSD Libraries User Survey Project.

    ERIC Educational Resources Information Center

    Talbot, Dawn E.; Lowell, Gerald R.; Martin, Kerry

    1998-01-01

    Discussion of a user-driven survey conducted at the University of California, San Diego libraries focuses on the methodology that resulted in a high response rate. Highlights goals for the survey, including acceptance of data by groups outside the library and for benchmarking data; planning; user population; and questionnaire development. (LRW)

  12. Evolution of Project-Based Learning in Small Groups in Environmental Engineering Courses

    ERIC Educational Resources Information Center

    Requies, Jesús M.; Agirre, Ion; Barrio, V. Laura; Graells, Moisès

    2018-01-01

    This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning--PBL) implemented on the course "Unit Operations in Environmental Engineering", within the bachelor's degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial…

  13. Quartz dissolution. I - Negative crystal experiments and a rate law. II - Theory of rough and smooth surfaces

    NASA Technical Reports Server (NTRS)

    Gratz, Andrew J.; Bird, Peter

    1993-01-01

    The range of the measured quartz dissolution rates, as a function of temperature and pOH, extent of saturation, and ionic strength, is extended to cover a wider range of solution chemistries, using the negative crystal methodology of Gratz et al. (1990) to measure the dissolution rate. A simple rate law describing the quartz dissolution kinetics above the point of zero charge of quartz is derived for ionic strengths above 0.003 m. Measurements were performed on some defective crystals, and the mathematics of step motion was developed for quartz dissolution and was compared with rough-face behavior using two different models.

  14. Rating methodological quality: toward improved assessment and investigation.

    PubMed

    Moyer, Anne; Finney, John W

    2005-01-01

    Assessing methodological quality is considered essential in deciding what investigations to include in research syntheses and in detecting potential sources of bias in meta-analytic results. Quality assessment is also useful in characterizing the strengths and limitations of the research in an area of study. Although numerous instruments to measure research quality have been developed, they have lacked empirically-supported components. In addition, different summary quality scales have yielded different findings when they were used to weight treatment effect estimates for the same body of research. Suggestions for developing improved quality instruments include: distinguishing distinct domains of quality, such as internal validity, external validity, the completeness of the study report, and adherence to ethical practices; focusing on individual aspects, rather than domains of quality; and focusing on empirically-verified criteria. Other ways to facilitate the constructive use of quality assessment are to improve and standardize the reporting of research investigations, so that the quality of studies can be more equitably and thoroughly compared, and to identify optimal methods for incorporating study quality ratings into meta-analyses.

  15. A methodology for enhancing implementation science proposals: comparison of face-to-face versus virtual workshops.

    PubMed

    Marriott, Brigid R; Rodriguez, Allison L; Landes, Sara J; Lewis, Cara C; Comtois, Katherine A

    2016-05-06

    With the current funding climate and need for advancements in implementation science, there is a growing demand for grantsmanship workshops to increase the quality and rigor of proposals. A group-based implementation science-focused grantsmanship workshop, the Implementation Development Workshop (IDW), is one methodology to address this need. This manuscript provides an overview of the IDW structure, format, and findings regarding its utility. The IDW methodology allows researchers to vet projects in the proposal stage in a structured format with a facilitator and two types of expert participants: presenters and attendees. The presenter uses a one-page handout and verbal presentation to present their proposal and questions. The facilitator elicits feedback from attendees using a format designed to maximize the number of unique points made. After each IDW, participants completed an anonymous survey assessing perceptions of the IDW. Presenters completed a funding survey measuring grant submission and funding success. Qualitative interviews were conducted with a subset of participants who participated in both delivery formats. Mixed method analyses were performed to evaluate the effectiveness and acceptability of the IDW and compare the delivery formats. Of those who participated in an IDW (N = 72), 40 participated in face-to-face only, 16 in virtual only, and 16 in both formats. Thirty-eight (face-to-face n = 12, 35 % response rate; virtual n = 26, 66.7 % response rate) responded to the surveys and seven (15.3 % response rate), who had attended both formats, completed an interview. Of 36 total presenters, 17 (face-to-face n = 12, 42.9 % response rate; virtual n = 5, 62.9 % response rate) responded to the funding survey. Mixed method analyses indicated that the IDW was effective for collaboration and growth, effective for enhancing success in obtaining grants, and acceptable. A third (35.3 %) of presenters ultimately received funding for their proposal, and more than 80 % of those who presented indicated they would present again in the future. The IDW structure and facilitation process were found to be acceptable, with both formats rated as equally strong. The IDW presents an acceptable and successful methodology for increasing competitiveness of implementation science grant proposals.

  16. Development of a standardized training course for laparoscopic procedures using Delphi methodology.

    PubMed

    Bethlehem, Martijn S; Kramp, Kelvin H; van Det, Marc J; ten Cate Hoedemaker, Henk O; Veeger, Nicolaas J G M; Pierie, Jean Pierre E N

    2014-01-01

    Content, evaluation, and certification of laparoscopic skills and procedure training lack uniformity among different hospitals in The Netherlands. Within the process of developing a new regional laparoscopic training curriculum, a uniform and transferrable curriculum was constructed for a series of laparoscopic procedures. The aim of this study was to determine regional expert consensus regarding the key steps for laparoscopic appendectomy and cholecystectomy using Delphi methodology. Lists of suggested key steps for laparoscopic appendectomy and cholecystectomy were created using surgical textbooks, available guidelines, and local practice. A total of 22 experts, working for teaching hospitals throughout the region, were asked to rate the suggested key steps for both procedures on a Likert scale from 1-5. Consensus was reached with Crohnbach's α ≥ 0.90. Of the 22 experts, 21 completed and returned the survey (95%). Data analysis already showed consensus after the first round of Delphi on the key steps for laparoscopic appendectomy (Crohnbach's α = 0.92) and laparoscopic cholecystectomy (Crohnbach's α = 0.90). After the second round, 15 proposed key steps for laparoscopic appendectomy and 30 proposed key steps for laparoscopic cholecystectomy were rated as important (≥4 by at least 80% of the expert panel). These key steps were used for the further development of the training curriculum. By using the Delphi methodology, regional consensus was reached on the key steps for laparoscopic appendectomy and cholecystectomy. These key steps are going to be used for standardized training and evaluation purposes in a new regional laparoscopic curriculum. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Methodology for Knowledge Synthesis of the Management of Vaccination Pain and Needle Fear.

    PubMed

    Taddio, Anna; McMurtry, C Meghan; Shah, Vibhuti; Yoon, Eugene W; Uleryk, Elizabeth; Pillai Riddell, Rebecca; Lang, Eddy; Chambers, Christine T; Noel, Melanie; MacDonald, Noni E

    2015-10-01

    A knowledge synthesis was undertaken to inform the development of a revised and expanded clinical practice guideline about managing vaccination pain in children to include the management of pain across the lifespan and the management of fear in individuals with high levels of needle fear. This manuscript describes the methodological details of the knowledge synthesis and presents the list of included clinical questions, critical and important outcomes, search strategy, and search strategy results. The Grading of Assessments, Recommendations, Development and Evaluation (GRADE) and Cochrane methodologies provided the general framework. The project team voted on clinical questions for inclusion and critically important and important outcomes. A broad search strategy was used to identify relevant randomized-controlled trials and quasi-randomized-controlled trials. Quality of research evidence was assessed using the Cochrane risk of bias tool and quality across studies was assessed using GRADE. Multiple measures of the same construct within studies (eg, observer-rated and parent-rated infant distress) were combined before pooling. The standardized mean difference and 95% confidence intervals (CI) or relative risk and 95% CI was used to express the effects of an intervention. Altogether, 55 clinical questions were selected for inclusion in the knowledge synthesis; 49 pertained to pain management during vaccine injections and 6 pertained to fear management in individuals with high levels of needle fear. Pain, fear, and distress were typically prioritized as critically important outcomes across clinical questions. The search strategy identified 136 relevant studies. This manuscript describes the methodological details of a knowledge synthesis about pain management during vaccination and fear management in individuals with high levels of needle fear. Subsequent manuscripts in this series will present the results for the included questions.

  18. FRAGS: estimation of coding sequence substitution rates from fragmentary data

    PubMed Central

    Swart, Estienne C; Hide, Winston A; Seoighe, Cathal

    2004-01-01

    Background Rates of substitution in protein-coding sequences can provide important insights into evolutionary processes that are of biomedical and theoretical interest. Increased availability of coding sequence data has enabled researchers to estimate more accurately the coding sequence divergence of pairs of organisms. However the use of different data sources, alignment protocols and methods to estimate substitution rates leads to widely varying estimates of key parameters that define the coding sequence divergence of orthologous genes. Although complete genome sequence data are not available for all organisms, fragmentary sequence data can provide accurate estimates of substitution rates provided that an appropriate and consistent methodology is used and that differences in the estimates obtainable from different data sources are taken into account. Results We have developed FRAGS, an application framework that uses existing, freely available software components to construct in-frame alignments and estimate coding substitution rates from fragmentary sequence data. Coding sequence substitution estimates for human and chimpanzee sequences, generated by FRAGS, reveal that methodological differences can give rise to significantly different estimates of important substitution parameters. The estimated substitution rates were also used to infer upper-bounds on the amount of sequencing error in the datasets that we have analysed. Conclusion We have developed a system that performs robust estimation of substitution rates for orthologous sequences from a pair of organisms. Our system can be used when fragmentary genomic or transcript data is available from one of the organisms and the other is a completely sequenced genome within the Ensembl database. As well as estimating substitution statistics our system enables the user to manage and query alignment and substitution data. PMID:15005802

  19. Estimation of radionuclide (137Cs) emission rates from a nuclear power plant accident using the Lagrangian Particle Dispersion Model (LPDM).

    PubMed

    Park, Soon-Ung; Lee, In-Hye; Ju, Jae-Won; Joo, Seung Jin

    2016-10-01

    A methodology for the estimation of the emission rate of 137 Cs by the Lagrangian Particle Dispersion Model (LPDM) with the use of monitored 137 Cs concentrations around a nuclear power plant has been developed. This method has been employed with the MM5 meteorological model in the 600 km × 600 km model domain with the horizontal grid scale of 3 km × 3 km centered at the Fukushima nuclear power plant to estimate 137 Cs emission rate for the accidental period from 00 UTC 12 March to 00 UTC 6 April 2011. The Lagrangian Particles are released continuously with the rate of one particle per minute at the first level modelled, about 15 m above the power plant site. The presently developed method was able to simulate quite reasonably the estimated 137 Cs emission rate compared with other studies, suggesting the potential usefulness of the present method for the estimation of the emission rate from the accidental power plant without detailed inventories of reactors and fuel assemblies and spent fuels. The advantage of this method is not so complicated but can be applied only based on one-time forward LPDM simulation with monitored concentrations around the power plant, in contrast to other inverse models. It was also found that continuously monitored radionuclides concentrations from possibly many sites located in all directions around the power plant are required to get accurate continuous emission rates from the accident power plant. The current methodology can also be used to verify the previous version of radionuclides emissions used among other modeling groups for the cases of intermittent or discontinuous samplings. Copyright © 2016. Published by Elsevier Ltd.

  20. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  1. Application of Adjoint Methodology in Various Aspects of Sonic Boom Design

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2014-01-01

    One of the advances in computational design has been the development of adjoint methods allowing efficient calculation of sensitivities in gradient-based shape optimization. This paper discusses two new applications of adjoint methodology that have been developed to aid in sonic boom mitigation exercises. In the first, equivalent area targets are generated using adjoint sensitivities of selected boom metrics. These targets may then be used to drive the vehicle shape during optimization. The second application is the computation of adjoint sensitivities of boom metrics on the ground with respect to parameters such as flight conditions, propagation sampling rate, and selected inputs to the propagation algorithms. These sensitivities enable the designer to make more informed selections of flight conditions at which the chosen cost functionals are less sensitive.

  2. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software

    PubMed Central

    Zuckerman, Daniel M.; Chong, Lillian T.

    2018-01-01

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772

  3. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  4. Microplastic Generation in the Marine Environment Through Degradation and Fragmentation

    NASA Astrophysics Data System (ADS)

    Perryman, M. E.; Jambeck, J.; Woodson, C. B.; Locklin, J.

    2016-02-01

    Plastic use has become requisite in our global economy; as population continues to increase, so too, will plastic production. At its end-of-life, some amount of plastic is mismanaged and ends up in the ocean. Once there, various environmental stresses eventually fragment plastic into microplastic pieces, now ubiquitous in the marine environment. Microplastics pose a serious threat to marine biota and possibly humans. Though the general mechanisms of microplastic formation are known, the rate and extent is not. Currently, no standard methodology for testing the formation of microplastic exists. We developed a replicable and flexible methodology for testing the formation of microplastics. We used this methodology to test the effects of UV, thermal, and mechanical stress on various types of plastic. We tested for fragmentation by measuring weight and size distribution, and looked for signs of degraded plastic using Fourier transform infrared spectroscopy. Though our results did not find any signs of fragmentation, we did see degradation. Additionally, we established a sound methodology and provided a benchmark for additional studies.

  5. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blakeman, Edward D; Peplow, Douglas E.; Wagner, John C

    2007-09-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally filesmore » and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.« less

  6. Air-kerma evaluation at the maze entrance of HDR brachytherapy facilities.

    PubMed

    Pujades, M C; Granero, D; Vijande, J; Ballester, F; Perez-Calatayud, J; Papagiannis, P; Siebert, F A

    2014-12-01

    In the absence of procedures for evaluating the design of brachytherapy (BT) facilities for radiation protection purposes, the methodology used for external beam radiotherapy facilities is often adapted. The purpose of this study is to adapt the NCRP 151 methodology for estimating the air-kerma rate at the door in BT facilities. Such methodology was checked against Monte Carlo (MC) techniques using the code Geant4. Five different facility designs were studied for (192)Ir and (60)Co HDR applications to account for several different bunker layouts.For the estimation of the lead thickness needed at the door, the use of transmission data for the real spectra at the door instead of the ones emitted by (192)Ir and (60)Co will reduce the lead thickness by a factor of five for (192)Ir and ten for (60)Co. This will significantly lighten the door and hence simplify construction and operating requirements for all bunkers.The adaptation proposed in this study to estimate the air-kerma rate at the door depends on the complexity of the maze: it provides good results for bunkers with a maze (i.e. similar to those used for linacs for which the NCRP 151 methodology was developed) but fails for less conventional designs. For those facilities, a specific Monte Carlo study is in order for reasons of safety and cost-effectiveness.

  7. Global trends in the incidence and prevalence of type 2 diabetes in children and adolescents: a systematic review and evaluation of methodological approaches.

    PubMed

    Fazeli Farsani, S; van der Aa, M P; van der Vorst, M M J; Knibbe, C A J; de Boer, A

    2013-07-01

    This study aimed to systematically review what has been reported on the incidence and prevalence of type 2 diabetes in children and adolescents, to scrutinise the methodological issues observed in the included studies and to prepare recommendations for future research and surveillances. PubMed, the Cochrane Database of Systematic Reviews, Scopus, EMBASE and Web of Science were searched from inception to February 2013. Population-based studies on incidence and prevalence of type 2 diabetes in children and adolescents were summarised and methodologically evaluated. Owing to substantial methodological heterogeneity and considerable differences in study populations a quantitative meta-analysis was not performed. Among 145 potentially relevant studies, 37 population-based studies met the inclusion criteria. Variations in the incidence and prevalence rates of type 2 diabetes in children and adolescents were mainly related to age of the study population, calendar time, geographical regions and ethnicity, resulting in a range of 0-330 per 100,000 person-years for incidence rates, and 0-5,300 per 100,000 population for prevalence rates. Furthermore, a substantial variation in the methodological characteristics was observed for response rates (60-96%), ascertainment rates (53-99%), diagnostic tests and criteria used to diagnose type 2 diabetes. Worldwide incidence and prevalence of type 2 diabetes in children and adolescents vary substantially among countries, age categories and ethnic groups and this can be explained by variations in population characteristics and methodological dissimilarities between studies.

  8. Expectations for methodology and translation of animal research: a survey of health care workers.

    PubMed

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P < .05 considered significant. Response rate was 44/114(39%) (pediatricians), and 69/120 (58%) (nurses/RTs). Asked about methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur <21% of the time; and that if translation was to occur <20% of the time, they would be less supportive of AR. There were few differences between pediatricians and nurses/RTs. HCW have high expectations for the methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  9. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.

  10. The effect of docetaxel on developing oedema in patients with breast cancer: a systematic review.

    PubMed

    Hugenholtz-Wamsteker, W; Robbeson, C; Nijs, J; Hoelen, W; Meeus, M

    2016-03-01

    Docetaxel is extensively used in chemotherapy for the treatment of breast cancer. Little attention has been given to oedema as a possible side effect of docetaxel-containing therapies. Until now, no review was conducted to evaluate docetaxel-containing therapies versus docetaxel-free therapies on the magnitude of the risk of developing oedema. In this systematic review, we investigated the risk of developing oedema in patients being treated for breast cancer with or without docetaxel. In this systematic literature review, we searched PubMed and Web of Knowledge for studies on breast cancer patients treated with chemotherapy containing docetaxel. We included clinical trials comparing docetaxel versus docetaxel-free chemotherapy. Oedema had to be reported and measured as a key outcome or an adverse effect. Methodological checklists were used to assess the risk of bias within the selected studies. Seven randomised clinical trials were included. Six trials were of moderate methodological quality. All trials showed an increased rate of oedema in the docetaxel-treatment arm. The trial of weakest methodological quality reported the highest incidence of oedema. The results moderately suggest that adjuvant chemotherapy containing docetaxel is related to a significantly increased risk of developing oedema, compared with docetaxel-free chemotherapy. © 2014 John Wiley & Sons Ltd.

  11. Entropy-Based Performance Analysis of Jet Engines; Methodology and Application to a Generic Single-Spool Turbojet

    NASA Astrophysics Data System (ADS)

    Abbas, Mohammad

    Recently developed methodology that provides the direct assessment of traditional thrust-based performance of aerospace vehicles in terms of entropy generation (i.e., exergy destruction) is modified for stand-alone jet engines. This methodology is applied to a specific single-spool turbojet engine configuration. A generic compressor performance map along with modeled engine component performance characterizations are utilized in order to provide comprehensive traditional engine performance results (engine thrust, mass capture, and RPM), for on and off-design engine operation. Details of exergy losses in engine components, across the entire engine, and in the engine wake are provided and the engine performance losses associated with their losses are discussed. Results are provided across the engine operating envelope as defined by operational ranges of flight Mach number, altitude, and fuel throttle setting. The exergy destruction that occurs in the engine wake is shown to be dominant with respect to other losses, including all exergy losses that occur inside the engine. Specifically, the ratio of the exergy destruction rate in the wake to the exergy destruction rate inside the engine itself ranges from 1 to 2.5 across the operational envelope of the modeled engine.

  12. Research to reduce the suicide rate among older adults: methodology roadblocks and promising paradigms

    PubMed Central

    Szanto, Katalin; Lenze, Eric J.; Waern, Margda; Duberstein, Paul; Bruce, Martha L.; Epstein-Lubow, Gary; Conwell, Yeates

    2013-01-01

    The National Institute of Mental Health and the National Action Alliance for Suicide Prevention have requested input into the development of a national suicide research agenda. In response, a working group of the American Association for Geriatric Psychiatry has prepared recommendations to ensure that the suicide prevention dialogue includes older adults, a large and fast-growing population at high risk of suicide. In this Open Forum, the working group describes three methodology roadblocks to research into suicide prevention among elderly persons and three paradigms that might provide directions for future research into suicide prevention strategies for older adults. PMID:23728601

  13. Research to reduce the suicide rate among older adults: methodology roadblocks and promising paradigms.

    PubMed

    Szanto, Katalin; Lenze, Eric J; Waern, Margda; Duberstein, Paul; Bruce, Martha L; Epstein-Lubow, Gary; Conwell, Yeates

    2013-06-01

    The National Institute of Mental Health and the National Action Alliance for Suicide Prevention have requested input into the development of a national suicide research agenda. In response, a working group of the American Association for Geriatric Psychiatry has prepared recommendations to ensure that the suicide prevention dialogue includes older adults, a large and fast-growing population at high risk of suicide. In this Open Forum, the working group describes three methodology roadblocks to research into suicide prevention among elderly persons and three paradigms that might provide directions for future research into suicide prevention strategies for older adults.

  14. Use of the 2-chlorotrityl chloride resin for microwave-assisted solid phase peptide synthesis.

    PubMed

    Ieronymaki, Matthaia; Androutsou, Maria Eleni; Pantelia, Anna; Friligou, Irene; Crisp, Molly; High, Kirsty; Penkman, Kirsty; Gatos, Dimitrios; Tselios, Theodore

    2015-09-01

    A fast and efficient microwave (MW)-assisted solid-phase peptide synthesis protocol using the 2-chlorotrityl chloride resin and the Fmoc/tBu methodology, has been developed. The established protocol combines the advantages of MW irradiation and the acid labile 2-chlorotrityl chloride resin. The effect of temperature during the MW irradiation, the degree of resin substitution during the coupling of the first amino acids and the rate of racemization for each amino acid were evaluated. The suggested solid phase methodology is applicable for orthogonal peptide synthesis and for the synthesis of cyclic peptides. © 2015 Wiley Periodicals, Inc.

  15. Novel methodology for pharmaceutical expenditure forecast

    PubMed Central

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical expenditure forecast’; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. Results This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time to reach peak sales for new drugs; 2) estimating the statistical distribution of the budget impact; and 3) testing different pricing and reimbursement policy decisions on health expenditures. Conclusions This methodology was independent of historical data and appeared to be highly flexible and adapted to test robustness and provide probabilistic analysis to support policy decision making. PMID:27226843

  16. Reporting and Methodology of Multivariable Analyses in Prognostic Observational Studies Published in 4 Anesthesiology Journals: A Methodological Descriptive Review.

    PubMed

    Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric

    2015-10-01

    Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.

  17. Instruments to assess patients with rotator cuff pathology: a systematic review of measurement properties.

    PubMed

    Longo, Umile Giuseppe; Saris, Daniël; Poolman, Rudolf W; Berton, Alessandra; Denaro, Vincenzo

    2012-10-01

    The aims of this study were to obtain an overview of the methodological quality of studies on the measurement properties of rotator cuff questionnaires and to describe how well various aspects of the design and statistical analyses of studies on measurement properties are performed. A systematic review of published studies on the measurement properties of rotator cuff questionnaires was performed. Two investigators independently rated the quality of the studies using the Consensus-based Standards for the selection of health Measurement Instruments checklist. This checklist was developed in an international Delphi consensus study. Sixteen studies were included, in which two measurement instruments were evaluated, namely the Western Ontario Rotator Cuff Index and the Rotator Cuff Quality-of-Life Measure. The methodological quality of the included studies was adequate on some properties (construct validity, reliability, responsiveness, internal consistency, and translation) but need to be improved on other aspects. The most important methodological aspects that need to be developed are as follows: measurement error, content validity, structural validity, cross-cultural validity, criterion validity, and interpretability. Considering the importance of adequate measurement properties, it is concluded that, in the field of rotator cuff pathology, there is room for improvement in the methodological quality of studies measurement properties. II.

  18. Outcome of schizophrenia: some transcultural observations with particular reference to developing countries.

    PubMed

    Kulhara, P

    1994-01-01

    The present paper provides a description of data based and methodologically sound studies of outcome of schizophrenia from developing and non-Western countries and compares the results. Major studies reviewed include the 2- and 5-year follow-up of the cohort of the International Pilot Study of Schizophrenia, the patients of the World Health Organization Collaborative Study on the Determinants of Outcome of Severe Mental Disorders, a few Indian studies including the study sponsored by the Indian Council of Medical Research and some studies from Colombia and South-East Asia. The studies are compared in terms of the quality of methodology and the rate of attrition. Although the outcome criteria of these studies are not similar, it is obvious that the outcome of schizophrenia in developing countries is generally more favourable. The reasons for this are far from clear. Research concerning the issues pertaining to better outcome of schizophrenia in developing countries in the context of socio-cultural differences in woefully lacking. This is an area that deserves research attention.

  19. Measuring self-rated health status among resettled adult refugee populations to inform practice and policy - a scoping review.

    PubMed

    Dowling, Alison; Enticott, Joanne; Russell, Grant

    2017-12-08

    The health status of refugees is a significant factor in determining their success in resettlement and relies heavily on self-rated measures of refugee health. The selection of robust and appropriate self-rated health measurement tools is challenging due to the number and methodological variation in the use of assessment tools across refugee health studies. This study describes the existing self-report health measures which have been used in studies of adult refugees living in the community to allow us to address the challenges of selecting appropriate assessments to measure health within refugee groups. Electronic databases of Ovid Medline, CINAHL, SCOPUS, Embase and Scopus. This review identified 45 different self-rated health measurements in 183 studies. Most of the studies were cross sectional explorations of the mental health status of refugees living in community settings within Western nations. A third of the tools were designed specifically for use within refugee populations. More than half of the identified measurement tools have been evaluated for reliability and/or validity within refugee populations. Much variation was found in the selection, development and testing of measurement tools across the reviewed studies. This review shows that there are currently a number of reliable and valid tools available for use in refugee health research; however, further work is required to achieve consistency in the quality and in the use of these tools. Methodological guidelines are required to assist researchers and clinicians in the development and testing of self-rated health measurement tools for use in refugee research.

  20. Invited Reaction: Outsourcing Relationships between Firms and their Training Providers--The Role of Trust

    ERIC Educational Resources Information Center

    Leimbach, Michael P.

    2005-01-01

    Outsourcing in the training and development industry has been steadily increasing and shows no indication of slowing (Surgue & Kim, 2004). Gainey and Klaas's study shines light on the role of interfirm trust in effective outsourcing relationships. This reaction addresses a methodological question of the effect of the rating target on the results,…

  1. The Science of Learning. 2nd Edition

    ERIC Educational Resources Information Center

    Pear, Joseph J.

    2016-01-01

    For over a century and a quarter, the science of learning has expanded at an increasing rate and has achieved the status of a mature science. It has developed powerful methodologies and applications. The rise of this science has been so swift that other learning texts often overlook the fact that, like other mature sciences, the science of…

  2. Estimating Children's Soil/Dust Ingestion Rates through ...

    EPA Pesticide Factsheets

    Background: Soil/dust ingestion rates are important variables in assessing children’s health risks in contaminated environments. Current estimates are based largely on soil tracer methodology, which is limited by analytical uncertainty, small sample size, and short study duration. Objectives: The objective was to estimate site-specific soil/dust ingestion rates through reevaluation of the lead absorption dose–response relationship using new bioavailability data from the Bunker Hill Mining and Metallurgical Complex Superfund Site (BHSS) in Idaho, USA. Methods: The U.S. Environmental Protection Agency (EPA) in vitro bioavailability methodology was applied to archived BHSS soil and dust samples. Using age-specific biokinetic slope factors, we related bioavailable lead from these sources to children’s blood lead levels (BLLs) monitored during cleanup from 1988 through 2002. Quantitative regression analyses and exposure assessment guidance were used to develop candidate soil/dust source partition scenarios estimating lead intake, allowing estimation of age-specific soil/dust ingestion rates. These ingestion rate and bioavailability estimates were simultaneously applied to the U.S. EPA Integrated Exposure Uptake Biokinetic Model for Lead in Children to determine those combinations best approximating observed BLLs. Results: Absolute soil and house dust bioavailability averaged 33% (SD ± 4%) and 28% (SD ± 6%), respectively. Estimated BHSS age-specific soil/du

  3. Pressure-based high-order TVD methodology for dynamic stall control

    NASA Astrophysics Data System (ADS)

    Yang, H. Q.; Przekwas, A. J.

    1992-01-01

    The quantitative prediction of the dynamics of separating unsteady flows, such as dynamic stall, is of crucial importance. This six-month SBIR Phase 1 study has developed several new pressure-based methodologies for solving 3D Navier-Stokes equations in both stationary and moving (body-comforting) coordinates. The present pressure-based algorithm is equally efficient for low speed incompressible flows and high speed compressible flows. The discretization of convective terms by the presently developed high-order TVD schemes requires no artificial dissipation and can properly resolve the concentrated vortices in the wing-body with minimum numerical diffusion. It is demonstrated that the proposed Newton's iteration technique not only increases the convergence rate but also strongly couples the iteration between pressure and velocities. The proposed hyperbolization of the pressure correction equation is shown to increase the solver's efficiency. The above proposed methodologies were implemented in an existing CFD code, REFLEQS. The modified code was used to simulate both static and dynamic stalls on two- and three-dimensional wing-body configurations. Three-dimensional effect and flow physics are discussed.

  4. Reporting Items for Updated Clinical Guidelines: Checklist for the Reporting of Updated Guidelines (CheckUp)

    PubMed Central

    Vernooij, Robin W. M.; Alonso-Coello, Pablo; Brouwers, Melissa

    2017-01-01

    Background Scientific knowledge is in constant development. Consequently, regular review to assure the trustworthiness of clinical guidelines is required. However, there is still a lack of preferred reporting items of the updating process in updated clinical guidelines. The present article describes the development process of the Checklist for the Reporting of Updated Guidelines (CheckUp). Methods and Findings We developed an initial list of items based on an overview of research evidence on clinical guideline updating, the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument, and the advice of the CheckUp panel (n = 33 professionals). A multistep process was used to refine this list, including an assessment of ten existing updated clinical guidelines, interviews with key informants (response rate: 54.2%; 13/24), a three-round Delphi consensus survey with the CheckUp panel (33 participants), and an external review with clinical guideline methodologists (response rate: 90%; 53/59) and users (response rate: 55.6%; 10/18). CheckUp includes 16 items that address (1) the presentation of an updated guideline, (2) editorial independence, and (3) the methodology of the updating process. In this article, we present the methodology to develop CheckUp and include as a supplementary file an explanation and elaboration document. Conclusions CheckUp can be used to evaluate the completeness of reporting in updated guidelines and as a tool to inform guideline developers about reporting requirements. Editors may request its completion from guideline authors when submitting updated guidelines for publication. Adherence to CheckUp will likely enhance the comprehensiveness and transparency of clinical guideline updating for the benefit of patients and the public, health care professionals, and other relevant stakeholders. PMID:28072838

  5. Methods for Measuring Specific Rates of Mercury Methylation and Degradation and Their Use in Determining Factors Controlling Net Rates of Mercury Methylation

    PubMed Central

    Ramlal, Patricia S.; Rudd, John W. M.; Hecky, Robert E.

    1986-01-01

    A method was developed to estimate specific rates of demethylation of methyl mercury in aquatic samples by measuring the volatile 14C end products of 14CH3HgI demethylation. This method was used in conjunction with a 203Hg2+ radiochemical method which determines specific rates of mercury methylation. Together, these methods enabled us to examine some factors controlling the net rate of mercury methylation. The methodologies were field tested, using lake sediment samples from a recently flooded reservoir in the Southern Indian Lake system which had developed a mercury contamination problem in fish. Ratios of the specific rates of methylation/demethylation were calculated. The highest ratios of methylation/demethylation were calculated. The highest ratios of methylation/demethylation occurred in the flooded shorelines of Southern Indian Lake. These results provide an explanation for the observed increases in the methyl mercury concentrations in fish after flooding. PMID:16346959

  6. Towards Comprehensive Variation Models for Designing Vehicle Monitoring Systems

    NASA Technical Reports Server (NTRS)

    McAdams, Daniel A.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes in a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. This crucial roadblock makes their implementation in real vehicles (e.g., helicopter transmissions and aircraft engines) difficult, making their operation costly and unreliable. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. Using such models, we develop a methodology to account for design and manufacturing variations, and explore the changes in the vibration response to determine its stochastic nature. We explore the potential of the methodology using a nonlinear cam-follower model, where the spring stiffness values are assumed to follow a normal distribution. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle monitoring systems.

  7. A frontier analysis approach for benchmarking hospital performance in the treatment of acute myocardial infarction.

    PubMed

    Stanford, Robert E

    2004-05-01

    This paper uses a non-parametric frontier model and adaptations of the concepts of cross-efficiency and peer-appraisal to develop a formal methodology for benchmarking provider performance in the treatment of Acute Myocardial Infarction (AMI). Parameters used in the benchmarking process are the rates of proper recognition of indications of six standard treatment processes for AMI; the decision making units (DMUs) to be compared are the Medicare eligible hospitals of a particular state; the analysis produces an ordinal ranking of individual hospital performance scores. The cross-efficiency/peer-appraisal calculation process is constructed to accommodate DMUs that experience no patients in some of the treatment categories. While continuing to rate highly the performances of DMUs which are efficient in the Pareto-optimal sense, our model produces individual DMU performance scores that correlate significantly with good overall performance, as determined by a comparison of the sums of the individual DMU recognition rates for the six standard treatment processes. The methodology is applied to data collected from 107 state Medicare hospitals.

  8. Eutrophication risk assessment in coastal embayments using simple statistical models.

    PubMed

    Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G

    2003-09-01

    A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.

  9. Methodological variations and their effects on reported medication administration error rates.

    PubMed

    McLeod, Monsey Chan; Barber, Nick; Franklin, Bryony Dean

    2013-04-01

    Medication administration errors (MAEs) are a problem, yet methodological variation between studies presents a potential barrier to understanding how best to increase safety. Using the UK as a case-study, we systematically summarised methodological variations in MAE studies, and their effects on reported MAE rates. Nine healthcare databases were searched for quantitative observational MAE studies in UK hospitals. Methodological variations were analysed and meta-analysis of MAE rates performed using studies that used the same definitions. Odds ratios (OR) were calculated to compare MAE rates between intravenous (IV) and non-IV doses, and between paediatric and adult doses. We identified 16 unique studies reporting three MAE definitions, 44 MAE subcategories and four different denominators. Overall adult MAE rates were 5.6% of a total of 21 533 non-IV opportunities for error (OE) (95% CI 4.6% to 6.7%) and 35% of a total of 154 IV OEs (95% CI 2% to 68%). MAEs were five times more likely in IV than non-IV doses (pooled OR 5.1; 95% CI 3.5 to 7.5). Including timing errors of ±30 min increased the MAE rate from 27% to 69% of 320 IV doses in one study. Five studies were unclear as to whether the denominator included dose omissions; omissions accounted for 0%-13% of IV doses and 1.8%-5.1% of non-IV doses. Wide methodological variations exist even within one country, some with significant effects on reported MAE rates. We have made recommendations for future MAE studies; these may be applied both within and outside the UK.

  10. Problem posing and cultural tailoring: developing an HIV/AIDS health literacy toolkit with the African American community.

    PubMed

    Rikard, R V; Thompson, Maxine S; Head, Rachel; McNeil, Carlotta; White, Caressa

    2012-09-01

    The rate of HIV infection among African Americans is disproportionately higher than for other racial groups in the United States. Previous research suggests that low level of health literacy (HL) is an underlying factor to explain racial disparities in the prevalence and incidence of HIV/AIDS. The present research describes a community and university project to develop a culturally tailored HIV/AIDS HL toolkit in the African American community. Paulo Freire's pedagogical philosophy and problem-posing methodology served as the guiding framework throughout the development process. Developing the HIV/AIDS HL toolkit occurred in a two-stage process. In Stage 1, a nonprofit organization and research team established a collaborative partnership to develop a culturally tailored HIV/AIDS HL toolkit. In Stage 2, African American community members participated in focus groups conducted as Freirian cultural circles to further refine the HIV/AIDS HL toolkit. In both stages, problem posing engaged participants' knowledge, experiences, and concerns to evaluate a working draft toolkit. The discussion and implications highlight how Freire's pedagogical philosophy and methodology enhances the development of culturally tailored health information.

  11. Collecting standardized urban health indicator data at an individual level for school-aged children living in urban areas: methods from EURO-URHIS 2.

    PubMed

    Pope, D; Katreniak, Z; Guha, J; Puzzolo, E; Higgerson, J; Steels, S; Woode-Owusu, M; Bruce, N; Birt, Christopher A; Ameijden, E van; Verma, A

    2017-05-01

    Measuring health and its determinants in urban populations is essential to effectively develop public health policies maximizing health gain within this context. Adolescents are important in this regard given the origins of leading causes of morbidity and mortality develop pre-adulthood. Comprehensive, accurate and comparable information on adolescent urban health indicators from heterogeneous urban contexts is an important challenge. EURO-URHIS 2 aimed to develop standardized tools and methodologies collecting data from adolescents across heterogenous European urban contexts. Questionnaires were developed including (i) comprehensive assessment of urban health indicators from 7 pre-defined domains, (ii) use of previously validated questions from a literature review and other European surveys, (iii) translation/back-translation into European languages and (iv) piloting. Urban area-specific data collection methodologies were established through literature review, consultation and piloting. School-based surveys of 14-16-year olds (400-800 per urban area) were conducted in 13 European countries (33 urban areas). Participation rates were high (80-100%) for students from schools taking part in the surveys from all urban areas, and data quality was generally good (low rates of missing/spoiled data). Overall, 13 850 questionnaires were collected, coded and entered for EURO-URHIS 2. Dissemination included production of urban area health profiles (allowing benchmarking for a number of important public health indicators in young people) and use of visualization tools as part of the EURO-URHIS 2 project. EURO-URHIS 2 has developed standardized survey tools and methodologies for assessing key measures of health and its determinants in adolescents from heterogenous urban contexts and demonstrated the utility of this data to public health practitioners and policy makers. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  12. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  13. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  14. Indicators of Student Flow Rates in Honduras: An Assessment of an Alternative Methodology, with Two Methodologies for Estimating Student Flow Rates. BRIDGES Research Report No. 6.

    ERIC Educational Resources Information Center

    Cuadra, Ernesto; Crouch, Luis

    Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…

  15. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the clinician report measures appeared less well developed. It would be of value if new measures defined the construct of interest and, that the construct, be part of theoretical model. By ensuring measures are both theoretically and empirically valid then improvements in subjective health outcome measures should be possible. PMID:17343739

  16. Short-term forecasting of turbidity in trunk main networks.

    PubMed

    Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward

    2017-11-01

    Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Underlying theory of actuarial analyses.

    PubMed

    Benjamin, B

    1985-05-01

    The developments in theory governing the calculation of mortality rates for use in survival measurements working through the initial basic concept of exposure to risk to the later introduction of stochastic elements are reviewed. I have indicated the way in which actuaries and statisticians who work closely with those in the fields of medicine and biology have, by the exchange of methodologic ideas, come to an identity of approach. Recent new actuarial work and likely future developments in actuarial interests are reviewed.

  18. Characterization of a mine fire using atmospheric monitoring system sensor data.

    PubMed

    Yuan, L; Thomas, R A; Zhou, L

    2017-06-01

    Atmospheric monitoring systems (AMS) have been widely used in underground coal mines in the United States for the detection of fire in the belt entry and the monitoring of other ventilation-related parameters such as airflow velocity and methane concentration in specific mine locations. In addition to an AMS being able to detect a mine fire, the AMS data have the potential to provide fire characteristic information such as fire growth - in terms of heat release rate - and exact fire location. Such information is critical in making decisions regarding fire-fighting strategies, underground personnel evacuation and optimal escape routes. In this study, a methodology was developed to calculate the fire heat release rate using AMS sensor data for carbon monoxide concentration, carbon dioxide concentration and airflow velocity based on the theory of heat and species transfer in ventilation airflow. Full-scale mine fire experiments were then conducted in the Pittsburgh Mining Research Division's Safety Research Coal Mine using an AMS with different fire sources. Sensor data collected from the experiments were used to calculate the heat release rates of the fires using this methodology. The calculated heat release rate was compared with the value determined from the mass loss rate of the combustible material using a digital load cell. The experimental results show that the heat release rate of a mine fire can be calculated using AMS sensor data with reasonable accuracy.

  19. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  20. Cross Time-Frequency Analysis for Combining Information of Several Sources: Application to Estimation of Spontaneous Respiratory Rate from Photoplethysmography

    PubMed Central

    Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.

    2013-01-01

    A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777

  1. Direct access to dithiobenzoate RAFT agent fragmentation rate coefficients by ESR spin-trapping.

    PubMed

    Ranieri, Kayte; Delaittre, Guillaume; Barner-Kowollik, Christopher; Junkers, Thomas

    2014-12-01

    The β-scission rate coefficient of tert-butyl radicals fragmenting off the intermediate resulting from their addition to tert-butyl dithiobenzoate-a reversible addition-fragmentation chain transfer (RAFT) agent-is estimated via the recently introduced electron spin resonance (ESR)-trapping methodology as a function of temperature. The newly introduced ESR-trapping methodology is critically evaluated and found to be reliable. At 20 °C, a fragmentation rate coefficient of close to 0.042 s(-1) is observed, whereas the activation parameters for the fragmentation reaction-determined for the first time-read EA = 82 ± 13.3 kJ mol(-1) and A = (1.4 ± 0.25) × 10(13) s(-1) . The ESR spin-trapping methodology thus efficiently probes the stability of the RAFT adduct radical under conditions relevant for the pre-equilibrium of the RAFT process. It particularly indicates that stable RAFT adduct radicals are indeed formed in early stages of the RAFT poly-merization, at least when dithiobenzoates are employed as controlling agents as stipulated by the so-called slow fragmentation theory. By design of the methodology, the obtained fragmentation rate coefficients represent an upper limit. The ESR spin-trapping methodology is thus seen as a suitable tool for evaluating the fragmentation rate coefficients of a wide range of RAFT adduct radicals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Child Mortality Estimation: Accelerated Progress in Reducing Global Child Mortality, 1990–2010

    PubMed Central

    Hill, Kenneth; You, Danzhen; Inoue, Mie; Oestergaard, Mikkel Z.; Hill, Kenneth; Alkema, Leontine; Cousens, Simon; Croft, Trevor; Guillot, Michel; Pedersen, Jon; Walker, Neff; Wilmoth, John; Jones, Gareth

    2012-01-01

    Monitoring development indicators has become a central interest of international agencies and countries for tracking progress towards the Millennium Development Goals. In this review, which also provides an introduction to a collection of articles, we describe the methodology used by the United Nations Inter-agency Group for Child Mortality Estimation to track country-specific changes in the key indicator for Millennium Development Goal 4 (MDG 4), the decline of the under-five mortality rate (the probability of dying between birth and age five, also denoted in the literature as U5MR and 5 q 0). We review how relevant data from civil registration, sample registration, population censuses, and household surveys are compiled and assessed for United Nations member states, and how time series regression models are fitted to all points of acceptable quality to establish the trends in U5MR from which infant and neonatal mortality rates are generally derived. The application of this methodology indicates that, between 1990 and 2010, the global U5MR fell from 88 to 57 deaths per 1,000 live births, and the annual number of under-five deaths fell from 12.0 to 7.6 million. Although the annual rate of reduction in the U5MR accelerated from 1.9% for the period 1990–2000 to 2.5% for the period 2000–2010, it remains well below the 4.4% annual rate of reduction required to achieve the MDG 4 goal of a two-thirds reduction in U5MR from its 1990 value by 2015. Thus, despite progress in reducing child mortality worldwide, and an encouraging increase in the pace of decline over the last two decades, MDG 4 will not be met without greatly increasing efforts to reduce child deaths. PMID:22952441

  3. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  4. 77 FR 59348 - Revisions to Page 700 of FERC Form No. 6

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-27

    .... The components of an oil pipeline's rate base are governed by the Trended Original Cost Methodology... ratemaking methodology to the Trended Original Cost methodology as adopted in Opinion 154-B. The SRB was to... trended original cost methodology divides the nominal return on equity component of the cost of service...

  5. Bioremediation of chlorpyrifos contaminated soil by two phase bioslurry reactor: Processes evaluation and optimization by Taguchi's design of experimental (DOE) methodology.

    PubMed

    Pant, Apourv; Rai, J P N

    2018-04-15

    Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Using Lean Process Improvement to Enhance Safety and Value in Orthopaedic Surgery: The Case of Spine Surgery.

    PubMed

    Sethi, Rajiv; Yanamadala, Vijay; Burton, Douglas C; Bess, Robert Shay

    2017-11-01

    Lean methodology was developed in the manufacturing industry to increase output and decrease costs. These labor organization methods have become the mainstay of major manufacturing companies worldwide. Lean methods involve continuous process improvement through the systematic elimination of waste, prevention of mistakes, and empowerment of workers to make changes. Because of the profit and productivity gains made in the manufacturing arena using lean methods, several healthcare organizations have adopted lean methodologies for patient care. Lean methods have now been implemented in many areas of health care. In orthopaedic surgery, lean methods have been applied to reduce complication rates and create a culture of continuous improvement. A step-by-step guide based on our experience can help surgeons use lean methods in practice. Surgeons and hospital centers well versed in lean methodology will be poised to reduce complications, improve patient outcomes, and optimize cost/benefit ratios for patient care.

  7. Single Event Test Methodologies and System Error Rate Analysis for Triple Modular Redundant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael

    2010-01-01

    We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.

  8. Methodological Choices in Rating Speech Samples

    ERIC Educational Resources Information Center

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  9. 77 FR 52110 - Agency Response to Public Comments of Safety Measurement System Changes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... University of Michigan Transportation Research Institute ( http://csa.fmcsa.dot.gov/Documents/Evaluation-of... the revised methodology have a 3.9% greater future crash rate and 3.6% greater future HM violation rate than those previously identified for intervention using the existing SMS methodology. Details...

  10. Predicting Vision-Related Disability in Glaucoma.

    PubMed

    Abe, Ricardo Y; Diniz-Filho, Alberto; Costa, Vital P; Wu, Zhichao; Medeiros, Felipe A

    2018-01-01

    To present a new methodology for investigating predictive factors associated with development of vision-related disability in glaucoma. Prospective, observational cohort study. Two hundred thirty-six patients with glaucoma followed up for an average of 4.3±1.5 years. Vision-related disability was assessed by the 25-item National Eye Institute Visual Function Questionnaire (NEI VFQ-25) at baseline and at the end of follow-up. A latent transition analysis model was used to categorize NEI VFQ-25 results and to estimate the probability of developing vision-related disability during follow-up. Patients were tested with standard automated perimetry (SAP) at 6-month intervals, and evaluation of rates of visual field change was performed using mean sensitivity (MS) of the integrated binocular visual field. Baseline disease severity, rate of visual field loss, and duration of follow-up were investigated as predictive factors for development of disability during follow-up. The relationship between baseline and rates of visual field deterioration and the probability of vision-related disability developing during follow-up. At baseline, 67 of 236 (28%) glaucoma patients were classified as disabled based on NEI VFQ-25 results, whereas 169 (72%) were classified as nondisabled. Patients classified as nondisabled at baseline had 14.2% probability of disability developing during follow-up. Rates of visual field loss as estimated by integrated binocular MS were almost 4 times faster for those in whom disability developed versus those in whom it did not (-0.78±1.00 dB/year vs. -0.20±0.47 dB/year, respectively; P < 0.001). In the multivariate model, each 1-dB lower baseline binocular MS was associated with 34% higher odds of disability developing over time (odds ratio [OR], 1.34; 95% confidence interval [CI], 1.06-1.70; P = 0.013). In addition, each 0.5-dB/year faster rate of loss of binocular MS during follow-up was associated with a more than 3.5 times increase in the risk of disability developing (OR, 3.58; 95% CI, 1.56-8.23; P = 0.003). A new methodology for classification and analysis of change in patient-reported quality-of-life outcomes allowed construction of models for predicting vision-related disability in glaucoma. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  11. Kinetic modelling and optimisation of antimicrobial compound production by Candida pyralidae KU736785 for control of Candida guilliermondii.

    PubMed

    Mewa-Ngongang, Maxwell; du Plessis, Heinrich W; Hutchinson, Ucrecia F; Mekuto, Lukhanyo; Ntwampe, Seteno Ko

    2017-06-01

    Biological antimicrobial compounds from yeast can be used to address the critical need for safer preservatives in food, fruit and beverages. The inhibition of Candida guilliermondii, a common fermented beverage spoilage organism, was achieved using antimicrobial compounds produced by Candida pyralidae KU736785. The antimicrobial production system was modelled and optimised using response surface methodology, with 22.5 ℃ and pH of 5.0 being the optimum conditions. A new concept for quantifying spoilage organism inhibition was developed. The inhibition activity of the antimicrobial compounds was observed to be at a maximum after 17-23 h of fermentation, with C. pyralidae concentration being between 0.40 and 1.25 × 10 9 CFU ml -1 , while its maximum specific growth rate was 0.31-0.54 h -1 . The maximum inhibitory activity was between 0.19 and 1.08 l contaminated solidified media per millilitre of antimicrobial compound used. Furthermore, the antimicrobial compound formation rate was 0.037-0.086 l VZI ml -1 ACU h -1 , respectively. The response surface methodology analysis showed that the model developed sufficiently described the antimicrobial compound formation rate 1.08 l VZI ml -1 ACU, as 1.17 l VZI ml -1 ACU, predicted under the optimum production conditions.

  12. Quality Assurance of UMLS Semantic Type Assignments Using SNOMED CT Hierarchies.

    PubMed

    Gu, H; Chen, Y; He, Z; Halper, M; Chen, L

    2016-01-01

    The Unified Medical Language System (UMLS) is one of the largest biomedical terminological systems, with over 2.5 million concepts in its Metathesaurus repository. The UMLS's Semantic Network (SN) with its collection of 133 high-level semantic types serves as an abstraction layer on top of the Metathesaurus. In particular, the SN elaborates an aspect of the Metathesaurus's concepts via the assignment of one or more types to each concept. Due to the scope and complexity of the Metathesaurus, errors are all but inevitable in this semantic-type assignment process. To develop a semi-automated methodology to help assure the quality of semantic-type assignments within the UMLS. The methodology uses a cross-validation strategy involving SNOMED CT's hierarchies in combination with UMLS semantic types. Semantically uniform, disjoint concept groups are generated programmatically by partitioning the collection of all concepts in the same SNOMED CT hierarchy according to their respective semantic-type assignments in the UMLS. Domain experts are then called upon to review the concepts in any group having a small number of concepts. It is our hypothesis that a semantic-type assignment combination applicable only to a very small number of concepts in a SNOMED CT hierarchy is an indicator of potential problems. The methodology was applied to the UMLS 2013AA release along with the SNOMED CT from January 2013. An overall error rate of 33% was found for concepts proposed by the quality-assurance methodology. Supporting our hypothesis, that number was four times higher than the error rate found in control samples. The results show that the quality-assurance methodology can aid in effective and efficient identification of UMLS semantic-type assignment errors.

  13. The UNO Aviation Monograph Series: The Airline Quality Rating 1997

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    1997-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline performance on combined multiple factors important to consumers. Development history and calculation details for the AQR rating system are detailed in The Airline Quality Rating 1991 issued in April, 1991, by the National Institute for Aviation Research at Wichita State University. This current report, Airline Rating 1997, contains monthly Airline Quality Rating scores for 1996. Additional copies are available by contacting Wichita State University or the University of Nebraska at Omaha. The Airline Quality Rating (AQR) 1997 is a summary of a month-by-month quality ratings for the nine major domestic U.S. airlines operating during 1996. Using the Airline Quality Rating system and monthly performance data for each airline for the calendar year of 1996, individual and comparative ratings are reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for major domestic airlines across the 12 month period of 1996, and industry average results. Also comparative Airline Quality Rating data for 1991 through 1995 are included to provide a longer term view of quality in the industry.

  14. The UNO Aviation Monograph Series: The Airline Quality Rating 1998

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    1998-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline performance on combined multiple factors important to consumers. Development history and calculation details for the AQR rating system are detailed in The Airline Quality Rating 1991 issued in April, 1991, by the National Institute for Aviation Research at Wichita State University. This current report, Airline Quality Rating 1998, contains monthly Airline Quality Rating scores for 1997. Additional copies are available by contacting Wichita State University or University of Nebraska at Omaha. The Airline Quality Rating 1998 is a summary of month-by-month quality ratings for the ten major U.S. airlines operating during 1997. Using the Airline Quality Rating system and monthly performance data for each airline for the calendar year of 1997, individual and comparative ratings are reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for major airlines domestic operations for the 12 month period of 1997, and industry average results. Also, comparative Airline Quality Rating data for 1991 through 1996 are included to provide a longer term view of quality in the industry.

  15. First-Year Student Experiences in Community College: Making Transitions, Forming Connections, and Developing Perceptions of Student Learning. Draft.

    ERIC Educational Resources Information Center

    Jalomo, Romero Espinoza, Jr.

    This paper provides theoretical background and methodology for a focus group study of influences on first-time Latino community college students. The first chapter identifies the need for research on Latino students, citing high attrition rates and focusing on three critical dynamics: making the transition to college, making connections on campus…

  16. The Effectiveness of Policies and Programs that Attempt to Reduce Firearm Violence: A Meta-Analysis

    ERIC Educational Resources Information Center

    Makarios, Matthew D.; Pratt, Travis C.

    2012-01-01

    In response to rising rates of firearms violence that peaked in the mid-1990s, a wide range of policy interventions have been developed in an attempt to reduce violent crimes committed with firearms. Although some of these approaches appear to be effective at reducing gun violence, methodological variations make comparing effects across program…

  17. Methodology for developing life tables for sessile insects in the field using the Whitefly, Bemisia tabaci, in cotton as a model system

    USDA-ARS?s Scientific Manuscript database

    Life tables provide a means of measuring the schedules of birth and death from populations over time. They also can be used to quantify the sources and rates of mortality in populations, which has a variety of applications in ecology, including agricultural ecosystems. Horizontal, or cohort-based, l...

  18. Adapting Child Care Market Price Surveys to Support State Quality Initiatives. White Paper

    ERIC Educational Resources Information Center

    Branscome, Kenley

    2016-01-01

    Recent changes to the Child Care and Development Fund (CCDF) require a state's child care market price survey to: (1) be statistically valid and reliable and (2) reflect variations in the cost of child care services by geographic area, type of provider, and age of child. States may use an alternative methodology for setting payment rates--such as…

  19. Using existing case-mix methods to fund trauma cases.

    PubMed

    Monakova, Julia; Blais, Irene; Botz, Charles; Chechulin, Yuriy; Picciano, Gino; Basinski, Antoni

    2010-01-01

    Policymakers frequently face the need to increase funding in isolated and frequently heterogeneous (clinically and in terms of resource consumption) patient subpopulations. This article presents a methodologic solution for testing the appropriateness of using existing grouping and weighting methodologies for funding subsets of patients in the scenario where a case-mix approach is preferable to a flat-rate based payment system. Using as an example the subpopulation of trauma cases of Ontario lead trauma hospitals, the statistical techniques of linear and nonlinear regression models, regression trees, and spline models were applied to examine the fit of the existing case-mix groups and reference weights for the trauma cases. The analyses demonstrated that for funding Ontario trauma cases, the existing case-mix systems can form the basis for rational and equitable hospital funding, decreasing the need to develop a different grouper for this subset of patients. This study confirmed that Injury Severity Score is a poor predictor of costs for trauma patients. Although our analysis used the Canadian case-mix classification system and cost weights, the demonstrated concept of using existing case-mix systems to develop funding rates for specific subsets of patient populations may be applicable internationally.

  20. Assessing importance and satisfaction judgments of intermodal work commuters with electronic survey methodology.

    DOT National Transportation Integrated Search

    2013-09-01

    Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...

  1. Estimation of under-reporting in epidemics using approximations.

    PubMed

    Gamado, Kokouvi; Streftaris, George; Zachary, Stan

    2017-06-01

    Under-reporting in epidemics, when it is ignored, leads to under-estimation of the infection rate and therefore of the reproduction number. In the case of stochastic models with temporal data, a usual approach for dealing with such issues is to apply data augmentation techniques through Bayesian methodology. Departing from earlier literature approaches implemented using reversible jump Markov chain Monte Carlo (RJMCMC) techniques, we make use of approximations to obtain faster estimation with simple MCMC. Comparisons among the methods developed here, and with the RJMCMC approach, are carried out and highlight that approximation-based methodology offers useful alternative inference tools for large epidemics, with a good trade-off between time cost and accuracy.

  2. Recent advances in sortase-catalyzed ligation methodology.

    PubMed

    Antos, John M; Truttmann, Matthias C; Ploegh, Hidde L

    2016-06-01

    The transpeptidation reaction catalyzed by bacterial sortases continues to see increasing use in the construction of novel protein derivatives. In addition to growth in the number of applications that rely on sortase, this field has also seen methodology improvements that enhance reaction performance and scope. In this opinion, we present an overview of key developments in the practice and implementation of sortase-based strategies, including applications relevant to structural biology. Topics include the use of engineered sortases to increase reaction rates, the use of redesigned acyl donors and acceptors to mitigate reaction reversibility, and strategies for expanding the range of substrates that are compatible with a sortase-based approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Classification of samples into two or more ordered populations with application to a cancer trial.

    PubMed

    Conde, D; Fernández, M A; Rueda, C; Salvador, B

    2012-12-10

    In many applications, especially in cancer treatment and diagnosis, investigators are interested in classifying patients into various diagnosis groups on the basis of molecular data such as gene expression or proteomic data. Often, some of the diagnosis groups are known to be related to higher or lower values of some of the predictors. The standard methods of classifying patients into various groups do not take into account the underlying order. This could potentially result in high misclassification rates, especially when the number of groups is larger than two. In this article, we develop classification procedures that exploit the underlying order among the mean values of the predictor variables and the diagnostic groups by using ideas from order-restricted inference. We generalize the existing methodology on discrimination under restrictions and provide empirical evidence to demonstrate that the proposed methodology improves over the existing unrestricted methodology. The proposed methodology is applied to a bladder cancer data set where the researchers are interested in classifying patients into various groups. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Optimizing low impact development (LID) for stormwater runoff treatment in urban area, Korea: Experimental and modeling approach.

    PubMed

    Baek, Sang-Soo; Choi, Dong-Ho; Jung, Jae-Woon; Lee, Hyung-Jin; Lee, Hyuk; Yoon, Kwang-Sik; Cho, Kyung Hwa

    2015-12-01

    Currently, continued urbanization and development result in an increase of impervious areas and surface runoff including pollutants. Also one of the greatest issues in pollutant emissions is the first flush effect (FFE), which implies a greater discharge rate of pollutant mass in the early part in the storm. Low impact development (LID) practices have been mentioned as a promising strategy to control urban stormwater runoff and pollution in the urban ecosystem. However, this requires many experimental and modeling efforts to test LID characteristics and propose an adequate guideline for optimizing LID management. In this study, we propose a novel methodology to optimize the sizes of different types of LID by conducting intensive stormwater monitoring and numerical modeling in a commercial site in Korea. The methodology proposed optimizes LID size in an attempt to moderate FFE on a receiving waterbody. Thereby, the main objective of the optimization is to minimize mass first flush (MFF), which is an indicator for quantifying FFE. The optimal sizes of 6 different LIDs ranged from 1.2 mm to 3.0 mm in terms of runoff depths, which significantly moderate the FFE. We hope that the new proposed methodology can be instructive for establishing LID strategies to mitigate FFE. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Systematic review of guidelines for management of intermediate hepatocellular carcinoma using the Appraisal of Guidelines Research and Evaluation II instrument.

    PubMed

    Holvoet, Tom; Raevens, Sarah; Vandewynckel, Yves-Paul; Van Biesen, Wim; Geboes, Karen; Van Vlierberghe, Hans

    2015-10-01

    Hepatocellular carcinoma is the second leading cause of cancer-related mortality worldwide. Multiple guidelines have been developed to assist clinicians in its management. We aimed to explore methodological quality of these guidelines focusing on treatment of intermediate hepatocellular carcinoma by transarterial chemoembolization. A systematic search was performed for Clinical Practice Guidelines and Consensus statements for hepatocellular carcinoma management. Guideline quality was appraised using the Appraisal of Guidelines Research and Evaluation II instrument, which rates guideline development processes across 6 domains: 'Scope and purpose', 'Stakeholder involvement', 'Rigour of development', 'Clarity of presentation', 'Applicability' and 'Editorial independence'. Thematic analysis of guidelines was performed to map differences in recommendations. Quality of 21 included guidelines varied widely, but was overall poor with only one guideline passing the 50% mark on all domains. Key recommendations as (contra)indications and technical aspects were inconsistent between guidelines. Aspects on side effects and health economics were mainly neglected. Methodological quality of guidelines on transarterial chemoembolization in hepatocellular carcinoma management is poor. This results in important discrepancies between guideline recommendations, creating confusion in clinical practice. Incorporation of the Appraisal of Guidelines Research and Evaluation II instrument in guideline development may improve quality of future guidelines by increasing focus on methodological aspects. Copyright © 2015 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  6. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. On the development of HSCT tail sizing criteria using linear matrix inequalities

    NASA Technical Reports Server (NTRS)

    Kaminer, Isaac

    1995-01-01

    This report presents the results of a study to extend existing high speed civil transport (HSCT) tail sizing criteria using linear matrix inequalities (LMI). In particular, the effects of feedback specifications, such as MIL STD 1797 Level 1 and 2 flying qualities requirements, and actuator amplitude and rate constraints on the maximum allowable cg travel for a given set of tail sizes are considered. Results comparing previously developed industry criteria and the LMI methodology on an HSCT concept airplane are presented.

  8. Machine cost analysis using the traditional machine-rate method and ChargeOut!

    Treesearch

    E. M. (Ted) Bilek

    2009-01-01

    Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...

  9. Computerized system for assessing heart rate variability.

    PubMed

    Frigy, A; Incze, A; Brânzaniuc, E; Cotoi, S

    1996-01-01

    The principal theoretical, methodological and clinical aspects of heart rate variability (HRV) analysis are reviewed. This method has been developed over the last 10 years as a useful noninvasive method of measuring the activity of the autonomic nervous system. The main components and the functioning of the computerized rhythm-analyzer system developed by our team are presented. The system is able to perform short-term (maximum 20 minutes) time domain HRV analysis and statistical analysis of the ventricular rate in any rhythm, particularly in atrial fibrillation. The performances of our system are demonstrated by using the graphics (RR histograms, delta RR histograms, RR scattergrams) and the statistical parameters resulted from the processing of three ECG recordings. These recordings are obtained from a normal subject, from a patient with advanced heart failure, and from a patient with atrial fibrillation.

  10. Cleavage of influenza RNA by using a human PUF-based artificial RNA-binding protein–staphylococcal nuclease hybrid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Tomoaki; Nakamura, Kento; Masaoka, Keisuke

    Various viruses infect animals and humans and cause a variety of diseases, including cancer. However, effective methodologies to prevent virus infection have not yet been established. Therefore, development of technologies to inactivate viruses is highly desired. We have already demonstrated that cleavage of a DNA virus genome was effective to prevent its replication. Here, we expanded this methodology to RNA viruses. In the present study, we used staphylococcal nuclease (SNase) instead of the PIN domain (PilT N-terminus) of human SMG6 as an RNA-cleavage domain and fused the SNase to a human Pumilio/fem-3 binding factor (PUF)-based artificial RNA-binding protein to constructmore » an artificial RNA restriction enzyme with enhanced RNA-cleavage rates for influenzavirus. The resulting SNase-fusion nuclease cleaved influenza RNA at rates 120-fold greater than the corresponding PIN-fusion nuclease. The cleaving ability of the PIN-fusion nuclease was not improved even though the linker moiety between the PUF and RNA-cleavage domain was changed. Gel shift assays revealed that the RNA-binding properties of the PUF derivative used was not as good as wild type PUF. Improvement of the binding properties or the design method will allow the SNase-fusion nuclease to cleave an RNA target in mammalian animal cells and/or organisms. - Highlights: • A novel RNA restriction enzyme using SNase was developed tor cleave viral RNA. • Our enzyme cleaved influenza RNA with rates >120-fold higher rates a PIN-fusion one. • Our artificial enzyme with the L5 linker showed the highest RNA cleavage rate. • Our artificial enzyme site-selectively cleaved influenza RNA in vitro.« less

  11. Application of the Navigation Guide systematic review methodology to the evidence for developmental and reproductive toxicity of triclosan.

    PubMed

    Johnson, Paula I; Koustas, Erica; Vesterinen, Hanna M; Sutton, Patrice; Atchley, Dylan S; Kim, Allegra N; Campbell, Marlissa; Donald, James M; Sen, Saunak; Bero, Lisa; Zeise, Lauren; Woodruff, Tracey J

    2016-01-01

    There are reports of developmental and reproductive health effects associated with the widely used biocide triclosan. Apply the Navigation Guide systematic review methodology to answer the question: Does exposure to triclosan have adverse effects on human development or reproduction? We applied the first 3 steps of the Navigation Guide methodology: 1) Specify a study question, 2) Select the evidence, and 3) Rate quality and strength of the evidence. We developed a protocol, conducted a comprehensive search of the literature, and identified relevant studies using pre-specified criteria. We assessed the number and type of all relevant studies. We evaluated each included study for risk of bias and rated the quality and strength of the evidence for the selected outcomes. We conducted a meta-analysis on a subset of suitable data. We found 4282 potentially relevant records, and 81 records met our inclusion criteria. Of the more than 100 endpoints identified by our search, we focused our evaluation on hormone concentration outcomes, which had the largest human and non-human mammalian data set. Three human studies and 8 studies conducted in rats reported thyroxine levels as outcomes. The rat data were amenable to meta-analysis. Because only one of the human thyroxine studies quantified exposure, we did not conduct a meta-analysis of the human data. Through meta-analysis of the data for rats, we estimated for prenatal exposure a 0.09% (95% CI: -0.20, 0.02) reduction in thyroxine concentration per mg triclosan/kg-bw in fetal and young rats compared to control. For postnatal exposure we estimated a 0.31% (95% CI: -0.38, -0.23) reduction in thyroxine per mg triclosan/kg-bw, also compared to control. Overall, we found low to moderate risk of bias across the human studies and moderate to high risk of bias across the non-human studies, and assigned a "moderate/low" quality rating to the body of evidence for human thyroid hormone alterations and a "moderate" quality rating to the body of evidence for non-human thyroid hormone alterations. Based on this application of the Navigation Guide systematic review methodology, we concluded that there was "sufficient" non-human evidence and "inadequate" human evidence of an association between triclosan exposure and thyroxine concentrations, and consequently, triclosan is "possibly toxic" to reproductive and developmental health. Thyroid hormone disruption is an upstream indicator of developmental toxicity. Additional endpoints may be identified as being of equal or greater concern as other data are developed or evaluated. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Application of the Navigation Guide Systematic Review Methodology to the Evidence for Developmental and Reproductive Toxicity of Triclosan

    PubMed Central

    Johnson, Paula I.; Koustas, Erica; Vesterinen, Hanna M.; Sutton, Patrice; Atchley, Dylan S.; Kim, Allegra N.; Campbell, Marlissa; Donald, James M.; Sen, Saunak; Bero, Lisa; Zeise, Lauren; Woodruff, Tracey J.

    2016-01-01

    Background There are reports of developmental and reproductive health effects associated with the widely used biocide triclosan. Objective Apply the Navigation Guide systematic review methodology to answer the question: Does exposure to triclosan have adverse effects on human development or reproduction? Methods We applied the first 3 steps of the Navigation Guide methodology: 1) Specify a study question, 2) Select the evidence, and 3) Rate quality and strength of the evidence. We developed a protocol, conducted a comprehensive search of the literature, and identified relevant studies using pre-specified criteria. We assessed the number and type of all relevant studies. We evaluated each included study for risk of bias and rated the quality and strength of the evidence for the selected outcomes. We conducted a meta-analysis on a subset of suitable data. Results We found 4,282 potentially relevant records, and 81 records met our inclusion criteria. Of the more than 100 endpoints identified by our search, we focused our evaluation on hormone concentration outcomes, which had the largest human and non-human mammalian data set. Three human studies and 8 studies conducted in rats reported thyroxine levels as outcomes. The rat data were amenable to meta-analysis. Because only one of the human thyroxine studies quantified exposure, we did not conduct a meta-analysis of the human data. Through meta-analysis of the data for rats, we estimated for prenatal exposure a 0.09% (95% CI: −0.20, 0.02) reduction in thyroxine concentration per mg triclosan/kg-bw in fetal and young rats compared to control. For postnatal exposure we estimated a 0.31% (95% CI: −0.38, −0.23) reduction in thyroxine per mg triclosan/kg-bw, also compared to control. Overall we found low to moderate risk of bias across the human studies and moderate to high risk of bias across the non-human studies, and assigned a “moderate/low” quality rating to the body of evidence for human thyroid hormone alterations and a “moderate” quality rating to the body of evidence for non-human thyroid hormone alterations. Conclusion Based on this application of the Navigation Guide systematic review methodology, we concluded that there was “sufficient” non-human evidence and “inadequate” human evidence of an association between triclosan exposure and thyroxine concentrations, and consequently, triclosan is “possibly toxic” to reproductive and developmental health. Thyroid hormone disruption is an upstream indicator of developmental toxicity. Additional endpoints may be identified as being of equal or greater concern as other data are developed or evaluated. PMID:27156197

  13. An in vitro methodology for forecasting luminal concentrations and precipitation of highly permeable lipophilic weak bases in the fasted upper small intestine.

    PubMed

    Psachoulias, Dimitrios; Vertzoni, Maria; Butler, James; Busby, David; Symillides, Moira; Dressman, Jennifer; Reppas, Christos

    2012-12-01

    To develop an in vitro methodology for prediction of concentrations and potential precipitation of highly permeable, lipophilic weak bases in fasted upper small intestine based on ketoconazole and dipyridamole luminal data. Evaluate usefulness of methodology in predicting luminal precipitation of AZD0865 and SB705498 based on plasma data. A three-compartment in vitro setup was used. Depending on the dosage form administered in in vivo studies, a solution or a suspension was placed in the gastric compartment. A medium simulating the luminal environment (FaSSIF-V2plus) was initially placed in the duodenal compartment. Concentrated FaSSIF-V2plus was placed in the reservoir compartment. In vitro ketoconazole and dipyridamole concentrations and precipitated fractions adequately reflected luminal data. Unlike luminal precipitates, in vitro ketoconazole precipitates were crystalline. In vitro AZD0865 data confirmed previously published human pharmacokinetic data suggesting that absorption rates are not affected by luminal precipitation. In vitro SB705498 data predicted that significant luminal precipitation occurs after a 100 mg or 400 mg but not after a 10 mg dose, consistent with human pharmacokinetic data. An in vitro methodology for predicting concentrations and potential precipitation in fasted upper small intestine, after administration of highly permeable, lipophilic weak bases in fasted upper small intestine was developed and evaluated for its predictability in regard to luminal precipitation.

  14. Reducing hospital associated infection: a role for social marketing.

    PubMed

    Conway, Tony; Langley, Sue

    2013-01-01

    Although hand hygiene is seen as the most important method to prevent the transmission of hospital associated infection in the UK, hand hygiene compliance rates appear to remain poor. This research aims to assess the degree to which social marketing methodology can be adopted by a particular organisation to promote hand hygiene compliance. The research design is based on a conceptual framework developed from analysis of social marketing literature. Data collection involved taped interviews given by nursing staff working within a specific Hospital Directorate in Manchester, England. Supplementary data were obtained from archival records of the hand hygiene compliance rates. Findings highlighted gaps in the Directorate's approach to the promotion of hand hygiene compared to what could be using social marketing methodology. Respondents highlighted how the Directorate failed to fully optimise resources required to endorse hand hygiene practice and this resulted in poorer compliance. From the experiences and events documented, the study suggests how the emergent phenomena could be utilised by the Directorate to apply a social marketing approach which could positively influence hand hygiene compliance. The paper seeks to explore the use of social marketing in nursing to promote hand hygiene compliance and offer a conceptual framework that provides a way of measuring the strength of the impact that social marketing methodology could have.

  15. Methodology for Flight Relevant Arc-Jet Testing of Flexible Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Bruce, Walter E., III; Mesick, Nathaniel J.; Sutton, Kenneth

    2013-01-01

    A methodology to correlate flight aeroheating environments to the arc-jet environment is presented. For a desired hot-wall flight heating rate, the methodology provides the arcjet bulk enthalpy for the corresponding cold-wall heating rate. A series of analyses were conducted to examine the effects of the test sample model holder geometry to the overall performance of the test sample. The analyses were compared with arc-jet test samples and challenges and issues are presented. The transient flight environment was calculated for the Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Earth Atmospheric Reentry Test (HEART) vehicle, which is a planned demonstration vehicle using a large inflatable, flexible thermal protection system to reenter the Earth's atmosphere from the International Space Station. A series of correlations were developed to define the relevant arc-jet test environment to properly approximate the HEART flight environment. The computed arcjet environments were compared with the measured arc-jet values to define the uncertainty of the correlated environment. The results show that for a given flight surface heat flux and a fully-catalytic TPS, the flight relevant arc-jet heat flux increases with the arc-jet bulk enthalpy while for a non-catalytic TPS the arc-jet heat flux decreases with the bulk enthalpy.

  16. The 2014 International Pressure Ulcer Guideline: methods and development.

    PubMed

    Haesler, Emily; Kottner, Jan; Cuddigan, Janet

    2017-06-01

    A discussion of the methodology used to develop the Prevention and Treatment of Pressure Ulcers: Clinical Practice Guideline. (2014). International experts representing National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel and Pan Pacific Pressure Injury Alliance developed the second edition of this clinical guideline. Discussion paper - methodology. A comprehensive search for papers published up to July 2013 was conducted in 11 databases and identified 4286 studies. After critical appraisal, 356 studies were included and assigned a level of evidence. Guideline recommendations provide guidance on best practice in pressure ulcer prevention and treatment. Understanding the development process of a guideline increases the meaningfulness of recommendations to clinicians. Five hundred and seventy five recommendations arose from the research and its interpretation. The body of evidence supporting each recommendation was assigned a strength of evidence. A strength of recommendation was assigned to recommendation statements using the GRADE system. Recommendations are primarily supported by a body of evidence rated as C (87% of recommendations), representing low quality and/or indirect evidence (30%) and expert opinion (57%). Two hundred and forty seven recommendations (43%) received a strong recommendation ('Do it'). Recommendations were developed with consideration to research of the highest methodological quality evidence and studies that add to clinical insight and provide guidance for areas of care where minimal research has been conducted. Recommendations in the guideline reflect best practice and should be implemented with consideration to local context and resources and the individual's preferences and needs. © 2016 John Wiley & Sons Ltd.

  17. A Longitudinal Study on Human Outdoor Decomposition in Central Texas.

    PubMed

    Suckling, Joanna K; Spradley, M Katherine; Godde, Kanya

    2016-01-01

    The development of a methodology that estimates the postmortem interval (PMI) from stages of decomposition is a goal for which forensic practitioners strive. A proposed equation (Megyesi et al. 2005) that utilizes total body score (TBS) and accumulated degree days (ADD) was tested using longitudinal data collected from human remains donated to the Forensic Anthropology Research Facility (FARF) at Texas State University-San Marcos. Exact binomial tests examined the rate of the equation to successfully predict ADD. Statistically significant differences were found between ADD estimated by the equation and the observed value for decomposition stage. Differences remained significant after carnivore scavenged donations were removed from analysis. Low success rates for the equation to predict ADD from TBS and the wide standard errors demonstrate the need to re-evaluate the use of this equation and methodology for PMI estimation in different environments; rather, multivariate methods and equations should be derived that are environmentally specific. © 2015 American Academy of Forensic Sciences.

  18. Evaluation of background radiation dose contributions in the United Arab Emirates.

    PubMed

    Goddard, Braden; Bosc, Emmanuel; Al Hasani, Sarra; Lloyd, Cody

    2018-09-01

    The natural background radiation consists of three main components; cosmic, terrestrial, and skyshine. Although there are currently methods available to measure the total dose rate from background radiation, no established methods exist that allow for the measurement of each component the background radiation. This analysis consists of a unique methodology in which the dose rate contribution from each component of the natural background radiation is measured and calculated. This project evaluates the natural background dose rate in the Abu Dhabi City region from all three of these components using the developed methodology. Evaluating and understanding the different components of background radiation provides a baseline allowing for the detection, and possibly attribution, of elevated radiation levels. Measurements using a high-pressure ion chamber with different shielding configurations and two offshore measurements provided dose rate information that were attributed to the different components of the background radiation. Additional spectral information was obtained using an HPGe detector to verify and quantify the presence of terrestrial radionuclides. By evaluating the dose rates of the different shielding configurations the comic, terrestrial, and skyshine contribution in the Abu Dhabi City region were determined to be 33.0 ± 1.7, 15.7 ± 2.5, and 2.4 ± 2.1 nSv/h, respectively. Copyright © 2018. Published by Elsevier Ltd.

  19. Review of sampling, sample and data collection procedures in nursing research--An example of research on ethical climate as perceived by nurses.

    PubMed

    Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Leino-Kilpi, Helena

    2015-12-01

    To report a review of quality regarding sampling, sample and data collection procedures of empirical nursing research of ethical climate studies where nurses were informants. Surveys are needed to obtain generalisable information about topics sensitive to nursing. Methodological quality of the studies is of key concern, especially the description of sampling and data collection procedures. Methodological literature review. Using the electronic MEDLINE database, empirical nursing research articles focusing on ethical climate were accessed in 2013 (earliest-22 November 2013). Using the search terms 'ethical' AND ('climate*' OR 'environment*') AND ('nurse*' OR 'nursing'), 376 citations were retrieved. Based on a four-phase retrieval process, 26 studies were included in the detailed analysis. Sampling method was reported in 58% of the studies, and it was random in a minority of the studies (26%). The identification of target sample and its size (92%) was reported, whereas justification for sample size was less often given. In over two-thirds (69%) of the studies with identifiable response rate, it was below 75%. A variety of data collection procedures were used with large amount of missing data about the details of who distributed, recruited and collected the questionnaires. Methods to increase response rates were seldom described. Discussion about nonresponse, representativeness of the sample and generalisability of the results was missing in many studies. This review highlights the methodological challenges and developments that need to be considered in ensuring the use of valid information in developing health care through research findings. © 2015 Nordic College of Caring Science.

  20. Sport Injuries Sustained by Athletes with Disability: A Systematic Review.

    PubMed

    Weiler, Richard; Van Mechelen, Willem; Fuller, Colin; Verhagen, Evert

    2016-08-01

    Fifteen percent of the world's population live with disability, and many of these individuals choose to play sport. There are barriers to sport participation for athletes with disability and sports injury can greatly impact on daily life, which makes sports injury prevention additionally important. The purpose of this review is to systematically review the definitions, methodologies and injury rates in disability sport, which should assist future identification of risk factors and development of injury prevention strategies. A secondary aim is to highlight the most pressing issues for improvement of the quality of injury epidemiology research for disability sport. A search of NICE, AMED, British Nursing Index, CINAHL, EMBASE and Medline was conducted to identify all publications up to 16 June 2015. Of 489 potentially relevant articles and reference searching, a total of 15 studies were included. Wide study sample heterogeneity prevented data pooling and meta-analysis. Results demonstrated an evolving field of epidemiology, but with wide differences in sports injury definition and with studies focused on short competitions. Background data were generally sparse; there was minimal exposure analysis, and no analysis of injury severity, all of which made comparison of injury risk and injury severity difficult. There is an urgent need for consensus on sports injury definition and methodology in disability sports. The quality of studies is variable, with inconsistent sports injury definitions, methodologies and injury rates, which prevents comparison, conclusions and development of injury prevention strategies. The authors highlight the most pressing issues for improvement of the quality in injury epidemiology research for disability sport.

  1. Including information about comorbidity in estimates of disease burden: Results from the WHO World Mental Health Surveys

    PubMed Central

    Alonso, Jordi; Vilagut, Gemma; Chatterji, Somnath; Heeringa, Steven; Schoenbaum, Michael; Üstün, T. Bedirhan; Rojas-Farreras, Sonia; Angermeyer, Matthias; Bromet, Evelyn; Bruffaerts, Ronny; de Girolamo, Giovanni; Gureje, Oye; Haro, Josep Maria; Karam, Aimee N.; Kovess, Viviane; Levinson, Daphna; Liu, Zhaorui; Mora, Maria Elena Medina; Ormel, J.; Posada-Villa, Jose; Uda, Hidenori; Kessler, Ronald C.

    2010-01-01

    Background The methodology commonly used to estimate disease burden, featuring ratings of severity of individual conditions, has been criticized for ignoring comorbidity. A methodology that addresses this problem is proposed and illustrated here with data from the WHO World Mental Health Surveys. Although the analysis is based on self-reports about one’s own conditions in a community survey, the logic applies equally well to analysis of hypothetical vignettes describing comorbid condition profiles. Methods Face-to-face interviews in 13 countries (six developing, nine developed; n = 31,067; response rate = 69.6%) assessed 10 classes of chronic physical and 9 of mental conditions. A visual analog scale (VAS) was used to assess overall perceived health. Multiple regression analysis with interactions for comorbidity was used to estimate associations of conditions with VAS. Simulation was used to estimate condition-specific effects. Results The best-fitting model included condition main effects and interactions of types by numbers of conditions. Neurological conditions, insomnia, and major depression were rated most severe. Adjustment for comorbidity reduced condition-specific estimates with substantial between-condition variation (.24–.70 ratios of condition-specific estimates with and without adjustment for comorbidity). The societal-level burden rankings were quite different from the individual-level rankings, with the highest societal-level rankings associated with conditions having high prevalence rather than high individual-level severity. Conclusions Plausible estimates of disorder-specific effects on VAS can be obtained using methods that adjust for comorbidity. These adjustments substantially influence condition-specific ratings. PMID:20553636

  2. Practical Loop-Shaping Design of Feedback Control Systems

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2010-01-01

    An improved methodology for designing feedback control systems has been developed based on systematically shaping the loop gain of the system to meet performance requirements such as stability margins, disturbance attenuation, and transient response, while taking into account the actuation system limitations such as actuation rates and range. Loop-shaping for controls design is not new, but past techniques do not directly address how to systematically design the controller to maximize its performance. As a result, classical feedback control systems are designed predominantly using ad hoc control design approaches such as proportional integral derivative (PID), normally satisfied when a workable solution is achieved, without a good understanding of how to maximize the effectiveness of the control design in terms of competing performance requirements, in relation to the limitations of the plant design. The conception of this improved methodology was motivated by challenges in designing control systems of the types needed for supersonic propulsion. But the methodology is generally applicable to any classical control-system design where the transfer function of the plant is known or can be evaluated. In the case of a supersonic aerospace vehicle, a major challenge is to design the system to attenuate anticipated external and internal disturbances, using such actuators as fuel injectors and valves, bypass doors, and ramps, all of which are subject to limitations in actuator response, rates, and ranges. Also, for supersonic vehicles, with long slim type of structures, coupling between the engine and the structural dynamics can produce undesirable effects that could adversely affect vehicle stability and ride quality. In order to design distributed controls that can suppress these potential adverse effects, within the full capabilities of the actuation system, it is important to employ a systematic control design methodology such as this that can maximize the effectiveness of the control design in a methodical and quantifiable way. The emphasis is in generating simple but rather powerful design techniques that will allow even designers with a layman s knowledge in controls to develop effective feedback control designs. Unlike conventional ad hoc methodologies of feedback control design, in this approach actuator rates are incorporated into the design right from the start: The relation between actuator speeds and the desired control bandwidth of the system is established explicitly. The technique developed is demonstrated via design examples in a step-by-step tutorial way. Given the actuation system rates and range limits together with design specifications in terms of stability margins, disturbance rejection, and transient response, the procedure involves designing the feedback loop gain to meet the requirements and maximizing the control system effectiveness, without exceeding the actuation system limits and saturating the controller. Then knowing the plant transfer function, the procedure involves designing the controller so that the controller transfer function together with the plant transfer function equate to the designed loop gain. The technique also shows what the limitations of the controller design are and how to trade competing design requirements such as stability margins and disturbance rejection. Finally, the technique is contrasted against other more familiar control design techniques, like PID control, to show its advantages.

  3. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  4. The ALHAMBRA survey: accurate merger fractions derived by PDF analysis of photometrically close pairs

    NASA Astrophysics Data System (ADS)

    López-Sanjuan, C.; Cenarro, A. J.; Varela, J.; Viironen, K.; Molino, A.; Benítez, N.; Arnalte-Mur, P.; Ascaso, B.; Díaz-García, L. A.; Fernández-Soto, A.; Jiménez-Teja, Y.; Márquez, I.; Masegosa, J.; Moles, M.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Broadhurst, T.; Cabrera-Caño, J.; Castander, F. J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Aims: Our goal is to develop and test a novel methodology to compute accurate close-pair fractions with photometric redshifts. Methods: We improved the currently used methodologies to estimate the merger fraction fm from photometric redshifts by (i) using the full probability distribution functions (PDFs) of the sources in redshift space; (ii) including the variation in the luminosity of the sources with z in both the sample selection and the luminosity ratio constrain; and (iii) splitting individual PDFs into red and blue spectral templates to reliably work with colour selections. We tested the performance of our new methodology with the PDFs provided by the ALHAMBRA photometric survey. Results: The merger fractions and rates from the ALHAMBRA survey agree excellently well with those from spectroscopic work for both the general population and red and blue galaxies. With the merger rate of bright (MB ≤ -20-1.1z) galaxies evolving as (1 + z)n, the power-law index n is higher for blue galaxies (n = 2.7 ± 0.5) than for red galaxies (n = 1.3 ± 0.4), confirming previous results. Integrating the merger rate over cosmic time, we find that the average number of mergers per galaxy since z = 1 is Nmred = 0.57 ± 0.05 for red galaxies and Nmblue = 0.26 ± 0.02 for blue galaxies. Conclusions: Our new methodology statistically exploits all the available information provided by photometric redshift codes and yields accurate measurements of the merger fraction by close pairs from using photometric redshifts alone. Current and future photometric surveys will benefit from this new methodology. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).The catalogues, probabilities, and figures of the ALHAMBRA close pairs detected in Sect. 5.1 are available at http://https://cloud.iaa.csic.es/alhambra/catalogues/ClosePairs

  5. [Growing up as an occupation child of World War II in Germany: Rationale and methods of a study on German occupation children].

    PubMed

    Kaiser, Marie; Kuwert, Philipp; Glaesmer, Heide

    2015-01-01

    To date the experiences of German occupation children (GOC) have been described solely in historical studies; empirical research on the psychosocial consequences growing up as German occupation children was missing. This paper provides an introduction to the background, methodological approaches and descriptive information on a sample for the first German-based empirical study on this topic. It also touches on methodical challenges and solution processes. Children born of war resemble a target group that is difficult to reach (hidden population). Therefore, an investigation needs consultation of both people from the target group and scientific experts (participatory approach) as well as specific methodological approaches. The questionnaire utilized contains adaptations of established and psychometrically validated instruments as well as adapted self-developed items. N = 146 occupation children were surveyed (mean age 63.4, 63.0% women) via press release and contact to platforms of children born of war. Despite methodological challenges an instrument to assess the target group was developed through participatory methods. The instrument shows high relevance for the target group and is highly accepted. High rates of American and French participants show the influence of networking in platforms on successful recruitment.

  6. The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale

    ERIC Educational Resources Information Center

    Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine

    2013-01-01

    Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…

  7. From Study to Work: Methodological Challenges of a Graduate Destination Survey in the Western Cape, South Africa

    ERIC Educational Resources Information Center

    du Toit, Jacques; Kraak, Andre; Favish, Judy; Fletcher, Lizelle

    2014-01-01

    Current literature proposes several strategies for improving response rates to student evaluation surveys. Graduate destination surveys pose the difficulty of tracing graduates years later when their contact details may have changed. This article discusses the methodology of one such a survey to maximise response rates. Compiling a sample frame…

  8. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  9. Cruise design for a 5-year period of the 50-year timber sales in Alaska.

    Treesearch

    John W. Hazard

    1985-01-01

    Sampling rules and estimation procedures are described for a new cruise design that was developed for 50-year timber sales in Alaska. An example is given of the rate redetermination cruise and analysis for the 1984-1989 period of the Ketchikan Pulp Company sale. In addition, methodology is presented for an alternative sampling technique of sampling with probability...

  10. Network Models of Entrepreneurial Ecosystems in Developing Economies

    DTIC Science & Technology

    2014-01-01

    Department of Mathematical Sciences, U.S. Military Academy Candice Price , Ph.D. , Department of Mathematical Sciences, U.S. Military Academy NOTICES...methodology. “Youth unemployment is a ticking time bomb,” –Alexander Chikwanda, Finance Minister, Zambia Protesters in Tahrir Square, Cairo...with the recent political and social changes in the region, only contributes to this high unemployment rate. As the Finance Minister of Zambia stated

  11. 75 FR 24757 - Order Making Fiscal Year 2011 Annual Adjustments to the Fee Rates Applicable Under Section 6(b...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... Management and Budget (``OMB'') to project aggregate offering price for purposes of the fiscal year 2010... methodology it developed in consultation with the CBO and OMB to project dollar volume for purposes of prior... AAMOP is given by exp(FLAAMOP t + [sigma] n \\2\\/2), where [sigma] n denotes the standard error of the n...

  12. Modified Methodology for Projecting Coastal Louisiana Land Changes over the Next 50 Years

    USGS Publications Warehouse

    Hartley, Steve B.

    2009-01-01

    The coastal Louisiana landscape is continually undergoing geomorphologic changes (in particular, land loss); however, after the 2005 hurricane season, the changes were intensified because of Hurricanes Katrina and Rita. The amount of land loss caused by the 2005 hurricane season was 42 percent (562 km2) of the total land loss (1,329 km2) that was projected for the next 50 years in the Louisiana Coastal Area (LCA), Louisiana Ecosystem Restoration Study. The purpose of this study is to provide information on potential changes to coastal Louisiana by using a revised LCA study methodology. In the revised methodology, we used classified Landsat TM satellite imagery from 1990, 2001, 2004, and 2006 to calculate the 'background' or ambient land-water change rates but divided the Louisiana coastal area differently on the basis of (1) geographic regions ('subprovinces') and (2) specific homogeneous habitat types. Defining polygons by subprovinces (1, Pontchartrain Basin; 2, Barataria Basin; 3, Vermilion/Terrebonne Basins; and 4, the Chenier Plain area) allows for a specific erosion rate to be applied to that area. Further subdividing the provinces by habitat type allows for specific erosion rates for a particular vegetation type to be applied. Our modified methodology resulted in 24 polygons rather than the 183 that were used in the LCA study; further, actively managed areas and the CWPPRA areas were not masked out and dealt with separately as in the LCA study. This revised methodology assumes that erosion rates for habitat types by subprovince are under the influence of similar environmental conditions (sediment depletion, subsidence, and saltwater intrusion). Background change rate for three time periods (1990-2001, 1990-2004, and 1990-2006) were calculated by taking the difference in water or land among each time period and dividing it by the time interval. This calculation gives an annual change rate for each polygon per time period. Change rates for each time period were then used to compute the projected change in each subprovince and habitat type over 50 years by using the same compound rate functions used in the LCA study. The resulting maps show projected land changes based on the revised methodology and inclusion of damage by Hurricanes Katrina and Rita. Comparison of projected land change values between the LCA study and this study shows that this revised methodology - that is, using a reduced polygon subset (reduced from 183 to 24) based on habitat type and subprovince - can be used as a quick projection of land loss.

  13. A measurement-based study of concurrency in a multiprocessor

    NASA Technical Reports Server (NTRS)

    Mcguire, Patrick John

    1987-01-01

    A systematic measurement-based methodology for characterizing the amount of concurrency present in a workload, and the effect of concurrency on system performance indices such as cache miss rate and bus activity are developed. Hardware and software instrumentation of an Alliant FX/8 was used to obtain data from a real workload environment. Results show that 35% of the workload is concurrent, with the concurrent periods typically using all available processors. Measurements of periods of change in concurrency show uneven usage of processors during these times. Other system measures, including cache miss rate and processor bus activity, are analyzed with respect to the concurrency measures. Probability of a cache miss is seen to increase with concurrency. The change in cache miss rate is much more sensitive to the fraction of concurrent code in the worklaod than the number of processors active during concurrency. Regression models are developed to quantify the relationships between cache miss rate, bus activity, and the concurrency measures. The model for cache miss rate predicts an increase in the median miss rate value as much as 300% for a 100% increase in concurrency in the workload.

  14. Regional Hospital Input Price Indexes

    PubMed Central

    Freeland, Mark S.; Schendler, Carol Ellen; Anderson, Gerard

    1981-01-01

    This paper describes the development of regional hospital input price indexes that is consistent with the general methodology used for the National Hospital Input Price Index. The feasibility of developing regional indexes was investigated because individuals inquired whether different regions experienced different rates of increase in hospital input prices. The regional indexes incorporate variations in cost-share weights (the amount an expense category contributes to total spending) associated with hospital type and location, and variations in the rate of input price increases for various regions. We found that between 1972 and 1979 none of the regional price indexes increased at average annual rates significantly different from the national rate. For the more recent period 1977 through 1979, the increase in one Census Region was significantly below the national rate. Further analyses indicated that variations in cost-share weights for various types of hospitals produced no substantial variations in the regional price indexes relative to the national index. We consider these findings preliminary because of limitations in the availability of current, relevant, and reliable data, especially for local area wage rate increases. PMID:10309557

  15. Characterization of a mine fire using atmospheric monitoring system sensor data

    PubMed Central

    Yuan, L.; Thomas, R.A.; Zhou, L.

    2017-01-01

    Atmospheric monitoring systems (AMS) have been widely used in underground coal mines in the United States for the detection of fire in the belt entry and the monitoring of other ventilation-related parameters such as airflow velocity and methane concentration in specific mine locations. In addition to an AMS being able to detect a mine fire, the AMS data have the potential to provide fire characteristic information such as fire growth — in terms of heat release rate — and exact fire location. Such information is critical in making decisions regarding fire-fighting strategies, underground personnel evacuation and optimal escape routes. In this study, a methodology was developed to calculate the fire heat release rate using AMS sensor data for carbon monoxide concentration, carbon dioxide concentration and airflow velocity based on the theory of heat and species transfer in ventilation airflow. Full-scale mine fire experiments were then conducted in the Pittsburgh Mining Research Division’s Safety Research Coal Mine using an AMS with different fire sources. Sensor data collected from the experiments were used to calculate the heat release rates of the fires using this methodology. The calculated heat release rate was compared with the value determined from the mass loss rate of the combustible material using a digital load cell. The experimental results show that the heat release rate of a mine fire can be calculated using AMS sensor data with reasonable accuracy. PMID:28845058

  16. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators.

    PubMed

    Beccari, Benjamin

    2016-03-14

    In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development.

  17. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators

    PubMed Central

    Beccari, Benjamin

    2016-01-01

    Introduction: In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. Methods: An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Results: Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298

  18. The tear turnover and tear clearance tests - a review.

    PubMed

    Garaszczuk, Izabela K; Montes Mico, Robert; Iskander, D Robert; Expósito, Alejandro Cerviño

    2018-03-01

    The aim is to provide a summary of methods available for the assessment of tear turnover and tear clearance rates. The review defines tear clearance and tear turnover and describes their implication for ocular surface health. Additionally, it describes main types of techniques for measuring tear turnover, including fluorescein tear clearance tests, techniques utilizing electromagnetic spectrum and tracer molecule and novel experimental techniques utilizing optical coherence tomography and fluorescein profilometry. Areas covered: Internet databases (PubMed, Science Direct, Google Scholar) and most frequently cited references were used as a principal resource of information on tear turnover rate and tear clearance rate, presenting methodologies and equipment, as well as their definition and implications for the anterior eye surface health and function. Keywords used for data-search were as follows: tear turnover, tear clearance, fluorescein clearance, scintigraphy, fluorophotometry, tear flow, drainage, tear meniscus dynamics, Krehbiel flow and lacrimal functional unit. Expert commentary: After decades, the topic of tear turnover assessment has been reintroduced. Recently, new techniques have been developed to propose less invasive, less time consuming and simpler methodologies for the assessment of tear dynamics that have the potential to be utilized in clinical practice.

  19. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    NASA Astrophysics Data System (ADS)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  20. Implementation of a multi-variable regression analysis in the assessment of the generation rate and composition of hospital solid waste for the design of a sustainable management system in developing countries.

    PubMed

    Al-Khatib, Issam A; Abu Fkhidah, Ismail; Khatib, Jumana I; Kontogianni, Stamatia

    2016-03-01

    Forecasting of hospital solid waste generation is a critical challenge for future planning. The composition and generation rate of hospital solid waste in hospital units was the field where the proposed methodology of the present article was applied in order to validate the results and secure the outcomes of the management plan in national hospitals. A set of three multiple-variable regression models has been derived for estimating the daily total hospital waste, general hospital waste, and total hazardous waste as a function of number of inpatients, number of total patients, and number of beds. The application of several key indicators and validation procedures indicates the high significance and reliability of the developed models in predicting the hospital solid waste of any hospital. Methodology data were drawn from existent scientific literature. Also, useful raw data were retrieved from international organisations and the investigated hospitals' personnel. The primal generation outcomes are compared with other local hospitals and also with hospitals from other countries. The main outcome, which is the developed model results, are presented and analysed thoroughly. The goal is this model to act as leverage in the discussions among governmental authorities on the implementation of a national plan for safe hospital waste management in Palestine. © The Author(s) 2016.

  1. [Integrated Quality Management System (IQMS): a model for improving the quality of reproductive health care in rural Kenya].

    PubMed

    Herrler, Claudia; Bramesfeld, Anke; Brodowski, Marc; Prytherch, Helen; Marx, Irmgard; Nafula, Maureen; Richter-Aairijoki, Heide; Musyoka, Lucy; Marx, Michael; Szecsenyi, Joachim

    2015-01-01

    To develop a model aiming to improve the quality of services for reproductive health care in rural Kenya and designed to measure the quality of reproductive health services in such a way that allows these services to identify measures for improving their performance. The Integrated Quality Management System (IQMS) was developed on the basis of a pre-existing and validated model for quality promotion, namely the European Practice Assessment (EPA). The methodology for quality assessment and feedback of assessment results to the service teams was adopted from the EPA model. Quality assessment methodology included data assessment through staff, patient surveys and service visitation. Quality is assessed by indicators, and so indicators had to be developed that were appropriate for assessing reproductive health care in rural Kenya. A search of the Kenyan and international literature was conducted to identify potential indicators. These were then rated for their relevance and clarity by a panel of Kenyan experts. 260 indicators were rated as relevant and assigned to 29 quality dimensions and 5 domains. The implementation of IQMS in ten facilities showed that IQMS is a feasible model for assessing the quality of reproductive health services in rural Kenya. IQMS enables these services to identify quality improvement targets and necessary improvement measures. Both strengths and limitations of IQMS will be discussed. Copyright © 2015. Published by Elsevier GmbH.

  2. Purchasing power of civil servant health workers in Mozambique.

    PubMed

    Ferrinho, Fátima; Amaral, Marta; Russo, Giuliano; Ferrinho, Paulo

    2012-01-01

    Health workers' purchasing power is an important consideration in the development of strategies for health workforce development. This work explores the purchasing power variation of Mozambican public sector health workers, between 1999 and 2007. In general, the calculated purchasing power increased for most careers under study, and the highest percentage increase was observed for the lowest remuneration careers, contributing in this way for a relative reduction in the difference between the higher and the lower salaries. This was done through a simple and easy-to-apply methodology to estimate salaries' capitalization rate, by means of the accumulated inflation rate, after taking wage revisions into account. All the career categories in the Ministry of Health and affiliated public sector institutions were considered. Health workers' purchasing power is an important consideration in the development of strategies for health workforce development. This work explores the purchasing power variation of Mozambican public sector health workers, between 1999 and 2007. In general, the calculated purchasing power increased for most careers under study, and the highest percentage increase was observed for the lowest remuneration careers, contributing in this way for a relative reduction in the difference between the higher and the lower salaries. These results seem to contradict a commonly held assumption that health sector pay has deteriorated over the years, and with substantial damage for the poorest. Further studies appear to be needed to design a more accurate methodology to better understand the evolution and impact of public sector health workers' remunerations across the years.

  3. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  4. A portable circulating tumor cell capture microdevice

    NASA Astrophysics Data System (ADS)

    Datar, Ram H.

    2009-03-01

    Sensitive detection of earliest metastatic spread of tumors in a minimally invasive and user-friendly manner will revolutionize the clinical management of cancer patients. The current methodologies for circulating tumor cell (CTC) capture and identification have significant limitations including time, cost, limited capture efficiency and lack of standardization. We have developed and optimized a novel parylene membrane filter-based portable microdevice for size-based isolation of CTC from human peripheral blood. Following characterization with a model system to study the recovery rate and enrichment factor, a comparison of the microdevice with the commercially available system using blood from cancer patients demonstrated superior recovery rate and the promise of clinical utility of the microdevice. The development of the microdevice and its potential clinical applicability will be discussed.

  5. Sedimentation and the Economics of Selecting an Optimum Reservoir Size

    NASA Astrophysics Data System (ADS)

    Miltz, David; White, David C.

    1987-08-01

    This paper attempts to develop an easily reproducible methodology for the economic selection of an optimal reservoir size given an annual sedimentation rate. The optimal capacity is that at which the marginal cost of constructing additional storage capacity is equal to the dredging costs avoided by having that additional capacity available to store sediment. The cost implications of misestimating dredging costs, construction costs, and sediment delivery rates are investigated. In general, it is shown that oversizing is a rational response to uncertainty in the estimation of parameters. The sensitivity of the results to alternative discount rates is also discussed. The theoretical discussion is illustrated with a case study drawn from Highland Silver Lake in southwestern Illinois.

  6. Self-tuning control of attitude and momentum management for the Space Station

    NASA Technical Reports Server (NTRS)

    Shieh, L. S.; Sunkel, J. W.; Yuan, Z. Z.; Zhao, X. M.

    1992-01-01

    This paper presents a hybrid state-space self-tuning design methodology using dual-rate sampling for suboptimal digital adaptive control of attitude and momentum management for the Space Station. This new hybrid adaptive control scheme combines an on-line recursive estimation algorithm for indirectly identifying the parameters of a continuous-time system from the available fast-rate sampled data of the inputs and states and a controller synthesis algorithm for indirectly finding the slow-rate suboptimal digital controller from the designed optimal analog controller. The proposed method enables the development of digitally implementable control algorithms for the robust control of Space Station Freedom with unknown environmental disturbances and slowly time-varying dynamics.

  7. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  8. Airfoil deposition model

    NASA Technical Reports Server (NTRS)

    Kohl, F. J.

    1982-01-01

    The methodology to predict deposit evolution (deposition rate and subsequent flow of liquid deposits) as a function of fuel and air impurity content and relevant aerodynamic parameters for turbine airfoils is developed in this research. The spectrum of deposition conditions encountered in gas turbine operations includes the mechanisms of vapor deposition, small particle deposition with thermophoresis, and larger particle deposition with inertial effects. The focus is on using a simplified version of the comprehensive multicomponent vapor diffusion formalism to make deposition predictions for: (1) simple geometry collectors; and (2) gas turbine blade shapes, including both developing laminar and turbulent boundary layers. For the gas turbine blade the insights developed in previous programs are being combined with heat and mass transfer coefficient calculations using the STAN 5 boundary layer code to predict vapor deposition rates and corresponding liquid layer thicknesses on turbine blades. A computer program is being written which utilizes the local values of the calculated deposition rate and skin friction to calculate the increment in liquid condensate layer growth along a collector surface.

  9. Analysis of optimal phenotypic space using elementary modes as applied to Corynebacterium glutamicum

    PubMed Central

    Gayen, Kalyan; Venkatesh, KV

    2006-01-01

    Background Quantification of the metabolic network of an organism offers insights into possible ways of developing mutant strain for better productivity of an extracellular metabolite. The first step in this quantification is the enumeration of stoichiometries of all reactions occurring in a metabolic network. The structural details of the network in combination with experimentally observed accumulation rates of external metabolites can yield flux distribution at steady state. One such methodology for quantification is the use of elementary modes, which are minimal set of enzymes connecting external metabolites. Here, we have used a linear objective function subject to elementary modes as constraint to determine the fluxes in the metabolic network of Corynebacterium glutamicum. The feasible phenotypic space was evaluated at various combinations of oxygen and ammonia uptake rates. Results Quantification of the fluxes of the elementary modes in the metabolism of C. glutamicum was formulated as linear programming. The analysis demonstrated that the solution was dependent on the criteria of objective function when less than four accumulation rates of the external metabolites were considered. The analysis yielded feasible ranges of fluxes of elementary modes that satisfy the experimental accumulation rates. In C. glutamicum, the elementary modes relating to biomass synthesis through glycolysis and TCA cycle were predominantly operational in the initial growth phase. At a later time, the elementary modes contributing to lysine synthesis became active. The oxygen and ammonia uptake rates were shown to be bounded in the phenotypic space due to the stoichiometric constraint of the elementary modes. Conclusion We have demonstrated the use of elementary modes and the linear programming to quantify a metabolic network. We have used the methodology to quantify the network of C. glutamicum, which evaluates the set of operational elementary modes at different phases of fermentation. The methodology was also used to determine the feasible solution space for a given set of substrate uptake rates under specific optimization criteria. Such an approach can be used to determine the optimality of the accumulation rates of any metabolite in a given network. PMID:17038164

  10. Design and implementation of co-operative control strategy for hybrid AC/DC microgrids

    NASA Astrophysics Data System (ADS)

    Mahmud, Rasel

    This thesis is mainly divided in two major sections: 1) Modeling and control of AC microgrid, DC microgrid, Hybrid AC/DC microgrid using distributed co-operative control, and 2) Development of a four bus laboratory prototype of an AC microgrid system. At first, a distributed cooperative control (DCC) for a DC microgrid considering the state-of-charge (SoC) of the batteries in a typical plug-in-electric-vehicle (PEV) is developed. In DC microgrids, this methodology is developed to assist the load sharing amongst the distributed generation units (DGs), according to their ratings with improved voltage regulation. Subsequently, a DCC based control algorithm for AC microgrid is also investigated to improve the performance of AC microgrid in terms of power sharing among the DGs, voltage regulation and frequency deviation. The results validate the advantages of the proposed methodology as compared to traditional droop control of AC microgrid. The DCC-based control methodology for AC microgrid and DC microgrid are further expanded to develop a DCC-based power management algorithm for hybrid AC/DC microgrid. The developed algorithm for hybrid microgrid controls the power flow through the interfacing converter (IC) between the AC and DC microgrids. This will facilitate the power sharing between the DGs according to their power ratings. Moreover, it enables the fixed scheduled power delivery at different operating conditions, while maintaining good voltage regulation and improved frequency profile. The second section provides a detailed explanation and step-by-step design and development of an AC/DC microgrid testbed. Controllers for the three-phase inverters are designed and tested on different generation units along with their corresponding inductor-capacitor-inductor (LCL) filters to eliminate the switching frequency harmonics. Electric power distribution line models are developed to form the microgrid network topology. Voltage and current sensors are placed in the proper positions to achieve a full visibility over the microgrid. A running average filter (RAF) based enhanced phase-locked-loop (EPLL) is designed and implemented to extract frequency and phase angle information. A PLL-based synchronizing scheme is also developed to synchronize the DGs to the microgrid. The developed laboratory prototype runs on dSpace platform for real time data acquisition, communication and controller implementation.

  11. Methodological Review of Intimate Partner Violence Prevention Research

    ERIC Educational Resources Information Center

    Murray, Christine E.; Graybeal, Jennifer

    2007-01-01

    The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…

  12. 76 FR 72134 - Annual Charges for Use of Government Lands

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ... revise the methodology used to compute these annual charges. Under the proposed rule, the Commission would create a fee schedule based on the U.S. Bureau of Land Management's (BLM) methodology for calculating rental rates for linear rights of way. This methodology includes a land value per acre, an...

  13. Assessment of Anthelmintic Efficacy of Mebendazole in School Children in Six Countries Where Soil-Transmitted Helminths Are Endemic

    PubMed Central

    Levecke, Bruno; Montresor, Antonio; Albonico, Marco; Ame, Shaali M.; Behnke, Jerzy M.; Bethony, Jeffrey M.; Noumedem, Calvine D.; Engels, Dirk; Guillard, Bertrand; Kotze, Andrew C.; Krolewiecki, Alejandro J.; McCarthy, James S.; Mekonnen, Zeleke; Periago, Maria V.; Sopheak, Hem; Tchuem-Tchuenté, Louis-Albert; Duong, Tran Thanh; Huong, Nguyen Thu; Zeynudin, Ahmed; Vercruysse, Jozef

    2014-01-01

    Background Robust reference values for fecal egg count reduction (FECR) rates of the most widely used anthelmintic drugs in preventive chemotherapy (PC) programs for controlling soil-transmitted helminths (STHs; Ascaris lumbricoides, Trichuris trichiura, and hookworm) are still lacking. However, they are urgently needed to ensure detection of reduced efficacies that are predicted to occur due to growing drug pressure. Here, using a standardized methodology, we assessed the FECR rate of a single oral dose of mebendazole (MEB; 500 mg) against STHs in six trials in school children in different locations around the world. Our results are compared with those previously obtained for similarly conducted trials of a single oral dose of albendazole (ALB; 400 mg). Methodology The efficacy of MEB, as assessed by FECR, was determined in six trials involving 5,830 school children in Brazil, Cambodia, Cameroon, Ethiopia, United Republic of Tanzania, and Vietnam. The efficacy of MEB was compared to that of ALB as previously assessed in 8,841 school children in India and all the above-mentioned study sites, using identical methodologies. Principal Findings The estimated FECR rate [95% confidence interval] of MEB was highest for A. lumbricoides (97.6% [95.8; 99.5]), followed by hookworm (79.6% [71.0; 88.3]). For T. trichiura, the estimated FECR rate was 63.1% [51.6; 74.6]. Compared to MEB, ALB was significantly more efficacious against hookworm (96.2% [91.1; 100], p<0.001) and only marginally, although significantly, better against A. lumbricoides infections (99.9% [99.0; 100], p = 0.012), but equally efficacious for T. trichiura infections (64.5% [44.4; 84.7], p = 0.906). Conclusions/Significance A minimum FECR rate of 95% for A. lumbricoides, 70% for hookworm, and 50% for T. trichiura is expected in MEB-dependent PC programs. Lower FECR results may indicate the development of potential drug resistance. PMID:25299391

  14. Fuzzy Based Decision Support System for Condition Assessment and Rating of Bridges

    NASA Astrophysics Data System (ADS)

    Srinivas, Voggu; Sasmal, Saptarshi; Karusala, Ramanjaneyulu

    2016-09-01

    In this work, a knowledge based decision support system has been developed to efficiently handle the issues such as distress diagnosis, assessment of damages and condition rating of existing bridges towards developing an exclusive and robust Bridge Management System (BMS) for sustainable bridges. The Knowledge Based Expert System (KBES) diagnoses the distresses and finds the cause of distress in the bridge by processing the data which are heuristic and combined with site inspection results, laboratory test results etc. The coupling of symbolic and numeric type of data has been successfully implemented in the expert system to strengthen its decision making process. Finally, the condition rating of the bridge is carried out using the assessment results obtained from the KBES and the information received from the bridge inspector. A systematic procedure has been developed using fuzzy mathematics for condition rating of bridges by combining the fuzzy weighted average and resolution identity technique. The proposed methodologies and the decision support system will facilitate in developing a robust and exclusive BMS for a network of bridges across the country and allow the bridge engineers and decision makers to carry out maintenance of bridges in a rational and systematic way.

  15. The price of innovation: new estimates of drug development costs.

    PubMed

    DiMasi, Joseph A; Hansen, Ronald W; Grabowski, Henry G

    2003-03-01

    The research and development costs of 68 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per new drug is 403 million US dollars (2000 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 11% yields a total pre-approval cost estimate of 802 million US dollars (2000 dollars). When compared to the results of an earlier study with a similar methodology, total capitalized costs were shown to have increased at an annual rate of 7.4% above general price inflation. Copyright 2003 Elsevier Science B.V.

  16. The method of expected number of deaths, 1786-1886-1986.

    PubMed

    Keiding, N

    1987-04-01

    "The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt

  17. Novel methodology to obtain salient biomechanical characteristics of insole materials.

    PubMed

    Lavery, L A; Vela, S A; Ashry, H R; Lanctot, D R; Athanasiou, K A

    1997-06-01

    Viscoelastic inserts are commonly used as artificial shock absorbers to prevent neuropathic foot ulcerations by decreasing pressure on the sole of the foot. Unfortunately, there is little scientific information available to guide physicians in the selection of appropriate insole materials. Therefore, a novel methodology was developed to form a rational platform for biomechanical characterizations of insole material durability, which consisted of in vivo gait analysis and in vitro bioengineering measurements. Results show significant differences in the compressive stiffness of the tested insoles and the rate of change over time in both compressive stiffness and peak pressures measured. Good correlations were found between pressure-time integral and Young's modulus (r2 = 0.93), and total energy applied and Young's modulus (r2 = 0.87).

  18. The inverse problem of brain energetics: ketone bodies as alternative substrates

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Occhipinti, R.; Somersalo, E.

    2008-07-01

    Little is known about brain energy metabolism under ketosis, although there is evidence that ketone bodies have a neuroprotective role in several neurological disorders. We investigate the inverse problem of estimating reaction fluxes and transport rates in the different cellular compartments of the brain, when the data amounts to a few measured arterial venous concentration differences. By using a recently developed methodology to perform Bayesian Flux Balance Analysis and a new five compartment model of the astrocyte-glutamatergic neuron cellular complex, we are able to identify the preferred biochemical pathways during shortage of glucose and in the presence of ketone bodies in the arterial blood. The analysis is performed in a minimally biased way, therefore revealing the potential of this methodology for hypothesis testing.

  19. Composite Dry Structure Cost Improvement Approach

    NASA Technical Reports Server (NTRS)

    Nettles, Alan; Nettles, Mindy

    2015-01-01

    This effort demonstrates that by focusing only on properties of relevance, composite interstage and shroud structures can be placed on the Space Launch System vehicle that simultaneously reduces cost, improves reliability, and maximizes performance, thus providing the Advanced Development Group with a new methodology of how to utilize composites to reduce weight for composite structures on launch vehicles. Interstage and shroud structures were chosen since both of these structures are simple in configuration and do not experience extreme environments (such as cryogenic or hot gas temperatures) and should represent a good starting point for flying composites on a 'man-rated' vehicle. They are used as an example only. The project involves using polymer matrix composites for launch vehicle structures, and the logic and rationale behind the proposed new methodology.

  20. Signal transduction and amplification through enzyme-triggered ligand release and accelerated catalysis.

    PubMed

    Goggins, Sean; Marsh, Barrie J; Lubben, Anneke T; Frost, Christopher G

    2015-08-01

    Signal transduction and signal amplification are both important mechanisms used within biological signalling pathways. Inspired by this process, we have developed a signal amplification methodology that utilises the selectivity and high activity of enzymes in combination with the robustness and generality of an organometallic catalyst, achieving a hybrid biological and synthetic catalyst cascade. A proligand enzyme substrate was designed to selectively self-immolate in the presence of the enzyme to release a ligand that can bind to a metal pre-catalyst and accelerate the rate of a transfer hydrogenation reaction. Enzyme-triggered catalytic signal amplification was then applied to a range of catalyst substrates demonstrating that signal amplification and signal transduction can both be achieved through this methodology.

  1. Extraction of breathing pattern using temperature sensor based on Arduino board

    NASA Astrophysics Data System (ADS)

    Patel, Rajesh; Sengottuvel, S.; Gireesan, K.; Janawadkar, M. P.; Radhakrishnan, T. S.

    2015-06-01

    Most of the basic functions of human body are assessed by measuring the different parameters from the body such as temperature, pulse activity and blood pressure etc. Respiration rate is the number of inhalations a person takes per minute and needs to be quantitatively assessed as it modulates other measurements such as SQUID based magnetocardiography (MCG) by bringing the chest closer to or away from the sensor array located inside a stationary liquid helium cryostat. The respiration rate is usually measured when a person is at rest and simply involves counting the number of inhalations for one minute. This paper aims at the development of a suitable methodology for the measurement of respiration rate with the help of a temperature sensor which monitors the very slight change in temperature near the nostril during inhalation & exhalation. The design and development of the proposed system is presented, along with typical experiment results.

  2. Expectation maximization-based likelihood inference for flexible cure rate models with Weibull lifetimes.

    PubMed

    Balakrishnan, Narayanaswamy; Pal, Suvra

    2016-08-01

    Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence. © The Author(s) 2013.

  3. [Cancer pain management: Systematic review and critical appraisal of clinical practice guidelines].

    PubMed

    Martínez-Nicolás, I; Ángel-García, D; Saturno, P J; López-Soriano, F

    2016-01-01

    Although several clinical practice guidelines have been developed in the last decades, cancer pain management is still deficient. The purpose of this work was to carry out a comprehensive and systematic literature review of current clinical practice guidelines on cancer pain management, and critically appraise their methodology and content in order to evaluate their quality and validity to cope with this public health issue. A systematic review was performed in the main databases, using English, French and Spanish as languages, from 2008 to 2013. Reporting and methodological quality was rated with the Appraisal of Guidelines, Research and Evaluation II (AGREE-II) tool, including an inter-rater reliability analysis. Guideline recommendations were extracted and classified into several categories and levels of evidence, aiming to analyse guidelines variability and evidence-based content comprehensiveness. Six guidelines were included. A wide variability was found in both reporting and methodological quality of guidelines, as well as in the content and the level of evidence of their recommendations. The Scottish Intercollegiate Guidelines Network guideline was the best rated using AGREE-II, while the Sociedad Española de Oncología Médica guideline was the worst rated. The Ministry of Health Malaysia guideline was the most comprehensive, and the Scottish Intercollegiate Guidelines Network guideline was the second one. The current guidelines on cancer pain management have limited quality and content. We recommend Ministry of Health Malaysia and Scottish Intercollegiate Guidelines Network guidelines, whilst Sociedad Española de Oncología Médica guideline still needs to improve. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  4. Assessing Aircraft Susceptibility to Nonlinear Aircraft-Pilot Coupling/Pilot-Induced Oscillations

    NASA Technical Reports Server (NTRS)

    Hess, R.A.; Stout, P. W.

    1997-01-01

    A unified approach for assessing aircraft susceptibility to aircraft-pilot coupling (or pilot-induced oscillations) which was previously reported in the literature and applied to linear systems is extended to nonlinear systems, with emphasis upon vehicles with actuator rate saturation. The linear methodology provided a tool for predicting: (1) handling qualities levels, (2) pilot-induced oscillation rating levels and (3) a frequency range in which pilot-induced oscillations are likely to occur. The extension to nonlinear systems provides a methodology for predicting the latter two quantities. Eight examples are presented to illustrate the use of the technique. The dearth of experimental flight-test data involving systematic variation and assessment of the effects of actuator rate limits presently prevents a more thorough evaluation of the methodology.

  5. Wall jet analysis for circulation control aerodynamics. Part 1: Fundamental CFD and turbulence modeling concepts

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; York, B. J.; Sinha, N.; Dvorak, F. A.

    1987-01-01

    An overview of parabolic and PNS (Parabolized Navier-Stokes) methodology developed to treat highly curved sub and supersonic wall jets is presented. The fundamental data base to which these models were applied is discussed in detail. The analysis of strong curvature effects was found to require a semi-elliptic extension of the parabolic modeling to account for turbulent contributions to the normal pressure variations, as well as an extension to the turbulence models utilized, to account for the highly enhanced mixing rates observed in situations with large convex curvature. A noniterative, pressure split procedure is shown to extend parabolic models to account for such normal pressure variations in an efficient manner, requiring minimal additional run time over a standard parabolic approach. A new PNS methodology is presented to solve this problem which extends parabolic methodology via the addition of a characteristic base wave solver. Applications of this approach to analyze the interaction of wave and turbulence processes in wall jets is presented.

  6. Joint source-channel coding for motion-compensated DCT-based SNR scalable video.

    PubMed

    Kondi, Lisimachos P; Ishtiaq, Faisal; Katsaggelos, Aggelos K

    2002-01-01

    In this paper, we develop an approach toward joint source-channel coding for motion-compensated DCT-based scalable video coding and transmission. A framework for the optimal selection of the source and channel coding rates over all scalable layers is presented such that the overall distortion is minimized. The algorithm utilizes universal rate distortion characteristics which are obtained experimentally and show the sensitivity of the source encoder and decoder to channel errors. The proposed algorithm allocates the available bit rate between scalable layers and, within each layer, between source and channel coding. We present the results of this rate allocation algorithm for video transmission over a wireless channel using the H.263 Version 2 signal-to-noise ratio (SNR) scalable codec for source coding and rate-compatible punctured convolutional (RCPC) codes for channel coding. We discuss the performance of the algorithm with respect to the channel conditions, coding methodologies, layer rates, and number of layers.

  7. Discharge ratings for control gates at Mississippi River lock and dam 12, Bellevue, Iowa

    USGS Publications Warehouse

    Heinitz, Albert J.

    1986-01-01

    The water level of the navigation pools on the Mississippi River are maintained by the operation of tainter and roller gates at the locks and dams. Discharge ratings for the gates on Lock and Dam 12, at Bellevue, Iowa, were developed from current-meter discharge measurements made in the forebays of the gate structures. Methodology is given to accurately compute the gate openings of the tainter gates. Discharge coefficients, in equations that express discharge as a function of tailwater head , forebay head, and height of gate opening, were determined for conditions of submerged-orifice and fee-weir flow. A comparison of the rating discharges to the hydraulic-model rating discharges is given for submerged orifice flow for the tainter and roller gates.

  8. Human-Computer System Development Methodology for the Dialogue Management System.

    DTIC Science & Technology

    1982-05-01

    methodologies [HOSIJ78] are given below: I. The Michael Jackson Methodology [JACKM75] 2. The Warnier-Orr Methodolgy [HOSIJ78] 3. SADT (Structured...All the mentioned methodologies use top-down development strategy. The first two methodologies above ( Michael Jackson and Warnier-Orr) use data as the

  9. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    PubMed

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries.

  10. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    PubMed Central

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile phones offers promise for future data collection in Ghana and may be suitable for other developing countries. PMID:29351349

  11. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  12. Is Mauritius Ready to Become the HRD Leader in Africa? An Assessment of Strategic Human Resource Development in Mauritius

    ERIC Educational Resources Information Center

    Dusoye, Indravidoushi C.; Oogarah, Kavi

    2016-01-01

    Purpose: This paper aims to explore the applicability of Strategic HRD in Mauritius. Additionally, it assesses if Mauritius, with a high HDI factor, can take the lead on Strategic HRD in Africa. Design/methodology/approach: This paper used a mixed-approach questionnaire. A sample of 21 managers was contacted and received a response rate of 67 per…

  13. Development of a Flapping Wing Design Incorporating Shape Memory Alloy Actuation

    DTIC Science & Technology

    2010-03-01

    blimp platform. The Methodology section describes the manner in which functional kinematics of Nitinol were determined, the design and fabrication...functional kinematics of Nitinol . The direction of this research aimed at quantifying the stroke length of selected diameter Nitinol wires as a function...of cycling rate. Several Nitinol wires, trademarked as FlexinolTM and advertised as 50:50 Nickel-Titanium in composition, were purchased online

  14. Analysis of the Database of Theses and Dissertations from DME/UFSCAR about Astronomy Education

    NASA Astrophysics Data System (ADS)

    Rodrigues Ferreira, Orlando; Voelzke, Marcos Rincon

    2013-11-01

    The paper presents a brief analysis of the "Database of Theses and Dissertations about Astronomy Education" from the Department of Teaching Methodology (DME) of the Federal University of São Carlos(UFSCar). This kind of study made it possible to develop new analysis and statistical data, as well as to conduct a rating of Brazilian institutions that produce academic work in the area.

  15. "You Still Got to See Where She's Coming From": Using Photovoice to Understand African American Female Adolescents' Perspectives on Sexual Risk

    ERIC Educational Resources Information Center

    Sidibe, Turquoise; Turner, Kea; Sparks, Alicia; Woods-Jaeger, Briana; Lightfoot, Alexandra

    2018-01-01

    African Americans have the highest rate of new HIV infection in the United States. This photovoice study explored the perspectives and experiences of African American female youth and sought to understand how adolescent development impacts HIV risk. This study used the photovoice methodology with seven African American or Biracial female youth, in…

  16. 76 FR 26324 - Order Making Fiscal Year 2012 Annual Adjustments to Section 31 Fee Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... Commission is using the same methodology it developed in consultation with the CBO and OMB to project dollar... , * * *, [Delta] 120 {time} . These are given by [mu] = 0.0074 and [sigma] = 0.123, respectively. 4. Assume that... expected value of ADS t /ADS t-1 is given by exp ([mu] + [sigma]\\2\\/2), or on average ADS t = 1.015 x ADS t...

  17. The need for a comprehensive expert system development methodology

    NASA Technical Reports Server (NTRS)

    Baumert, John; Critchfield, Anna; Leavitt, Karen

    1988-01-01

    In a traditional software development environment, the introduction of standardized approaches has led to higher quality, maintainable products on the technical side and greater visibility into the status of the effort on the management side. This study examined expert system development to determine whether it differed enough from traditional systems to warrant a reevaluation of current software development methodologies. Its purpose was to identify areas of similarity with traditional software development and areas requiring tailoring to the unique needs of expert systems. A second purpose was to determine whether existing expert system development methodologies meet the needs of expert system development, management, and maintenance personnel. The study consisted of a literature search and personal interviews. It was determined that existing methodologies and approaches to developing expert systems are not comprehensive nor are they easily applied, especially to cradle to grave system development. As a result, requirements were derived for an expert system development methodology and an initial annotated outline derived for such a methodology.

  18. Preliminary Multi-Variable Cost Model for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. This paper reviews the methodology used to develop space telescope cost models; summarizes recently published single variable models; and presents preliminary results for two and three variable cost models. Some of the findings are that increasing mass reduces cost; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and technology development as a function of time reduces cost at the rate of 50% per 17 years.

  19. The Navigation Guide Systematic Review Methodology: A Rigorous and Transparent Method for Translating Environmental Health Science into Better Health Outcomes

    PubMed Central

    Sutton, Patrice

    2014-01-01

    Background: Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. Objectives: We sought to develop a proof of concept of the “Navigation Guide,” a systematic and transparent method of research synthesis in environmental health. Discussion: The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of “risk of bias,” and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a “moderate” quality rating to human observational studies and combining diverse evidence streams. Conclusions: The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm. Citation: Woodruff TJ, Sutton P. 2014. The Navigation Guide systematic review methodology: a rigorous and transparent method for translating environmental health science into better health outcomes. Environ Health Perspect 122:1007–1014; http://dx.doi.org/10.1289/ehp.1307175 PMID:24968373

  20. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  1. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    PubMed

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.

  2. Teaching clinical research methodology to the academic medical community: a fifteen-year retrospective of a comprehensive curriculum.

    PubMed

    Supino, Phyllis G; Borer, Jeffrey S

    2007-05-01

    Due to inadequate preparation, many medical professionals are unable to critically evaluate published research articles or properly design, execute and present their own research. To increase exposure among physicians, medical students, and allied health professionals to diverse methodological issues involved in performing research. A comprehensive course on research methodology was newly designed for physicians and other members of an academic medical community, and has been successfully implemented beginning 1991. The role of the study hypothesis is highlighted; interactive pedagogical techniques are employed to promote audience engagement. Participants complete an annual evaluation to assess course quality and perceived outcomes. Outcomes also are assessed qualitatively by faculty. More than 500 physicians/other professionals have participated. Ratings have been consistently high. Topics deemed most valuable are investigational planning, hypothesis construction and study designs. An enhancement of capacity to define hypotheses and apply methodological concepts in the criticism of scientific papers and development of protocols/manuscripts has been observed. Participants and faculty believe the course improves critical appraisal skills and ability to conduct research. Our experience shows it is feasible to accomplish these objectives, with a high level of satisfaction, through a didactic program targeted to the general academic community.

  3. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  4. Predictive aging results in radiation environments

    NASA Astrophysics Data System (ADS)

    Gillen, Kenneth T.; Clough, Roger L.

    1993-06-01

    We have previously derived a time-temperature-dose rate superposition methodology, which, when applicable, can be used to predict polymer degradation versus dose rate, temperature and exposure time. This methodology results in predictive capabilities at the low dose rates and long time periods appropriate, for instance, to ambient nuclear power plant environments. The methodology was successfully applied to several polymeric cable materials and then verified for two of the materials by comparisons of the model predictions with 12 year, low-dose-rate aging data on these materials from a nuclear environment. In this paper, we provide a more detailed discussion of the methodology and apply it to data obtained on a number of additional nuclear power plant cable insulation (a hypalon, a silicone rubber and two ethylene-tetrafluoroethylenes) and jacket (a hypalon) materials. We then show that the predicted, low-dose-rate results for our materials are in excellent agreement with long-term (7-9 year) low-dose-rate results recently obtained for the same material types actually aged under bnuclear power plant conditions. Based on a combination of the modelling and long-term results, we find indications of reasonably similar degradation responses among several different commercial formulations for each of the following "generic" materials: hypalon, ethylene-tetrafluoroethylene, silicone rubber and PVC. If such "generic" behavior can be further substantiated through modelling and long-term results on additional formulations, predictions of cable life for other commercial materials of the same generic types would be greatly facilitated.

  5. Web-based automation of green building rating index and life cycle cost analysis

    NASA Astrophysics Data System (ADS)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  6. Differing antidepressant maintenance methodologies.

    PubMed

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  7. Development of the Likelihood of Low Glucose (LLG) algorithm for evaluating risk of hypoglycemia: a new approach for using continuous glucose data to guide therapeutic decision making.

    PubMed

    Dunn, Timothy C; Hayter, Gary A; Doniger, Ken J; Wolpert, Howard A

    2014-07-01

    The objective was to develop an analysis methodology for generating diabetes therapy decision guidance using continuous glucose (CG) data. The novel Likelihood of Low Glucose (LLG) methodology, which exploits the relationship between glucose median, glucose variability, and hypoglycemia risk, is mathematically based and can be implemented in computer software. Using JDRF Continuous Glucose Monitoring Clinical Trial data, CG values for all participants were divided into 4-week periods starting at the first available sensor reading. The safety and sensitivity performance regarding hypoglycemia guidance "stoplights" were compared between the LLG method and one based on 10th percentile (P10) values. Examining 13 932 hypoglycemia guidance outputs, the safety performance of the LLG method ranged from 0.5% to 5.4% incorrect "green" indicators, compared with 0.9% to 6.0% for P10 value of 110 mg/dL. Guidance with lower P10 values yielded higher rates of incorrect indicators, such as 11.7% to 38% at 80 mg/dL. When evaluated only for periods of higher glucose (median above 155 mg/dL), the safety performance of the LLG method was superior to the P10 method. Sensitivity performance of correct "red" indicators of the LLG method had an in sample rate of 88.3% and an out of sample rate of 59.6%, comparable with the P10 method up to about 80 mg/dL. To aid in therapeutic decision making, we developed an algorithm-supported report that graphically highlights low glucose risk and increased variability. When tested with clinical data, the proposed method demonstrated equivalent or superior safety and sensitivity performance. © 2014 Diabetes Technology Society.

  8. Test Standard Developed for Determining the Slow Crack Growth of Advanced Ceramics at Ambient Temperature

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Salem, Jonathan A.

    1998-01-01

    The service life of structural ceramic components is often limited by the process of slow crack growth. Therefore, it is important to develop an appropriate testing methodology for accurately determining the slow crack growth design parameters necessary for component life prediction. In addition, an appropriate test methodology can be used to determine the influences of component processing variables and composition on the slow crack growth and strength behavior of newly developed materials, thus allowing the component process to be tailored and optimized to specific needs. At the NASA Lewis Research Center, work to develop a standard test method to determine the slow crack growth parameters of advanced ceramics was initiated by the authors in early 1994 in the C 28 (Advanced Ceramics) committee of the American Society for Testing and Materials (ASTM). After about 2 years of required balloting, the draft written by the authors was approved and established as a new ASTM test standard: ASTM C 1368-97, Standard Test Method for Determination of Slow Crack Growth Parameters of Advanced Ceramics by Constant Stress-Rate Flexural Testing at Ambient Temperature. Briefly, the test method uses constant stress-rate testing to determine strengths as a function of stress rate at ambient temperature. Strengths are measured in a routine manner at four or more stress rates by applying constant displacement or loading rates. The slow crack growth parameters required for design are then estimated from a relationship between strength and stress rate. This new standard will be published in the Annual Book of ASTM Standards, Vol. 15.01, in 1998. Currently, a companion draft ASTM standard for determination of the slow crack growth parameters of advanced ceramics at elevated temperatures is being prepared by the authors and will be presented to the committee by the middle of 1998. Consequently, Lewis will maintain an active leadership role in advanced ceramics standardization within ASTM. In addition, the authors have been and are involved with several international standardization organizations including the Versailles Project on Advanced Materials and Standards (VAMAS), the International Energy Agency (IEA), and the International Organization for Standardization (ISO). The associated standardization activities involve fracture toughness, strength, elastic modulus, and the machining of advanced ceramics.

  9. Population causes and consequences of leading chronic diseases: a comparative analysis of prevailing explanations.

    PubMed

    Stuckler, David

    2008-06-01

    The mortality numbers and rates of chronic disease are rising faster in developing than in developed countries. This article compares prevailing explanations of population chronic disease trends with theoretical and empirical models of population chronic disease epidemiology and assesses some economic consequences of the growth of chronic diseases in developing countries based on the experiences of developed countries. Four decades of male mortality rates of cardiovascular and chronic noncommunicable diseases were regressed on changes in and levels of country income per capita, market integration, foreign direct investment, urbanization rates, and population aging in fifty-six countries for which comparative data were available. Neoclassical economic growth models were used to estimate the effect of the mortality rates of chronic noncommunicable diseases on economic growth in high-income OECD countries. Processes of economic growth, market integration, foreign direct investment, and urbanization were significant determinants of long-term changes in mortality rates of heart disease and chronic noncommunicable disease, and the observed relationships with these social and economic factors were roughly three times stronger than the relationships with the population's aging. In low-income countries, higher levels of country income per capita, population urbanization, foreign direct investment, and market integration were associated with greater mortality rates of heart disease and chronic noncommunicable disease, less increased or sometimes reduced rates in middle-income countries, and decreased rates in high-income countries. Each 10 percent increase in the working-age mortality rates of chronic noncommunicable disease decreased economic growth rates by close to a half percent. Macrosocial and macroeconomic forces are major determinants of population rises in chronic disease mortality, and some prevailing demographic explanations, such as population aging, are incomplete on methodological, empirical, and policy grounds. Rising chronic disease mortality rates will significantly reduce economic growth in developing countries and further widen the health and economic gap between the developed and developing world.

  10. Population Causes and Consequences of Leading Chronic Diseases: A Comparative Analysis of Prevailing Explanations

    PubMed Central

    Stuckler, David

    2008-01-01

    Context The mortality numbers and rates of chronic disease are rising faster in developing than in developed countries. This article compares prevailing explanations of population chronic disease trends with theoretical and empirical models of population chronic disease epidemiology and assesses some economic consequences of the growth of chronic diseases in developing countries based on the experiences of developed countries. Methods Four decades of male mortality rates of cardiovascular and chronic noncommunicable diseases were regressed on changes in and levels of country income per capita, market integration, foreign direct investment, urbanization rates, and population aging in fifty-six countries for which comparative data were available. Neoclassical economic growth models were used to estimate the effect of the mortality rates of chronic noncommunicable diseases on economic growth in high-income OECD countries. Findings Processes of economic growth, market integration, foreign direct investment, and urbanization were significant determinants of long-term changes in mortality rates of heart disease and chronic noncommunicable disease, and the observed relationships with these social and economic factors were roughly three times stronger than the relationships with the population's aging. In low-income countries, higher levels of country income per capita, population urbanization, foreign direct investment, and market integration were associated with greater mortality rates of heart disease and chronic noncommunicable disease, less increased or sometimes reduced rates in middle-income countries, and decreased rates in high-income countries. Each 10 percent increase in the working-age mortality rates of chronic noncommunicable disease decreased economic growth rates by close to a half percent. Conclusions Macrosocial and macroeconomic forces are major determinants of population rises in chronic disease mortality, and some prevailing demographic explanations, such as population aging, are incomplete on methodological, empirical, and policy grounds. Rising chronic disease mortality rates will significantly reduce economic growth in developing countries and further widen the health and economic gap between the developed and developing world. PMID:18522614

  11. New well pattern optimization methodology in mature low-permeability anisotropic reservoirs

    NASA Astrophysics Data System (ADS)

    Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei

    2018-02-01

    In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.

  12. [A general review of the discussion at the Beijing International Symposium on Population and Development].

    PubMed

    Ren, Y

    1985-03-29

    A general review of papers and discussions at the Beijing International Symposium on Population and Development held December 10-14, 1984 is presented. Discussions on population and development included China's population change 1949-1982, impacts of economic change on Tianjin's population, the population factor in economic development policy-making, Japanese population and development, recent population development in Hungary, population and economy, comprehensive long-term population development in Russia, fertility rate change factors in China, Shanghai's population change, and population and economic development in Mian County, Shaanxi Province. Fertility rate changes were discussed, including multinational borderline value assumptions, recent trends in life span fertility rate in China, fertility rate in Jiangsu Province, fertility rate change in Zhejiang Province, and sterilization in Yangjiaping, Thailand. Population and employment discussions included the economic impact of world population change, the 1984 International Population Conference, changes in economically productive population and employment strategy, employed/unemployed populations in Guangdong Province, and the economic composition of China's population. Urbanization discussions covered population and development methodological problems, population growth and economic development in the Pacific region, surplus rural population transfer and economic development in China, urbanization analysis, trends and urban population distribution problems, and Laioning Province population development. Issues in migration, population distribution, and regional population included migration and development of the Great Northwest, internal migration to Beijing, Chinese population growth and economic development by major region, and current population changes of Chinese Tibetans. Under social problems of population, discussions included women's status, development and population change, Shanghai's aging trend, analysis of the aged population, analysis of educational quality in Anhui Province, and the retirement system in Chinese villages.

  13. The Navigation Guide systematic review methodology: a rigorous and transparent method for translating environmental health science into better health outcomes.

    PubMed

    Woodruff, Tracey J; Sutton, Patrice

    2014-10-01

    Synthesizing what is known about the environmental drivers of health is instrumental to taking prevention-oriented action. Methods of research synthesis commonly used in environmental health lag behind systematic review methods developed in the clinical sciences over the past 20 years. We sought to develop a proof of concept of the "Navigation Guide," a systematic and transparent method of research synthesis in environmental health. The Navigation Guide methodology builds on best practices in research synthesis in evidence-based medicine and environmental health. Key points of departure from current methods of expert-based narrative review prevalent in environmental health include a prespecified protocol, standardized and transparent documentation including expert judgment, a comprehensive search strategy, assessment of "risk of bias," and separation of the science from values and preferences. Key points of departure from evidence-based medicine include assigning a "moderate" quality rating to human observational studies and combining diverse evidence streams. The Navigation Guide methodology is a systematic and rigorous approach to research synthesis that has been developed to reduce bias and maximize transparency in the evaluation of environmental health information. Although novel aspects of the method will require further development and validation, our findings demonstrated that improved methods of research synthesis under development at the National Toxicology Program and under consideration by the U.S. Environmental Protection Agency are fully achievable. The institutionalization of robust methods of systematic and transparent review would provide a concrete mechanism for linking science to timely action to prevent harm.

  14. From Theory-Inspired to Theory-Based Interventions: A Protocol for Developing and Testing a Methodology for Linking Behaviour Change Techniques to Theoretical Mechanisms of Action.

    PubMed

    Michie, Susan; Carey, Rachel N; Johnston, Marie; Rothman, Alexander J; de Bruin, Marijn; Kelly, Michael P; Connell, Lauren E

    2018-05-18

    Understanding links between behaviour change techniques (BCTs) and mechanisms of action (the processes through which they affect behaviour) helps inform the systematic development of behaviour change interventions. This research aims to develop and test a methodology for linking BCTs to their mechanisms of action. Study 1 (published explicit links): Hypothesised links between 93 BCTs (from the 93-item BCT taxonomy, BCTTv1) and mechanisms of action will be identified from published interventions and their frequency, explicitness and precision documented. Study 2 (expert-agreed explicit links): Behaviour change experts will identify links between 61 BCTs and 26 mechanisms of action in a formal consensus study. Study 3 (integrated matrix of explicit links): Agreement between studies 1 and 2 will be evaluated and a new group of experts will discuss discrepancies. An integrated matrix of BCT-mechanism of action links, annotated to indicate strength of evidence, will be generated. Study 4 (published implicit links): To determine whether groups of co-occurring BCTs can be linked to theories, we will identify groups of BCTs that are used together from the study 1 literature. A consensus exercise will be used to rate strength of links between groups of BCT and theories. A formal methodology for linking BCTs to their hypothesised mechanisms of action can contribute to the development and evaluation of behaviour change interventions. This research is a step towards developing a behaviour change 'ontology', specifying relations between BCTs, mechanisms of action, modes of delivery, populations, settings and types of behaviour.

  15. How to assess and compare inter-rater reliability, agreement and correlation of ratings: an exemplary analysis of mother-father and parent-teacher expressive vocabulary rating pairs

    PubMed Central

    Stolarova, Margarita; Wolf, Corinna; Rinker, Tanja; Brielmann, Aenne

    2014-01-01

    This report has two main purposes. First, we combine well-known analytical approaches to conduct a comprehensive assessment of agreement and correlation of rating-pairs and to dis-entangle these often confused concepts, providing a best-practice example on concrete data and a tutorial for future reference. Second, we explore whether a screening questionnaire developed for use with parents can be reliably employed with daycare teachers when assessing early expressive vocabulary. A total of 53 vocabulary rating pairs (34 parent–teacher and 19 mother–father pairs) collected for two-year-old children (12 bilingual) are evaluated. First, inter-rater reliability both within and across subgroups is assessed using the intra-class correlation coefficient (ICC). Next, based on this analysis of reliability and on the test-retest reliability of the employed tool, inter-rater agreement is analyzed, magnitude and direction of rating differences are considered. Finally, Pearson correlation coefficients of standardized vocabulary scores are calculated and compared across subgroups. The results underline the necessity to distinguish between reliability measures, agreement and correlation. They also demonstrate the impact of the employed reliability on agreement evaluations. This study provides evidence that parent–teacher ratings of children's early vocabulary can achieve agreement and correlation comparable to those of mother–father ratings on the assessed vocabulary scale. Bilingualism of the evaluated child decreased the likelihood of raters' agreement. We conclude that future reports of agreement, correlation and reliability of ratings will benefit from better definition of terms and stricter methodological approaches. The methodological tutorial provided here holds the potential to increase comparability across empirical reports and can help improve research practices and knowledge transfer to educational and therapeutic settings. PMID:24994985

  16. The impact of proto- and metazooplankton on the fate of organic carbon in continental ocean margins. Final progress report, May 1992--July 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paffenhofer, G.A.; Verity, P.G.

    1995-12-31

    Three fates potentially consume primary production occurring on ocean margins: portions can be oxidized within the water column, portions can sediment to shelf/slope depots, and portions can be exported to the interior ocean. Zooplankton mediate all three of these processes and thus can alter the pathway and residence time of particulate organic carbon, depending on the size structure and composition of the zooplankton (and phytoplankton). To achieve the long-term goal of quantifying the role of proto- and metazooplankton in removing newly formed POC (primary production), the authors must accomplish two major component objectives: (a) determine plankton carbon biomass at relevantmore » temporal and spatial scales; and (b) measure zooplankton carbon consumption rates and (for metazoan zooplankton) fecal pellet production. These measurements will specify the importance of different zooplankton groups as consumers and transformers of phytoplankton carbon. During Phase 1, they concentrated on methodological and technological developments prerequisite to an organized field program. Specifically, they proposed to develop and test an optical zooplankton counter, and to fully enhance the color image analysis system. In addition, they proposed to evaluate a solid-phase enzyme-linked immunospot assay to quantify predation by metazoan zooplankton on protozoans; and to improve methodology to determine ingestion and growth rates of salps, and accompanying pellet production rates, under conditions which very closely resemble their environment. The image analyzer data provide insights on basic ecosystem parameters relevant to carbon flux from the continental ocean to the deep ocean. Together these approaches provide a powerful set of tools to probe food web relationships in greater detail, to increase the accuracy and speed of carbon biomass and rate measurements, and to enhance data collection and analysis.« less

  17. Developing methods for systematic reviewing in health services delivery and organisation

    PubMed Central

    Alborz, Alison; McNally, Rosalind

    2007-01-01

    Objectives To develop methods to facilitate the ‘systematic’ review of evidence from a range of methodologies on diffuse or ‘soft’ topics, as exemplified by ‘access to healthcare’. Data sources 28 bibliographic databases, research registers, organisational web sites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Review methods Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords’ model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesised. Quality assessment was by an initial set of ‘generic’ quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Results 82 studies were fully evaluated. Five studies were rated ‘highly rigorous’, 22 ‘rigorous’, 46 ‘less rigorous’ and 9 ‘poor’ papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. Conclusions The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or ‘soft’ topics. Synthesis can be facilitated further by using software, such as the Microsoft ‘Access’ database, for managing information. PMID:15606880

  18. Developing methods for systematic reviewing in health services delivery and organization: an example from a review of access to health care for people with learning disabilities. Part 2. Evaluation of the literature--a practical guide.

    PubMed

    Alborz, Alison; McNally, Rosalind

    2004-12-01

    To develop methods to facilitate the 'systematic' review of evidence from a range of methodologies on diffuse or 'soft' topics, as exemplified by 'access to health care'. Twenty-eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords' model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of 'generic' quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed-method studies. Eighty-two studies were fully evaluated. Five studies were rated 'highly rigorous', 22 'rigorous', 46 'less rigorous' and nine 'poor' papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or 'soft' topics. Synthesis can be facilitated further by using software, such as the microsoft 'access' database, for managing information.

  19. Multi-parameter vital sign database to assist in alarm optimization for general care units.

    PubMed

    Welch, James; Kanter, Benjamin; Skora, Brooke; McCombie, Scott; Henry, Isaac; McCombie, Devin; Kennedy, Rosemary; Soller, Babs

    2016-12-01

    Continual vital sign assessment on the general care, medical-surgical floor is expected to provide early indication of patient deterioration and increase the effectiveness of rapid response teams. However, there is concern that continual, multi-parameter vital sign monitoring will produce alarm fatigue. The objective of this study was the development of a methodology to help care teams optimize alarm settings. An on-body wireless monitoring system was used to continually assess heart rate, respiratory rate, SpO 2 and noninvasive blood pressure in the general ward of ten hospitals between April 1, 2014 and January 19, 2015. These data, 94,575 h for 3430 patients are contained in a large database, accessible with cloud computing tools. Simulation scenarios assessed the total alarm rate as a function of threshold and annunciation delay (s). The total alarm rate of ten alarms/patient/day predicted from the cloud-hosted database was the same as the total alarm rate for a 10 day evaluation (1550 h for 36 patients) in an independent hospital. Plots of vital sign distributions in the cloud-hosted database were similar to other large databases published by different authors. The cloud-hosted database can be used to run simulations for various alarm thresholds and annunciation delays to predict the total alarm burden experienced by nursing staff. This methodology might, in the future, be used to help reduce alarm fatigue without sacrificing the ability to continually monitor all vital signs.

  20. Fault Diagnosis approach based on a model-based reasoner and a functional designer for a wind turbine. An approach towards self-maintenance

    NASA Astrophysics Data System (ADS)

    Echavarria, E.; Tomiyama, T.; van Bussel, G. J. W.

    2007-07-01

    The objective of this on-going research is to develop a design methodology to increase the availability for offshore wind farms, by means of an intelligent maintenance system capable of responding to faults by reconfiguring the system or subsystems, without increasing service visits, complexity, or costs. The idea is to make use of the existing functional redundancies within the system and sub-systems to keep the wind turbine operational, even at a reduced capacity if necessary. Re-configuration is intended to be a built-in capability to be used as a repair strategy, based on these existing functionalities provided by the components. The possible solutions can range from using information from adjacent wind turbines, such as wind speed and direction, to setting up different operational modes, for instance re-wiring, re-connecting, changing parameters or control strategy. The methodology described in this paper is based on qualitative physics and consists of a fault diagnosis system based on a model-based reasoner (MBR), and on a functional redundancy designer (FRD). Both design tools make use of a function-behaviour-state (FBS) model. A design methodology based on the re-configuration concept to achieve self-maintained wind turbines is an interesting and promising approach to reduce stoppage rate, failure events, maintenance visits, and to maintain energy output possibly at reduced rate until the next scheduled maintenance.

  1. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  2. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology to be quantitatively compared in several categories, and a QFD matrix which allows process/chemical pairs to be rated against one another for importance (using consistent categories). Depending on the need for application, one can choose the part(s) needed or have the methodology completed in its entirety. For example, if a program needs to show the risk of changing a process/chemical one may choose to use part of Matrix A and Matrix C. If a chemical is being used, and the process must be changed; one might use the Process Concerns part of Matrix D for the existing process and all possible replacement processes. If an overall analysis of a program is needed, one may request the QFD to be completed.

  3. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.

  4. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  5. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  6. Modulations of Heart Rate, ECG, and Cardio-Respiratory Coupling Observed in Polysomnography

    PubMed Central

    Penzel, Thomas; Kantelhardt, Jan W.; Bartsch, Ronny P.; Riedl, Maik; Kraemer, Jan F.; Wessel, Niels; Garcia, Carmen; Glos, Martin; Fietze, Ingo; Schöbel, Christoph

    2016-01-01

    The cardiac component of cardio-respiratory polysomnography is covered by ECG and heart rate recordings. However, their evaluation is often underrepresented in summarizing reports. As complements to EEG, EOG, and EMG, these signals provide diagnostic information for autonomic nervous activity during sleep. This review presents major methodological developments in sleep research regarding heart rate, ECG, and cardio-respiratory couplings in a chronological (historical) sequence. It presents physiological and pathophysiological insights related to sleep medicine obtained by new technical developments. Recorded nocturnal ECG facilitates conventional heart rate variability (HRV) analysis, studies of cyclical variations of heart rate, and analysis of ECG waveform. In healthy adults, the autonomous nervous system is regulated in totally different ways during wakefulness, slow-wave sleep, and REM sleep. Analysis of beat-to-beat heart-rate variations with statistical methods enables us to estimate sleep stages based on the differences in autonomic nervous system regulation. Furthermore, up to some degree, it is possible to track transitions from wakefulness to sleep by analysis of heart-rate variations. ECG and heart rate analysis allow assessment of selected sleep disorders as well. Sleep disordered breathing can be detected reliably by studying cyclical variation of heart rate combined with respiration-modulated changes in ECG morphology (amplitude of R wave and T wave). PMID:27826247

  7. Modulations of Heart Rate, ECG, and Cardio-Respiratory Coupling Observed in Polysomnography.

    PubMed

    Penzel, Thomas; Kantelhardt, Jan W; Bartsch, Ronny P; Riedl, Maik; Kraemer, Jan F; Wessel, Niels; Garcia, Carmen; Glos, Martin; Fietze, Ingo; Schöbel, Christoph

    2016-01-01

    The cardiac component of cardio-respiratory polysomnography is covered by ECG and heart rate recordings. However, their evaluation is often underrepresented in summarizing reports. As complements to EEG, EOG, and EMG, these signals provide diagnostic information for autonomic nervous activity during sleep. This review presents major methodological developments in sleep research regarding heart rate, ECG, and cardio-respiratory couplings in a chronological (historical) sequence. It presents physiological and pathophysiological insights related to sleep medicine obtained by new technical developments. Recorded nocturnal ECG facilitates conventional heart rate variability (HRV) analysis, studies of cyclical variations of heart rate, and analysis of ECG waveform. In healthy adults, the autonomous nervous system is regulated in totally different ways during wakefulness, slow-wave sleep, and REM sleep. Analysis of beat-to-beat heart-rate variations with statistical methods enables us to estimate sleep stages based on the differences in autonomic nervous system regulation. Furthermore, up to some degree, it is possible to track transitions from wakefulness to sleep by analysis of heart-rate variations. ECG and heart rate analysis allow assessment of selected sleep disorders as well. Sleep disordered breathing can be detected reliably by studying cyclical variation of heart rate combined with respiration-modulated changes in ECG morphology (amplitude of R wave and T wave).

  8. Space-Time Dependent Transport, Activation, and Dose Rates for Radioactivated Fluids.

    NASA Astrophysics Data System (ADS)

    Gavazza, Sergio

    Two methods are developed to calculate the space - and time-dependent mass transport of radionuclides, their production and decay, and the associated dose rates generated from the radioactivated fluids flowing through pipes. The work couples space- and time-dependent phenomena, treated as only space- or time-dependent in the open literature. The transport and activation methodology (TAM) is used to numerically calculate space- and time-dependent transport and activation of radionuclides in fluids flowing through pipes exposed to radiation fields, and volumetric radioactive sources created by radionuclide motions. The computer program Radionuclide Activation and Transport in Pipe (RNATPA1) performs the numerical calculations required in TAM. The gamma ray dose methodology (GAM) is used to numerically calculate space- and time-dependent gamma ray dose equivalent rates from the volumetric radioactive sources determined by TAM. The computer program Gamma Ray Dose Equivalent Rate (GRDOSER) performs the numerical calculations required in GAM. The scope of conditions considered by TAM and GAM herein include (a) laminar flow in straight pipe, (b)recirculating flow schemes, (c) time-independent fluid velocity distributions, (d) space-dependent monoenergetic neutron flux distribution, (e) space- and time-dependent activation process of a single parent nuclide and transport and decay of a single daughter radionuclide, and (f) assessment of space- and time-dependent gamma ray dose rates, outside the pipe, generated by the space- and time-dependent source term distributions inside of it. The methodologies, however, can be easily extended to include all the situations of interest for solving the phenomena addressed in this dissertation. A comparison is made from results obtained by the described calculational procedures with analytical expressions. The physics of the problems addressed by the new technique and the increased accuracy versus non -space and time-dependent methods are presented. The value of the methods is also discussed. It has been demonstrated that TAM and GAM can be used to enhance the understanding of the space- and time-dependent mass transport of radionuclides, their production and decay, and the associated dose rates related to radioactivated fluids flowing through pipes.

  9. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Prediction of road accidents: A Bayesian hierarchical approach.

    PubMed

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H

    2013-03-01

    In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any road network provided that the required data are available. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Experimental validation of an ultrasonic flowmeter for unsteady flows

    NASA Astrophysics Data System (ADS)

    Leontidis, V.; Cuvier, C.; Caignaert, G.; Dupont, P.; Roussette, O.; Fammery, S.; Nivet, P.; Dazin, A.

    2018-04-01

    An ultrasonic flowmeter was developed for further applications in cryogenic conditions and for measuring flow rate fluctuations in the range of 0 to 70 Hz. The prototype was installed in a flow test rig, and was validated experimentally both in steady and unsteady water flow conditions. A Coriolis flowmeter was used for the calibration under steady state conditions, whereas in the unsteady case the validation was done simultaneously against two methods: particle image velocimetry (PIV), and with pressure transducers installed flush on the wall of the pipe. The results show that the developed flowmeter and the proposed methodology can accurately measure the frequency and amplitude of unsteady fluctuations in the experimental range of 0-9 l s-1 of the mean main flow rate and 0-70 Hz of the imposed disturbances.

  12. Ergonomics strategies and actions for achieving productive use of an ageing work-force.

    PubMed

    Kumashiro, M

    2000-07-01

    In this report, a basic ERGOMA (Ergonomics in Industrial Management) strategy is proposed as a policy for corporate production and employment in countries where ageing populations and reduced birth rates are imminent, and a strategy related to this is proposed. Specifically, as a strategy at the company level, the results of survey studies aimed at the development of methods for determining job capacity, to enable effective use of the labour of ageing workers, were summarized. A number of the insights gained here are steps in the development of a foundational methodology for practical use, and in actual practice a number of these insights must be subjected to measurements. However, the theory and newly developed methodology described here are thought to represent significant changes from the approaches to job capacity diagnosis and assessment published in the past and from the stance towards utilization of an ageing work-force. The author is confident that this represents new progress in one of the ergonomics approach to dealing with the working environment of ageing workers and an ageing work-force in general.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Jim Bouchard

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. Themore » fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.« less

  14. Toward the Long-Term Scientific Study of Encounter Group Phenomena: I. Methodological Considerations.

    ERIC Educational Resources Information Center

    Diamond, Michael Jay; Shapiro, Jerrold Lee

    This paper proposes a model for the long-term scientific study of encounter, T-, and sensitivity groups. The authors see the need for overcoming major methodological and design inadequacies of such research. They discuss major methodological flaws in group outcome research as including: (1) lack of adequate base rate or pretraining measures; (2)…

  15. Systematic Review of the Application of Lean and Six Sigma Quality Improvement Methodologies in Radiology.

    PubMed

    Amaratunga, Thelina; Dobranowski, Julian

    2016-09-01

    Preventable yet clinically significant rates of medical error remain systemic, while health care spending is at a historic high. Industry-based quality improvement (QI) methodologies show potential for utility in health care and radiology because they use an empirical approach to reduce variability and improve workflow. The aim of this review was to systematically assess the literature with regard to the use and efficacy of Lean and Six Sigma (the most popular of the industrial QI methodologies) within radiology. MEDLINE, the Allied & Complementary Medicine Database, Embase Classic + Embase, Health and Psychosocial Instruments, and the Ovid HealthStar database, alongside the Cochrane Library databases, were searched on June 2015. Empirical studies in peer-reviewed journals were included if they assessed the use of Lean, Six Sigma, or Lean Six Sigma with regard to their ability to improve a variety of quality metrics in a radiology-centered clinical setting. Of the 278 articles returned, 23 studies were suitable for inclusion. Of these, 10 assessed Six Sigma, 7 assessed Lean, and 6 assessed Lean Six Sigma. The diverse range of measured outcomes can be organized into 7 common aims: cost savings, reducing appointment wait time, reducing in-department wait time, increasing patient volume, reducing cycle time, reducing defects, and increasing staff and patient safety and satisfaction. All of the included studies demonstrated improvements across a variety of outcomes. However, there were high rates of systematic bias and imprecision as per the Grading of Recommendations Assessment, Development and Evaluation guidelines. Lean and Six Sigma QI methodologies have the potential to reduce error and costs and improve quality within radiology. However, there is a pressing need to conduct high-quality studies in order to realize the true potential of these QI methodologies in health care and radiology. Recommendations on how to improve the quality of the literature are proposed. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  16. Investigation of the fiber/matrix interphase under high loading rates

    NASA Astrophysics Data System (ADS)

    Tanoglu, Metin

    2000-10-01

    This research focuses on characterization of the interphases of various sized E-glass-fiber/epoxy-amine systems under high loading rates. The systems include unsized, epoxy-amine compatible, and epoxy-amine incompatible glass fibers. A new experimental technique (dynamic micro-debonding technique) was developed to directly characterize the fiber/matrix interphase properties under various loading rates. Displacement rates of up to 3000 mum/sec that induce high-strain-rate interphase loading were obtained using the rapid expansion capability of the piezoelectric actuators (PZT). A straightforward data reduction scheme, which does not require complex numerical solutions, was also developed by employing thin specimens. This method enables quantification of the strength and specific absorbed energies due to debonding and frictional sliding. Moreover, the technique offers the potential to obtain the shear stress/strain response of the interphases at various rates. A new methodology was also developed to independently investigate the properties of the fiber/matrix interphase. This methodology is based on the assumption that the portion of sizing bound to the glass fiber strongly affects the interphase formation. Conventional burnout and acetone extraction experiments in conjunction with nuclear magnetic spectroscopy were used to determine the composition of the bound sizing. Using the determined composition, model interphase compounds were made to replicate the actual interphase and tested utilizing dynamic mechanical analyzer (DMA) and differential scanning calorimeter (DSC) techniques. The rate-dependent behavior of the model interphase materials and the bulk epoxy matrix were characterized by constructing storage modulus master curves as a function of strain rate using the time-temperature superposition principle. The results of dynamic micro-debonding experiments showed that the values of interphase strength and specific absorbed energies vary dependent on the sizing and exhibited significant sensitivity to loading rates. The unsized fibers exhibit greater energy-absorbing capability that could provide better ballistic resistance while the compatible sized fibers show higher strength values that improve the structural integrity of the polymeric composites. The calculated interphase shear modulus values from micro-debonding experiments increase with the loading rate consistent with DMA results. In addition, significantly higher amounts of energy are absorbed within the frictional sliding regime compared to debonding. Characterization of model interphase compounds revealed that the interphase formed due to the presence of bound sizing has a Tg below room temperature, a modulus more compliant than that of the bulk matrix, and a thickness of about 10 nm. The results showed that the properties of the interphases are significantly affected by the interphase network structure.

  17. Developing a self-rating measure of patient competence in the context of oncology: a multi-center study.

    PubMed

    Giesler, Jürgen M; Weis, Joachim

    2008-11-01

    Concepts of patient competence (PC) are being increasingly used, but seldom clearly defined in the context of shared medical treatment decision making and coping with cancer. The meaning of such concepts should therefore be clarified, and measures developed that permit the assessment of different facets of this patient characteristic. Consequently, this study attempted to contribute to the definition and measurement of PC. Employing literature reviews and qualitative interviews, we developed a working definition of PC in the context of cancer from which we designed a self-rating measure of this patient characteristic that was then tested for validity and reliability in a sample of N=536 patients with cancer. Using factor analyses, we developed five problem- and three emotion-focused subscales that measure distinct facets of PC with satisfactory reliability. Additional analyses provide preliminary evidence of the instruments' validity. This study represents an essential first step in developing a reliable self-rating measure of PC in the context of cancer. Although further refinement of this measure is clearly required, it provides a preliminary methodological basis for empirically investigating the determinants and potential health effects of PC. (c) 2008 John Wiley & Sons, Ltd.

  18. Methodological issues associated with preclinical drug development and increased placebo effects in schizophrenia clinical trials.

    PubMed

    Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I

    2016-01-20

    Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.

  19. Somatic and gastrointestinal in vivo biotransformation rates of hydrophobic chemicals in fish.

    PubMed

    Lo, Justin C; Campbell, David A; Kennedy, Christopher J; Gobas, Frank A P C

    2015-10-01

    To improve current bioaccumulation assessment methods, a methodology is developed, applied, and investigated for measuring in vivo biotransformation rates of hydrophobic organic substances in the body (soma) and gastrointestinal tract of the fish. The method resembles the Organisation for Economic Co-operation and Development (OECD) 305 dietary bioaccumulation test but includes reference chemicals to determine both somatic and gastrointestinal biotransformation rates of test chemicals. Somatic biotransformation rate constants for the test chemicals ranged between 0 d(-1) and 0.38 (standard error [SE] 0.03)/d(-1) . Gastrointestinal biotransformation rate constants varied from 0 d(-1) to 46 (SE 7) d(-1) . Gastrointestinal biotransformation contributed more to the overall biotransformation in fish than somatic biotransformation for all test substances but 1. Results suggest that biomagnification tests can reveal the full extent of biotransformation in fish. The common presumption that the liver is the main site of biotransformation may not apply to many substances exposed through the diet. The results suggest that the application of quantitative structure-activity relationships (QSARs) for somatic biotransformation rates and hepatic in vitro models to assess the effect of biotransformation on bioaccumulation can underestimate biotransformation rates and overestimate the biomagnification potential of chemicals that are biotransformed in the gastrointestinal tract. With some modifications, the OECD 305 test can generate somatic and gastrointestinal biotransformation data to develop biotransformation QSARs and test in vitro-in vivo biotransformation extrapolation methods. © 2015 SETAC.

  20. Methodology and reporting of diagnostic accuracy studies of automated perimetry in glaucoma: evaluation using a standardised approach.

    PubMed

    Fidalgo, Bruno M R; Crabb, David P; Lawrenson, John G

    2015-05-01

    To evaluate methodological and reporting quality of diagnostic accuracy studies of perimetry in glaucoma and to determine whether there had been any improvement since the publication of the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines. A systematic review of English language articles published between 1993 and 2013 reporting the diagnostic accuracy of perimetry in glaucoma. Articles were appraised for methodological quality using the 14-item Quality assessment tool for diagnostic accuracy studies (QUADAS) and evaluated for quality of reporting by applying the STARD checklist. Fifty-eight articles were appraised. Overall methodological quality of these studies was moderate with a median number of QUADAS items rated as 'yes' equal to nine (out of a maximum of 14) (IQR 7-10). The studies were often poorly reported; median score of STARD items fully reported was 11 out of 25 (IQR 10-14). A comparison of the studies published in 10-year periods before and after the publication of the STARD checklist in 2003 found quality of reporting had not substantially improved. Methodological and reporting quality of diagnostic accuracy studies of perimetry is sub-optimal and appears not to have improved substantially following the development of the STARD reporting guidance. This observation is consistent with previous studies in ophthalmology and in other medical specialities. © 2015 The Authors Ophthalmic & Physiological Optics © 2015 The College of Optometrists.

  1. Installation Restoration Program Records Search for Kingsley Field, Oregon.

    DTIC Science & Technology

    1982-06-01

    Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3

  2. Sex estimation standards for medieval and contemporary Croats

    PubMed Central

    Bašić, Željana; Kružić, Ivana; Jerković, Ivan; Anđelinović, Deny; Anđelinović, Šimun

    2017-01-01

    Aim To develop discriminant functions for sex estimation on medieval Croatian population and test their application on contemporary Croatian population. Methods From a total of 519 skeletons, we chose 84 adult excellently preserved skeletons free of antemortem and postmortem changes and took all standard measurements. Sex was estimated/determined using standard anthropological procedures and ancient DNA (amelogenin analysis) where pelvis was insufficiently preserved or where sex morphological indicators were not consistent. We explored which measurements showed sexual dimorphism and used them for developing univariate and multivariate discriminant functions for sex estimation. We included only those functions that reached accuracy rate ≥80%. We tested the applicability of developed functions on modern Croatian sample (n = 37). Results From 69 standard skeletal measurements used in this study, 56 of them showed statistically significant sexual dimorphism (74.7%). We developed five univariate discriminant functions with classification rate 80.6%-85.2% and seven multivariate discriminant functions with an accuracy rate of 81.8%-93.0%. When tested on the modern population functions showed classification rates 74.1%-100%, and ten of them reached aimed accuracy rate. Females showed higher classification rates in the medieval populations, whereas males were better classified in the modern populations. Conclusion Developed discriminant functions are sufficiently accurate for reliable sex estimation in both medieval Croatian population and modern Croatian samples and may be used in forensic settings. The methodological issues that emerged regarding the importance of considering external factors in development and application of discriminant functions for sex estimation should be further explored. PMID:28613039

  3. Purchasing power of civil servant health workers in Mozambique

    PubMed Central

    Ferrinho, Fátima; Amaral, Marta; Russo, Giuliano; Ferrinho, Paulo

    2012-01-01

    Background Health workers’ purchasing power is an important consideration in the development of strategies for health workforce development. This work explores the purchasing power variation of Mozambican public sector health workers, between 1999 and 2007. In general, the calculated purchasing power increased for most careers under study, and the highest percentage increase was observed for the lowest remuneration careers, contributing in this way for a relative reduction in the difference between the higher and the lower salaries. Methods This was done through a simple and easy-to-apply methodology to estimate salaries’ capitalization rate, by means of the accumulated inflation rate, after taking wage revisions into account. All the career categories in the Ministry of Health and affiliated public sector institutions were considered. Results Health workers’ purchasing power is an important consideration in the development of strategies for health workforce development. This work explores the purchasing power variation of Mozambican public sector health workers, between 1999 and 2007. In general, the calculated purchasing power increased for most careers under study, and the highest percentage increase was observed for the lowest remuneration careers, contributing in this way for a relative reduction in the difference between the higher and the lower salaries. Conclusion These results seem to contradict a commonly held assumption that health sector pay has deteriorated over the years, and with substantial damage for the poorest. Further studies appear to be needed to design a more accurate methodology to better understand the evolution and impact of public sector health workers’ remunerations across the years. PMID:22368757

  4. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 3 2012-10-01 2012-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base shall...

  5. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base shall...

  6. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base shall...

  7. 47 CFR 65.800 - Rate base.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Rate base. 65.800 Section 65.800 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INTERSTATE RATE OF RETURN PRESCRIPTION PROCEDURES AND METHODOLOGIES Rate Base § 65.800 Rate base. The rate base shall...

  8. Visual performance-based image enhancement methodology: an investigation of contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Neriani, Kelly E.; Herbranson, Travis J.; Reis, George A.; Pinkus, Alan R.; Goodyear, Charles D.

    2006-05-01

    While vast numbers of image enhancing algorithms have already been developed, the majority of these algorithms have not been assessed in terms of their visual performance-enhancing effects using militarily relevant scenarios. The goal of this research was to apply a visual performance-based assessment methodology to evaluate six algorithms that were specifically designed to enhance the contrast of digital images. The image enhancing algorithms used in this study included three different histogram equalization algorithms, the Autolevels function, the Recursive Rational Filter technique described in Marsi, Ramponi, and Carrato1 and the multiscale Retinex algorithm described in Rahman, Jobson and Woodell2. The methodology used in the assessment has been developed to acquire objective human visual performance data as a means of evaluating the contrast enhancement algorithms. Objective performance metrics, response time and error rate, were used to compare algorithm enhanced images versus two baseline conditions, original non-enhanced images and contrast-degraded images. Observers completed a visual search task using a spatial-forcedchoice paradigm. Observers searched images for a target (a military vehicle) hidden among foliage and then indicated in which quadrant of the screen the target was located. Response time and percent correct were measured for each observer. Results of the study and future directions are discussed.

  9. When is good, good enough? Methodological pragmatism for sustainable guideline development.

    PubMed

    Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C

    2015-03-06

    Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.

  10. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology

    PubMed Central

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-01-01

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers. PMID:28793427

  11. Fracture Mechanics for Composites: State of the Art and Challenges

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Krueger, Ronald

    2006-01-01

    Interlaminar fracture mechanics has proven useful for characterizing the onset of delaminations in composites and has been used with limited success primarily to investigate onset in fracture toughness specimens and laboratory size coupon type specimens. Future acceptance of the methodology by industry and certification authorities however, requires the successful demonstration of the methodology on the structural level. In this paper, the state-of-the-art in fracture toughness characterization, and interlaminar fracture mechanics analysis tools are described. To demonstrate the application on the structural level, a panel was selected which is reinforced with stringers. Full implementation of interlaminar fracture mechanics in design however remains a challenge and requires a continuing development effort of codes to calculate energy release rates and advancements in delamination onset and growth criteria under mixed mode conditions.

  12. Scalability analysis methodology for passive optical interconnects in data center networks using PAM

    NASA Astrophysics Data System (ADS)

    Lin, R.; Szczerba, Krzysztof; Agrell, Erik; Wosinska, Lena; Tang, M.; Liu, D.; Chen, J.

    2017-11-01

    A framework is developed for modeling the fundamental impairments in optical datacenter interconnects, i.e., the power loss and the receiver noises. This framework makes it possible, to analyze the trade-offs between data rates, modulation order, and number of ports that can be supported in optical interconnect architectures, while guaranteeing that the required signal-to-noise ratios are satisfied. To the best of our knowledge, this important assessment methodology is not yet available. As a case study, the trade-offs are investigated for three coupler-based top-of-rack interconnect architectures, which suffer from serious insertion loss. The results show that using single-port transceivers with 10 GHz bandwidth, avalanche photodiode detectors, and quadratical pulse amplitude modulation, more than 500 ports can be supported.

  13. ICS-II USA research design and methodology.

    PubMed

    Rana, H; Andersen, R M; Nakazono, T T; Davidson, P L

    1997-05-01

    The purpose of the WHO-sponsored International Collaborative Study of Oral Health Outcomes (ICS-II) was to provide policy-markers and researchers with detailed, reliable, and valid data on the oral health situation in their countries or regions, together with comparative data from other dental care delivery systems. ICS-II used a cross-sectional design with no explicit control groups or experimental interventions. A standardized methodology was developed and tested for collecting and analyzing epidemiological, sociocultural, economic, and delivery system data. Respondent information was obtained by household interviews, and clinical examinations were conducted by calibrated oral epidemiologists. Discussed are the sampling design characteristics for the USA research locations, response rates, samples size for interview and oral examination data, weighting procedures, and statistical methods. SUDAAN was used to adjust variance calculations, since complex sampling designs were used.

  14. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology.

    PubMed

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-07-07

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.

  15. Optimization of the Electrochemical Extraction and Recovery of Metals from Electronic Waste Using Response Surface Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, Luis A.; Clark, Gemma G.; Lister, Tedd E.

    The rapid growth of the electronic waste can be viewed both as an environmental threat and as an attractive source of minerals that can reduce the mining of natural resources, and stabilize the market of critical materials, such as rare earths. Here in this article surface response methodology was used to optimize a previously developed electrochemical recovery process for base metals from electronic waste using a mild oxidant (Fe 3+). Through this process an effective extraction of base metals can be achieved enriching the concentration of precious metals and significantly reducing environmental impacts and operational costs associated with the wastemore » generation and chemical consumption. The optimization was performed using a bench-scale system specifically designed for this process. Operational parameters such as flow rate, applied current density and iron concentration were optimized to reduce the specific energy consumption of the electrochemical recovery process to 1.94 kWh per kg of metal recovered at a processing rate of 3.3 g of electronic waste per hour.« less

  16. Optimization of the Electrochemical Extraction and Recovery of Metals from Electronic Waste Using Response Surface Methodology

    DOE PAGES

    Diaz, Luis A.; Clark, Gemma G.; Lister, Tedd E.

    2017-06-08

    The rapid growth of the electronic waste can be viewed both as an environmental threat and as an attractive source of minerals that can reduce the mining of natural resources, and stabilize the market of critical materials, such as rare earths. Here in this article surface response methodology was used to optimize a previously developed electrochemical recovery process for base metals from electronic waste using a mild oxidant (Fe 3+). Through this process an effective extraction of base metals can be achieved enriching the concentration of precious metals and significantly reducing environmental impacts and operational costs associated with the wastemore » generation and chemical consumption. The optimization was performed using a bench-scale system specifically designed for this process. Operational parameters such as flow rate, applied current density and iron concentration were optimized to reduce the specific energy consumption of the electrochemical recovery process to 1.94 kWh per kg of metal recovered at a processing rate of 3.3 g of electronic waste per hour.« less

  17. Detection and Processing Techniques of FECG Signal for Fetal Monitoring

    PubMed Central

    2009-01-01

    Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912

  18. A post-earthquake psychopathological investigation in Armenia: methodology, summary of findings, and follow-up.

    PubMed

    Khachadourian, Vahe; Armenian, Haroutune; Demirchyan, Anahit; Melkonian, Arthur; Hovanesian, Ashot

    2016-07-01

    The post-earthquake psychopathological investigation (PEPSI) was designed to probe the short-and long-term effects of the earthquake in northern Armenia on 7 December 1988 on survivors' mental and physical health. Four phases of this study have been conducted to date, and, overall, more than 80 per cent of a sub-sample of 1,773 drawn from an initial cohort of 32,743 was successfully followed during 2012. This paper describes the methodology employed in the evaluation, summarises previous findings, details the current objectives, and examines the general characteristics of the sample based on the most recent follow-up phase outcomes. Despite a significant decrease in psychopathology rates between 1990 and 2012, prevalence rates of post-traumatic stress disorder and depression among study participants in 2012 were greater than 15 and 26 per cent, respectively. The paper also notes the strengths and limitations of the study vis-à-vis future research and highlights the importance and potential practical implications of similar assessments and their outcomes. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  19. A complete equation of state for non-ideal condensed phase explosives

    NASA Astrophysics Data System (ADS)

    Wilkinson, S. D.; Braithwaite, M.; Nikiforakis, N.; Michael, L.

    2017-12-01

    The objective of this work is to improve the robustness and accuracy of numerical simulations of both ideal and non-ideal explosives by introducing temperature dependence in mechanical equations of state for reactants and products. To this end, we modify existing mechanical equations of state to appropriately approximate the temperature in the reaction zone. Mechanical equations of state of the Mie-Grüneisen form are developed with extensions, which allow the temperature to be evaluated appropriately and the temperature equilibrium condition to be applied robustly. Furthermore, the snow plow model is used to capture the effect of porosity on the reactant equation of state. We apply the methodology to predict the velocity of compliantly confined detonation waves. Once reaction rates are calibrated for unconfined detonation velocities, simulations of confined rate sticks and slabs are performed, and the experimental detonation velocities are matched without further parameter alteration, demonstrating the predictive capability of our simulations. We apply the same methodology to both ideal (PBX9502, a high explosive with principal ingredient TATB) and non-ideal (EM120D, an ANE or ammonium nitrate based emulsion) explosives.

  20. Healthcare tariffs for specialist inpatient neurorehabilitation services: rationale and development of a UK casemix and costing methodology.

    PubMed

    Turner-Stokes, Lynne; Sutch, Stephen; Dredge, Robert

    2012-03-01

    To describe the rationale and development of a casemix model and costing methodology for tariff development for specialist neurorehabilitation services in the UK. Patients with complex needs incur higher treatment costs. Fair payment should be weighted in proportion to costs of providing treatment, and should allow for variation over time CASEMIX MODEL AND BAND-WEIGHTING: Case complexity is measured by the Rehabilitation Complexity Scale (RCS). Cases are divided into five bands of complexity, based on the total RCS score. The principal determinant of costs in rehabilitation is staff time. Total staff hours/week (estimated from the Northwick Park Nursing and Therapy Dependency Scales) are analysed within each complexity band, through cross-sectional analysis of parallel ratings. A 'band-weighting' factor is derived from the relative proportions of staff time within each of the five bands. Total unit treatment costs are obtained from retrospective analysis of provider hospitals' budget and accounting statements. Mean bed-day costs (total unit cost/occupied bed days) are divided broadly into 'variable' and 'non-variable' components. In the weighted costing model, the band-weighting factor is applied to the variable portion of the bed-day cost to derive a banded cost, and thence a set of cost-multipliers. Preliminary data from one unit are presented to illustrate how this weighted costing model will be applied to derive a multilevel banded payment model, based on serial complexity ratings, to allow for change over time.

  1. Establishing confidence in the output of qualitative research synthesis: the ConQual approach.

    PubMed

    Munn, Zachary; Porritt, Kylie; Lockwood, Craig; Aromataris, Edoardo; Pearson, Alan

    2014-09-20

    The importance of findings derived from syntheses of qualitative research has been increasingly acknowledged. Findings that arise from qualitative syntheses inform questions of practice and policy in their own right and are commonly used to complement findings from quantitative research syntheses. The GRADE approach has been widely adopted by international organisations to rate the quality and confidence of the findings of quantitative systematic reviews. To date, there has been no widely accepted corresponding approach to assist health care professionals and policy makers in establishing confidence in the synthesised findings of qualitative systematic reviews. A methodological group was formed develop a process to assess the confidence in synthesised qualitative research findings and develop a Summary of Findings tables for meta-aggregative qualitative systematic reviews. Dependability and credibility are two elements considered by the methodological group to influence the confidence of qualitative synthesised findings. A set of critical appraisal questions are proposed to establish dependability, whilst credibility can be ranked according to the goodness of fit between the author's interpretation and the original data. By following the processes outlined in this article, an overall ranking can be assigned to rate the confidence of synthesised qualitative findings, a system we have labelled ConQual. The development and use of the ConQual approach will assist users of qualitative systematic reviews to establish confidence in the evidence produced in these types of reviews and can serve as a practical tool to assist in decision making.

  2. Comparing Alternatives For Replacing Harmful Chemicals

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1995-01-01

    Methodology developed to provide guidance for replacement of industrial chemicals that must be phased out by law because they are toxic and/or affect environment adversely. Chemicals and processes ranked numerically. Applies mostly to chemicals contributing to depletion of ozone in upper atmosphere; some other harmful chemicals included. Quality function deployment matrix format provides convenient way to compare alternative processes and chemicals. Overall rating at bottom of each process-and-chemical column indicates relative advantage.

  3. Towards a Model of Technology Adoption: A Conceptual Model Proposition

    NASA Astrophysics Data System (ADS)

    Costello, Pat; Moreton, Rob

    A conceptual model for Information Communication Technology (ICT) adoption by Small Medium Enterprises (SMEs) is proposed. The research uses several ICT adoption models as its basis with theoretical underpinning provided by the Diffusion of Innovation theory and the Technology Acceptance Model (TAM). Taking an exploratory research approach the model was investigated amongst 200 SMEs whose core business is ICT. Evidence from this study demonstrates that these SMEs face the same issues as all other industry sectors. This work points out weaknesses in SMEs environments regarding ICT adoption and suggests what they may need to do to increase the success rate of any proposed adoption. The methodology for development of the framework is described and recommendations made for improved Government-led ICT adoption initiatives. Application of the general methodology has resulted in new opportunities to embed the ethos and culture surrounding the issues into the framework of new projects developed as a result of Government intervention. A conceptual model is proposed that may lead to a deeper understanding of the issues under consideration.

  4. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE PAGES

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei; ...

    2017-03-02

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  5. A method based on infrared detection for determining the moisture content of ceramic plaster materials.

    PubMed

    Macias-Melo, E V; Aguilar-Castro, K M; Alvarez-Lemus, M A; Flores-Prieto, J J

    2015-09-01

    In this work, we describe a methodology for developing a mathematical model based on infrared (IR) detection to determine the moisture content (M) in solid samples. For this purpose, an experimental setup was designed, developed and calibrated against the gravimetric method. The experimental arrangement allowed for the simultaneous measurement of M and the electromotive force (EMF), fitting the experimental variables as much as possible. These variables were correlated by a mathematical model, and the obtained correlation was M=1.12×exp(3.47×EMF), ±2.54%. This finding suggests that it is feasible to measure the moisture content when it has greater values than 2.54%. The proposed methodology could be used for different conditions of temperature, relative humidity and drying rates to evaluate the influence of these variables on the amount of energy received by the IR detector. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Parameter-free driven Liouville-von Neumann approach for time-dependent electronic transport simulations in open quantum systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zelovich, Tamar; Hansen, Thorsten; Liu, Zhen-Fei

    A parameter-free version of the recently developed driven Liouville-von Neumann equation [T. Zelovich et al., J. Chem. Theory Comput. 10(8), 2927-2941 (2014)] for electronic transport calculations in molecular junctions is presented. The single driving rate, appearing as a fitting parameter in the original methodology, is replaced by a set of state-dependent broadening factors applied to the different single-particle lead levels. These broadening factors are extracted explicitly from the self-energy of the corresponding electronic reservoir and are fully transferable to any junction incorporating the same lead model. Furthermore, the performance of the method is demonstrated via tight-binding and extended Hückel calculationsmore » of simple junction models. Our analytic considerations and numerical results indicate that the developed methodology constitutes a rigorous framework for the design of "black-box" algorithms to simulate electron dynamics in open quantum systems out of equilibrium.« less

  7. Identification of vegetable oil botanical speciation in refined vegetable oil blends using an innovative combination of chromatographic and spectroscopic techniques.

    PubMed

    Osorio, Maria Teresa; Haughey, Simon A; Elliott, Christopher T; Koidis, Anastasios

    2015-12-15

    European Regulation 1169/2011 requires producers of foods that contain refined vegetable oils to label the oil types. A novel rapid and staged methodology has been developed for the first time to identify common oil species in oil blends. The qualitative method consists of a combination of a Fourier Transform Infrared (FTIR) spectroscopy to profile the oils and fatty acid chromatographic analysis to confirm the composition of the oils when required. Calibration models and specific classification criteria were developed and all data were fused into a simple decision-making system. The single lab validation of the method demonstrated the very good performance (96% correct classification, 100% specificity, 4% false positive rate). Only a small fraction of the samples needed to be confirmed with the majority of oils identified rapidly using only the spectroscopic procedure. The results demonstrate the huge potential of the methodology for a wide range of oil authenticity work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  9. Heart Rate Variability and Cardiac Vagal Tone in Psychophysiological Research – Recommendations for Experiment Planning, Data Analysis, and Data Reporting

    PubMed Central

    Laborde, Sylvain; Mosley, Emma; Thayer, Julian F.

    2017-01-01

    Psychophysiological research integrating heart rate variability (HRV) has increased during the last two decades, particularly given the fact that HRV is able to index cardiac vagal tone. Cardiac vagal tone, which represents the contribution of the parasympathetic nervous system to cardiac regulation, is acknowledged to be linked with many phenomena relevant for psychophysiological research, including self-regulation at the cognitive, emotional, social, and health levels. The ease of HRV collection and measurement coupled with the fact it is relatively affordable, non-invasive and pain free makes it widely accessible to many researchers. This ease of access should not obscure the difficulty of interpretation of HRV findings that can be easily misconstrued, however, this can be controlled to some extent through correct methodological processes. Standards of measurement were developed two decades ago by a Task Force within HRV research, and recent reviews updated several aspects of the Task Force paper. However, many methodological aspects related to HRV in psychophysiological research have to be considered if one aims to be able to draw sound conclusions, which makes it difficult to interpret findings and to compare results across laboratories. Those methodological issues have mainly been discussed in separate outlets, making difficult to get a grasp on them, and thus this paper aims to address this issue. It will help to provide psychophysiological researchers with recommendations and practical advice concerning experimental designs, data analysis, and data reporting. This will ensure that researchers starting a project with HRV and cardiac vagal tone are well informed regarding methodological considerations in order for their findings to contribute to knowledge advancement in their field. PMID:28265249

  10. Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 3: how to assess methodological limitations.

    PubMed

    Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte

    2018-01-25

    The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.

  11. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the targetmore » vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.« less

  12. A model to predict the thermal reaction norm for the embryo growth rate from field data.

    PubMed

    Girondot, Marc; Kaska, Yakup

    2014-10-01

    The incubation of eggs is strongly influenced by temperature as observed in all species studied to date. For example, incubation duration, sexual phenotype, growth, and performances in many vertebrate hatchlings are affected by incubation temperature. Yet it is very difficult to predict temperature effect based on the temperature within a field nest, as temperature varies throughout incubation. Previous works used egg incubation at constant temperatures in the laboratory to evaluate the dependency of growtProd. Type: FTPh rate on temperature. However, generating such data is time consuming and not always feasible due to logistical and legislative constraints. This paper therefore presents a methodology to extract the thermal reaction norm for the embryo growth rate directly from a time series of incubation temperatures recorded within natural nests. This methodology was successfully applied to the nests of the marine turtle Caretta caretta incubated on Dalyan Beach in Turkey, although it can also be used for any egg-laying species, with some of its limitations being discussed in the paper. Knowledge about embryo growth patterns is also important when determining the thermosensitive period for species with temperature-dependent sex determination. Indeed, in this case, sexual phenotype is sensitive to temperature only during this window of embryonic development. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. The relationship between the prescription of psychotropic drugs and suicide rates in older people in England and Wales.

    PubMed

    Shah, Ajit; Zhinchin, Galina; Zarate-Escudero, Sofia; Somyaji, Manjunath

    2014-02-01

    Several studies have reported an inverse correlation between general population and elderly suicide rates and antidepressant prescribing rates. Correlations between general population and elderly suicide rates and prescribing rates of other psychotropic drugs have also been reported. All studies of elderly suicide rates have used data over a decade old. The relationship between elderly suicide rates and prescription rates of psychotropic drugs by the broad British National Formulary (BNF) categories, for individual psychotropic drug groups within the BNF categories (e.g. SSRIs), and for individual psychotropic drugs was examined over a 12-year period (1995-2006) using Spearman's rank correlation. All data were ascertained from the archives of the National Statistics Office. There was an absence of significant correlations between elderly suicides rates and rates of prescriptions of psychotropic drugs in the broad BNF categories, individual psychotropic drug groups and individual psychotropic drugs. The findings may be due to methodological flaws. However, if they are genuine, then the following approaches require consideration to further reduce suicide rates: (1) development of strategies to ensure continued prescription of psychotropic drugs at the current level; (2) development of strategies to improve non-pharmacological measures, including improved mental health services provision for older people, improved assessment of suicide risk, increased availability of psychosocial interventions and restricting the availability of methods of suicide; and (3) development of strategies to implement improvement in distal risk (e.g. societal socio-economic status) and protective (e.g. societal educational attainment) factors for suicide at a societal level.

  14. Method applied to the background analysis of energy data to be considered for the European Reference Life Cycle Database (ELCD).

    PubMed

    Fazio, Simone; Garraín, Daniel; Mathieux, Fabrice; De la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda

    2015-01-01

    Under the framework of the European Platform on Life Cycle Assessment, the European Reference Life-Cycle Database (ELCD - developed by the Joint Research Centre of the European Commission), provides core Life Cycle Inventory (LCI) data from front-running EU-level business associations and other sources. The ELCD contains energy-related data on power and fuels. This study describes the methods to be used for the quality analysis of energy data for European markets (available in third-party LC databases and from authoritative sources) that are, or could be, used in the context of the ELCD. The methodology was developed and tested on the energy datasets most relevant for the EU context, derived from GaBi (the reference database used to derive datasets for the ELCD), Ecoinvent, E3 and Gemis. The criteria for the database selection were based on the availability of EU-related data, the inclusion of comprehensive datasets on energy products and services, and the general approval of the LCA community. The proposed approach was based on the quality indicators developed within the International Reference Life Cycle Data System (ILCD) Handbook, further refined to facilitate their use in the analysis of energy systems. The overall Data Quality Rating (DQR) of the energy datasets can be calculated by summing up the quality rating (ranging from 1 to 5, where 1 represents very good, and 5 very poor quality) of each of the quality criteria indicators, divided by the total number of indicators considered. The quality of each dataset can be estimated for each indicator, and then compared with the different databases/sources. The results can be used to highlight the weaknesses of each dataset and can be used to guide further improvements to enhance the data quality with regard to the established criteria. This paper describes the application of the methodology to two exemplary datasets, in order to show the potential of the methodological approach. The analysis helps LCA practitioners to evaluate the usefulness of the ELCD datasets for their purposes, and dataset developers and reviewers to derive information that will help improve the overall DQR of databases.

  15. Key Factors Influencing the Energy Absorption of Dual-Phase Steels: Multiscale Material Model Approach and Microstructural Optimization

    NASA Astrophysics Data System (ADS)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2018-06-01

    The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.

  16. Clinical practice guidelines for the surgical management of colon cancer: a consensus statement of the Hellenic and Cypriot Colorectal Cancer Study Group by the HeSMO*

    PubMed Central

    Xynos, Evaghelos; Gouvas, Nikolaos; Triantopoulou, Charina; Tekkis, Paris; Vini, Louiza; Tzardi, Maria; Boukovinas, Ioannis; Androulakis, Nikolaos; Athanasiadis, Athanasios; Christodoulou, Christos; Chrysou, Evangelia; Dervenis, Christos; Emmanouilidis, Christos; Georgiou, Panagiotis; Katopodi, Ourania; Kountourakis, Panteleimon; Makatsoris, Thomas; Papakostas, Pavlos; Papamichael, Demetris; Pentheroudakis, Georgios; Pilpilidis, Ioannis; Sgouros, Joseph; Vassiliou, Vassilios; Xynogalos, Spyridon; Ziras, Nikolaos; Karachaliou, Niki; Zoras, Odysseas; Agalianos, Christos; Souglakos, John

    2016-01-01

    Despite considerable improvement in the management of colon cancer, there is a great deal of variation in the outcomes among European countries, and in particular among different hospital centers in Greece and Cyprus. Discrepancy in the approach strategies and lack of adherence to guidelines for the management of colon cancer may explain the situation. The aim was to elaborate a consensus on the multidisciplinary management of colon cancer, based on European guidelines (ESMO and EURECCA), and also taking into account local special characteristics of our healthcare system. Following discussion and online communication among members of an executive team, a consensus was developed. Statements entered the Delphi voting system on two rounds to achieve consensus by multidisciplinary international experts. Statements with an agreement rate of ≥80% achieved a large consensus, while those with an agreement rate of 60-80% a moderate consensus. Statements achieving an agreement of <60% after both rounds were rejected and not presented. Sixty statements on the management of colon cancer were subjected to the Delphi methodology. Voting experts were 109. The median rate of abstain per statement was 10% (range: 0-41%). In the end of the voting process, all statements achieved a consensus by more than 80% of the experts. A consensus on the management of colon cancer was developed by applying the Delphi methodology. Guidelines are proposed along with algorithms of diagnosis and treatment. The importance of centralization, care by a multidisciplinary team, and adherence to guidelines is emphasized. PMID:26752945

  17. Key Factors Influencing the Energy Absorption of Dual-Phase Steels: Multiscale Material Model Approach and Microstructural Optimization

    NASA Astrophysics Data System (ADS)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2018-03-01

    The increase in use of dual-phase (DP) steel grades by vehicle manufacturers to enhance crash resistance and reduce body car weight requires the development of a clear understanding of the effect of various microstructural parameters on the energy absorption in these materials. Accordingly, DP steelmakers are interested in predicting the effect of various microscopic factors as well as optimizing microstructural properties for application in crash-relevant components of vehicle bodies. This study presents a microstructure-based approach using a multiscale material and structure model. In this approach, Digimat and LS-DYNA software were coupled and employed to provide a full micro-macro multiscale material model, which is then used to simulate tensile tests. Microstructures with varied ferrite grain sizes, martensite volume fractions, and carbon content in DP steels were studied. The impact of these microstructural features at different strain rates on energy absorption characteristics of DP steels is investigated numerically using an elasto-viscoplastic constitutive model. The model is implemented in a multiscale finite-element framework. A comprehensive statistical parametric study using response surface methodology is performed to determine the optimum microstructural features for a required tensile toughness at different strain rates. The simulation results are validated using experimental data found in the literature. The developed methodology proved to be effective for investigating the influence and interaction of key microscopic properties on the energy absorption characteristics of DP steels. Furthermore, it is shown that this method can be used to identify optimum microstructural conditions at different strain-rate conditions.

  18. Developing Army Leaders through Increased Rigor in Professional Military Training and Education

    DTIC Science & Technology

    2017-06-09

    leadership. Research Methodology An applied, exploratory, qualitative research methodology via a structured and focused case study comparison was...research methodology via a structured and focused case study comparison. Finally, it will discuss how the methodology will be conducted to make...development models; it serves as the base data for case study comparison. 48 Research Methodology and Data Analysis A qualitative research

  19. A morphology independent methodology for quantifying planview river change and characteristics from remotely sensed imagery

    DOE PAGES

    Rowland, Joel C.; Shelef, Eitan; Pope, Paul A.; ...

    2016-07-15

    Remotely sensed imagery of rivers has long served as a means for characterizing channel properties and detection of planview change. In the last decade the dramatic increase in the availability of satellite imagery and processing tools has created the potential to greatly expand the spatial and temporal scale of our understanding of river morphology and dynamics. To date, the majority of GIS and automated analyses of planview changes in rivers from remotely sensed data has been developed for single-threaded meandering river systems. These methods have limited applicability to many of the earth's rivers with complex multi-channel planforms. Here we presentmore » the methodologies of a set of analysis algorithms collectively called Spatially Continuous Riverbank Erosion and Accretion Measurements (SCREAM). SCREAM analyzes planview river metrics regardless of river morphology. These algorithms quantify both the erosion and accretion rates of riverbanks from binary masks of channels generated from imagery acquired at two time periods. Additionally, the program quantifies the area of change between river channels and the surrounding floodplain and area of islands lost or formed between these two time periods. To examine variations in erosion rates in relation to local channel attributes and make rate comparisons between river systems of varying sizes, the program determines channel widths and bank curvature at every bank pixel. SCREAM was developed and tested on rivers with diverse and complex planform morphologies in imagery acquired from a range of observational platforms with varying spatial resolutions. Here, validation and verification of SCREAM-generated metrics against manual measurements show no significant measurement errors in determination of channel width, erosion, and bank aspects. SCREAM has the potential to provide data for both the quantitative examination of the controls on erosion rates and for the comparison of these rates across river systems ranging broadly in size and planform morphology.« less

  20. Extending the Instructional Systems Development Methodology.

    ERIC Educational Resources Information Center

    O'Neill, Colin E.

    1993-01-01

    Describes ways that components of Information Engineering (IE) methodology can be used by training system developers to extend Instructional Systems Development (ISD) methodology. Aspects of IE that are useful in ISD are described, including requirements determination, group facilitation, integrated automated tool support, and prototyping.…

  1. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  2. Ecological and methodological drivers of species' distribution and phenology responses to climate change.

    PubMed

    Brown, Christopher J; O'Connor, Mary I; Poloczanska, Elvira S; Schoeman, David S; Buckley, Lauren B; Burrows, Michael T; Duarte, Carlos M; Halpern, Benjamin S; Pandolfi, John M; Parmesan, Camille; Richardson, Anthony J

    2016-04-01

    Climate change is shifting species' distribution and phenology. Ecological traits, such as mobility or reproductive mode, explain variation in observed rates of shift for some taxa. However, estimates of relationships between traits and climate responses could be influenced by how responses are measured. We compiled a global data set of 651 published marine species' responses to climate change, from 47 papers on distribution shifts and 32 papers on phenology change. We assessed the relative importance of two classes of predictors of the rate of change, ecological traits of the responding taxa and methodological approaches for quantifying biological responses. Methodological differences explained 22% of the variation in range shifts, more than the 7.8% of the variation explained by ecological traits. For phenology change, methodological approaches accounted for 4% of the variation in measurements, whereas 8% of the variation was explained by ecological traits. Our ability to predict responses from traits was hindered by poor representation of species from the tropics, where temperature isotherms are moving most rapidly. Thus, the mean rate of distribution change may be underestimated by this and other global syntheses. Our analyses indicate that methodological approaches should be explicitly considered when designing, analysing and comparing results among studies. To improve climate impact studies, we recommend that (1) reanalyses of existing time series state how the existing data sets may limit the inferences about possible climate responses; (2) qualitative comparisons of species' responses across different studies be limited to studies with similar methodological approaches; (3) meta-analyses of climate responses include methodological attributes as covariates; and (4) that new time series be designed to include the detection of early warnings of change or ecologically relevant change. Greater consideration of methodological attributes will improve the accuracy of analyses that seek to quantify the role of climate change in species' distribution and phenology changes. © 2015 John Wiley & Sons Ltd.

  3. Dynamic Creep Buckling: Analysis of Shell Structures Subjected to Time-dependent Mechanical and Thermal Loading

    NASA Technical Reports Server (NTRS)

    Simitses, G. J.; Carlson, R. L.; Riff, R.

    1985-01-01

    The objective of the present research is to develop a general mathematical model and solution methodologies for analyzing the structural response of thin, metallic shell structures under large transient, cyclic, or static thermomechanical loads. Among the system responses associated with these loads and conditions are thermal buckling, creep buckling, and ratcheting. Thus geometric and material nonlinearities (of high order) can be anticipated and must be considered in developing the mathematical model. A complete, true ab-initio rate theory of kinematics and kinetics for continuum and curved thin structures, without any restriction on the magnitude of the strains or the deformations, was formulated. The time dependence and large strain behavior are incorporated through the introduction of the time rates of metric and curvature in two coordinate systems: fixed (spatial) and convected (material). The relations between the time derivative and the covariant derivative (gradient) were developed for curved space and motion, so the velocity components supply the connection between the equations of motion and the time rates of change of the metric and curvature tensors.

  4. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  5. Curated Collection for Educators: Five Key Papers about the Flipped Classroom Methodology.

    PubMed

    King, Andrew; Boysen-Osborn, Megan; Cooney, Robert; Mitzman, Jennifer; Misra, Asit; Williams, Jennifer; Dulani, Tina; Gottlieb, Michael

    2017-10-25

    The flipped classroom (FC) pedagogy is becoming increasingly popular in medical education due to its appeal to the millennial learner and potential benefits in knowledge acquisition. Despite its popularity and effectiveness, the FC educational method is not without challenges. In this article, we identify and summarize several key papers relevant to medical educators interested in exploring the FC teaching methodology. The authors identified an extensive list of papers relevant to FC pedagogy via online discussions within the Academic Life in Emergency Medicine (ALiEM) Faculty Incubator. This list was augmented by an open call on Twitter (utilizing the #meded, #FOAMed, and #flippedclassroom hashtags) yielding a list of 33 papers. We then conducted a three-round modified Delphi process within the authorship group, which included both junior and senior clinician educators, to identify the most impactful papers for educators interested in FC pedagogy. The three-round modified Delphi process ranked all of the selected papers and selected the five most highly-rated papers for inclusion. The authorship group reviewed and summarized these papers with specific consideration given to their value to junior faculty educators and faculty developers interested in the flipped classroom approach. The list of papers featured in this article serves as a key reading list for junior clinician educators and faculty developers interested in the flipped classroom technique. The associated commentaries contextualize the importance of these papers for medical educators aiming to optimize their understanding and implementation of the flipped classroom methodology in their teaching and through faculty development.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, G.-H.; Pesaran, A.; Smith, K.

    The objectives of this paper are: (1) continue to explore thermal abuse behaviors of Li-ion cells and modules that are affected by local conditions of heat and materials; (2) use the 3D Li-ion battery thermal abuse 'reaction' model developed for cells to explore the impact of the location of internal short, its heating rate, and thermal properties of the cell; (3) continue to understand the mechanisms and interactions between heat transfer and chemical reactions during thermal runaway for Li-ion cells and modules; and (4) explore the use of the developed methodology to support the design of abuse-tolerant Li-ion battery systems.

  7. Closing the Certification Gaps in Adaptive Flight Control Software

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2008-01-01

    Over the last five decades, extensive research has been performed to design and develop adaptive control systems for aerospace systems and other applications where the capability to change controller behavior at different operating conditions is highly desirable. Although adaptive flight control has been partially implemented through the use of gain-scheduled control, truly adaptive control systems using learning algorithms and on-line system identification methods have not seen commercial deployment. The reason is that the certification process for adaptive flight control software for use in national air space has not yet been decided. The purpose of this paper is to examine the gaps between the state-of-the-art methodologies used to certify conventional (i.e., non-adaptive) flight control system software and what will likely to be needed to satisfy FAA airworthiness requirements. These gaps include the lack of a certification plan or process guide, the need to develop verification and validation tools and methodologies to analyze adaptive controller stability and convergence, as well as the development of metrics to evaluate adaptive controller performance at off-nominal flight conditions. This paper presents the major certification gap areas, a description of the current state of the verification methodologies, and what further research efforts will likely be needed to close the gaps remaining in current certification practices. It is envisioned that closing the gap will require certain advances in simulation methods, comprehensive methods to determine learning algorithm stability and convergence rates, the development of performance metrics for adaptive controllers, the application of formal software assurance methods, the application of on-line software monitoring tools for adaptive controller health assessment, and the development of a certification case for adaptive system safety of flight.

  8. Cooperative learning combined with short periods of lecturing: A good alternative in teaching biochemistry.

    PubMed

    Fernández-Santander, Ana

    2008-01-01

    The informal activities of cooperative learning and short periods of lecturing has been combined and used in the university teaching of biochemistry as part of the first year course of Optics and Optometry in the academic years 2004-2005 and 2005-2006. The lessons were previously elaborated by the teacher and included all that is necessary to understand the topic (text, figures, graphics, diagrams, pictures, etc.). Additionally, a questionnaire was prepared for every chapter. All lessons contained three parts: objectives, approach and development, and the assessment of the topic. Team work, responsibility, and communication skills were some of the abilities developed with this new methodology. Students worked collaboratively in small groups of two or three following the teacher's instructions with short periods of lecturing that clarified misunderstood concepts. Homework was minimized. On comparing this combined methodology with the traditional one (only lecture), students were found to exhibit a higher satisfaction with the new method. They were more involved in the learning process and had a better attitude toward the subject. The use of this new methodology showed a significant increase in the mean score of the students' academic results. The rate of students who failed the subject was significantly inferior in comparison with those who failed in the previous years when only lecturing was applied. This combined methodology helped the teacher to observe the apprenticeship process of students better and to act as a facilitator in the process of building students' knowledge. Copyright © 2008 International Union of Biochemistry and Molecular Biology, Inc.

  9. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care

    PubMed Central

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988

  10. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.

    PubMed

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.

  11. Improving Junior Infantry Officer Leader Development and Performance

    DTIC Science & Technology

    2017-06-09

    researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...CHAPTER 3 RESEARCH METHODOLOGY ..............................................................132 CHAPTER 4 QUALITATIVE ANALYSIS

  12. 78 FR 44459 - Rate Regulation Reforms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... the interest rate. A simple multiplication of the nominal rate by the portion of the year covered by... makes technical changes to the full and simplified rate procedures; changes the interest rate that... allocation methodology for cross-over traffic. Part IV sets out the change in the interest rate carriers must...

  13. A Computational Tool for Evaluating THz Imaging Performance in Brownout Conditions at Land Sites Throughout the World

    DTIC Science & Technology

    2009-03-01

    III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter

  14. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  15. Metalexical awareness: development, methodology or written language? A cross-linguistic comparison.

    PubMed

    Kurvers, Jeanne; Uri, Helene

    2006-07-01

    This study explores the ability to access word boundaries of pre-school children, using an on-line methodology (Karmiloff-Smith, Grant, Sims, Jones, & Cockle (1996). Cognition, 58, 197-219.), which has hardly been used outside English-speaking countries. In a cross-linguistic study in the Netherlands and Norway, four and five-year-old children were asked to repeat the last word every time a narrator stopped reading a story. In total 32 target-words were used, both closed and open class words, and both monosyllabic and disyllabic words. The outcomes in both countries were different from those of the original English study (Karmiloff-Smith et al., 1996): four- and five-year-olds were successful in only about 26% of the cases, whereas the success rate in the former English experiment was 75% for the younger and 96% for the older children. No differences were found between age groups and between open and closed class words. This methodology does reveal the ability to access word boundaries, but probably not because of the ease of the on-line methodology in itself, but rather because literacy introduces new representations of language, even in on-line processing. The outcomes implicate that the ability to mark word boundaries does not seem to be a valid indication of who is ready for reading.

  16. Session: Monitoring wind turbine project sites for avian impacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Wally

    This third session at the Wind Energy and Birds/Bats workshop consisted of one presentation followed by a discussion/question and answer period. The focus of the session was on existing wind projects that are monitored for their impacts on birds and bats. The presentation given was titled ''Bird and Bat Fatality Monitoring Methods'' by Wally Erickson, West, Inc. Sections included protocol development and review, methodology, adjusting for scavenging rates, and adjusting for observer detection bias.

  17. Identifying Aircraft and Personnel Needs to Meet On-station Patrol Requirements

    DTIC Science & Technology

    2014-06-17

    One option would be to develop a fully stochastic model that explicitly examined unplanned maintenance ( Marlow and Novak 2013; Mattila et al. 2008...stationed at the base and the serviceability rate, respectively (as in Marlow and Novak 2013). Next, if one assumes that, for the number of available AU...of Intelligent & Robotic Systems 70: 347-359. 7. Marlow D and Novak A (2013). Fleet Sizing Analysis Methodologies for the Royal Australian Navy’s

  18. Understanding the rates of nonpolar organic chemical accumulation into passive samplers deployed in the environment: Guidance for passive sampler deployments.

    PubMed

    Apell, Jennifer N; Tcaciuc, A Patricia; Gschwend, Philip M

    2016-07-01

    Polymeric passive samplers have become a common method for estimating freely dissolved concentrations in environmental media. However, this approach has not yet been adopted by investigators conducting remedial investigations of contaminated environmental sites. Successful adoption of this sampling methodology relies on an understanding of how passive samplers accumulate chemical mass as well as developing guidance for the design and deployment of passive samplers. Herein, we outline the development of a simple mathematical relationship of the environmental, polymer, and chemical properties that control the uptake rate. This relationship, called a timescale, is then used to illustrate how each property controls the rate of equilibration in samplers deployed in the water or in the sediment. Guidance is also given on how to use the timescales to select an appropriate polymer, deployment time, and suite of performance reference compounds. Integr Environ Assess Manag 2016;12:486-492. © 2015 SETAC. © 2015 SETAC.

  19. Cadmium biosorption rate in protonated Sargassum biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, J.; Volesky, B.

    1999-03-01

    Biosorption of the heavy metal ion Cd{sup 2+} by protonated nonliving brown alga Sargassum fluitans biomass was accompanied by the release of hydrogen protons from the biomass. The uptake of cadmium and the release of proton matched each other throughout the biosorption process. The end-point titration methodology was used to maintain the constant pH 4.0 for developing the dynamic sorption rate. The sorption isotherm could be well represented by the Langmuir sorption model. A mass transfer model assuming the intraparticle diffusion in a one-dimensional thin plate as a controlling step was developed to describe the overall biosorption rate of cadmiummore » ions in flat seaweed biomass particles. The overall biosorption mathematical model equations were solved numerically yielding the effective diffusion coefficient D{sub e} about 3.5 {times} 10{sup {minus}6} cm{sup 2}/s. This value matches that obtained for the desorption process and is approximately half of that of the molecular diffusion coefficient for cadmium ions in aqueous solution.« less

  20. Development of dielectrophoresis MEMS device for PC12 cell patterning to elucidate nerve-network generation

    NASA Astrophysics Data System (ADS)

    Nakamachi, Eiji; Koga, Hirotaka; Morita, Yusuke; Yamamoto, Koji; Sakamoto, Hidetoshi

    2018-01-01

    We developed a PC12 cell trapping and patterning device by combining the dielectrophoresis (DEP) methodology and the micro electro mechanical systems (MEMS) technology for time-lapse observation of morphological change of nerve network to elucidate the generation mechanism of neural network. We succeeded a neural network generation, which consisted of cell body, axon and dendrites by using tetragonal and hexagonal cell patterning. Further, the time laps observations was carried out to evaluate the axonal extension rate. The axon extended in the channel and reached to the target cell body. We found that the shorter the PC12 cell distance, the less the axonal connection time in both tetragonal and hexagonal structures. After 48 hours culture, a maximum success rate of network formation was 85% in the case of 40 μm distance tetragonal structure.

  1. Development of the hard and soft constraints based optimisation model for unit sizing of the hybrid renewable energy system designed for microgrid applications

    NASA Astrophysics Data System (ADS)

    Sundaramoorthy, Kumaravel

    2017-02-01

    The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method

  2. Basic actions to reduce dropout rates in distance learning.

    PubMed

    Gregori, Pablo; Martínez, Vicente; Moyano-Fernández, Julio José

    2018-02-01

    Today's society, which is strongly based on knowledge and interaction with information, has a key component in technological innovation, a fundamental tool for the development of the current teaching methodologies. Nowadays, there are a lot of online resources, such as MOOCs (Massive Open Online Courses) and distance learning courses. One aspect that is common to all of these is a high dropout rate: about 90% in MOOCs and 50% in the courses of the Spanish National Distance Education University, among other examples. In this paper, we analyze a number of actions undertaken in the Master's Degree in Computational Mathematics at Universitat Jaume I in Castellón, Spain. These actions seem to help decrease the dropout rate in distance learning; the available data confirm their effectiveness. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Vegetation ecological restoration during geothermic exploratory perforation: A case study in Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortega-Rubio, A.; Salinas, F.; Naranjo, A.

    1997-12-31

    At Las Tres Virgenes, B.C.S., Mexico developed the Geothermic exploratory drilling of the area. One of the main recommendations of our Environmental Impact Assessment Study includes transplantation of the plant individuals found in the zones of roads and drilling platforms. In this work we describe the methodologies used to transplant the vegetal individuals found in such zones. We listed the species selected and the survivorship rate obtained for every one of them. From a total of 4,266 transplanted individuals, including many endemic species, a total of 2349 survived. Members of the Agavaceae and Cactaceae families show the maximum survivorship rate,more » meanwhile the members of the Burseraceae, Euphorbiaceae and Fouqueriaceae families exhibited the minimum survivorship rate (between 12.7% and 20%).« less

  4. Methodological choices affect cancer incidence rates: a cohort study.

    PubMed

    Brooke, Hannah L; Talbäck, Mats; Feychting, Maria; Ljung, Rickard

    2017-01-19

    Incidence rates are fundamental to epidemiology, but their magnitude and interpretation depend on methodological choices. We aimed to examine the extent to which the definition of the study population affects cancer incidence rates. All primary cancer diagnoses in Sweden between 1958 and 2010 were identified from the national Cancer Register. Age-standardized and age-specific incidence rates of 29 cancer subtypes between 2000 and 2010 were calculated using four definitions of the study population: persons resident in Sweden 1) based on general population statistics; 2) with no previous subtype-specific cancer diagnosis; 3) with no previous cancer diagnosis except non-melanoma skin cancer; and 4) with no previous cancer diagnosis of any type. We calculated absolute and relative differences between methods. Age-standardized incidence rates calculated using general population statistics ranged from 6% lower (prostate cancer, incidence rate difference: -13.5/100,000 person-years) to 8% higher (breast cancer in women, incidence rate difference: 10.5/100,000 person-years) than incidence rates based on individuals with no previous subtype-specific cancer diagnosis. Age-standardized incidence rates in persons with no previous cancer of any type were up to 10% lower (bladder cancer in women) than rates in those with no previous subtype-specific cancer diagnosis; however, absolute differences were <5/100,000 person-years for all cancer subtypes. For some cancer subtypes incidence rates vary depending on the definition of the study population. For these subtypes, standardized incidence ratios calculated using general population statistics could be misleading. Moreover, etiological arguments should be used to inform methodological choices during study design.

  5. In situ photoacoustic characterization for porous silicon growing: Detection principles

    NASA Astrophysics Data System (ADS)

    Ramirez-Gutierrez, C. F.; Castaño-Yepes, J. D.; Rodriguez-García, M. E.

    2016-05-01

    There are a few methodologies for monitoring the in-situ formation of Porous Silicon (PS). One of the methodologies is photoacoustic. Previous works that reported the use of photoacoustic to study the PS formation do not provide the physical explanation of the origin of the signal. In this paper, a physical explanation of the origin of the photoacoustic signal during the PS etching is provided. The incident modulated radiation and changes in the reflectance are taken as thermal sources. In this paper, a useful methodology is proposed to determine the etching rate, porosity, and refractive index of a PS film by the determination of the sample thickness, using scanning electron microscopy images. This method was developed by carrying out two different experiments using the same anodization conditions. The first experiment consisted of growth of the samples with different etching times to prove the periodicity of the photoacoustic signal, while the second one considered the growth samples using three different wavelengths that are correlated with the period of the photoacoustic signal. The last experiment showed that the period of the photoacoustic signal is proportional to the laser wavelength.

  6. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    PubMed

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  7. Procedure for calculating estimated ultimate recoveries of Bakken and Three Forks Formations horizontal wells in the Williston Basin

    USGS Publications Warehouse

    Cook, Troy A.

    2013-01-01

    Estimated ultimate recoveries (EURs) are a key component in determining productivity of wells in continuous-type oil and gas reservoirs. EURs form the foundation of a well-performance-based assessment methodology initially developed by the U.S. Geological Survey (USGS; Schmoker, 1999). This methodology was formally reviewed by the American Association of Petroleum Geologists Committee on Resource Evaluation (Curtis and others, 2001). The EUR estimation methodology described in this paper was used in the 2013 USGS assessment of continuous oil resources in the Bakken and Three Forks Formations and incorporates uncertainties that would not normally be included in a basic decline-curve calculation. These uncertainties relate to (1) the mean time before failure of the entire well-production system (excluding economics), (2) the uncertainty of when (and if) a stable hyperbolic-decline profile is revealed in the production data, (3) the particular formation involved, (4) relations between initial production rates and a stable hyperbolic-decline profile, and (5) the final behavior of the decline extrapolation as production becomes more dependent on matrix storage.

  8. Lean Methodology Reduces Inappropriate Use of Antipsychotics for Agitation at a Psychiatric Hospital.

    PubMed

    Goga, Joshana K; Depaolo, Antonio; Khushalani, Sunil; Walters, J Ken; Roca, Robert; Zisselman, Marc; Borleis, Christopher

    2017-01-01

    To Evaluate the Effects of Applying Lean Methodology-Improving Quality Increasing Efficiency by Eliminating Waste and Reducing Costs-An Approach To Decrease the Prescribing Frequency of Antipsychotics for The Indication of Agitation. Historically Controlled Study. Bheppard Pratt Health System is the Largest Private Provider of Psychiatric Care in Maryland With a Total Bed Capacity of 300. There Were 4 337 Patient Days From November 1 2012 to October 31 2013 on the Dementia Unit. All Patients Admitted on the Dementia Unit Were 65 Years of Age and Older with a Primary Diagnosis of Dementia. our Multidisciplinary Team Used Lean Methodology to Identify the Root Causes and Interventions Necessary to Reduce Inappropriate Antipsychotic Use. The Primary Outcome Was Rate of Inappropriately Indicating Agitation as the Rationale When Prescribing Antipsychotic Medications. There Was a 90% (P < 0.001) Reduction in Rate Of Antipsychotic Prescribing with an Indication of Agitation. The Lean Methodology Interventions Led To A 90% (P < 0.001) Reduction in the Rate of Antipsychotic Prescribing with an Indication of Agitation and a 10% Rate Reduction in Overall Antipsychotic Prescribing. Key Words: Agitation Alzheimer's Antipsychotics Behavioral and Psychological Symptoms of Dementia Centers For Medicare & Medicaid Services Dementia Root-cause Analysis. BPSD = Behavioral and Psychological Symptoms of Dementia CATIE-AD = Clinical Antipsychotic Trials of Intervention Effectiveness in Alzheimer's Disease EMR = Electronic Medical Records GAO = Government Accountability Office GNCIS = Geriatric Neuropsychiatric Clinical Indicator Scale.

  9. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    PubMed

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Pyrolysis Model Development for a Multilayer Floor Covering

    PubMed Central

    McKinnon, Mark B.; Stoliarov, Stanislav I.

    2015-01-01

    Comprehensive pyrolysis models that are integral to computational fire codes have improved significantly over the past decade as the demand for improved predictive capabilities has increased. High fidelity pyrolysis models may improve the design of engineered materials for better fire response, the design of the built environment, and may be used in forensic investigations of fire events. A major limitation to widespread use of comprehensive pyrolysis models is the large number of parameters required to fully define a material and the lack of effective methodologies for measurement of these parameters, especially for complex materials. The work presented here details a methodology used to characterize the pyrolysis of a low-pile carpet tile, an engineered composite material that is common in commercial and institutional occupancies. The studied material includes three distinct layers of varying composition and physical structure. The methodology utilized a comprehensive pyrolysis model (ThermaKin) to conduct inverse analyses on data collected through several experimental techniques. Each layer of the composite was individually parameterized to identify its contribution to the overall response of the composite. The set of properties measured to define the carpet composite were validated against mass loss rate curves collected at conditions outside the range of calibration conditions to demonstrate the predictive capabilities of the model. The mean error between the predicted curve and the mean experimental mass loss rate curve was calculated as approximately 20% on average for heat fluxes ranging from 30 to 70 kW·m−2, which is within the mean experimental uncertainty. PMID:28793556

  11. Fatigue Life Methodology for Bonded Composite Skin/Stringer Configurations

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Paris, Isabelle L.; OBrien, T. Kevin; Minguet, Pierre J.

    2001-01-01

    A methodology is presented for determining the fatigue life of composite structures based on fatigue characterization data and geometric nonlinear finite element (FE) analyses. To demonstrate the approach, predicted results were compared to fatigue tests performed on specimens which represented a tapered composite flange bonded onto a composite skin. In a first step, tension tests were performed to evaluate the debonding mechanisms between the flange and the skin. In a second step, a 2D FE model was developed to analyze the tests. To predict matrix cracking onset, the relationship between the tension load and the maximum principal stresses transverse to the fiber direction was determined through FE analysis. Transverse tension fatigue life data were used to -enerate an onset fatigue life P-N curve for matrix cracking. The resulting prediction was in good agreement with data from the fatigue tests. In a third step, a fracture mechanics approach based on FE analysis was used to determine the relationship between the tension load and the critical energy release rate. Mixed mode energy release rate fatigue life data were used to create a fatigue life onset G-N curve for delamination. The resulting prediction was in good agreement with data from the fatigue tests. Further, the prediction curve for cumulative life to failure was generated from the previous onset fatigue life curves. The results showed that the methodology offers a significant potential to Predict cumulative fatigue life of composite structures.

  12. Methodology for speech assessment in the Scandcleft project--an international randomized clinical trial on palatal surgery: experiences from a pilot study.

    PubMed

    Lohmander, A; Willadsen, E; Persson, C; Henningsson, G; Bowden, M; Hutters, B

    2009-07-01

    To present the methodology for speech assessment in the Scandcleft project and discuss issues from a pilot study. Description of methodology and blinded test for speech assessment. Speech samples and instructions for data collection and analysis for comparisons of speech outcomes across five included languages were developed and tested. PARTICIPANTS AND MATERIALS: Randomly selected video recordings of 10 5-year-old children from each language (n = 50) were included in the project. Speech material consisted of test consonants in single words, connected speech, and syllable chains with nasal consonants. Five experienced speech and language pathologists participated as observers. Narrow phonetic transcription of test consonants translated into cleft speech characteristics, ordinal scale rating of resonance, and perceived velopharyngeal closure (VPC). A velopharyngeal composite score (VPC-sum) was extrapolated from raw data. Intra-agreement comparisons were performed. Range for intra-agreement for consonant analysis was 53% to 89%, for hypernasality on high vowels in single words the range was 20% to 80%, and the agreement between the VPC-sum and the overall rating of VPC was 78%. Pooling data of speakers of different languages in the same trial and comparing speech outcome across trials seems possible if the assessment of speech concerns consonants and is confined to speech units that are phonetically similar across languages. Agreed conventions and rules are important. A composite variable for perceptual assessment of velopharyngeal function during speech seems usable; whereas, the method for hypernasality evaluation requires further testing.

  13. Pilot-scale treatment of atrazine production wastewater by UV/O3/ultrasound: Factor effects and system optimization.

    PubMed

    Jing, Liang; Chen, Bing; Wen, Diya; Zheng, Jisi; Zhang, Baiyu

    2017-12-01

    This study shed light on removing atrazine from pesticide production wastewater using a pilot-scale UV/O 3 /ultrasound flow-through system. A significant quadratic polynomial prediction model with an adjusted R 2 of 0.90 was obtained from central composite design with response surface methodology. The optimal atrazine removal rate (97.68%) was obtained at the conditions of 75 W UV power, 10.75 g h -1 O 3 flow rate and 142.5 W ultrasound power. A Monte Carlo simulation aided artificial neural networks model was further developed to quantify the importance of O 3 flow rate (40%), UV power (30%) and ultrasound power (30%). Their individual and interaction effects were also discussed in terms of reaction kinetics. UV and ultrasound could both enhance the decomposition of O 3 and promote hydroxyl radical (OH·) formation. Nonetheless, the dose of O 3 was the dominant factor and must be optimized because excess O 3 can react with OH·, thereby reducing the rate of atrazine degradation. The presence of other organic compounds in the background matrix appreciably inhibited the degradation of atrazine, while the effects of Cl - , CO 3 2- and HCO 3 - were comparatively negligible. It was concluded that the optimization of system performance using response surface methodology and neural networks would be beneficial for scaling up the treatment by UV/O 3 /ultrasound at industrial level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. High-rate RTK and PPP multi-GNSS positioning for small-scale dynamic displacements monitoring

    NASA Astrophysics Data System (ADS)

    Paziewski, Jacek; Sieradzki, Rafał; Baryła, Radosław; Wielgosz, Pawel

    2017-04-01

    The monitoring of dynamic displacements and deformations of engineering structures such as buildings, towers and bridges is of great interest due to several practical and theoretical reasons. The most important is to provide information required for safe maintenance of the constructions. High temporal resolution and precision of GNSS observations predestine this technology to be applied to most demanding application in terms of accuracy, availability and reliability. GNSS technique supported by appropriate processing methodology may meet the specific demands and requirements of ground and structures monitoring. Thus, high-rate multi-GNSS signals may be used as reliable source of information on dynamic displacements of ground and engineering structures, also in real time applications. In this study we present initial results of application of precise relative GNSS positioning for detection of small scale (cm level) high temporal resolution dynamic displacements. Methodology and algorithms applied in self-developed software allowing for relative positioning using high-rate dual-frequency phase and pseudorange GPS+Galileo observations are also given. Additionally, an approach was also made to use the Precise Point Positioning technique to such application. In the experiment were used the observations obtained from high-rate (20 Hz) geodetic receivers. The dynamic displacements were simulated using specially constructed device moving GNSS antenna with dedicated amplitude and frequency. The obtained results indicate on possibility of detection of dynamic displacements of the GNSS antenna even at the level of few millimetres using both relative and Precise Point Positioning techniques after suitable signals processing.

  15. Systematic content evaluation and review of measurement properties of questionnaires for measuring self-reported fatigue among older people.

    PubMed

    Egerton, Thorlene; Riphagen, Ingrid I; Nygård, Arnhild J; Thingstad, Pernille; Helbostad, Jorunn L

    2015-09-01

    The assessment of fatigue in older people requires simple and user-friendly questionnaires that capture the phenomenon, yet are free from items indistinguishable from other disorders and experiences. This study aimed to evaluate the content, and systematically review and rate the measurement properties of self-report questionnaires for measuring fatigue, in order to identify the most suitable questionnaires for older people. This study firstly involved identification of questionnaires that purport to measure self-reported fatigue, and evaluation of the content using a rating scale developed for the purpose from contemporary understanding of the construct. Secondly, for the questionnaires that had acceptable content, we identified studies reporting measurement properties and rated the methodological quality of those studies according to the COSMIN system. Finally, we extracted and synthesised the results of the studies to give an overall rating for each questionnaire for each measurement property. The protocol was registered with PROSPERO (CRD42013005589). Of the 77 identified questionnaires, twelve were selected for review after content evaluation. Methodological quality varied, and there was a lack of information on measurement error and responsiveness. The PROMIS-Fatigue item bank and short forms perform the best. The FACIT-Fatigue scale, Parkinsons Fatigue Scale, Perform Questionnaire, and Uni-dimensional Fatigue Impact Scale also perform well and can be recommended. Minor modifications to improve performance are suggested. Further evaluation of unresolved measurement properties, particularly with samples including older people, is needed for all the recommended questionnaires.

  16. Using a Lean Six Sigma Approach to Yield Sustained Pressure Ulcer Prevention for Complex Critical Care Patients.

    PubMed

    Donovan, Elizabeth A; Manta, Christine J; Goldsack, Jennifer C; Collins, Michelle L

    2016-01-01

    Under value-based purchasing, Medicare withholds reimbursements for hospital-acquired pressure ulcer occurrence and rewards hospitals that meet performance standards. With little evidence of a validated prevention process, nurse managers are challenged to find evidence-based interventions. The aim of this study was to reduce the unit-acquired pressure ulcer (UAPU) rate on targeted intensive care and step-down units by 15% using Lean Six Sigma (LSS) methodology. An interdisciplinary team designed a pilot program using LSS methodology to test 4 interventions: standardized documentation, equipment monitoring, patient out-of-bed-to-chair monitoring, and a rounding checklist. During the pilot, the UAPU rate decreased from 4.4% to 2.8%, exceeding the goal of a 15% reduction. The rate remained below the goal through the program control phase at 2.9%, demonstrating a statistically significant reduction after intervention implementation. The program significantly reduced UAPU rates in high-risk populations. LSS methodologies are a sustainable approach to reducing hospital-acquired conditions that should be broadly tested and implemented.

  17. Quality and methodological challenges in Internet-based mental health trials.

    PubMed

    Ye, Xibiao; Bapuji, Sunita Bayyavarapu; Winters, Shannon; Metge, Colleen; Raynard, Mellissa

    2014-08-01

    To review the quality of Internet-based mental health intervention studies and their methodological challenges. We searched multiple literature databases to identify relevant studies according to the Population, Interventions, Comparators, Outcomes, and Study Design framework. Two reviewers independently assessed selection bias, allocation bias, confounding bias, blinding, data collection methods, and withdrawals/dropouts, using the Quality Assessment Tool for Quantitative Studies. We rated each component as strong, moderate, or weak and assigned a global rating (strong, moderate, or weak) to each study. We discussed methodological issues related to the study quality. Of 122 studies included, 31 (25%), 44 (36%), and 47 (39%) were rated strong, moderate, and weak, respectively. Only five studies were rated strong for all of the six quality components (three of them were published by the same group). Lack of blinding, selection bias, and low adherence were the top three challenges in Internet-based mental health intervention studies. The overall quality of Internet-based mental health intervention needs to improve. In particular, studies need to improve sample selection, intervention allocation, and blinding.

  18. Working group written presentation: Solar radiation

    NASA Technical Reports Server (NTRS)

    Slemp, Wayne S.

    1989-01-01

    The members of the Solar Radiation Working Group arrived at two major solar radiation technology needs: (1) generation of a long term flight data base; and (2) development of a standardized UV testing methodology. The flight data base should include 1 to 5 year exposure of optical filters, windows, thermal control coatings, hardened coatings, polymeric films, and structural composites. The UV flux and wavelength distribution, as well as particulate radiation flux and energy, should be measured during this flight exposure. A standard testing methodology is needed to establish techniques for highly accelerated UV exposure which will correlate well with flight test data. Currently, UV can only be accelerated to about 3 solar constants and can correlate well with flight exposure data. With space missions to 30 years, acceleration rates of 30 to 100X are needed for efficient laboratory testing.

  19. Using Mixed Methods to Evaluate a Community Intervention for Sexual Assault Survivors: A Methodological Tale.

    PubMed

    Campbell, Rebecca; Patterson, Debra; Bybee, Deborah

    2011-03-01

    This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.

  20. Low-order modeling of internal heat transfer in biomass particle pyrolysis

    DOE PAGES

    Wiggins, Gavin M.; Daw, C. Stuart; Ciesielski, Peter N.

    2016-05-11

    We present a computationally efficient, one-dimensional simulation methodology for biomass particle heating under conditions typical of fast pyrolysis. Our methodology is based on identifying the rate limiting geometric and structural factors for conductive heat transport in biomass particle models with realistic morphology to develop low-order approximations that behave appropriately. Comparisons of transient temperature trends predicted by our one-dimensional method with three-dimensional simulations of woody biomass particles reveal good agreement, if the appropriate equivalent spherical diameter and bulk thermal properties are used. Here, we conclude that, for particle sizes and heating regimes typical of fast pyrolysis, it is possible to simulatemore » biomass particle heating with reasonable accuracy and minimal computational overhead, even when variable size, aspherical shape, anisotropic conductivity, and complex, species-specific internal pore geometry are incorporated.« less

  1. Low-Order Modeling of Internal Heat Transfer in Biomass Particle Pyrolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiggins, Gavin M.; Ciesielski, Peter N.; Daw, C. Stuart

    2016-06-16

    We present a computationally efficient, one-dimensional simulation methodology for biomass particle heating under conditions typical of fast pyrolysis. Our methodology is based on identifying the rate limiting geometric and structural factors for conductive heat transport in biomass particle models with realistic morphology to develop low-order approximations that behave appropriately. Comparisons of transient temperature trends predicted by our one-dimensional method with three-dimensional simulations of woody biomass particles reveal good agreement, if the appropriate equivalent spherical diameter and bulk thermal properties are used. We conclude that, for particle sizes and heating regimes typical of fast pyrolysis, it is possible to simulate biomassmore » particle heating with reasonable accuracy and minimal computational overhead, even when variable size, aspherical shape, anisotropic conductivity, and complex, species-specific internal pore geometry are incorporated.« less

  2. A Sequential Fluid-mechanic Chemical-kinetic Model of Propane HCCI Combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aceves, S M; Flowers, D L; Martinez-Frias, J

    2000-11-29

    We have developed a methodology for predicting combustion and emissions in a Homogeneous Charge Compression Ignition (HCCI) Engine. This methodology combines a detailed fluid mechanics code with a detailed chemical kinetics code. Instead of directly linking the two codes, which would require an extremely long computational time, the methodology consists of first running the fluid mechanics code to obtain temperature profiles as a function of time. These temperature profiles are then used as input to a multi-zone chemical kinetics code. The advantage of this procedure is that a small number of zones (10) is enough to obtain accurate results. Thismore » procedure achieves the benefits of linking the fluid mechanics and the chemical kinetics codes with a great reduction in the computational effort, to a level that can be handled with current computers. The success of this procedure is in large part a consequence of the fact that for much of the compression stroke the chemistry is inactive and thus has little influence on fluid mechanics and heat transfer. Then, when chemistry is active, combustion is rather sudden, leaving little time for interaction between chemistry and fluid mixing and heat transfer. This sequential methodology has been capable of explaining the main characteristics of HCCI combustion that have been observed in experiments. In this paper, we use our model to explore an HCCI engine running on propane. The paper compares experimental and numerical pressure traces, heat release rates, and hydrocarbon and carbon monoxide emissions. The results show an excellent agreement, even in parameters that are difficult to predict, such as chemical heat release rates. Carbon monoxide emissions are reasonably well predicted, even though it is intrinsically difficult to make good predictions of CO emissions in HCCI engines. The paper includes a sensitivity study on the effect of the heat transfer correlation on the results of the analysis. Importantly, the paper also shows a numerical study on how parameters such as swirl rate, crevices and ceramic walls could help in reducing HC and CO emissions from HCCI engines.« less

  3. Rates of climatic niche evolution are correlated with species richness in a large and ecologically diverse radiation of songbirds.

    PubMed

    Title, Pascal O; Burns, Kevin J

    2015-05-01

    By employing a recently inferred phylogeny and museum occurrence records, we examine the relationship of ecological niche evolution to diversification in the largest family of songbirds, the tanagers (Thraupidae). We test whether differences in species numbers in the major clades of tanagers can be explained by differences in rate of climatic niche evolution. We develop a methodological pipeline to process and filter occurrence records. We find that, of the ecological variables examined, clade richness is higher in clades with higher climatic niche rate, and that this rate is also greater for clades that occupy a greater extent of climatic space. Additionally, we find that more speciose clades contain species with narrower niche breadths, suggesting that clades in which species are more successful at diversifying across climatic gradients have greater potential for speciation or are more buffered from the risk of extinction. © 2015 John Wiley & Sons Ltd/CNRS.

  4. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Sanders

    2006-09-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less

  5. Development and Evaluation of a Training Program for Organ Procurement Coordinators Using Standardized Patient Methodology.

    PubMed

    Odabasi, Orhan; Elcin, Melih; Uzun Basusta, Bilge; Gulkaya Anik, Esin; Aki, Tuncay F; Bozoklar, Ata

    2015-12-01

    The low rate of consent by next of kin of donor-eligible patients is a major limiting factor in organ transplant. Educating health care professionals about their role may lead to measurable improvements in the process. Our aim was to describe the developmental steps of a communication skills training program for health care professionals using standardized patients and to evaluate the results. We developed a rubric and 5 cases for standardized family interviews. The 20 participants interviewed standardized families at the beginning and at the end of the training course, with interviews followed by debriefing sessions. Participants also provided feedback before and after the course. The performance of each participant was assessed by his or her peers using the rubric. We calculated the generalizability coefficient to measure the reliability of the rubric and used the Wilcoxon signed rank test to compare achievement among participants. Statistical analyses were performed with SPSS software (SPSS: An IBM Company, version 17.0, IBM Corporation, Armonk, NY, USA). All participants received higher scores in their second interview, including novice participants who expressed great discomfort during their first interview. The participants rated the scenarios and the standardized patients as very representative of real-life situations, with feedback forms showing that the interviews, the video recording sessions, and the debriefing sessions contributed to their learning. Our program was designed to meet the current expectations and implications in the field of donor consent from next of kin. Results showed that our training program developed using standardized patient methodology was effective in obtaining the communication skills needed for family interviews during the consent process. The rubric developed during the study was a valid and reliable assessment tool that could be used in further educational activities. The participants showed significant improvements in communication skills.

  6. Using grounded theory methodology to conceptualize the mother-infant communication dynamic: potential application to compliance with infant feeding recommendations.

    PubMed

    Waller, Jennifer; Bower, Katherine M; Spence, Marsha; Kavanagh, Katherine F

    2015-10-01

    Excessive, rapid weight gain in early infancy has been linked to risk of later overweight and obesity. Inappropriate infant feeding practices associated with this rapid weight gain are currently of great interest. Understanding the origin of these practices may increase the effectiveness of interventions. Low-income populations in the Southeastern United States are at increased risk for development of inappropriate infant feeding practices, secondary to the relatively low rates of breastfeeding reported from this region. The objective was to use grounded theory methodology (GTM) to explore interactions between mothers and infants that may influence development of feeding practices, and to do so among low-income, primiparous, Southeastern United States mothers. Analysis of 15 in-depth phone interviews resulted in development of a theoretical model in which Mother-Infant Communication Dynamic emerged as the central concept. The central concept suggests a communication pattern developed over the first year of life, based on a positive feedback loop, which is harmonious and results in the maternal perception of mother and infant now speaking the same language. Importantly, though harmonious, this dynamic may result from inaccurate maternal interpretation of infant cues and behaviours, subsequently leading to inappropriate infant feeding practices. Future research should test this theoretical model using direct observation of mother-infant communication, to increase the understanding of maternal interpretation of infant cues. Subsequently, interventions targeting accurate maternal interpretation of and response to infant cues, and impact on rate of infant weight gain could be tested. If effective, health care providers could potentially use these concepts to attenuate excess rapid infant weight gain. © 2013 John Wiley & Sons Ltd.

  7. 76 FR 59896 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program; Postponement of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Wage Rule revised the methodology by which we calculate the prevailing wages to be paid to H-2B workers... methodology by which we calculate the prevailing wages to be paid to H-2B workers and United States (U.S... concerning the calculation of the prevailing wage rate in the H-2B program. CATA v. Solis, Dkt. No. 103-1...

  8. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  9. Back to BAC: The Use of Infectious Clone Technologies for Viral Mutagenesis

    PubMed Central

    Hall, Robyn N.; Meers, Joanne; Fowler, Elizabeth; Mahony, Timothy

    2012-01-01

    Bacterial artificial chromosome (BAC) vectors were first developed to facilitate the propagation and manipulation of large DNA fragments in molecular biology studies for uses such as genome sequencing projects and genetic disease models. To facilitate these studies, methodologies have been developed to introduce specific mutations that can be directly applied to the mutagenesis of infectious clones (icBAC) using BAC technologies. This has resulted in rapid identification of gene function and expression at unprecedented rates. Here we review the major developments in BAC mutagenesis in vitro. This review summarises the technologies used to construct and introduce mutations into herpesvirus icBAC. It also explores developing technologies likely to provide the next leap in understanding these important viruses. PMID:22470833

  10. Managing In-House Development of a Campus-Wide Information System

    ERIC Educational Resources Information Center

    Shurville, Simon; Williams, John

    2005-01-01

    Purpose: To show how a combination of hard and soft project and change management methodologies guided successful in-house development of a campus-wide information system. Design/methodology/approach: A case study of the methodologies and management structures that guided the development is presented. Findings: Applying a combination of the…

  11. The Airline Quality Rating 2003

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    2003-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline quality on combined multiple performance criteria. This current report, the Airline Quality Rating 2003, reflects monthly Airline Quality Rating scores for 2002. AQR scores for the calendar year 2002 are based on 15 elements that focus on airline performance areas important to air travel consumers. The Airline Quality Rating 2003 is a summary of month-by-month quality ratings for the 10 largest U.S. airlines operating during 2002. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of ontime arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, airlines comparative performance for the calendar year of 2002 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for domestic airline operations for the 12-month period of 2002, and industry average results. Also, comparative Airline Quality Rating data for 2001 are included for each airline to provide historical perspective regarding performance quality in the industry.

  12. The Airline Quality Rating 2002

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    2002-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline quality on combined multiple performance criteria. This current report, Airline Quality Rating 2002, reflects monthly Airline Quality Rating scores for 2001. AQR scores for the calendar year 2001 are based on 15 elements that focus on airline performance areas important to air travel consumers. The Airline Quality Rating 2002 is a summary of month-by-month quality ratings for the 11 largest U.S. airlines operating during 2001. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of on-time arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, airlines comparative performance for the calendar year of 2001 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for domestic airline operations for the 12-month period of 2001, and industry average results. Also, comparative Airline Quality Rating data for 2000 are included for each airline to provide historical perspective regarding performance quality in the industry.

  13. The Airline Quality Rating 1999

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    1999-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline performance on combined multiple criteria. This current report, Airline Quality Rating 1999, reflects an updated approach to calculating monthly Airline Quality Rating scores for 1998. AQR scores for the calendar year 1998 are based on 15 elements that focus on airline performance areas important to air travel consumers. The Airline Quality Rating is a summary of month-by-month quality ratings for the ten major U.S. airlines operating during 1998. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of on-time arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, major airlines comparative performance for the calendar year 1998 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for major airlines domestic operations for the 12 month period of 1998, and industry average results. Also, comparative Airline Quality Rating data for 1997, using the updated criteria, are included to provide a reference point regarding quality in the industry.

  14. The Airline Quality Rating 2004

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    2004-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline quality on combined multiple performance criteria. This current report, the Airline Quality Rating 2004, reflects monthly Airline Quality Rating scores for 2003. AQR scores far the calendar year 2003 are based on 15 elemnts in four major areas that focus on airline performance aspects important to air travel consumers. The Airline Quality Rating 2004 is a summary of month-by-month quality ratings for U.S. airlines that have at least 1% of domestic passenger volume during 2003. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of on-time arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, airlines comparative performance for the calendar year of 2003 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for domestic airline operations for the 12-month period of 2003, and industry results. Also, comparative Airline Quality Rating data for 2002 are included, where available, to provide historical perspective

  15. The Airline Quality Rating 2001

    NASA Technical Reports Server (NTRS)

    Bowen, Brent D.; Headley, Dean E.

    2001-01-01

    The Airline Quality Rating (AQR) was developed and first announced in early 1991 as an objective method of comparing airline quality on combined multiple performance criteria. This current report, Airline Quality Rating 2001, reflects monthly Airline Quality Rating scores for 2000. AQR scores for the calendar year 2000 are based on 15 elements that focus on airline performance areas important to air travel consumers. The Airline Quality Rating 2001 is a summary of month-by-month quality ratings for the ten major U.S. airlines operating during 2000. Using the Airline Quality Rating system of weighted averages and monthly performance data in the areas of on-time arrivals, involuntary denied boardings, mishandled baggage, and a combination of 12 customer complaint categories, major airlines comparative performance for the calendar year of 2000 is reported. This research monograph contains a brief summary of the AQR methodology, detailed data and charts that track comparative quality for major airlines domestic operations for the 12 month period of 2000, and industry average results. Also, comparative Airline Quality Rating data for 1999 are included for each airline to provide historical perspective regarding performance quality in the industry.

  16. Association of journal quality indicators with methodological quality of clinical research articles.

    PubMed

    Lee, Kirby P; Schotland, Marieka; Bacchetti, Peter; Bero, Lisa A

    2002-06-05

    The ability to identify scientific journals that publish high-quality research would help clinicians, scientists, and health-policy analysts to select the most up-to-date medical literature to review. To assess whether journal characteristics of (1) peer-review status, (2) citation rate, (3) impact factor, (4) circulation, (5) manuscript acceptance rate, (6) MEDLINE indexing, and (7) Brandon/Hill Library List indexing are predictors of methodological quality of research articles, we conducted a cross-sectional study of 243 original research articles involving human subjects published in general internal medical journals. The mean (SD) quality score of the 243 articles was 1.37 (0.22). All journals reported a peer-review process and were indexed on MEDLINE. In models that controlled for article type (randomized controlled trial [RCT] or non-RCT), journal citation rate was the most statistically significant predictor (0.051 increase per doubling; 95% confidence interval [CI], 0.037-0.065; P<.001). In separate analyses by article type, acceptance rate was the strongest predictor for RCT quality (-0.113 per doubling; 95% CI, -0.148 to -0.078; P<.001), while journal citation rate was the most predictive factor for non-RCT quality (0.051 per doubling; 95% CI, 0.044-0.059; P<.001). High citation rates, impact factors, and circulation rates, and low manuscript acceptance rates and indexing on Brandon/Hill Library List appear to be predictive of higher methodological quality scores for journal articles.

  17. Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.

    PubMed

    Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D

    2016-04-01

    Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.

  18. Examination of commercial aviation operational energy conservation strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Forty-seven fuel conservation strategies are identified for commercial aviation and the fuel saving potential, costs, constraints, and current implementation levels of these options are examined. This assessment is based on a comprehensive review of published data and discussions with representatives from industry and government. Analyses were performed to quantify the fuel saving potential of each option, and to assess the fuel savings achieved to date by the airline industry. Those options requiring further government support for option implementation were identified, rated, and ranked in accordance with a rating methodology developed in the study. Finally, recommendations are made for future governmentmore » efforts in the area of fuel conservation in commercial aviation.« less

  19. Capital projects: Economic and financial analyses of nine capital projects in Egypt. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanrahan, M.; Walker, J.

    1994-03-01

    Over the period 1977-92, the US Agency for International Development (A.I.D.) funded nine capital projects in Egypt, which collectively increased electric power generation, introduced a modern telephone system in Cairo and Alexandria, and rehabilitated a water and sewer system that served 23 million people. This study presents detailed ex post facto analyses of the projects` economic and financial internal rates of return. The methodology, assumptions, and data are examined and described. Results indicate a mixed performance, with generally low to medium financial and economic rates of return. In large measure, the poor performance was due to the Egyptian Government`s poormore » economic policies.« less

  20. New forecasting methodology indicates more disease and earlier mortality ahead for today's younger Americans.

    PubMed

    Reither, Eric N; Olshansky, S Jay; Yang, Yang

    2011-08-01

    Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.

  1. An ontology-driven, case-based clinical decision support model for removable partial denture design

    NASA Astrophysics Data System (ADS)

    Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao

    2016-06-01

    We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient’s oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.

  2. An ontology-driven, case-based clinical decision support model for removable partial denture design.

    PubMed

    Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao

    2016-06-14

    We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient's oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.

  3. Determination of struvite crystallization mechanisms in urine using turbidity measurement.

    PubMed

    Triger, Aurélien; Pic, Jean-Stéphane; Cabassud, Corinne

    2012-11-15

    Sanitation improvement in developing countries could be achieved through wastewater treatment processes. Nowadays alternative concepts such as urine separate collection are being developed. These processes would be an efficient way to reduce pollution of wastewater while recovering nutrients, especially phosphorus, which are lost in current wastewater treatment methods. The precipitation of struvite (MgNH(4)PO(4)∙6H(2)O) from urine is an efficient process yielding more than 98% phosphorus recovery with very high reaction rates. The work presented here aims to determine the kinetics and mechanisms of struvite precipitation in order to supply data for the design of efficient urine treatment processes. A methodology coupling the resolution of the population balance equation to turbidity measurement was developed, and batch experiments with synthetic and real urine were performed. The main mechanisms of struvite crystallization were identified as crystal growth and nucleation. A satisfactory approximation of the volumetric crystal size distribution was obtained. The study has shown the low influence on the crystallization process of natural organic matter contained in real urine. It has also highlighted the impact of operational parameters. Mixing conditions can create segregation and attrition which influence the nucleation rate, resulting in a change in crystals number, size, and thus final crystal size distribution (CSD). Moreover urine storage conditions can impact urea hydrolysis and lead to spontaneous struvite precipitation in the stock solution also influencing the final CSD. A few limits of the applied methodology and of the proposed modelling, due to these phenomena and to the turbidity measurement, are also discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  5. Development of phantom and methodology for 3D and 4D dose intercomparisons for advanced lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Caloz, Misael; Kafrouni, Marilyne; Leturgie, Quentin; Corde, Stéphanie; Downes, Simon; Lehmann, Joerg; Thwaites, David

    2015-01-01

    There are few reported intercomparisons or audits of combinations of advanced radiotherapy methods, particularly for 4D treatments. As part of an evaluation of the implementation of advanced radiotherapy technology, a phantom and associated methods, initially developed for in-house commissioning and QA of 4D lung treatments, has been developed further with the aim of using it for end-to-end dose intercomparison of 4D treatment planning and delivery. The respiratory thorax phantom can house moving inserts with variable speed (breathing rate) and motion amplitude. In one set-up mode it contains a small ion chamber for point dose measurements, or alternatively it can hold strips of radiochromic film to measure dose distributions. Initial pilot and feasibility measurements have been carried out in one hospital to thoroughly test the methods and procedures before using it more widely across a range of hospitals and treatment systems. Overall, the results show good agreement between measured and calculated doses and distributions, supporting the use of the phantom and methodology for multi-centre intercomparisons. However, before wider use, refinements of the method and analysis are currently underway particularly for the film measurements.

  6. Appropriate Use Criteria in Dermatopathology: Initial Recommendations from the American Society of Dermatopathology.

    PubMed

    Vidal, Claudia I; Armbrect, Eric A; Andea, Aleodor A; Bohlke, Angela K; Comfere, Nneka I; Hughes, Sarah R; Kim, Jinah; Kozel, Jessica A; Lee, Jason B; Linos, Konstantinos; Litzner, Brandon R; Missall, Tricia A; Novoa, Roberto A; Sundram, Uma; Swick, Brian L; Hurley, M Yadira; Alam, Murad; Argenyi, Zsolt; Duncan, Lyn M; Elston, Dirk M; Emanuel, Patrick O; Ferringer, Tammie; Fung, Maxwell A; Hosler, Gregory A; Lazar, Alexander J; Lowe, Lori; Plaza, Jose A; Prieto, Victor G; Robinson, June K; Schaffer, Andras; Subtil, Antonio; Wang, Wei-Lien

    2018-04-21

    Appropriate use criteria (AUC) provide physicians guidance in test selection, can affect health care delivery, reimbursement policy, and physician decision-making. The American Society of Dermatopathology (ASDP), with input from the American Academy of Dermatology (AAD) and the College of American Pathologists (CAP), sought to develop AUC in dermatopathology. The RAND/UCLA appropriateness methodology, which combines evidence-based medicine, clinical experience and expert judgment, was used to develop AUC in dermatopathology. With the number of ratings predetermined at 3, AUC were developed for 211 clinical scenarios (CS) involving 12 ancillary studies (AS). Consensus was reached for 188 (89%) CS, with 93 (44%) considered "usually appropriate", 52 (25%) "rarely appropriate", and 43 (20%) "uncertain appropriateness". The methodology requires a focus on appropriateness without comparison between tests and irrespective of cost. The ultimate decision of when to order specific test rests with the physician and is one where the expected benefit exceeds the negative consequences. This publication outlines the recommendation of appropriateness - AUC for 12 tests used in dermatopathology. Importantly, these recommendations may change considering new evidence. Results deemed "uncertain appropriateness" and where consensus was not reached may benefit from further research. Copyright © 2018. Published by Elsevier Inc.

  7. Development of smart textiles with embedded fiber optic chemical sensors

    NASA Astrophysics Data System (ADS)

    Khalil, Saif E.; Yuan, Jianming; El-Sherif, Mahmoud A.

    2004-03-01

    Smart textiles are defined as textiles capable of monitoring their own health conditions or structural behavior, as well as sensing external environmental conditions. Smart textiles appear to be a future focus of the textile industry. As technology accelerates, textiles are found to be more useful and practical for potential advanced technologies. The majority of textiles are used in the clothing industry, which set up the idea of inventing smart clothes for various applications. Examples of such applications are medical trauma assessment and medical patients monitoring (heart and respiration rates), and environmental monitoring for public safety officials. Fiber optics have played a major role in the development of smart textiles as they have in smart structures in general. Optical fiber integration into textile structures (knitted, woven, and non-woven) is presented, and defines the proper methodology for the manufacturing of smart textiles. Samples of fabrics with integrated optical fibers were processed and tested for optical signal transmission. This was done in order to investigate the effect of textile production procedures on optical fiber performance. The tests proved the effectiveness of the developed methodology for integration of optical fibers without changing their optical performance or structural integrity.

  8. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  9. A non-intrusive screening methodology for environmental hazard assessment at waste disposal sites for water resources protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, B.A.; Woldt, W.E.; Jones, D.D.

    The environmental and health risks posed by unregulated waste disposal sites are potential concerns of Pacific Rim regions and island ares because of the need to protect aquifers and other valuable water resources. A non-intrusive screening methodology to determine site characteristics including possible soil and/or groundwater contamination, areal extent of waste, etc. is being developed and tested at waste disposal sites in Nebraska. This type of methodology would be beneficial to Pacific Rim regions in investigating and/or locating unknown or poorly documented contamination areas for hazard assessment and groundwater protection. Traditional assessment methods are generally expensive, time consuming, and potentiallymore » exacerbate the problem. Ideally, a quick and inexpensive assessment method to reliably characterize these sites is desired. Electromagnetic (EM) conductivity surveying and soil-vapor sampling techniques, combined with innovative three-dimensional geostatistical methods are used to map the data to develop a site characterization of the subsurface and to aid in tracking any contaminant plumes. The EM data is analyzed to determine/estimate the extent and volume of waste and/or leachate. Soil-vapor data are analyzed to estimate a site`s volatile organic compound (VOC) emission rate to the atmosphere. The combined information could then be incorporated as one part of an overall hazard assessment system.« less

  10. How few and far between? Examining the effects of probe rate on self-reported mind wandering

    PubMed Central

    Seli, Paul; Carriere, Jonathan S. A.; Levene, Merrick; Smilek, Daniel

    2013-01-01

    We examined whether the temporal rate at which thought probes are presented affects the likelihood that people will report periods of mind wandering. To evaluate this possibility, we had participants complete a sustained-attention task (the Metronome Response Task; MRT) during which we intermittently presented thought probes. Critically, we varied the average time between probes (i.e., probe rate) across participants, allowing us to examine the relation between probe rate and mind-wandering rate. We observed a positive relation between these variables, indicating that people are more likely to report mind wandering as the time between probes increases. We discuss the methodological implications of this finding in the context of the mind-wandering literature, and suggest that researchers include a range of probe rates in future work to provide more insight into this methodological issue. PMID:23882239

  11. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  12. Analysis of Multiple Cracks in an Infinite Functionally Graded Plate

    NASA Technical Reports Server (NTRS)

    Shbeeb, N. I.; Binienda, W. K.; Kreider, K. L.

    1999-01-01

    A general methodology was constructed to develop the fundamental solution for a crack embedded in an infinite non-homogeneous material in which the shear modulus varies exponentially with the y coordinate. The fundamental solution was used to generate a solution to fully interactive multiple crack problems for stress intensity factors and strain energy release rates. Parametric studies were conducted for two crack configurations. The model displayed sensitivity to crack distance, relative angular orientation, and to the coefficient of nonhomogeneity.

  13. A general methodology for maximum likelihood inference from band-recovery data

    USGS Publications Warehouse

    Conroy, M.J.; Williams, B.K.

    1984-01-01

    A numerical procedure is described for obtaining maximum likelihood estimates and associated maximum likelihood inference from band- recovery data. The method is used to illustrate previously developed one-age-class band-recovery models, and is extended to new models, including the analysis with a covariate for survival rates and variable-time-period recovery models. Extensions to R-age-class band- recovery, mark-recapture models, and twice-yearly marking are discussed. A FORTRAN program provides computations for these models.

  14. Development of Fatigue and Crack Propagation Design and Analysis Methodology in a Corrosive Environment for Typical Mechanically-Fastened Joints. Volume 2. State-of-the-Art Assessment.

    DTIC Science & Technology

    1983-03-01

    120] hypothesized a linear summation model to predict the corrosion -fatigue behavior above Kjscc for a high-strength steel . The model considers the...120] could satisfactorily predict the rates of corrosion -fatigue-crack growth for 18-Ni Maraging steels tested in several gaseous and aqueous...NADC-83126-60 Vol. II 6. The corrosion fatigue behavior of titanium alloys is very complex. Therefore, a better understanding of corrosion fatigue

  15. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  16. Sugar-Sweetened Beverages and Obesity Risk in Children and Adolescents: A Systematic Analysis on How Methodological Quality May Influence Conclusions.

    PubMed

    Bucher Della Torre, Sophie; Keller, Amélie; Laure Depeyre, Jocelyne; Kruseman, Maaike

    2016-04-01

    In the context of a worldwide high prevalence of childhood obesity, the role of sugar-sweetened beverage (SSB) consumption as a cause of excess weight gain remains controversial. Conflicting results may be due to methodological issues in original studies and in reviews. The aim of this review was to systematically analyze the methodology of studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents, and the studies' ability to answer this research question. A systematic review of cohort and experimental studies published until December 2013 in peer-reviewed journals was performed on Medline, CINAHL, Web of Knowledge, and ClinicalTrials.gov. Studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents were included, and methodological quality to answer this question was assessed independently by two investigators using the Academy of Nutrition and Dietetics Quality Criteria Checklist. Among the 32 identified studies, nine had positive quality ratings and 23 studies had at least one major methodological issue. Main methodological issues included SSB definition and inadequate measurement of exposure. Studies with positive quality ratings found an association between SSB consumption and risk of obesity or obesity (n=5) (ie, when SSB consumption increased so did obesity) or mixed results (n=4). Studies with a neutral quality rating found a positive association (n=7), mixed results (n=9), or no association (n=7). The present review shows that the majority of studies with strong methodology indicated a positive association between SSB consumption and risk of obesity or obesity, especially among overweight children. In addition, study findings highlight the need for the careful and precise measurement of the consumption of SSBs and of important confounders. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  17. A methodology to support the development of 4-year pavement management plan.

    DOT National Transportation Integrated Search

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  18. Robust PV Degradation Methodology and Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Dirk; Deline, Christopher A; Kurtz, Sarah

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of PV systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this manuscript, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year (YOY) rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

  19. Robust PV Degradation Methodology and Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Dirk C.; Deline, Chris; Kurtz, Sarah R.

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of photovoltaics (PV) systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this paper, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

  20. Robust PV Degradation Methodology and Application

    DOE PAGES

    Jordan, Dirk C.; Deline, Chris; Kurtz, Sarah R.; ...

    2017-12-21

    The degradation rate plays an important role in predicting and assessing the long-term energy generation of photovoltaics (PV) systems. Many methods have been proposed for extracting the degradation rate from operational data of PV systems, but most of the published approaches are susceptible to bias due to inverter clipping, module soiling, temporary outages, seasonality, and sensor degradation. In this paper, we propose a methodology for determining PV degradation leveraging available modeled clear-sky irradiance data rather than site sensor data, and a robust year-over-year rate calculation. We show the method to provide reliable degradation rate estimates even in the case ofmore » sensor drift, data shifts, and soiling. Compared with alternate methods, we demonstrate that the proposed method delivers the lowest uncertainty in degradation rate estimates for a fleet of 486 PV systems.« less

Top