Sample records for develop standardized methodologies

  1. Workshop on LCA: Methodology, Current Development, and Application in Standards - LCA Methodology

    EPA Science Inventory

    As ASTM standards are being developed including Life Cycle Assessment within the Standards it is imperative that practitioners in the field learn more about what LCA is, and how to conduct it. This presentation will include an overview of the LCA process and will concentrate on ...

  2. Methodological standards and patient-centeredness in comparative effectiveness research: the PCORI perspective.

    PubMed

    2012-04-18

    Rigorous methodological standards help to ensure that medical research produces information that is valid and generalizable, and are essential in patient-centered outcomes research (PCOR). Patient-centeredness refers to the extent to which the preferences, decision-making needs, and characteristics of patients are addressed, and is the key characteristic differentiating PCOR from comparative effectiveness research. The Patient Protection and Affordable Care Act signed into law in 2010 created the Patient-Centered Outcomes Research Institute (PCORI), which includes an independent, federally appointed Methodology Committee. The Methodology Committee is charged to develop methodological standards for PCOR. The 4 general areas identified by the committee in which standards will be developed are (1) prioritizing research questions, (2) using appropriate study designs and analyses, (3) incorporating patient perspectives throughout the research continuum, and (4) fostering efficient dissemination and implementation of results. A Congressionally mandated PCORI methodology report (to be issued in its first iteration in May 2012) will begin to provide standards in each of these areas, and will inform future PCORI funding announcements and review criteria. The work of the Methodology Committee is intended to enable generation of information that is relevant and trustworthy for patients, and to enable decisions that improve patient-centered outcomes.

  3. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less

  4. Intermountain Health Care, Inc.: Standard Costing System Methodology and Implementation

    PubMed Central

    Rosqvist, W.V.

    1984-01-01

    Intermountain Health Care, Inc. (IHC) a notfor-profit hospital chain with 22 hospitals in the intermountain area and corporate offices located in Salt Lake City, Utah, has developed a Standard Costing System to provide hospital management with a tool for confronting increased cost pressures in the health care environment. This document serves as a description of methodology used in developing the standard costing system and outlines the implementation process.

  5. Event-driven, pattern-based methodology for cost-effective development of standardized personal health devices.

    PubMed

    Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis

    2014-11-01

    Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. A standard description and costing methodology for the balance-of-plant items of a solar thermal electric power plant. Report of a multi-institutional working group

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Standard descriptions for solar thermal power plants are established and uniform costing methodologies for nondevelopmental balance of plant (BOP) items are developed. The descriptions and methodologies developed are applicable to the major systems. These systems include the central receiver, parabolic dish, parabolic trough, hemispherical bowl, and solar pond. The standard plant is defined in terms of four categories comprising (1) solar energy collection, (2) power conversion, (3) energy storage, and (4) balance of plant. Each of these categories is described in terms of the type and function of components and/or subsystems within the category. A detailed description is given for the BOP category. BOP contains a number of nondevelopmental items that are common to all solar thermal systems. A standard methodology for determining the costs of these nondevelopmental BOP items is given. The methodology is presented in the form of cost equations involving cost factors such as unit costs. A set of baseline values for the normalized cost factors is also given.

  7. A taxonomy of multinational ethical and methodological standards for clinical trials of therapeutic interventions

    PubMed Central

    Ashton, Carol M; Wray, Nelda P; Jarman, Anna F; Kolman, Jacob M; Wenner, Danielle M; Brody, Baruch A

    2013-01-01

    Background If trials of therapeutic interventions are to serve society’s interests, they must be of high methodological quality and must satisfy moral commitments to human subjects. The authors set out to develop a clinical-trials compendium in which standards for the ethical treatment of human subjects are integrated with standards for research methods. Methods The authors rank-ordered the world’s nations and chose the 31 with >700 active trials as of 24 July 2008. Governmental and other authoritative entities of the 31 countries were searched, and 1004 English-language documents containing ethical and/or methodological standards for clinical trials were identified. The authors extracted standards from 144 of those: 50 designated as ‘core’, 39 addressing trials of invasive procedures and a 5% sample (N=55) of the remainder. As the integrating framework for the standards we developed a coherent taxonomy encompassing all elements of a trial’s stages. Findings Review of the 144 documents yielded nearly 15 000 discrete standards. After duplicates were removed, 5903 substantive standards remained, distributed in the taxonomy as follows: initiation, 1401 standards, 8 divisions; design, 1869 standards, 16 divisions; conduct, 1473 standards, 8 divisions; analysing and reporting results, 997 standards, four divisions; and post-trial standards, 168 standards, 5 divisions. Conclusions The overwhelming number of source documents and standards uncovered in this study was not anticipated beforehand and confirms the extraordinary complexity of the clinical trials enterprise. This taxonomy of multinational ethical and methodological standards may help trialists and overseers improve the quality of clinical trials, particularly given the globalisation of clinical research. PMID:21429960

  8. A taxonomy of multinational ethical and methodological standards for clinical trials of therapeutic interventions.

    PubMed

    Ashton, Carol M; Wray, Nelda P; Jarman, Anna F; Kolman, Jacob M; Wenner, Danielle M; Brody, Baruch A

    2011-06-01

    If trials of therapeutic interventions are to serve society's interests, they must be of high methodological quality and must satisfy moral commitments to human subjects. The authors set out to develop a clinical-trials compendium in which standards for the ethical treatment of human subjects are integrated with standards for research methods. The authors rank-ordered the world's nations and chose the 31 with >700 active trials as of 24 July 2008. Governmental and other authoritative entities of the 31 countries were searched, and 1004 English-language documents containing ethical and/or methodological standards for clinical trials were identified. The authors extracted standards from 144 of those: 50 designated as 'core', 39 addressing trials of invasive procedures and a 5% sample (N=55) of the remainder. As the integrating framework for the standards we developed a coherent taxonomy encompassing all elements of a trial's stages. Review of the 144 documents yielded nearly 15 000 discrete standards. After duplicates were removed, 5903 substantive standards remained, distributed in the taxonomy as follows: initiation, 1401 standards, 8 divisions; design, 1869 standards, 16 divisions; conduct, 1473 standards, 8 divisions; analysing and reporting results, 997 standards, four divisions; and post-trial standards, 168 standards, 5 divisions. The overwhelming number of source documents and standards uncovered in this study was not anticipated beforehand and confirms the extraordinary complexity of the clinical trials enterprise. This taxonomy of multinational ethical and methodological standards may help trialists and overseers improve the quality of clinical trials, particularly given the globalisation of clinical research.

  9. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  11. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  12. Methodology issues in implementation science.

    PubMed

    Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita

    2013-04-01

    Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.

  13. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  14. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  15. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  16. Software production methodology tested project

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.

  17. Steps towards the international regulatory acceptance of non-animal methodology in safety assessment.

    PubMed

    Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie

    2017-10-01

    The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Screening Methodologies to Support Risk and Technology ...

    EPA Pesticide Factsheets

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title

  19. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  20. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  1. ASTM and VAMAS activities in titanium matrix composites test methods development

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.; Harmon, D. M.; Bartolotta, P. A.; Russ, S. M.

    1994-01-01

    Titanium matrix composites (TMC's) are being considered for a number of aerospace applications ranging from high performance engine components to airframe structures in areas that require high stiffness to weight ratios at temperatures up to 400 C. TMC's exhibit unique mechanical behavior due to fiber-matrix interface failures, matrix cracks bridged by fibers, thermo-viscoplastic behavior of the matrix at elevated temperatures, and the development of significant thermal residual stresses in the composite due to fabrication. Standard testing methodology must be developed to reflect the uniqueness of this type of material systems. The purpose of this paper is to review the current activities in ASTM and Versailles Project on Advanced Materials and Standards (VAMAS) that are directed toward the development of standard test methodology for titanium matrix composites.

  2. The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.

    ERIC Educational Resources Information Center

    Samers, Bernard N.; And Others

    The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…

  3. The need for a comprehensive expert system development methodology

    NASA Technical Reports Server (NTRS)

    Baumert, John; Critchfield, Anna; Leavitt, Karen

    1988-01-01

    In a traditional software development environment, the introduction of standardized approaches has led to higher quality, maintainable products on the technical side and greater visibility into the status of the effort on the management side. This study examined expert system development to determine whether it differed enough from traditional systems to warrant a reevaluation of current software development methodologies. Its purpose was to identify areas of similarity with traditional software development and areas requiring tailoring to the unique needs of expert systems. A second purpose was to determine whether existing expert system development methodologies meet the needs of expert system development, management, and maintenance personnel. The study consisted of a literature search and personal interviews. It was determined that existing methodologies and approaches to developing expert systems are not comprehensive nor are they easily applied, especially to cradle to grave system development. As a result, requirements were derived for an expert system development methodology and an initial annotated outline derived for such a methodology.

  4. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  5. Expanded uncertainty estimation methodology in determining the sandy soils filtration coefficient

    NASA Astrophysics Data System (ADS)

    Rusanova, A. D.; Malaja, L. D.; Ivanov, R. N.; Gruzin, A. V.; Shalaj, V. V.

    2018-04-01

    The combined standard uncertainty estimation methodology in determining the sandy soils filtration coefficient has been developed. The laboratory researches were carried out which resulted in filtration coefficient determination and combined uncertainty estimation obtaining.

  6. Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process

    ERIC Educational Resources Information Center

    Arnab, Sylvester; Clarke, Samantha

    2017-01-01

    The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…

  7. FIELD EVALUATION OF A SAMPLING APPROACH FOR PM-COARSE AEROSOLS

    EPA Science Inventory

    Subsequent to a 1997 revision of the national ambient air quality standards (NAAQS) for particulate matter (PM), the US Environmental Protection Agency is investigating the development of sampling methodology for a possible new coarse particle standard. When developed, this me...

  8. On the development of a methodology for extensive in-situ and continuous atmospheric CO2 monitoring

    NASA Astrophysics Data System (ADS)

    Wang, K.; Chang, S.; Jhang, T.

    2010-12-01

    Carbon dioxide is recognized as the dominating greenhouse gas contributing to anthropogenic global warming. Stringent controls on carbon dioxide emissions are viewed as necessary steps in controlling atmospheric carbon dioxide concentrations. From the view point of policy making, regulation of carbon dioxide emissions and its monitoring are keys to the success of stringent controls on carbon dioxide emissions. Especially, extensive atmospheric CO2 monitoring is a crucial step to ensure that CO2 emission control strategies are closely followed. In this work we develop a methodology that enables reliable and accurate in-situ and continuous atmospheric CO2 monitoring for policy making. The methodology comprises the use of gas filter correlation (GFC) instrument for in-situ CO2 monitoring, the use of CO2 working standards accompanying the continuous measurements, and the use of NOAA WMO CO2 standard gases for calibrating the working standards. The use of GFC instruments enables 1-second data sampling frequency with the interference of water vapor removed from added dryer. The CO2 measurements are conducted in the following timed and cycled manner: zero CO2 measurement, two standard CO2 gases measurements, and ambient air measurements. The standard CO2 gases are calibrated again NOAA WMO CO2 standards. The methodology is used in indoor CO2 measurements in a commercial office (about 120 people working inside), ambient CO2 measurements, and installed in a fleet of in-service commercial cargo ships for monitoring CO2 over global marine boundary layer. These measurements demonstrate our method is reliable, accurate, and traceable to NOAA WMO CO2 standards. The portability of the instrument and the working standards make the method readily applied for large-scale and extensive CO2 measurements.

  9. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    PubMed

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  10. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  11. Seven Performance Drivers.

    ERIC Educational Resources Information Center

    Ross, Linda

    2003-01-01

    Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…

  12. Health Data Standards and Adoption Process: Preliminary Findings of a Qualitative Study in Saudi Arabia

    ERIC Educational Resources Information Center

    Alkraiji, Abdullah; Jackson, Thomas; Murray, Ian

    2011-01-01

    Purpose: This paper seeks to carry out a critical study of health data standards and adoption process with a focus on Saudi Arabia. Design/methodology/approach: Many developed nations have initiated programs to develop, promote, adopt and customise international health data standards to the local needs. The current status of, and future plans for,…

  13. Sharing behavioral data through a grid infrastructure using data standards

    PubMed Central

    Min, Hua; Ohira, Riki; Collins, Michael A; Bondy, Jessica; Avis, Nancy E; Tchuvatkina, Olga; Courtney, Paul K; Moser, Richard P; Shaikh, Abdul R; Hesse, Bradford W; Cooper, Mary; Reeves, Dianne; Lanese, Bob; Helba, Cindy; Miller, Suzanne M; Ross, Eric A

    2014-01-01

    Objective In an effort to standardize behavioral measures and their data representation, the present study develops a methodology for incorporating measures found in the National Cancer Institute's (NCI) grid-enabled measures (GEM) portal, a repository for behavioral and social measures, into the cancer data standards registry and repository (caDSR). Methods The methodology consists of four parts for curating GEM measures into the caDSR: (1) develop unified modeling language (UML) models for behavioral measures; (2) create common data elements (CDE) for UML components; (3) bind CDE with concepts from the NCI thesaurus; and (4) register CDE in the caDSR. Results UML models have been developed for four GEM measures, which have been registered in the caDSR as CDE. New behavioral concepts related to these measures have been created and incorporated into the NCI thesaurus. Best practices for representing measures using UML models have been utilized in the practice (eg, caDSR). One dataset based on a GEM-curated measure is available for use by other systems and users connected to the grid. Conclusions Behavioral and population science data can be standardized by using and extending current standards. A new branch of CDE for behavioral science was developed for the caDSR. It expands the caDSR domain coverage beyond the clinical and biological areas. In addition, missing terms and concepts specific to the behavioral measures addressed in this paper were added to the NCI thesaurus. A methodology was developed and refined for curation of behavioral and population science data. PMID:24076749

  14. 76 FR 65504 - Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ..., including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility... Reliability Standard, FAC- 008-3--Facility Ratings, developed by the North American Electric Reliability... Reliability Standard FAC- 008-3 is pending before the Commission. The proposed Reliability Standard modifies...

  15. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  16. Moving to the Next Generation of Standards for Science: Building on Recent Practices. CRESST Report 762

    ERIC Educational Resources Information Center

    Herman, Joan L.

    2009-01-01

    In this report, Joan Herman, director for the National Center for Research, on Evaluation, Standards, & Student Testing (CRESST) recommends that the new generation of science standards be based on lessons learned from current practice and on recent examples of standards-development methodology. In support of this, recent, promising efforts to…

  17. A Knowledge Engineering Approach to Develop Domain Ontology

    ERIC Educational Resources Information Center

    Yun, Hongyan; Xu, Jianliang; Xiong, Jing; Wei, Moji

    2011-01-01

    Ontologies are one of the most popular and widespread means of knowledge representation and reuse. A few research groups have proposed a series of methodologies for developing their own standard ontologies. However, because this ontological construction concerns special fields, there is no standard method to build domain ontology. In this paper,…

  18. Can Global Learning Raise Standards within Pupils' Writing in the Primary Phase? Development Education Research Centre. Research Paper No. 16

    ERIC Educational Resources Information Center

    Alcock, Hilary L.; Ramirez Barker, Linda

    2016-01-01

    This study was primarily undertaken by teachers for teachers, and focuses on the potential contribution of global learning and development education (DE) methodologies to a core aspect of curriculum provision, namely writing. The aim of the study is to explore whether using global learning and DE methodologies can have an impact on pupils'…

  19. Sharing behavioral data through a grid infrastructure using data standards.

    PubMed

    Min, Hua; Ohira, Riki; Collins, Michael A; Bondy, Jessica; Avis, Nancy E; Tchuvatkina, Olga; Courtney, Paul K; Moser, Richard P; Shaikh, Abdul R; Hesse, Bradford W; Cooper, Mary; Reeves, Dianne; Lanese, Bob; Helba, Cindy; Miller, Suzanne M; Ross, Eric A

    2014-01-01

    In an effort to standardize behavioral measures and their data representation, the present study develops a methodology for incorporating measures found in the National Cancer Institute's (NCI) grid-enabled measures (GEM) portal, a repository for behavioral and social measures, into the cancer data standards registry and repository (caDSR). The methodology consists of four parts for curating GEM measures into the caDSR: (1) develop unified modeling language (UML) models for behavioral measures; (2) create common data elements (CDE) for UML components; (3) bind CDE with concepts from the NCI thesaurus; and (4) register CDE in the caDSR. UML models have been developed for four GEM measures, which have been registered in the caDSR as CDE. New behavioral concepts related to these measures have been created and incorporated into the NCI thesaurus. Best practices for representing measures using UML models have been utilized in the practice (eg, caDSR). One dataset based on a GEM-curated measure is available for use by other systems and users connected to the grid. Behavioral and population science data can be standardized by using and extending current standards. A new branch of CDE for behavioral science was developed for the caDSR. It expands the caDSR domain coverage beyond the clinical and biological areas. In addition, missing terms and concepts specific to the behavioral measures addressed in this paper were added to the NCI thesaurus. A methodology was developed and refined for curation of behavioral and population science data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Specification of Energy Assessment Methodologies to Satisfy ISO 50001 Energy Management Standard

    NASA Astrophysics Data System (ADS)

    Kanneganti, Harish

    Energy management has become more crucial for industrial sector as a way to lower their cost of production and in reducing their carbon footprint. Environmental regulations also force the industrial sector to increase the efficiency of their energy usage. Hence industrial sector started relying on energy management consultancies for improvements in energy efficiency. With the development of ISO 50001 standard, the entire energy management took a new dimension involving top level management and getting their commitment on energy efficiency. One of the key requirements of ISO 50001 is to demonstrate continual improvement in their (industry) energy efficiency. The major aim of this work is to develop an energy assessment methodology and reporting format to tailor the needs of ISO 50001. The developed methodology integrates the energy reduction aspect of an energy assessment with the requirements of sections 4.4.3 (Energy Review) to 4.4.6 (Objectives, Targets and Action Plans) in ISO 50001 and thus helping the facilities in easy implementation of ISO 50001.

  1. Summary Brief: International Baccalaureate Standards Development and Alignment Project

    ERIC Educational Resources Information Center

    Conley, David T.; Ward, Terri

    2009-01-01

    Although the International Baccalaureate (IB) Diploma Programme is offered by many high schools in the United States and considered to be challenging and rich in content, the curriculum has not been analyzed to determine its alignment with college readiness standards or state educational standards in the U.S. The research methodology employed by…

  2. Final Report of the Working Group Meeting C, "Standards in Vocational Training" (Berlin, Germany, February 15-16, 1996). [and] Minutes of the Working Group Meeting C, "Standards in Vocational Education and Training."

    ERIC Educational Resources Information Center

    German Federal Inst. for Vocational Training Affairs, Berlin (Germany).

    Representatives from 13 Central and Eastern European countries, the European Centre for the Development of Vocational Training, and the Organization for Economic Cooperation and Development met for 2 days in Berlin to continue European Training Foundation (ETF) efforts to design a methodology for formulating standards in vocational training (VT)…

  3. Movie Mitosis

    ERIC Educational Resources Information Center

    Bogiages, Christopher; Hitt, Austin M.

    2008-01-01

    Mitosis and meiosis are essential for the growth, development, and reproduction of organisms. Because these processes are essential to life, both are emphasized in biology texts, state standards, and the National Science Education Standards. In this article, the authors present their methodology for teaching mitosis by having students produce…

  4. Technical Support Documents Used to Develop the Chesapeake Bay TMDL

    EPA Pesticide Factsheets

    The Chesapeake Bay TMDL development was supported by several technical documents for water quality standards and allocation methodologies specific to the Chesapeake Bay. This page provides the technical support documents.

  5. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  6. Cryogenic insulation standard data and methodologies

    NASA Astrophysics Data System (ADS)

    Demko, J. A.; Fesmire, J. E.; Johnson, W. L.; Swanger, A. M.

    2014-01-01

    Although some standards exist for thermal insulation, few address the sub-ambient temperature range and cold-side temperatures below 100 K. Standards for cryogenic insulation systems require cryostat testing and data analysis that will allow the development of the tools needed by design engineers and thermal analysts for the design of practical cryogenic systems. Thus, this critically important information can provide reliable data and methodologies for industrial efficiency and energy conservation. Two Task Groups have been established in the area of cryogenic insulation systems Under ASTM International's Committee C16 on Thermal Insulation. These are WK29609 - New Standard for Thermal Performance Testing of Cryogenic Insulation Systems and WK29608 - Standard Practice for Multilayer Insulation in Cryogenic Service. The Cryogenics Test Laboratory of NASA Kennedy Space Center and the Thermal Energy Laboratory of LeTourneau University are conducting Inter-Laboratory Study (ILS) of selected insulation materials. Each lab carries out the measurements of thermal properties of these materials using identical flat-plate boil-off calorimeter instruments. Parallel testing will provide the comparisons necessary to validate the measurements and methodologies. Here we discuss test methods, some initial data in relation to the experimental approach, and the manner reporting the thermal performance data. This initial study of insulation materials for sub-ambient temperature applications is aimed at paving the way for further ILS comparative efforts that will produce standard data sets for several commercial materials. Discrepancies found between measurements will be used to improve the testing and data reduction techniques being developed as part of the future ASTM International standards.

  7. Applications of cost-effectiveness methodologies in behavioral medicine.

    PubMed

    Kaplan, Robert M; Groessl, Erik J

    2002-06-01

    In 1996, the Panel on Cost-Effectiveness in Health and Medicine developed standards for cost-effectiveness analysis. The standards include the use of a societal perspective, that treatments be evaluated in comparison with the best available alternative (rather than with no care at all), and that health benefits be expressed in standardized units. Guidelines for cost accounting were also offered. Among 24,562 references on cost-effectiveness in Medline between 1995 and 2000, only a handful were relevant to behavioral medicine. Only 19 studies published between 1983 and 2000 met criteria for further evaluation. Among analyses that were reported, only 2 studies were found consistent with the Panel's criteria for high-quality analyses, although more recent studies were more likely to meet methodological standards. There are substantial opportunities to advance behavioral medicine by performing standardized cost-effectiveness analyses.

  8. Standards to Assure Quality in Tertiary Education: The Case of Tanzania

    ERIC Educational Resources Information Center

    Manyaga, Timothy

    2008-01-01

    Purpose: The purpose of this paper is to provide information on development of standards in Tanzania which may be of help to training providers in other countries as they seek to improve the quality and standards of their provision. Design/methodology/approach: The need to provide quality assured tertiary qualifications in Tanzania to win both…

  9. A methodology for Manufacturing Execution Systems (MES) implementation

    NASA Astrophysics Data System (ADS)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  10. [Method for the quality assessment of data collection processes in epidemiological studies].

    PubMed

    Schöne, G; Damerow, S; Hölling, H; Houben, R; Gabrys, L

    2017-10-01

    For a quantitative evaluation of primary data collection processes in epidemiological surveys based on accompaniments and observations (in the field), there is no description of test criteria and methodologies in relevant literature and thus no known application in practice. Therefore, methods need to be developed and existing procedures adapted. The aim was to identify quality-relevant developments within quality dimensions by means of inspection points (quality indicators) during the process of data collection. As a result we seek to implement and establish a methodology for the assessment of overall survey quality supplementary to standardized data analyses. Monitors detect deviations from standard primary data collection during site visits by applying standardized checklists. Quantitative results - overall and for each dimension - are obtained by numerical calculation of quality indicators. Score results are categorized and color coded. This visual prioritization indicates necessity for intervention. The results obtained give clues regarding the current quality of data collection. This allows for the identification of such sections where interventions for quality improvement are needed. In addition, process quality development can be shown over time on an intercomparable basis. This methodology for the evaluation of data collection quality can identify deviations from norms, focalize quality analyses and help trace causes for significant deviations.

  11. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, K. B.; Peter, A. M.

    2017-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. This paper focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  12. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, Kyle B.; Peter, Adrian M.

    2016-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Our research focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  13. 75 FR 18751 - FBI Criminal Justice Information Services Division User Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... Standards (SFFAS-4): Managerial Cost Accounting Concepts and Standards for the Federal Government; and other relevant financial management directives, BearingPoint developed a cost accounting methodology and related... management process that provides information about the relationships between inputs (costs) and outputs...

  14. Development Algorithm of the Technological Process of Manufacturing Gas Turbine Parts by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Sotov, A. V.; Agapovichev, A. V.; Smelov, V. G.; Kyarimov, R. R.

    2018-01-01

    The technology of the selective laser melting (SLM) allows making products from powders of aluminum, titanium, heat-resistant alloys and stainless steels. Today the use of SLM technology develops at manufacture of the functional parts. This in turn requires development of a methodology projection of technological processes (TP) for manufacturing parts including databases of standard TP. Use of a technique will allow to exclude influence of technologist’s qualification on made products quality, and also to reduce labor input and energy consumption by development of TP due to use of the databases of standard TP integrated into a methodology. As approbation of the developed methodology the research of influence of the modes of a laser emission on a roughness of a surface of synthesized material was presented. It is established that the best values of a roughness of exemplars in the longitudinal and transversal directions make 1.98 μm and 3.59 μm respectively. These values of a roughness were received at specific density of energy 6.25 J/mm2 that corresponds to power and the speed of scanning of 200 W and 400 mm/s, respectively, and a hatch distance of 0.08 mm.

  15. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  16. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  17. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    NASA Astrophysics Data System (ADS)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over developing new super sensors.

  18. ESTABLISH AND STANDARDIZE METHODOLOGY FOR DETECTION OF WATERBORNE VIRUSES FROM HUMAN SOURCES

    EPA Science Inventory

    Research is conducted to develop and standardize methods to detect and measure occurrence of human enteric viruses that cause waterborne disease. The viruses of concern include the emerging pathogens--hepatitis E virus and group B rotaviruses. Also of concern are the coxsackiev...

  19. Direct cost analysis of intensive care unit stay in four European countries: applying a standardized costing methodology.

    PubMed

    Tan, Siok Swan; Bakker, Jan; Hoogendoorn, Marga E; Kapila, Atul; Martin, Joerg; Pezzi, Angelo; Pittoni, Giovanni; Spronk, Peter E; Welte, Robert; Hakkaart-van Roijen, Leona

    2012-01-01

    The objective of the present study was to measure and compare the direct costs of intensive care unit (ICU) days at seven ICU departments in Germany, Italy, the Netherlands, and the United Kingdom by means of a standardized costing methodology. A retrospective cost analysis of ICU patients was performed from the hospital's perspective. The standardized costing methodology was developed on the basis of the availability of data at the seven ICU departments. It entailed the application of the bottom-up approach for "hotel and nutrition" and the top-down approach for "diagnostics," "consumables," and "labor." Direct costs per ICU day ranged from €1168 to €2025. Even though the distribution of costs varied by cost component, labor was the most important cost driver at all departments. The costs for "labor" amounted to €1629 at department G but were fairly similar at the other departments (€711 ± 115). Direct costs of ICU days vary widely between the seven departments. Our standardized costing methodology could serve as a valuable instrument to compare actual cost differences, such as those resulting from differences in patient case-mix. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. The Standard of Quality for HEIs in Vietnam: A Step in the Right Direction?

    ERIC Educational Resources Information Center

    Tran, Nga D.; Nguyen, Thanh T.; Nguyen, My T. N.

    2011-01-01

    Purpose: The purpose of this paper is to provide a critical analysis of the Standard of Quality for higher education institutions in Vietnam which was developed in response to an urgent call for a fundamental reform to enhance the quality of educational provision, particularly of teaching and learning. Design/methodology/approach: The standard and…

  1. SHINE: Strategic Health Informatics Networks for Europe.

    PubMed

    Kruit, D; Cooper, P A

    1994-10-01

    The mission of SHINE is to construct an open systems framework for the development of regional community healthcare telematic services that support and add to the strategic business objectives of European healthcare providers and purchasers. This framework will contain a Methodology, that identifies healthcare business processes and develops a supporting IT strategy, and the Open Health Environment. This consists of an architecture and information standards that are 'open' and will be available to any organisation wishing to construct SHINE conform regional healthcare telematic services. Results are: generic models, e.g., regional healthcare business networks, IT strategies; demonstrable, e.g., pilot demonstrators, application and service prototypes; reports, e.g., SHINE Methodology, pilot specifications & evaluations; proposals, e.g., service/interface specifications, standards conformance.

  2. Procurement Contracting Officer’s Guide to Cost Accounting Standards,

    DTIC Science & Technology

    1977-09-01

    ACCESSION MO r P.R0CUR2K2NT CONTRACTING ^FFICDR’S %UID2 TO COST ACCOUNTING STANDARDS. .-IB’ i 4fiSj irPBVPWra ONOANIZATION NAME MB AOONESS...discussing the history and development of Cost Accounting Standards, the functions of the Cost Accounting Standards Board, and the methodology...20. Abstract (continued) the tasks that Cost Accounting Standards have placed on the procurement officer. 3y understanding these tasks the

  3. Development of a Methodology for Assessing Aircrew Workloads.

    DTIC Science & Technology

    1981-11-01

    Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting

  4. Ballast water regulations and the move toward concentration-based numeric discharge limits.

    PubMed

    Albert, Ryan J; Lishman, John M; Saxena, Juhi R

    2013-03-01

    Ballast water from shipping is a principal source for the introduction of nonindigenous species. As a result, numerous government bodies have adopted various ballast water management practices and discharge standards to slow or eliminate the future introduction and dispersal of these nonindigenous species. For researchers studying ballast water issues, understanding the regulatory framework is helpful to define the scope of research needed by policy makers to develop effective regulations. However, for most scientists, this information is difficult to obtain because it is outside the standard scientific literature and often difficult to interpret. This paper provides a brief review of the regulatory framework directed toward scientists studying ballast water and aquatic invasive species issues. We describe different approaches to ballast water management in international, U.S. federal and state, and domestic ballast water regulation. Specifically, we discuss standards established by the International Maritime Organization (IMO), the U.S. Coast Guard and U.S. Environmental Protection Agency, and individual states in the United States including California, New York, and Minnesota. Additionally, outside the United States, countries such as Australia, Canada, and New Zealand have well-established domestic ballast water regulatory regimes. Different approaches to regulation have recently resulted in variations between numeric concentration-based ballast water discharge limits, particularly in the United States, as well as reliance on use of ballast water exchange pending development and adoption of rigorous science-based discharge standards. To date, numeric concentration-based discharge limits have not generally been based upon a thorough application of risk-assessment methodologies. Regulators, making decisions based on the available information and methodologies before them, have consequently established varying standards, or not established standards at all. The review and refinement of ballast water discharge standards by regulatory agencies will benefit from activity by the scientific community to improve and develop more precise risk-assessment methodologies.

  5. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  6. Development of an Evaluation Methodology for Triple Bottom Line Reports Using International Standards on Reporting

    NASA Astrophysics Data System (ADS)

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  7. Development of an evaluation methodology for triple bottom line reports using international standards on reporting.

    PubMed

    Skouloudis, Antonis; Evangelinos, Konstantinos; Kourmousis, Fotis

    2009-08-01

    The purpose of this article is twofold. First, evaluation scoring systems for triple bottom line (TBL) reports to date are examined and potential methodological weaknesses and problems are highlighted. In this context, a new assessment methodology is presented based explicitly on the most widely acknowledged standard on non-financial reporting worldwide, the Global Reporting Initiative (GRI) guidelines. The set of GRI topics and performance indicators was converted into scoring criteria while the generic scoring devise was set from 0 to 4 points. Secondly, the proposed benchmark tool was applied to the TBL reports published by Greek companies. Results reveal major gaps in reporting practices, stressing the need for the further development of internal systems and processes in order to collect essential non-financial performance data. A critical overview of the structure and rationale of the evaluation tool in conjunction with the Greek case study is discussed while recommendations for future research on the field of this relatively new form of reporting are suggested.

  8. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 2: Test Bed Performance Evaluation and Final AeroMACS Recommendations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Magner, James

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  9. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 1: Concepts of Use, Initial System Requirements, Architecture, and AeroMACS Design Considerations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Isaacs, James; Henriksen, Steve; Zelkin, Natalie

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I (this document) is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  10. Methodological Validation of Quality of Life Questionnaire for Coal Mining Groups-Indian Scenario

    ERIC Educational Resources Information Center

    Sen, Sayanti; Sen, Goutam; Tewary, B. K.

    2012-01-01

    Maslow's hierarchy-of-needs theory has been used to predict development of Quality of Life (QOL) in countries over time. In this paper an attempt has been taken to derive a methodological validation of quality of life questionnaire which have been prepared for the study area. The objective of the study is to standardize a questionnaire tool to…

  11. Nuclear power plant life extension using subsize surveillance specimens. Performance report (4/15/92 - 4/14/98)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Arvind S.

    2001-03-05

    A new methodology to predict the Upper Shelf Energy (USE) of standard Charpy specimens (Full size) based on subsize specimens has been developed. The prediction methodology uses Finite Element Modeling (FEM) to model the fracture behavior. The inputs to FEM are the tensile properties of material and subsize Charpy specimen test data.

  12. Integrated methodology for standard-setting norms of innovative product in the new competitive environment

    NASA Astrophysics Data System (ADS)

    Polyakova, Marina; Rubin, Gennadiy

    2017-07-01

    Modern theory of technological and economical development is based on long-term cycles. So far it has been proved that the technological structure of the economy can be subdivided into groups of technological complexes, which are inter-related with each other by similar technological links, so called technological modes. Technological mode is defined as a complex of interrelated production units of similar technological level, which develop simultaneously. In order to provide competitiveness of products in the new changing conditions, it is necessary to make sure that they meet all the regulatory requirements specified in standards. But the existing and the fast changing situation on the merchandise markets causes disbalance between the growing customer requirements and the technological capabilities of the manufacturer. This makes the issue of standardization development even more urgent both from the point of view of establishing the current positions and from the point of view of possible promising development trends in technology. In the paper scientific and engineering principles of developing standardization as a science are described. It is shown that further development of standardization is based on the principles of advanced standardization the main idea of which is to set up the prospective requirements to the innovative product. Modern approaches of advanced standardization are shown in this paper. The complexity of the negotiation procedure between customer and manufacturer as a whole and achieving of consensus, in particular, make it necessary to find conceptually new approaches to developing mathematical models. The developed methodology picture the process of achieving the consensus between customer and manufacturer while developing the standard norms in the form of decreasing S-curve diagram. It means that in the end of the negotiation process, there is no difference between customer and manufacturer positions. It makes it possible to provide the basics of the assessment using the differential equation of the relationship between the rate of change of quality assessment and the distance of the estimated parameter value from the best value to the worst one. The obtained mathematical model can be used in the practice of standardization decreasing time of setting standard norms.

  13. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2001-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  14. Space Transportation Operations: Assessment of Methodologies and Models

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla

    2002-01-01

    The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.

  15. Implementation of a formulary management process.

    PubMed

    Karel, Lauren I; Delisle, Dennis R; Anagnostis, Ellena A; Wordell, Cindy J

    2017-08-15

    The application of lean methodology in an initiative to redesign the formulary maintenance process at an academic medical center is described. Maintaining a hospital formulary requires clear communication and coordination among multiple members of the pharmacy department. Using principles of lean methodology, pharmacy department personnel within a multihospital health system launched a multifaceted initiative to optimize formulary management systemwide. The ongoing initiative began with creation of a formulary maintenance redesign committee consisting of pharmacy department personnel with expertise in informatics, automation, purchasing, drug information, and clinical pharmacy services. The committee met regularly and used lean methodology to design a standardized process for management of formulary additions and deletions and changes to medications' formulary status. Through value stream analysis, opportunities for process and performance improvement were identified; staff suggestions on process streamlining were gathered during a series of departmental kaizen events. A standardized template for development and dissemination of monographs associated with formulary additions and status changes was created. In addition, a shared Web-based checklist was developed to facilitate information sharing and timely initiation and completion of tasks involved in formulary status changes, and a permanent formulary maintenance committee was established to monitor and refine the formulary management process. A clearly defined, standardized process within the pharmacy department was developed for tracking necessary steps in enacting formulary changes to encourage safe and efficient workflow. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  16. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303

  17. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  18. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  19. A design and implementation methodology for diagnostic systems

    NASA Technical Reports Server (NTRS)

    Williams, Linda J. F.

    1988-01-01

    A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.

  20. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  1. Classifying E-Trainer Standards

    ERIC Educational Resources Information Center

    Julien, Anne

    2005-01-01

    Purpose: To set-up a classification of the types of profiles and competencies that are required to set-up a good e-learning programme. This approach provides a framework within which a set of standards can be defined for e-trainers. Design/methodology/approach: Open and distance learning (ODL) has been developing in Europe, due to new tools in…

  2. The standard of healthcare accreditation standards: a review of empirical research underpinning their development and impact

    PubMed Central

    2012-01-01

    Background Healthcare accreditation standards are advocated as an important means of improving clinical practice and organisational performance. Standard development agencies have documented methodologies to promote open, transparent, inclusive development processes where standards are developed by members. They assert that their methodologies are effective and efficient at producing standards appropriate for the health industry. However, the evidence to support these claims requires scrutiny. The study’s purpose was to examine the empirical research that grounds the development methods and application of healthcare accreditation standards. Methods A multi-method strategy was employed over the period March 2010 to August 2011. Five academic health research databases (Medline, Psych INFO, Embase, Social work abstracts, and CINAHL) were interrogated, the websites of 36 agencies associated with the study topic were investigated, and a snowball search was undertaken. Search criteria included accreditation research studies, in English, addressing standards and their impact. Searching in stage 1 initially selected 9386 abstracts. In stage 2, this selection was refined against the inclusion criteria; empirical studies (n = 2111) were identified and refined to a selection of 140 papers with the exclusion of clinical or biomedical and commentary pieces. These were independently reviewed by two researchers and reduced to 13 articles that met the study criteria. Results The 13 articles were analysed according to four categories: overall findings; standards development; implementation issues; and impact of standards. Studies have only occurred in the acute care setting, predominately in 2003 (n = 5) and 2009 (n = 4), and in the United States (n = 8). A multidisciplinary focus (n = 9) and mixed method approach (n = 11) are common characteristics. Three interventional studies were identified, with the remaining 10 studies having research designs to investigate clinical or organisational impacts. No study directly examined standards development or other issues associated with their progression. Only one study noted implementation issues, identifying several enablers and barriers. Standards were reported to improve organisational efficiency and staff circumstances. However, the impact on clinical quality was mixed, with both improvements and a lack of measurable effects recorded. Conclusion Standards are ubiquitous within healthcare and are generally considered to be an important means by which to improve clinical practice and organisational performance. However, there is a lack of robust empirical evidence examining the development, writing, implementation and impacts of healthcare accreditation standards. PMID:22995152

  3. The standard of healthcare accreditation standards: a review of empirical research underpinning their development and impact.

    PubMed

    Greenfield, David; Pawsey, Marjorie; Hinchcliff, Reece; Moldovan, Max; Braithwaite, Jeffrey

    2012-09-20

    Healthcare accreditation standards are advocated as an important means of improving clinical practice and organisational performance. Standard development agencies have documented methodologies to promote open, transparent, inclusive development processes where standards are developed by members. They assert that their methodologies are effective and efficient at producing standards appropriate for the health industry. However, the evidence to support these claims requires scrutiny. The study's purpose was to examine the empirical research that grounds the development methods and application of healthcare accreditation standards. A multi-method strategy was employed over the period March 2010 to August 2011. Five academic health research databases (Medline, Psych INFO, Embase, Social work abstracts, and CINAHL) were interrogated, the websites of 36 agencies associated with the study topic were investigated, and a snowball search was undertaken. Search criteria included accreditation research studies, in English, addressing standards and their impact. Searching in stage 1 initially selected 9386 abstracts. In stage 2, this selection was refined against the inclusion criteria; empirical studies (n = 2111) were identified and refined to a selection of 140 papers with the exclusion of clinical or biomedical and commentary pieces. These were independently reviewed by two researchers and reduced to 13 articles that met the study criteria. The 13 articles were analysed according to four categories: overall findings; standards development; implementation issues; and impact of standards. Studies have only occurred in the acute care setting, predominately in 2003 (n = 5) and 2009 (n = 4), and in the United States (n = 8). A multidisciplinary focus (n = 9) and mixed method approach (n = 11) are common characteristics. Three interventional studies were identified, with the remaining 10 studies having research designs to investigate clinical or organisational impacts. No study directly examined standards development or other issues associated with their progression. Only one study noted implementation issues, identifying several enablers and barriers. Standards were reported to improve organisational efficiency and staff circumstances. However, the impact on clinical quality was mixed, with both improvements and a lack of measurable effects recorded. Standards are ubiquitous within healthcare and are generally considered to be an important means by which to improve clinical practice and organisational performance. However, there is a lack of robust empirical evidence examining the development, writing, implementation and impacts of healthcare accreditation standards.

  4. Recent developments in photovoltaic energy by ERDA/NASA-LeRC

    NASA Technical Reports Server (NTRS)

    Deyo, J. N.

    1977-01-01

    Application development activities were designed to stimulate the market for photovoltaics so that as costs are reduced there will be an increasing market demand to encourage the expansion of industrial solar array production capacity. Supporting these application development activities are tasks concerned with: (1) establishing standards and methodology for terrestrial solar cell calibration; (2) conducting standard and diagnostic measurements on solar cells and modules; and (3) conducting real time and accelerated testing of solar cell modules and materials of construction under outdoor sunlight conditions.

  5. Integrating Susceptibility into Environmental Policy: An Analysis of the National Ambient Air Quality Standard for Lead

    PubMed Central

    Chari, Ramya; Burke, Thomas A.; White, Ronald H.; Fox, Mary A.

    2012-01-01

    Susceptibility to chemical toxins has not been adequately addressed in risk assessment methodologies. As a result, environmental policies may fail to meet their fundamental goal of protecting the public from harm. This study examines how characterization of risk may change when susceptibility is explicitly considered in policy development; in particular we examine the process used by the U.S. Environmental Protection Agency (EPA) to set a National Ambient Air Quality Standard (NAAQS) for lead. To determine a NAAQS, EPA estimated air lead-related decreases in child neurocognitive function through a combination of multiple data elements including concentration-response (CR) functions. In this article, we present alternative scenarios for determining a lead NAAQS using CR functions developed in populations more susceptible to lead toxicity due to socioeconomic disadvantage. The use of CR functions developed in susceptible groups resulted in cognitive decrements greater than original EPA estimates. EPA’s analysis suggested that a standard level of 0.15 µg/m3 would fulfill decision criteria, but by incorporating susceptibility we found that options for the standard could reasonably be extended to lower levels. The use of data developed in susceptible populations would result in the selection of a more protective NAAQS under the same decision framework applied by EPA. Results are used to frame discussion regarding why cumulative risk assessment methodologies are needed to help inform policy development. PMID:22690184

  6. How to Select the most Relevant Roughness Parameters of a Surface: Methodology Research Strategy

    NASA Astrophysics Data System (ADS)

    Bobrovskij, I. N.

    2018-01-01

    In this paper, the foundations for new methodology creation which provides solving problem of surfaces structure new standards parameters huge amount conflicted with necessary actual floors quantity of surfaces structure parameters which is related to measurement complexity decreasing are considered. At the moment, there is no single assessment of the importance of a parameters. The approval of presented methodology for aerospace cluster components surfaces allows to create necessary foundation, to develop scientific estimation of surfaces texture parameters, to obtain material for investigators of chosen technological procedure. The methods necessary for further work, the creation of a fundamental reserve and development as a scientific direction for assessing the significance of microgeometry parameters are selected.

  7. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software

    PubMed Central

    Zuckerman, Daniel M.; Chong, Lillian T.

    2018-01-01

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772

  8. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  9. Breaking the Link between Environmental Degradation and Oil Palm Expansion: A Method for Enabling Sustainable Oil Palm Expansion

    PubMed Central

    Smit, Hans Harmen; Meijaard, Erik; van der Laan, Carina; Mantel, Stephan; Budiman, Arif; Verweij, Pita

    2013-01-01

    Land degradation is a global concern. In tropical areas it primarily concerns the conversion of forest into non-forest lands and the associated losses of environmental services. Defining such degradation is not straightforward hampering effective reduction in degradation and use of already degraded lands for more productive purposes. To facilitate the processes of avoided degradation and land rehabilitation, we have developed a methodology in which we have used international environmental and social sustainability standards to determine the suitability of lands for sustainable agricultural expansion. The method was developed and tested in one of the frontiers of agricultural expansion, West Kalimantan province in Indonesia. The focus was on oil palm expansion, which is considered as a major driver for deforestation in tropical regions globally. The results suggest that substantial changes in current land-use planning are necessary for most new plantations to comply with international sustainability standards. Through visualizing options for sustainable expansion with our methodology, we demonstrate that the link between oil palm expansion and degradation can be broken. Application of the methodology with criteria and thresholds similar to ours could help the Indonesian government and the industry to achieve its pro-growth, pro-job, pro-poor and pro-environment development goals. For sustainable agricultural production, context specific guidance has to be developed in areas suitable for expansion. Our methodology can serve as a template for designing such commodity and country specific tools and deliver such guidance. PMID:24039700

  10. Methodological Issues of Sample Collection and Analysis of Exhaled Breath

    EPA Science Inventory

    Recommended standardized procedures have been developed for measurement of exhaled lower respiratory nitric oxide (NO) and nasal NO. It would be desirable to develop similar guidelines for the sampling of exhaled breath related to other compounds. For such systemic volatile o...

  11. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  12. MOTOR VEHICLE SAFETY: NHTSA’s Ability to Detect and Recall Defective Replacement Crash Parts Is Limited

    DTIC Science & Technology

    2001-01-01

    incorporate airbags , under the used vehicle provision. NHTSA has not developed such standards because it has not identified significant problems with...might incorporate airbags . NHTSA has not developed such standards because it has not identified significant problems with occupant restraint systems...Appendix I: Scope and Methodology 24 Appendix II: State Legislation Governing Aftermarket Crash Parts and Recycled Airbags 27 Figures Figure 1: Replacement

  13. A Survey Examining Photopatch Test and Phototest Methodologies of Contact Dermatologists in the United States: Platform for Developing A Consensus.

    PubMed

    Asemota, Eseosa; Crawford, Glen; Kovarik, Carrie; Brod, Bruce A

    There is currently no standardized protocol for photopatch testing and phototesting in the United States. Certain testing paramaters (such as chemicals tested, time between test application and irradiation, and time of final interpretation) vary from provider to provider. These variations may impact comparability and consistency of test results. The goal of our survey-based study was to outline the photopatch test and phototest protocols used by US contact dermatologists. The information obtained will aid in the development of a national consensus on testing methodologies. Based on a literature search conducted on differences in testing methodologies, we constructed a questionnaire. The survey was distributed at the American Contact Dermatitis Society annual meeting and via the American Contact Dermatitis Society Web site. Standard descriptive analysis was performed on data obtained. Of the 800 dermatologists contacted, 117 agreed to participate in the survey. Among these respondents, 64 (54.8%) conduct photopatch testing. Results of the survey are presented, and they confirm that a variety of techniques and testing materials are used. It would be beneficial to enlist a panel of expert contact dermatologists to create by formal consensus, using these research findings, a standard photopatch test protocol for use in this country.

  14. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  15. Introduction to a special issue on concept mapping.

    PubMed

    Trochim, William M; McLinden, Daniel

    2017-02-01

    Concept mapping was developed in the 1980s as a unique integration of qualitative (group process, brainstorming, unstructured sorting, interpretation) and quantitative (multidimensional scaling, hierarchical cluster analysis) methods designed to enable a group of people to articulate and depict graphically a coherent conceptual framework or model of any topic or issue of interest. This introduction provides the basic definition and description of the methodology for the newcomer and describes the steps typically followed in its most standard canonical form (preparation, generation, structuring, representation, interpretation and utilization). It also introduces this special issue which reviews the history of the methodology, describes its use in a variety of contexts, shows the latest ways it can be integrated with other methodologies, considers methodological advances and developments, and sketches a vision of the future of the method's evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Listening to Students: Customer Journey Mapping at Birmingham City University Library and Learning Resources

    ERIC Educational Resources Information Center

    Andrews, Judith; Eade, Eleanor

    2013-01-01

    Birmingham City University's Library and Learning Resources' strategic aim is to improve student satisfaction. A key element is the achievement of the Customer Excellence Standard. An important component of the standard is the mapping of services to improve quality. Library and Learning Resources has developed a methodology to map these…

  17. Standard area diagrams for aiding severity estimation scientometrics, pathosystems and methodological trends in the last 25 years

    USDA-ARS?s Scientific Manuscript database

    Standard area diagrams (SADs) have long been used as a tool to aid the estimation of plant disease severity, an essential variable in phytopathometry. Formal validation of SADs was not considered prior to the early 1990s, when considerable effort began to be invested developing SADs and assessing th...

  18. Implementing the PAIN RelieveIt Randomized Controlled Trial in Hospice Care: Mechanisms for Success and Meeting PCORI Methodology Standards.

    PubMed

    Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J

    2017-07-01

    This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.

  19. Nanoparticle standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Havrilla, George Joseph

    2016-12-08

    We will purchase a COTS materials printer and adapt it for solution printing of known elemental concentration solutions. A methodology will be developed to create deposits of known mass in known locations on selected substrates. The deposits will be characterized for deposited mass, physical morphology, thickness and uniformity. Once an acceptable methodology has been developed and validated, we will create round robin samples to be characterized by LGSIMS instruments at LANL, PNNL and NIST. We will demonstrate the feasibility of depositing nanoparticles in known masses with the goal of creating separated nanoparticles in known locations.

  20. A comprehensive education plan: the key to a successful Joint Commission on Accreditation of Healthcare Organizations survey.

    PubMed

    Thurber, Raymond; Read, Linda Eklof

    2008-01-01

    This article describes how education specialists from a 359-bed acute care hospital in the Northeast developed and implemented a comprehensive educational plan to prepare all staff members on the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) tracer methodology and upcoming triennial survey. This methodology can be utilized by staff development educators in any setting to not only prepare staff members for a successful JCAHO survey but also to meet or exceed JCAHO standards in one's everyday job.

  1. Characteristics of a semi-custom library development system

    NASA Technical Reports Server (NTRS)

    Yancey, M.; Cannon, R.

    1990-01-01

    Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.

  2. Medical Differential Diagnosis (MDD) as the Architectural Framework for a Knowledge Model: A Vulnerability Detection and Threat Identification Methodology for Cyber-Crime and Cyber-Terrorism

    ERIC Educational Resources Information Center

    Conley-Ware, Lakita D.

    2010-01-01

    This research addresses a real world cyberspace problem, where currently no cross industry standard methodology exists. The goal is to develop a model for identification and detection of vulnerabilities and threats of cyber-crime or cyber-terrorism where cyber-technology is the vehicle to commit the criminal or terrorist act (CVCT). This goal was…

  3. The Development of Standard Operating Procedures for Social Mobilization and Community Engagement in Sierra Leone During the West Africa Ebola Outbreak of 2014-2015.

    PubMed

    Pedi, Danielle; Gillespie, Amaya; Bedson, Jamie; Jalloh, Mohamed F; Jalloh, Mohammad B; Kamara, Alusine; Bertram, Kathryn; Owen, Katharine; Jalloh, Mohamed A; Conte, Lansana

    2017-01-01

    This article describes the development of standard operating procedures (SOPs) for social mobilization and community engagement (SM/CE) in Sierra Leone during the Ebola outbreak of 2014-2015. It aims to (a) explain the rationale for a standardized approach, (b) describe the methodology used to develop the resulting SOPs, and (c) discuss the implications of the SOPs for future outbreak responses. Mixed methodologies were applied, including analysis of data on Ebola-related knowledge, attitudes, and practices; consultation through a national forum; and a series of workshops with more than 250 participants active in SM/CE in seven districts with recent confirmed cases. Specific challenges, best practices, and operational models were identified in relation to (a) the quality of SM/CE approaches; (b) coordination and operational structures; and (c) integration with Ebola services, including case management, burials, quarantine, and surveillance. This information was synthesized and codified into the SOPs, which include principles, roles, and actions for partners engaging in SM/CE as part of the Ebola response. This experience points to the need for a set of global principles and standards for meaningful SM/CE that can be rapidly adapted as a high-priority response component at the outset of future health and humanitarian crises.

  4. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less

  5. A Venezuelan Experience: Professional Development for Teachers, Meaningful Activities for Students.

    ERIC Educational Resources Information Center

    LeLoup, Jean W.; Schmidt-Rinehart, Barbara C.

    2003-01-01

    Presents a model of professional development that is suited to the inservice Spanish teacher with limited time and financial resources. Details a summer program for Spanish teachers in Venezuela that combines an immersion experience with an advanced methodology course emphasizing a standards-based approach to curriculum development. (Author/VWL)

  6. An Analysis and Plan of Test Development for the Law Enforcement Basic Training Course.

    ERIC Educational Resources Information Center

    Vineberg, Robert; Taylor, John E.

    A test development plan is described to evaluate police enrolled in the law enforcement basic training course developed by California's Commission on Peace Officer Standards and Training (POST). Some general test methodologies are discussed: performance tests, knowledge tests, and situational tests, including role playing simulations and…

  7. Use of Case Study Methods in Human Resource Management, Development, and Training Courses: Strategies and Techniques

    ERIC Educational Resources Information Center

    Maxwell, James R.; Gilberti, Anthony F.; Mupinga, Davison M.

    2006-01-01

    This paper will study some of the problems associated with case studies and make recommendations using standard and innovative methodologies effectively. Human resource management (HRM) and resource development cases provide context for analysis and decision-making designs in different industries. In most HRM development and training courses…

  8. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  9. Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.

    PubMed

    Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R

    1996-01-01

    Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.

  10. A multicenter study to standardize reporting and analyses of fluorescence-activated cell-sorted murine intestinal epithelial cells

    PubMed Central

    Magness, Scott T.; Puthoff, Brent J.; Crissey, Mary Ann; Dunn, James; Henning, Susan J.; Houchen, Courtney; Kaddis, John S.; Kuo, Calvin J.; Li, Linheng; Lynch, John; Martin, Martin G.; May, Randal; Niland, Joyce C.; Olack, Barbara; Qian, Dajun; Stelzner, Matthias; Swain, John R.; Wang, Fengchao; Wang, Jiafang; Wang, Xinwei; Yan, Kelley; Yu, Jian

    2013-01-01

    Fluorescence-activated cell sorting (FACS) is an essential tool for studies requiring isolation of distinct intestinal epithelial cell populations. Inconsistent or lack of reporting of the critical parameters associated with FACS methodologies has complicated interpretation, comparison, and reproduction of important findings. To address this problem a comprehensive multicenter study was designed to develop guidelines that limit experimental and data reporting variability and provide a foundation for accurate comparison of data between studies. Common methodologies and data reporting protocols for tissue dissociation, cell yield, cell viability, FACS, and postsort purity were established. Seven centers tested the standardized methods by FACS-isolating a specific crypt-based epithelial population (EpCAM+/CD44+) from murine small intestine. Genetic biomarkers for stem/progenitor (Lgr5 and Atoh 1) and differentiated cell lineages (lysozyme, mucin2, chromogranin A, and sucrase isomaltase) were interrogated in target and control populations to assess intra- and intercenter variability. Wilcoxon's rank sum test on gene expression levels showed limited intracenter variability between biological replicates. Principal component analysis demonstrated significant intercenter reproducibility among four centers. Analysis of data collected by standardized cell isolation methods and data reporting requirements readily identified methodological problems, indicating that standard reporting parameters facilitate post hoc error identification. These results indicate that the complexity of FACS isolation of target intestinal epithelial populations can be highly reproducible between biological replicates and different institutions by adherence to common cell isolation methods and FACS gating strategies. This study can be considered a foundation for continued method development and a starting point for investigators that are developing cell isolation expertise to study physiology and pathophysiology of the intestinal epithelium. PMID:23928185

  11. Onto-clust--a methodology for combining clustering analysis and ontological methods for identifying groups of comorbidities for developmental disorders.

    PubMed

    Peleg, Mor; Asbeh, Nuaman; Kuflik, Tsvi; Schertz, Mitchell

    2009-02-01

    Children with developmental disorders usually exhibit multiple developmental problems (comorbidities). Hence, such diagnosis needs to revolve on developmental disorder groups. Our objective is to systematically identify developmental disorder groups and represent them in an ontology. We developed a methodology that combines two methods (1) a literature-based ontology that we created, which represents developmental disorders and potential developmental disorder groups, and (2) clustering for detecting comorbid developmental disorders in patient data. The ontology is used to interpret and improve clustering results and the clustering results are used to validate the ontology and suggest directions for its development. We evaluated our methodology by applying it to data of 1175 patients from a child development clinic. We demonstrated that the ontology improves clustering results, bringing them closer to an expert generated gold-standard. We have shown that our methodology successfully combines an ontology with a clustering method to support systematic identification and representation of developmental disorder groups.

  12. Novel methodology to isolate microplastics from vegetal-rich samples.

    PubMed

    Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T

    2018-04-01

    Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.

    PubMed

    Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva

    2015-11-01

    It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.

  14. Generic simulation of multi-element ladar scanner kinematics in USU LadarSIM

    NASA Astrophysics Data System (ADS)

    Omer, David; Call, Benjamin; Pack, Robert; Fullmer, Rees

    2006-05-01

    This paper presents a generic simulation model for a ladar scanner with up to three scan elements, each having a steering, stabilization and/or pattern-scanning role. Of interest is the development of algorithms that automatically generate commands to the scan elements given beam-steering objectives out of the ladar aperture, and the base motion of the sensor platform. First, a straight-forward single-element body-fixed beam-steering methodology is presented. Then a unique multi-element redirective and reflective space-fixed beam-steering methodology is explained. It is shown that standard direction cosine matrix decomposition methods fail when using two orthogonal, space-fixed rotations, thus demanding the development of a new algorithm for beam steering. Finally, a related steering control methodology is presented that uses two separate optical elements mathematically combined to determine the necessary scan element commands. Limits, restrictions, and results on this methodology are presented.

  15. Status of emerging standards for removable computer storage media and related contributions of NIST

    NASA Technical Reports Server (NTRS)

    Podio, Fernando L.

    1992-01-01

    Standards for removable computer storage media are needed so that users may reliably interchange data both within and among various computer installations. Furthermore, media interchange standards support competition in industry and prevent sole-source lock-in. NIST participates in magnetic tape and optical disk standards development through Technical Committees X3B5, Digital Magnetic Tapes, X3B11, Optical Digital Data Disk, and the Joint Technical Commission on Data Permanence. NIST also participates in other relevant national and international standards committees for removable computer storage media. Industry standards for digital magnetic tapes require the use of Standard Reference Materials (SRM's) developed and maintained by NIST. In addition, NIST has been studying care and handling procedures required for digital magnetic tapes. NIST has developed a methodology for determining the life expectancy of optical disks. NIST is developing care and handling procedures for optical digital data disks and is involved in a program to investigate error reporting capabilities of optical disk drives. This presentation reflects the status of emerging magnetic tape and optical disk standards, as well as NIST's contributions in support of these standards.

  16. Ada and the rapid development lifecycle

    NASA Technical Reports Server (NTRS)

    Deforrest, Lloyd; Gref, Lynn

    1991-01-01

    JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.

  17. Benefits estimates of highway capital improvements with uncertain parameters.

    DOT National Transportation Integrated Search

    2006-01-01

    This report warrants consideration in the development of goals, performance measures, and standard cost-benefit methodology required of transportation agencies by the Virginia 2006 Appropriations Act. The Virginia Department of Transportation has beg...

  18. Perfect Mirror Design Technology

    DTIC Science & Technology

    1999-02-01

    with Prof. Mario Molina, recipient of the 1995 Nobel Prize in Chemistry. The partnership, along with Aerodyne Research Inc., looked at how sulfur...Corporation is developing standards for nondestructive evaluation ( NDE ) techniques for industry use. Use of the new standards will result in improved...novel testing methodology that dramatically improves the accuracy of NDE techniques used to detect flaws. Basic Research Five years ago, the main

  19. How to Select a Questionnaire with a Good Methodological Quality?

    PubMed

    Paiva, Saul Martins; Perazzo, Matheus de França; Ortiz, Fernanda Ruffo; Pordeus, Isabela Almeida; Martins-Júnior, Paulo Antônio

    2018-01-01

    In the last decades, several instruments have been used to evaluate the impact of oral health problems on the oral health-related quality of life (OHRQoL) of individuals. However, some instruments lack thorough methodological validation or present conceptual differences that hinder comparisons with instruments. Thus, it can be difficult to clinicians and researchers to select a questionnaire that accurately reflect what are really meaningful to individuals. This short communication aimed to discuss the importance of use an appropriate checklist to select an instrument with a good methodological quality. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to provide tools for evidence-based instrument selection. The COSMIN checklist comprises ten boxes that evaluate whether a study meets the standard for good methodological quality and two additional boxes to meet studies that use the Item Response Theory method and general requirements for results generalization, resulting in four steps to be followed. In this way, it is required at least some expertise in psychometrics or clinimetrics to a wide-ranging use of this checklist. The COSMIN applications include its use to ensure the standardization of cross-cultural adaptations and safer comparisons between measurement studies and evaluation of methodological quality of systematic reviews of measurement properties. Also, it can be used by students when training about measurement properties and by editors and reviewers when revising manuscripts on this topic. The popularization of COSMIN checklist is therefore necessary to improve the selection and evaluation of health measurement instruments.

  20. Influence of test procedures on the thermomechanical properties of a 55NiTi shape memory alloy

    NASA Astrophysics Data System (ADS)

    Padula, Santo A., II; Gaydosh, Darrell J.; Noebe, Ronald D.; Bigelow, Glen S.; Garg, Anita; Lagoudas, Dimitris; Karaman, Ibrahim; Atli, Kadri C.

    2008-03-01

    Over the past few decades, binary NiTi shape memory alloys have received attention due to their unique mechanical characteristics, leading to their potential use in low-temperature, solid-state actuator applications. However, prior to using these materials for such applications, the physical response of these systems to mechanical and thermal stimuli must be thoroughly understood and modeled to aid designers in developing SMA-enabled systems. Even though shape memory alloys have been around for almost five decades, very little effort has been made to standardize testing procedures. Although some standards for measuring the transformation temperatures of SMA's are available, no real standards exist for determining the various mechanical and thermomechanical properties that govern the usefulness of these unique materials. Consequently, this study involved testing a 55NiTi alloy using a variety of different test methodologies. All samples tested were taken from the same heat and batch to remove the influence of sample pedigree on the observed results. When the material was tested under constant-stress, thermal-cycle conditions, variations in the characteristic material responses were observed, depending on test methodology. The transformation strain and irreversible strain were impacted more than the transformation temperatures, which only showed an affect with regard to applied external stress. In some cases, test methodology altered the transformation strain by 0.005-0.01mm/mm, which translates into a difference in work output capability of approximately 2 J/cm 3 (290 in•lbf/in 3). These results indicate the need for the development of testing standards so that meaningful data can be generated and successfully incorporated into viable models and hardware. The use of consistent testing procedures is also important when comparing results from one research organization to another. To this end, differences in the observed responses will be presented, contrasted and rationalized, in hopes of eventually developing standardized testing procedures for shape memory alloys.

  1. Influence of Test Procedures on the Thermomechanical Properties of a 55NiTi Shape Memory Alloy

    NASA Technical Reports Server (NTRS)

    Padula, Santo A., II; Gaydosh, Darrell J.; Noebe, Ronald D.; Bigelow, Glen S.; Garg, Anita; Lagoudas, Dimitris; Karaman, Ibrahim; Atli, Kadri C.

    2008-01-01

    Over the past few decades, binary NiTi shape memory alloys have received attention due to their unique mechanical characteristics, leading to their potential use in low-temperature, solid-state actuator applications. However, prior to using these materials for such applications, the physical response of these systems to mechanical and thermal stimuli must be thoroughly understood and modeled to aid designers in developing SMA-enabled systems. Even though shape memory alloys have been around for almost five decades, very little effort has been made to standardize testing procedures. Although some standards for measuring the transformation temperatures of SMA s are available, no real standards exist for determining the various mechanical and thermomechanical properties that govern the usefulness of these unique materials. Consequently, this study involved testing a 55NiTi alloy using a variety of different test methodologies. All samples tested were taken from the same heat and batch to remove the influence of sample pedigree on the observed results. When the material was tested under constant-stress, thermal-cycle conditions, variations in the characteristic material responses were observed, depending on test methodology. The transformation strain and irreversible strain were impacted more than the transformation temperatures, which only showed an affect with regard to applied external stress. In some cases, test methodology altered the transformation strain by 0.005-0.01mm/mm, which translates into a difference in work output capability of approximately 2 J/cu cm (290 in!lbf/cu in). These results indicate the need for the development of testing standards so that meaningful data can be generated and successfully incorporated into viable models and hardware. The use of consistent testing procedures is also important when comparing results from one research organization to another. To this end, differences in the observed responses will be presented, contrasted and rationalized, in hopes of eventually developing standardized testing procedures for shape memory alloys.

  2. DEVELOPMENT OF A MULTIMETRIC INDEX FOR ASSESSING THE BIOLOGICAL CONDITION OF THE OHIO RIVER

    EPA Science Inventory

    The use of fish communities to assess environmental quality is common for streams, but a standard methodology for large rivers is largely undeveloped. We developed an index to assess the condition of fish assemblages along 1580 km of the Ohio River. Representative samples of th...

  3. [Regulations for decontamination of surfaces polluted as a result of chemical accidents (concept approaches)].

    PubMed

    Filatov, B N; Britanov, N G; Tochilkina, L P; Zhukov, V E; Maslennikov, A A; Ignatenko, M N; Volchek, K

    2011-01-01

    The threat of industrial chemical accidents and terrorist attacks requires the development of safety regulations for the cleanup of contaminated surfaces. This paper presents principles and a methodology for the development of a new toxicological parameter, "relative value unit" (RVU) as the primary decontamination standard.

  4. Assessment of capillary suction time (CST) test methodologies.

    PubMed

    Sawalha, O; Scholz, M

    2007-12-01

    The capillary suction time (CST) test is a commonly used method to measure the filterability and the easiness of removing moisture from slurry and sludge in numerous environmental and industrial applications. This study assessed several novel alterations of both the test methodology and the current standard capillary suction time (CST) apparatus. Twelve different papers including the standard Whatman No. 17 chromatographic paper were tested. The tests were run using four different types of sludge including a synthetic sludge, which was specifically developed for benchmarking purposes. The standard apparatus was altered by the introduction of a novel rectangular funnel instead of a standard circular one. A stirrer was also introduced to solve the problem of test inconsistency (e.g. high CST variability) particularly for heavy types of sludge. Results showed that several alternative papers, which are cheaper than the standard paper, can be used to estimate CST values accurately, and that the test repeatability can be improved in many cases and for different types of sludge. The introduction of the rectangular funnel demonstrated an obvious enhancement of test repeatability. The use of a stirrer to avoid sedimentation of heavy sludge did not have statistically significant impact on the CST values or the corresponding data variability. The application of synthetic sludge can support the testing of experimental methodologies and should be used for subsequent benchmarking purposes.

  5. Implementation methodology for interoperable personal health devices with low-voltage low-power constraints.

    PubMed

    Martinez-Espronceda, Miguel; Martinez, Ignacio; Serrano, Luis; Led, Santiago; Trigo, Jesús Daniel; Marzo, Asier; Escayola, Javier; Garcia, José

    2011-05-01

    Traditionally, e-Health solutions were located at the point of care (PoC), while the new ubiquitous user-centered paradigm draws on standard-based personal health devices (PHDs). Such devices place strict constraints on computation and battery efficiency that encouraged the International Organization for Standardization/IEEE11073 (X73) standard for medical devices to evolve from X73PoC to X73PHD. In this context, low-voltage low-power (LV-LP) technologies meet the restrictions of X73PHD-compliant devices. Since X73PHD does not approach the software architecture, the accomplishment of an efficient design falls directly on the software developer. Therefore, computational and battery performance of such LV-LP-constrained devices can even be outperformed through an efficient X73PHD implementation design. In this context, this paper proposes a new methodology to implement X73PHD into microcontroller-based platforms with LV-LP constraints. Such implementation methodology has been developed through a patterns-based approach and applied to a number of X73PHD-compliant agents (including weighing scale, blood pressure monitor, and thermometer specializations) and microprocessor architectures (8, 16, and 32 bits) as a proof of concept. As a reference, the results obtained in the weighing scale guarantee all features of X73PHD running over a microcontroller architecture based on ARM7TDMI requiring only 168 B of RAM and 2546 B of flash memory.

  6. Characteristics and Models of Effective Professional Development: The Case of School Teachers in Qatar

    ERIC Educational Resources Information Center

    Abu-Tineh, Abdullah M.; Sadiq, Hissa M.

    2018-01-01

    The purpose of this study was to investigate the characteristics of effective professional development and effective models of professional development as perceived by school teachers in the State of Qatar. This study is quantitative in nature and was conducted using a survey methodology. Means, standard deviations, t-test, and one-way analysis of…

  7. Short Serious Games Creation under the Paradigm of Software Process and Competencies as Software Requirements. Case Study: Elementary Math Competencies

    ERIC Educational Resources Information Center

    Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.

    2015-01-01

    Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…

  8. Development and Evaluation of a Training Program for Organ Procurement Coordinators Using Standardized Patient Methodology.

    PubMed

    Odabasi, Orhan; Elcin, Melih; Uzun Basusta, Bilge; Gulkaya Anik, Esin; Aki, Tuncay F; Bozoklar, Ata

    2015-12-01

    The low rate of consent by next of kin of donor-eligible patients is a major limiting factor in organ transplant. Educating health care professionals about their role may lead to measurable improvements in the process. Our aim was to describe the developmental steps of a communication skills training program for health care professionals using standardized patients and to evaluate the results. We developed a rubric and 5 cases for standardized family interviews. The 20 participants interviewed standardized families at the beginning and at the end of the training course, with interviews followed by debriefing sessions. Participants also provided feedback before and after the course. The performance of each participant was assessed by his or her peers using the rubric. We calculated the generalizability coefficient to measure the reliability of the rubric and used the Wilcoxon signed rank test to compare achievement among participants. Statistical analyses were performed with SPSS software (SPSS: An IBM Company, version 17.0, IBM Corporation, Armonk, NY, USA). All participants received higher scores in their second interview, including novice participants who expressed great discomfort during their first interview. The participants rated the scenarios and the standardized patients as very representative of real-life situations, with feedback forms showing that the interviews, the video recording sessions, and the debriefing sessions contributed to their learning. Our program was designed to meet the current expectations and implications in the field of donor consent from next of kin. Results showed that our training program developed using standardized patient methodology was effective in obtaining the communication skills needed for family interviews during the consent process. The rubric developed during the study was a valid and reliable assessment tool that could be used in further educational activities. The participants showed significant improvements in communication skills.

  9. 7T MRI subthalamic nucleus atlas for use with 3T MRI.

    PubMed

    Milchenko, Mikhail; Norris, Scott A; Poston, Kathleen; Campbell, Meghan C; Ushe, Mwiza; Perlmutter, Joel S; Snyder, Abraham Z

    2018-01-01

    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) reduces motor symptoms in most patients with Parkinson disease (PD), yet may produce untoward effects. Investigation of DBS effects requires accurate localization of the STN, which can be difficult to identify on magnetic resonance images collected with clinically available 3T scanners. The goal of this study is to develop a high-quality STN atlas that can be applied to standard 3T images. We created a high-definition STN atlas derived from seven older participants imaged at 7T. This atlas was nonlinearly registered to a standard template representing 56 patients with PD imaged at 3T. This process required development of methodology for nonlinear multimodal image registration. We demonstrate mm-scale STN localization accuracy by comparison of our 3T atlas with a publicly available 7T atlas. We also demonstrate less agreement with an earlier histological atlas. STN localization error in the 56 patients imaged at 3T was less than 1 mm on average. Our methodology enables accurate STN localization in individuals imaged at 3T. The STN atlas and underlying 3T average template in MNI space are freely available to the research community. The image registration methodology developed in the course of this work may be generally applicable to other datasets.

  10. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Therkelsen, Peter L.; Rao, Prakash; Aghajanzadeh, Arian

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performancemore » improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.« less

  11. Assessment of the Impacts of Standards and Labeling Programs inMexico (four products).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, Itha; Pulido, Henry; McNeil, Michael A.

    2007-06-12

    This study analyzes impacts from energy efficiency standards and labeling in Mexico from 1994 through 2005 for four major products: household refrigerators, room air conditioners, three-phase (squirrel cage) induction motors, and clothes washers. It is a retrospective analysis, seeking to assess verified impacts on product efficiency in the Mexican market in the first ten years after standards were implemented. Such an analysis allows the Mexican government to compare actual to originally forecast program benefits. In addition, it provides an extremely valuable benchmark for other countries considering standards, and to the energy policy community as a whole. The methodology for evaluationmore » begins with historical test data taken for a large number of models of each product type between 1994 and 2005. The pre-standard efficiency of models in 1994 is taken as a baseline throughout the analysis. Model efficiency data were provided by an independent certification laboratory (ANCE), which tested products as part of the certification and enforcement mechanism defined by the standards program. Using this data, together with economic and market data provided by both government and private sector sources, the analysis considers several types of national level program impacts. These include: Energy savings; Environmental (emissions) impacts, and Net financial impacts to consumers, manufacturers and utilities. Energy savings impacts are calculated using the same methodology as the original projections, allowing a comparison. Other impacts are calculated using a robust and sophisticated methodology developed by the Instituto de Investigaciones Electricas (IIE) and Lawrence Berkeley National Laboratory (LBNL), in a collaboration supported by the Collaborative Labeling and Standards Program (CLASP).« less

  12. Calibration Testing of Network Tap Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popovsky, Barbara; Chee, Brian; Frincke, Deborah A.

    2007-11-14

    Abstract: Understanding the behavior of network forensic devices is important to support prosecutions of malicious conduct on computer networks as well as legal remedies for false accusations of network management negligence. Individuals who seek to establish the credibility of network forensic data must speak competently about how the data was gathered and the potential for data loss. Unfortunately, manufacturers rarely provide information about the performance of low-layer network devices at a level that will survive legal challenges. This paper proposes a first step toward an independent calibration standard by establishing a validation testing methodology for evaluating forensic taps against manufacturermore » specifications. The methodology and the theoretical analysis that led to its development are offered as a conceptual framework for developing a standard and to "operationalize" network forensic readiness. This paper also provides details of an exemplar test, testing environment, procedures and results.« less

  13. Measuring Impact of U.S. DOE Geothermal Technologies Office Funding: Considerations for Development of a Geothermal Resource Reporting Metric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    This paper reviews existing methodologies and reporting codes used to describe extracted energy resources such as coal and oil and describes a comparable proposed methodology to describe geothermal resources. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of assessing the impacts of its funding programs. This framework will allow for GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress. Standards and reporting codes used in other countries and energy sectorsmore » provide guidance to inform development of a geothermal methodology, but industry feedback and our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and we sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for assessing and reporting on GTO funding according to resource knowledge and resource grade (or quality). This methodology would allow GTO to target funding or measure impact by progression of projects or geological potential for development.« less

  14. The EuroPrevall outpatient clinic study on food allergy: background and methodology.

    PubMed

    Fernández-Rivas, M; Barreales, L; Mackie, A R; Fritsche, P; Vázquez-Cortés, S; Jedrzejczak-Czechowicz, M; Kowalski, M L; Clausen, M; Gislason, D; Sinaniotis, A; Kompoti, E; Le, T-M; Knulst, A C; Purohit, A; de Blay, F; Kralimarkova, T; Popov, T; Asero, R; Belohlavkova, S; Seneviratne, S L; Dubakiene, R; Lidholm, J; Hoffmann-Sommergruber, K; Burney, P; Crevel, R; Brill, M; Fernández-Pérez, C; Vieths, S; Clare Mills, E N; van Ree, R; Ballmer-Weber, B K

    2015-05-01

    The EuroPrevall project aimed to develop effective management strategies in food allergy through a suite of interconnected studies and a multidisciplinary integrated approach. To address some of the gaps in food allergy diagnosis, allergen risk management and socio-economic impact and to complement the EuroPrevall population-based surveys, a cross-sectional study in 12 outpatient clinics across Europe was conducted. We describe the study protocol. Patients referred for immediate food adverse reactions underwent a consistent and standardized allergy work-up that comprised collection of medical history; assessment of sensitization to 24 foods, 14 inhalant allergens and 55 allergenic molecules; and confirmation of clinical reactivity and food thresholds by standardized double-blind placebo-controlled food challenges (DBPCFCs) to milk, egg, fish, shrimp, peanut, hazelnut, celeriac, apple and peach. A standardized methodology for a comprehensive evaluation of food allergy was developed and implemented in 12 outpatient clinics across Europe. A total of 2121 patients (22.6% <14 years) reporting 8257 reactions to foods were studied, and 516 DBPCFCs were performed. This is the largest multicentre European case series in food allergy, in which subjects underwent a comprehensive, uniform and standardized evaluation including DBPCFC, by a methodology which is made available for further studies in food allergy. The analysis of this population will provide information on the different phenotypes of food allergy across Europe, will allow to validate novel in vitro diagnostic tests, to establish threshold values for major allergenic foods and to analyse the socio-economic impact of food allergy. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  16. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  17. Development and Application of Health-Based Screening Levels for Use in Water-Quality Assessments

    USGS Publications Warehouse

    Toccalino, Patricia L.

    2007-01-01

    Health-Based Screening Levels (HBSLs) are non-enforceable water-quality benchmarks that were developed by the U.S. Geological Survey in collaboration with the U.S. Environmental Protection Agency (USEPA) and others. HBSLs supplement existing Federal drinking-water standards and guidelines, thereby providing a basis for a more comprehensive evaluation of contaminant-occurrence data in the context of human health. Since the original methodology used to calculate HBSLs for unregulated contaminants was published in 2003, revisions have been made to the HBSL methodology in order to reflect updates to relevant USEPA policies. These revisions allow for the use of the most recent, USEPA peer-reviewed, publicly available human-health toxicity information in the development of HBSLs. This report summarizes the revisions to the HBSL methodology for unregulated contaminants, and updates the guidance on the use of HBSLs for interpreting water-quality data in the context of human health.

  18. [HL7 standard--features, principles, and methodology].

    PubMed

    Koncar, Miroslav

    2005-01-01

    The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.

  19. Evaluative report on the Institute for Materials Research, National Bureau of Standards - fiscal year 1976. Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Institute for Materials Research (IMR), one of the major organizational units of the National Bureau of Standards, conducts research to provide a better understanding of the basic properties of materials and develops methodology and standards for measuring their properties to help ensure effective utilization of technologically important materials by the nation's scientific, commercial, and industrial communities. This report covers activities of the Institute during the 12 months preceding the Panel meeting on January 26-27, 1976.

  20. Evaluative report on the Institute for Materials Research, National Bureau of Standards - fiscal year 1977. Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Institute for Materials Research (IMR), one of the major organizational units of the National Bureau of Standards, conducts research to provide a better understanding of the basic properties of materials and develops methodology and standards for measuring their properties to help ensure effective utilization of technologically important materials by the nation's scientific, commercial, and industrial communities. This report covers activities of the Institute during the 12 months preceding the Panel meeting on January 25-26, 1977.

  1. Towards A Topological Framework for Integrating Semantic Information Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael

    2014-09-07

    In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.

  2. A study of the selection of microcomputer architectures to automate planetary spacecraft power systems

    NASA Technical Reports Server (NTRS)

    Nauda, A.

    1982-01-01

    Performance and reliability models of alternate microcomputer architectures as a methodology for optimizing system design were examined. A methodology for selecting an optimum microcomputer architecture for autonomous operation of planetary spacecraft power systems was developed. Various microcomputer system architectures are analyzed to determine their application to spacecraft power systems. It is suggested that no standardization formula or common set of guidelines exists which provides an optimum configuration for a given set of specifications.

  3. Methodological adequacy of articles published in two open-access Brazilian cardiology periodicals.

    PubMed

    Macedo, Cristiane Rufino; Silva, Davi Leite da; Puga, Maria Eduarda

    2010-01-01

    The use of rigorous scientific methods has contributed towards developing scientific articles of excellent methodological quality. This has made it possible to promote their citation and increase the impact factor. Brazilian periodicals have had to adapt to certain quality standards demanded by these indexing organizations, such as the content and the number of original articles published in each issue. This study aimed to evaluate the methodological adequacy of two Brazilian periodicals within the field of cardiology that are indexed in several databases and freely accessible through the Scientific Electronic Library Online (SciELO), and which are now indexed by the Web of Science (Institute for Scientific Information, ISI). Descriptive study at Brazilian Cochrane Center. All the published articles were evaluated according to merit assessment (content) and form assessment (performance). Ninety-six percent of the articles analyzed presented study designs that were adequate for answering the objectives. These two Brazilian periodicals within the field of cardiology published methodologically adequate articles, since they followed the quality standards. Thus, these periodicals can be considered both for consultation and as vehicles for publishing future articles. For further analyses, it is essential to apply other indicators of scientific activity such as bibliometrics, which evaluates quantitative aspects of the production, dissemination and use of information, and scientometrics, which is also concerned with the development of science policies, within which it is often superimposed on bibliometrics.

  4. Development and implementation of an audit tool for quality control of parenteral nutrition.

    PubMed

    García-Rodicio, Sonsoles; Abajo, Celia; Godoy, Mercedes; Catalá, Miguel Angel

    2009-01-01

    The aim of this article is to describe the development of a quality control methodology applied to patients receiving parenteral nutrition (PN) and to present the results obtained over the past 10 years. Development of the audit tool: In 1995, a total of 13 PN quality criteria and their standards were defined based on literature and past experiences. They were applied during 5 different 6-month audits carried out in subsequent years. According to the results of each audit, the criteria with lower validity were eliminated, while others were optimized and new criteria were introduced to complete the monitoring of other areas not previously examined. Currently, the quality control process includes 22 quality criteria and their standards that examine the following 4 different areas: (1) indication and duration of PN; (2) nutrition assessment, adequacy of the nutrition support, and monitoring; (3) metabolic and infectious complications; and (4) global efficacy of the nutrition support regimen. The authors describe the current definition of each criterion and present the results obtained in the 5 audits performed. In the past year, 9 of the 22 criteria reached the predefined standards. The areas detected for further improvements were: indication for PN, nutrition assessment, and management of catheter infections. The definition of quality criteria and their standards is an efficient method of providing a qualitative and quantitative analysis of the clinical care of patients receiving PN. It detects areas for improvement and assists in developing a methodology to work efficiently.

  5. Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm

    NASA Astrophysics Data System (ADS)

    Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel

    2016-02-01

    An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.

  6. Standards and Methodologies for Characterizing Radiobiological Impact of High-Z Nanoparticles

    PubMed Central

    Subiel, Anna; Ashmore, Reece; Schettino, Giuseppe

    2016-01-01

    Research on the application of high-Z nanoparticles (NPs) in cancer treatment and diagnosis has recently been the subject of growing interest, with much promise being shown with regards to a potential transition into clinical practice. In spite of numerous publications related to the development and application of nanoparticles for use with ionizing radiation, the literature is lacking coherent and systematic experimental approaches to fully evaluate the radiobiological effectiveness of NPs, validate mechanistic models and allow direct comparison of the studies undertaken by various research groups. The lack of standards and established methodology is commonly recognised as a major obstacle for the transition of innovative research ideas into clinical practice. This review provides a comprehensive overview of radiobiological techniques and quantification methods used in in vitro studies on high-Z nanoparticles and aims to provide recommendations for future standardization for NP-mediated radiation research. PMID:27446499

  7. Professional Development of Russian HEIs' Management and Faculty in CDIO Standards Application

    ERIC Educational Resources Information Center

    Chuchalin, Alexander; Malmqvist, Johan; Tayurskaya, Marina

    2016-01-01

    The paper presents the approach to complex training of managers and faculty staff for system modernisation of Russian engineering education. As a methodological basis of design and implementation of the faculty development programme, the CDIO (Conceive-Design-Implement-Operate) Approach was chosen due to compliance of its concept to the purposes…

  8. Developing a User Oriented Design Methodology for Learning Activities Using Boundary Objects

    ERIC Educational Resources Information Center

    Fragou, ?lga; Kameas, Achilles

    2013-01-01

    International Standards in High and Open and Distance Education are used for developing Open Educational Resources (OERs). Current issues in e-learning community are the specification of learning chunks and the definition of describing designs for different units of learning (activities, units, courses) in a generic though expandable format.…

  9. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.; Saltzman, D. H.

    1977-01-01

    Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.

  10. Product environmental footprint in policy and market decisions: Applicability and impact assessment.

    PubMed

    Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias

    2015-07-01

    In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.

  11. Remote-sensing applications as utilized in Florida's coastal zone management program

    NASA Technical Reports Server (NTRS)

    Worley, D. R.

    1975-01-01

    Land use maps were developed from photomaps obtained by remote sensing in order to develop a comprehensive state plan for the protection, development, and zoning of coastal regions. Only photographic remote sensors have been used in support of the coastal council's planning/management methodology. Standard photointerpretation and cartographic application procedures for map compilation were used in preparing base maps.

  12. Calibration methodology for proportional counters applied to yield measurements of a neutron burst.

    PubMed

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  13. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  14. Geothermal Resource Reporting Metric (GRRM) Developed for the U.S. Department of Energy's Geothermal Technologies Office

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    This paper reviews a methodology being developed for reporting geothermal resources and project progress. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of evaluating the impacts of its funding programs. This framework will allow the GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress and the public. Standards and reporting codes used in other countries and energy sectors provide guidance to develop the relevant geothermal methodology, but industry feedback andmore » our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by the GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for evaluating and reporting on GTO funding according to resource grade (geological, technical and socio-economic) and project progress. This methodology would allow GTO to target funding, measure impact by monitoring the progression of projects, or assess geological potential of targeted areas for development.« less

  15. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  16. Systematic review of the methodological and reporting quality of case series in surgery.

    PubMed

    Agha, R A; Fowler, A J; Lee, S-Y; Gundogan, B; Whitehurst, K; Sagoo, H K; Jeong, K J L; Altman, D G; Orgill, D P

    2016-09-01

    Case series are an important and common study type. No guideline exists for reporting case series and there is evidence of key data being missed from such reports. The first step in the process of developing a methodologically sound reporting guideline is a systematic review of literature relevant to the reporting deficiencies of case series. A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, Embase, Cochrane Methods Register, Science Citation Index and Conference Proceedings Citation index, from the start of indexing to 5 November 2014. Independent screening, eligibility assessments and data extraction were performed. Included articles were then analysed for five areas of deficiency: failure to use standardized definitions, missing or selective data (including the omission of whole cases or important variables), transparency or incomplete reporting, whether alternative study designs were considered, and other issues. Database searching identified 2205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequencies of methodological and reporting issues identified were: failure to use standardized definitions (57 per cent), missing or selective data (66 per cent), transparency or incomplete reporting (70 per cent), whether alternative study designs were considered (11 per cent) and other issues (52 per cent). The methodological and reporting quality of surgical case series needs improvement. The data indicate that evidence-based guidelines for the conduct and reporting of case series may be useful. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  17. CellML metadata standards, associated tools and repositories

    PubMed Central

    Beard, Daniel A.; Britten, Randall; Cooling, Mike T.; Garny, Alan; Halstead, Matt D.B.; Hunter, Peter J.; Lawson, James; Lloyd, Catherine M.; Marsh, Justin; Miller, Andrew; Nickerson, David P.; Nielsen, Poul M.F.; Nomura, Taishin; Subramanium, Shankar; Wimalaratne, Sarala M.; Yu, Tommy

    2009-01-01

    The development of standards for encoding mathematical models is an important component of model building and model sharing among scientists interested in understanding multi-scale physiological processes. CellML provides such a standard, particularly for models based on biophysical mechanisms, and a substantial number of models are now available in the CellML Model Repository. However, there is an urgent need to extend the current CellML metadata standard to provide biological and biophysical annotation of the models in order to facilitate model sharing, automated model reduction and connection to biological databases. This paper gives a broad overview of a number of new developments on CellML metadata and provides links to further methodological details available from the CellML website. PMID:19380315

  18. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    PubMed

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  19. Development and exploration of a new methodology for the fitting and analysis of XAS data.

    PubMed

    Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

    2010-01-01

    A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010), J. Synchrotron Rad. 17, 132-137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl(4)(2-), a common reference compound used for calibration and covalency estimation in M-Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples.

  20. Development and exploration of a new methodology for the fitting and analysis of XAS data

    PubMed Central

    Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

    2010-01-01

    A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120

  1. The methodology for defining the European Standards for the certification of Haemophilia Centres in Europe

    PubMed Central

    Candura, Fabio; Menichini, Ivana; Calizzani, Gabriele; Giangrande, Paul; Mannucci, Pier Mannuccio; Makris, Michael

    2014-01-01

    Introduction Work Package 4 Development of the standardisation criteria of the European Haemophilia Network project has the main objective of implementing a common and shared European strategy for a certification system for two levels of Haemophilia Centres: European Haemophilia Treatment Centres and European Haemophilia Comprehensive Care Centres in the Member States of the European Union. Materials and methods An inclusive and participatory process for developing shared standards and criteria for the management of patients with inherited bleeding disorders has been carried out. The process has been implemented through four different consultation events involving the entire European community of stakeholders that significantly contributed in the drafting of the European Guidelines for the certification of Haemophilia Centres. Results The Guidelines set the standards for the designation of centres that provide specialised and multidisciplinary care (Haemophilia Comprehensive Care Centres) as well as local routine care (Haemophilia Treatment Centres). Standards cover several issues such as: general requirements; patient care; advisory services; laboratory; networking of clinical and specialised services. Conclusions The drafting of the European Guidelines for the certification of Haemophilia Centres was performed adopting a rigorous methodological approach. In order to build the widest possible consensus to the quality standards, the main institutional and scientific stakeholders have been involved. The resulting document will significantly contribute in promoting standardisation in the quality of diagnosis and treatment in European Haemophilia Centres. PMID:24922292

  2. Drilled Shaft Foundations for Noise Barrier Walls and Slope Stabilization

    DOT National Transportation Integrated Search

    2002-12-01

    This research project is focused on two primary objectives. The first objective relates to the development of a methodology for using the SPT (Standard Penetration Test) results to design the laterally loaded drilled shafts. The second objective aims...

  3. Updated methods for traffic impact analysis, including evaluation of innovative intersection designs.

    DOT National Transportation Integrated Search

    2013-12-01

    In 1992, an Applicants Guide and a Reviewers Guide to Traffic Impact Analyses to standardize the methodologies for conducting traffic : impact analyses (TIAs) in Indiana were developed for the Indiana Department of Transportation (INDOT). The m...

  4. Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design

    PubMed Central

    Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.

    2017-01-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295

  5. Review of Recent Methodological Developments in Group-Randomized Trials: Part 1-Design.

    PubMed

    Turner, Elizabeth L; Li, Fan; Gallis, John A; Prague, Melanie; Murray, David M

    2017-06-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis.

  6. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  7. Technical Support Document: 50% Energy Savings Design Technology Packages for Highway Lodging Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Wei; Gowri, Krishnan; Lane, Michael D.

    2009-09-28

    This Technical Support Document (TSD) describes the process, methodology and assumptions for development of the 50% Energy Savings Design Technology Packages for Highway Lodging Buildings, a design guidance document intended to provide recommendations for achieving 50% energy savings in highway lodging properties over the energy-efficiency levels contained in ANSI/ASHRAE/IESNA Standard 90.1-2004, Energy Standard for Buildings Except Low-Rise Residential Buildings.

  8. Metropolitan Forensic Anthropology Team (MFAT) studies in identification: 1. Race and sex assessment by discriminant function analysis of the postcranial skeleton.

    PubMed

    Taylor, J V; DiBennardo, R; Linares, G H; Goldman, A D; DeForest, P R

    1984-07-01

    A case study is presented to demonstrate the utility of the team approach to the identification of human remains, and to illustrate a methodological innovation developed by MFAT. Case 1 represents the first of several planned case studies, each designed to present new methodological solutions to standard problems in identification. The present case describes a test, by application, of race and sex assessment of the postcranial skeleton by discriminant function analysis.

  9. [50 years of the methodology of weather forecasting for medicine].

    PubMed

    Grigor'ev, K I; Povazhnaia, E L

    2014-01-01

    The materials reported in the present article illustrate the possibility of weather forecasting for the medical purposes in the historical aspect. The main characteristics of the relevant organizational and methodological approaches to meteoprophylaxis based of the standard medical forecasts are presented. The emphasis is laid on the priority of the domestic medical school in the development of the principles of diagnostics and treatment of meteosensitivity and meteotropic complications in the patients presenting with various diseases with special reference to their age-related characteristics.

  10. A method to preserve trends in quantile mapping bias correction of climate modeled temperature

    NASA Astrophysics Data System (ADS)

    Grillakis, Manolis G.; Koutroulis, Aristeidis G.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-09-01

    Bias correction of climate variables is a standard practice in climate change impact (CCI) studies. Various methodologies have been developed within the framework of quantile mapping. However, it is well known that quantile mapping may significantly modify the long-term statistics due to the time dependency of the temperature bias. Here, a method to overcome this issue without compromising the day-to-day correction statistics is presented. The methodology separates the modeled temperature signal into a normalized and a residual component relative to the modeled reference period climatology, in order to adjust the biases only for the former and preserve the signal of the later. The results show that this method allows for the preservation of the originally modeled long-term signal in the mean, the standard deviation and higher and lower percentiles of temperature. To illustrate the improvements, the methodology is tested on daily time series obtained from five Euro CORDEX regional climate models (RCMs).

  11. Improving the adaptability of WHO evidence-informed guidelines for nutrition actions: results of a mixed methods evaluation.

    PubMed

    Dedios, Maria Cecilia; Esperato, Alexo; De-Regil, Luz Maria; Peña-Rosas, Juan Pablo; Norris, Susan L

    2017-03-21

    Over the past decade, the World Health Organization (WHO) has implemented a standardized, evidence-informed guideline development process to assure technically sound and policy-relevant guidelines. This study is an independent evaluation of the adaptability of the guidelines produced by the Evidence and Programme Guidance unit, at the Department of Nutrition for Health and Development (NHD). The study systematizes the lessons learned by the NHD group at WHO. We used a mixed methods approach to determine the adaptability of the nutrition guidelines. Adaptability was defined as having two components; methodological quality and implementability of guidelines. Additionally, we gathered recommendations to improve future guideline development in nutrition actions for health and development. Data sources for this evaluation were official documentation and feedback (both qualitative and quantitative) from key stakeholders involved in the development of nutrition guidelines. The qualitative data was collected through a desk review and two waves of semi-structured interviews (n = 12) and was analyzed through axial coding. Guideline adaptability was assessed quantitatively using two standardized instruments completed by key stakeholders. The Appraisal Guideline for Research and Evaluation questionnaire, version II was used to assess guideline quality (n = 6), while implementability was assessed with the electronic version of the GuideLine Implementability Appraisal (n = 7). The nutrition evidence-informed guideline development process has several strengths, among them are the appropriate management of conflicts of interest of guideline developers and the systematic use of high-quality evidence to inform the recommendations. These features contribute to increase the methodological quality of the guidelines. The key areas for improvement are the limited implementability of the recommendations, the lack of explicit and precise implementation advice in the guidelines and challenges related to collaborative work within interdisciplinary groups. Overall, our study found that the nutrition evidence-informed guidelines are of good methodological quality but that the implementability requires improvement. The recommendations to improve guideline adaptability address the guideline content, the dynamics shaping interdisciplinary work, and actions for implementation feasibility. As WHO relies heavily on a standardized procedure to develop guidelines, the lessons learned may be applicable to guideline development across the organization and to other groups developing guidelines.

  12. Study for standardization of the lighting system in fruit sorting

    NASA Astrophysics Data System (ADS)

    Gomes, J. F. S.; Baldner, F. O.; Costa, P. B.; Guedes, M. B.; Oliveira, I. A. A.; Leta, F. R.

    2016-07-01

    Sorting is a very important step in the fruit processing. The attributes definition and characterization are important for both marketing and end user, making it necessary to establish regulations for classification and standardization in order to unify the language of the market and enabling a more efficient market and also increase consumer awareness. For this end, it is necessary to standardize the technical criteria that can change the perception of the product. Studies have been developed in order to standardize a methodology to determine the subclass of fruit ripening, evaluating the influence of different light sources in the subclass evaluation.

  13. Improving sexuality education: the development of teacher-preparation standards.

    PubMed

    Barr, Elissa M; Goldfarb, Eva S; Russell, Susan; Seabert, Denise; Wallen, Michele; Wilson, Kelly L

    2014-06-01

    Teaching sexuality education to support young people's sexual development and overall sexual health is both needed and supported. Data continue to highlight the high rates of teen pregnancy, sexually transmitted disease, including human immunodeficiency virus (HIV) infections, among young people in the United States as well as the overwhelming public support for sexuality education instruction. In support of the implementation of the National Sexuality Education Standards, the current effort focuses on better preparing teachers to deliver sexuality education. An expert panel was convened by the Future of Sex Education Initiative to develop teacher-preparation standards for sexuality education. Their task was to develop standards and indicators that addressed the unique elements intrinsic to sexuality education instruction. Seven standards and associated indicators were developed that address professional disposition, diversity and equity, content knowledge, legal and professional ethics, planning, implementation, and assessment. The National Teacher-Preparation Standards for Sexuality Education represent an unprecedented unified effort to enable prospective health education teachers to become competent in teaching methodology, theory, practice of pedagogy, content, and skills, specific to sexuality education. Higher education will play a key role in ensuring the success of these standards. © 2014, American School Health Association.

  14. Ultrasound for assessing disease activity in IBD patients: a systematic review of activity scores.

    PubMed

    Bots, S; Nylund, K; Löwenberg, M; Gecse, K; Gilja, O H; D'Haens, G

    2018-04-19

    Ultrasound (US) indices for assessing disease activity in IBD patients have never been critically reviewed. We aimed to systematically review the quality and reliability of available ultrasound (US) indices compared with reference standards for grading disease activity in IBD patients. Pubmed, Embase and Medline were searched from 1990 until June 2017. Relevant publications were identified through full text review after initial screening by 2 investigators. Data on methodology and index characteristics were collected. Study quality was assessed with a modified version of the Quadas-2 tool for risk of bias assessment. Of 20 studies with an US index, 11 studies met the inclusion criteria. Out of these 11 studies, 7 and 4 studied CD and UC activity indices, respectively. Parameters that were used in these indices included bowel wall thickness (BWT), Doppler signal (DS), wall layer stratification (WLS), compressibility, peristalsis, haustrations, fatty wrapping, contrast enhancement (CE) and strain pattern. Study quality was graded high in 5 studies, moderate in 3 studies and low in 3 studies. Ileocolonoscopy was used as the reference standard in 9 studies. In 1 study a combined index of ileocolonoscopy and barium contrast radiography and in 1 study histology was used as the reference standard. Only 5 studies used an established endoscopic index for comparison with US. Several US indices for assessing disease activity in IBD are available; however the methodology for development was suboptimal in most studies. For the development of future indices stringent methodological design is required.

  15. Use of evidence-based practice in an aid organisation: a proposal to deal with the variety in terminology and methodology.

    PubMed

    De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe

    2014-03-01

    As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.

  16. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  17. Allocation of nursing care hours in a combined ophthalmic nursing unit.

    PubMed

    Navarro, V B; Stout, W A; Tolley, F M

    1995-04-01

    Traditional service configuration with separate nursing units for outpatient and inpatient care is becoming ineffective for new patient care delivery models. With the new configuration of a combined nursing unit, it was necessary to rethink traditional reporting methodologies and calculation of hours of care. This project management plan is an initial attempt to develop a standard costing/productivity model for a combined unit. The methodology developed from this plan measures nursing care hours for each patient population to determine the number of full time equivalents (FTEs) for a combined unit and allocates FTEs based on inpatient (IP), outpatient (OP), and emergency room (ER) volumes.

  18. The TMIS life-cycle process document, revision A

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.

  19. Effective Learning Systems through Blended Teaching Modules in Adult Secondary Education Systems in Developing Nations: Need for Partnership

    ERIC Educational Resources Information Center

    Ike, Eucharia; Okechukwu, Ibeh Bartholomew

    2015-01-01

    We investigated methodological lessons in randomly selected adult secondary schools to construct a case for international partnership while examining education development in Nigeria. Standard database and web-based searches were conducted for publications between 1985 and 2012 on learning systems. This paper presents its absence and finds a heavy…

  20. Does Maltreatment Beget Maltreatment? A Systematic Review of the Intergenerational Literature

    PubMed Central

    Thornberry, Terence P.; Knight, Kelly E.; Lovegrove, Peter J.

    2014-01-01

    In this paper, we critically review the literature testing the cycle of maltreatment hypothesis which posits continuity in maltreatment across adjacent generations. That is, we examine whether a history of maltreatment victimization is a significant risk factor for the later perpetration of maltreatment. We begin by establishing 11 methodological criteria that studies testing this hypothesis should meet. They include such basic standards as using representative samples, valid and reliable measures, prospective designs, and different reporters for each generation. We identify 47 studies that investigated this issue and then evaluate them with regard to the 11 methodological criteria. Overall, most of these studies report findings consistent with the cycle of maltreatment hypothesis. Unfortunately, at the same time, few of them satisfy the basic methodological criteria that we established; indeed, even the stronger studies in this area only meet about half of them. Moreover, the methodologically stronger studies present mixed support for the hypothesis. As a result, the positive association often reported in the literature appears to be based largely on the methodologically weaker designs. Based on our systematic methodological review, we conclude that this small and methodologically weak body of literature does not provide a definitive test of the cycle of maltreatment hypothesis. We conclude that it is imperative to develop more robust and methodologically adequate assessments of this hypothesis to more accurately inform the development of prevention and treatment programs. PMID:22673145

  1. Drilled Shaft Foundations for Noise Barrier Walls and Slope Stabilization : Executive Summary

    DOT National Transportation Integrated Search

    2002-12-01

    This research project is focused on two primary objectives. The first objective relates to the development of a methodology for using the SPT (Standard Penetration Test) results to design the laterally loaded drilled shafts. The second objective aims...

  2. Collecting standardized urban health indicator data at an individual level for school-aged children living in urban areas: methods from EURO-URHIS 2.

    PubMed

    Pope, D; Katreniak, Z; Guha, J; Puzzolo, E; Higgerson, J; Steels, S; Woode-Owusu, M; Bruce, N; Birt, Christopher A; Ameijden, E van; Verma, A

    2017-05-01

    Measuring health and its determinants in urban populations is essential to effectively develop public health policies maximizing health gain within this context. Adolescents are important in this regard given the origins of leading causes of morbidity and mortality develop pre-adulthood. Comprehensive, accurate and comparable information on adolescent urban health indicators from heterogeneous urban contexts is an important challenge. EURO-URHIS 2 aimed to develop standardized tools and methodologies collecting data from adolescents across heterogenous European urban contexts. Questionnaires were developed including (i) comprehensive assessment of urban health indicators from 7 pre-defined domains, (ii) use of previously validated questions from a literature review and other European surveys, (iii) translation/back-translation into European languages and (iv) piloting. Urban area-specific data collection methodologies were established through literature review, consultation and piloting. School-based surveys of 14-16-year olds (400-800 per urban area) were conducted in 13 European countries (33 urban areas). Participation rates were high (80-100%) for students from schools taking part in the surveys from all urban areas, and data quality was generally good (low rates of missing/spoiled data). Overall, 13 850 questionnaires were collected, coded and entered for EURO-URHIS 2. Dissemination included production of urban area health profiles (allowing benchmarking for a number of important public health indicators in young people) and use of visualization tools as part of the EURO-URHIS 2 project. EURO-URHIS 2 has developed standardized survey tools and methodologies for assessing key measures of health and its determinants in adolescents from heterogenous urban contexts and demonstrated the utility of this data to public health practitioners and policy makers. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  3. 76 FR 70680 - Small Business Size Standards: Real Estate and Rental and Leasing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... industries and one sub- industry in North American Industry Classification System (NAICS) Sector 53, Real... industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published in the October 21, 2009 issue of the Federal Register. That ``Size Standards Methodology'' is...

  4. RAMESES publication standards: meta-narrative reviews

    PubMed Central

    2013-01-01

    Background Meta-narrative review is one of an emerging menu of new approaches to qualitative and mixed-method systematic review. A meta-narrative review seeks to illuminate a heterogeneous topic area by highlighting the contrasting and complementary ways in which researchers have studied the same or a similar topic. No previous publication standards exist for the reporting of meta-narrative reviews. This publication standard was developed as part of the RAMESES (Realist And MEta-narrative Evidence Syntheses: Evolving Standards) project. The project's aim is to produce preliminary publication standards for meta-narrative reviews. Methods We (a) collated and summarized existing literature on the principles of good practice in meta-narrative reviews; (b) considered the extent to which these principles had been followed by published reviews, thereby identifying how rigor may be lost and how existing methods could be improved; (c) used a three-round online Delphi method with an interdisciplinary panel of national and international experts in evidence synthesis, meta-narrative reviews, policy and/or publishing to produce and iteratively refine a draft set of methodological steps and publication standards; (d) provided real-time support to ongoing meta-narrative reviews and the open-access RAMESES online discussion list so as to capture problems and questions as they arose; and (e) synthesized expert input, evidence review and real-time problem analysis into a definitive set of standards. Results We identified nine published meta-narrative reviews, provided real-time support to four ongoing reviews and captured questions raised in the RAMESES discussion list. Through analysis and discussion within the project team, we summarized the published literature, and common questions and challenges into briefing materials for the Delphi panel, comprising 33 members. Within three rounds this panel had reached consensus on 20 key publication standards, with an overall response rate of 90%. Conclusion This project used multiple sources to draw together evidence and expertise in meta-narrative reviews. For each item we have included an explanation for why it is important and guidance on how it might be reported. Meta-narrative review is a relatively new method for evidence synthesis and as experience and methodological developments occur, we anticipate that these standards will evolve to reflect further theoretical and methodological developments. We hope that these standards will act as a resource that will contribute to improving the reporting of meta-narrative reviews. To encourage dissemination of the RAMESES publication standards, this article is co-published in the Journal of Advanced Nursing and is freely accessible on Wiley Online Library (http://www.wileyonlinelibrary.com/journal/jan). Please see related articles http://www.biomedcentral.com/1741-7015/11/21 and http://www.biomedcentral.com/1741-7015/11/22 PMID:23360661

  5. The Harmonizing Outcome Measures for Eczema (HOME) roadmap: a methodological framework to develop core sets of outcome measurements in dermatology.

    PubMed

    Schmitt, Jochen; Apfelbacher, Christian; Spuls, Phyllis I; Thomas, Kim S; Simpson, Eric L; Furue, Masutaka; Chalmers, Joanne; Williams, Hywel C

    2015-01-01

    Core outcome sets (COSs) are consensus-derived minimum sets of outcomes to be assessed in a specific situation. COSs are being increasingly developed to limit outcome-reporting bias, allow comparisons across trials, and strengthen clinical decision making. Despite the increasing interest in outcomes research, methods to develop COSs have not yet been standardized. The aim of this paper is to present the Harmonizing Outcomes Measures for Eczema (HOME) roadmap for the development and implementation of COSs, which was developed on the basis of our experience in the standardization of outcome measurements for atopic eczema. Following the establishment of a panel representing all relevant stakeholders and a research team experienced in outcomes research, the scope and setting of the core set should be defined. The next steps are the definition of a core set of outcome domains such as symptoms or quality of life, followed by the identification or development and validation of appropriate outcome measurement instruments to measure these core domains. Finally, the consented COS needs to be disseminated, implemented, and reviewed. We believe that the HOME roadmap is a useful methodological framework to develop COSs in dermatology, with the ultimate goal of better decision making and promoting patient-centered health care.

  6. Intimate Partner Violence, 1993-2010

    MedlinePlus

    ... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ...

  7. Operational Impacts of Wind Energy Resources in the Bonneville Power Administration Control Area - Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai

    2008-07-15

    This report presents a methodology developed to study the future impact of wind on BPA power system load following and regulation requirements. The methodology uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system, by mimicking the actual power system operations. Therefore, the results are close to reality, yet the study based on this methodology is convenient to conduct. Compared with the proposed methodology, existing methodologies for doing similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. Dispatch model simulation is constrained by the design of themore » dispatch program, and standard deviation evaluation is artificial in separating the load following and regulation requirements, both of which usually do not reflect actual operational practice. The methodology used in this study provides not only capacity requirement information, it also analyzes the ramp rate requirements for system load following and regulation processes. The ramp rate data can be used to evaluate generator response/maneuverability requirements, which is another necessary capability of the generation fleet for the smooth integration of wind energy. The study results are presented in an innovative way such that the increased generation capacity or ramp requirements are compared for two different years, across 24 hours a day. Therefore, the impact of different levels of wind energy on generation requirements at different times can be easily visualized.« less

  8. Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Miller, James; Leggett, Jay; Kramer-White, Julie

    2008-01-01

    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.

  9. Development of a Composite Delamination Fatigue Life Prediction Methodology

    NASA Technical Reports Server (NTRS)

    OBrien, Thomas K.

    2009-01-01

    Delamination is one of the most significant and unique failure modes in composite structures. Because of a lack of understanding of the consequences of delamination and the inability to predict delamination onset and growth, many composite parts are unnecessarily rejected upon inspection, both immediately after manufacture and while in service. NASA Langley is leading the efforts in the U.S. to develop a fatigue life prediction methodology for composite delamination using fracture mechanics. Research being performed to this end will be reviewed. Emphasis will be placed on the development of test standards for delamination characterization, incorporation of approaches for modeling delamination in commercial finite element codes, and efforts to mature the technology for use in design handbooks and certification documents.

  10. Test Standard Developed for Determining the Slow Crack Growth of Advanced Ceramics at Ambient Temperature

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Salem, Jonathan A.

    1998-01-01

    The service life of structural ceramic components is often limited by the process of slow crack growth. Therefore, it is important to develop an appropriate testing methodology for accurately determining the slow crack growth design parameters necessary for component life prediction. In addition, an appropriate test methodology can be used to determine the influences of component processing variables and composition on the slow crack growth and strength behavior of newly developed materials, thus allowing the component process to be tailored and optimized to specific needs. At the NASA Lewis Research Center, work to develop a standard test method to determine the slow crack growth parameters of advanced ceramics was initiated by the authors in early 1994 in the C 28 (Advanced Ceramics) committee of the American Society for Testing and Materials (ASTM). After about 2 years of required balloting, the draft written by the authors was approved and established as a new ASTM test standard: ASTM C 1368-97, Standard Test Method for Determination of Slow Crack Growth Parameters of Advanced Ceramics by Constant Stress-Rate Flexural Testing at Ambient Temperature. Briefly, the test method uses constant stress-rate testing to determine strengths as a function of stress rate at ambient temperature. Strengths are measured in a routine manner at four or more stress rates by applying constant displacement or loading rates. The slow crack growth parameters required for design are then estimated from a relationship between strength and stress rate. This new standard will be published in the Annual Book of ASTM Standards, Vol. 15.01, in 1998. Currently, a companion draft ASTM standard for determination of the slow crack growth parameters of advanced ceramics at elevated temperatures is being prepared by the authors and will be presented to the committee by the middle of 1998. Consequently, Lewis will maintain an active leadership role in advanced ceramics standardization within ASTM. In addition, the authors have been and are involved with several international standardization organizations including the Versailles Project on Advanced Materials and Standards (VAMAS), the International Energy Agency (IEA), and the International Organization for Standardization (ISO). The associated standardization activities involve fracture toughness, strength, elastic modulus, and the machining of advanced ceramics.

  11. Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel

    2008-01-01

    This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.

  12. Developing a methodology to inspect and assess conditions of short span structures on county roads in Wyoming.

    DOT National Transportation Integrated Search

    2015-12-01

    Ever since the introduction of the National Bridge Inspection Standards (NBIS) in 1971, there has been a : tremendous amount of effort put into bridge rehabilitation programs and safety inspections. The : Wyoming Department of Transportation (WYDOT) ...

  13. On the spot damage detection methodology for highway bridges during natural crises : tech transfer summary.

    DOT National Transportation Integrated Search

    2010-07-01

    The objective of this work was to develop a : low-cost portable damage detection tool to : assess and predict damage areas in highway : bridges. : The proposed tool was based on standard : vibration-based damage identification (VBDI) : techniques but...

  14. Impact of sampling techniques on measured stormwater quality data for small streams

    USDA-ARS?s Scientific Manuscript database

    Science-based sampling methodologies are needed to enhance water quality characterization for developing Total Maximum Daily Loads (TMDLs), setting appropriate water quality standards, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water qual...

  15. Approaches to biofilm-associated infections: the need for standardized and relevant biofilm methods for clinical applications.

    PubMed

    Malone, Matthew; Goeres, Darla M; Gosbell, Iain; Vickery, Karen; Jensen, Slade; Stoodley, Paul

    2017-02-01

    The concept of biofilms in human health and disease is now widely accepted as cause of chronic infection. Typically, biofilms show remarkable tolerance to many forms of treatments and the host immune response. This has led to vast increase in research to identify new (and sometimes old) anti-biofilm strategies that demonstrate effectiveness against these tolerant phenotypes. Areas covered: Unfortunately, a standardized methodological approach of biofilm models has not been adopted leading to a large disparity between testing conditions. This has made it almost impossible to compare data across multiple laboratories, leaving large gaps in the evidence. Furthermore, many biofilm models testing anti-biofilm strategies aimed at the medical arena have not considered the matter of relevance to an intended application. This may explain why some in vitro models based on methodological designs that do not consider relevance to an intended application fail when applied in vivo at the clinical level. Expert commentary: This review will explore the issues that need to be considered in developing performance standards for anti-biofilm therapeutics and provide a rationale for the need to standardize models/methods that are clinically relevant. We also provide some rational as to why no standards currently exist.

  16. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  17. A stochastic conflict resolution model for trading pollutant discharge permits in river systems.

    PubMed

    Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram

    2009-07-01

    This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.

  18. New battery model considering thermal transport and partial charge stationary effects in photovoltaic off-grid applications

    NASA Astrophysics Data System (ADS)

    Sanz-Gorrachategui, Iván; Bernal, Carlos; Oyarbide, Estanis; Garayalde, Erik; Aizpuru, Iosu; Canales, Jose María; Bono-Nuez, Antonio

    2018-02-01

    The optimization of the battery pack in an off-grid Photovoltaic application must consider the minimum sizing that assures the availability of the system under the worst environmental conditions. Thus, it is necessary to predict the evolution of the state of charge of the battery under incomplete daily charging and discharging processes and fluctuating temperatures over day-night cycles. Much of previous development work has been carried out in order to model the short term evolution of battery variables. Many works focus on the on-line parameter estimation of available charge, using standard or advanced estimators, but they are not focused on the development of a model with predictive capabilities. Moreover, normally stable environmental conditions and standard charge-discharge patterns are considered. As the actual cycle-patterns differ from the manufacturer's tests, batteries fail to perform as expected. This paper proposes a novel methodology to model these issues, with predictive capabilities to estimate the remaining charge in a battery after several solar cycles. A new non-linear state space model is proposed as a basis, and the methodology to feed and train the model is introduced. The new methodology is validated using experimental data, providing only 5% of error at higher temperatures than the nominal one.

  19. Group Collaboration in Organizations: Architectures, Methodologies and Tools

    DTIC Science & Technology

    2002-03-01

    collaboration , its definition and characteristics was completed. Next, existing technologies and standards were studied as well as the ...2000). 22 For effective collaboration , the technology must support the dynamic world of work be it individual, group and/or teamwork, as well as... develop it or simply use it as the basis of discussion. If collaborators are all contributing to the development of a

  20. Methods for systematic reviews of health economic evaluations: a systematic review, comparison, and synthesis of method literature.

    PubMed

    Mathes, Tim; Walgenbach, Maren; Antoine, Sunya-Lee; Pieper, Dawid; Eikermann, Michaela

    2014-10-01

    The quality of systematic reviews of health economic evaluations (SR-HE) is often limited because of methodological shortcomings. One reason for this poor quality is that there are no established standards for the preparation of SR-HE. The objective of this study is to compare existing methods and suggest best practices for the preparation of SR-HE. To identify the relevant methodological literature on SR-HE, a systematic literature search was performed in Embase, Medline, the National Health System Economic Evaluation Database, the Health Technology Assessment Database, and the Cochrane methodology register, and webpages of international health technology assessment agencies were searched. The study selection was performed independently by 2 reviewers. Data were extracted by one reviewer and verified by a second reviewer. On the basis of the overlaps in the recommendations for the methods of SR-HE in the included papers, suggestions for best practices for the preparation of SR-HE were developed. Nineteen relevant publications were identified. The recommendations within them often differed. However, for most process steps there was some overlap between recommendations for the methods of preparation. The overlaps were taken as basis on which to develop suggestions for the following process steps of preparation: defining the research question, developing eligibility criteria, conducting a literature search, selecting studies, assessing the methodological study quality, assessing transferability, and synthesizing data. The differences in the proposed recommendations are not always explainable by the focus on certain evaluation types, target audiences, or integration in the decision process. Currently, there seem to be no standard methods for the preparation of SR-HE. The suggestions presented here can contribute to the harmonization of methods for the preparation of SR-HE. © The Author(s) 2014.

  1. Analysis of Material Sample Heated by Impinging Hot Hydrogen Jet in a Non-Nuclear Tester

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Foote, John; Litchford, Ron

    2006-01-01

    A computational conjugate heat transfer methodology was developed and anchored with data obtained from a hot-hydrogen jet heated, non-nuclear materials tester, as a first step towards developing an efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective and thermal radiative, and conjugate heat transfers. Predicted hot hydrogen jet and material surface temperatures were compared with those of measurement. Predicted solid temperatures were compared with those obtained with a standard heat transfer code. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.

  2. Cryogenic Insulation Standard Data and Methodologies Project

    NASA Technical Reports Server (NTRS)

    Summerfield, Burton; Thompson, Karen; Zeitlin, Nancy; Mullenix, Pamela; Fesmire, James; Swanger, Adam

    2015-01-01

    Extending some recent developments in the area of technical consensus standards for cryogenic thermal insulation systems, a preliminary Inter-Laboratory Study of foam insulation materials was performed by NASA Kennedy Space Center and LeTourneau University. The initial focus was ambient pressure cryogenic boil off testing using the Cryostat-400 flat-plate instrument. Completion of a test facility at LETU has enabled direct, comparative testing, using identical cryostat instruments and methods, and the production of standard thermal data sets for a number of materials under sub-ambient conditions. The two sets of measurements were analyzed and indicate there is reasonable agreement between the two laboratories. Based on cryogenic boiloff calorimetry, new equipment and methods for testing thermal insulation systems have been successfully developed. These boiloff instruments (or cryostats) include both flat plate and cylindrical models and are applicable to a wide range of different materials under a wide range of test conditions. Test measurements are generally made at large temperature difference (boundary temperatures of 293 K and 78 K are typical) and include the full vacuum pressure range. Results are generally reported in effective thermal conductivity (ke) and mean heat flux (q) through the insulation system. The new cryostat instruments provide an effective and reliable way to characterize the thermal performance of materials under subambient conditions. Proven in through thousands of tests of hundreds of material systems, they have supported a wide range of aerospace, industry, and research projects. Boiloff testing technology is not just for cryogenic testing but is a cost effective, field-representative methodology to test any material or system for applications at sub-ambient temperatures. This technology, when adequately coupled with a technical standards basis, can provide a cost-effective, field-representative methodology to test any material or system for applications at sub-ambient to cryogenic temperatures. A growing need for energy efficiency and cryogenic applications is creating a worldwide demand for improved thermal insulation systems for low temperatures. The need for thermal characterization of these systems and materials raises a corresponding need for insulation test standards and thermal data targeted for cryogenic-vacuum applications. Such standards have a strong correlation to energy, transportation, and environment and the advancement of new materials technologies in these areas. In conjunction with this project, two new standards on cryogenic insulation were recently published by ASTM International: C1774 and C740. Following the requirements of NPR 7120.10, Technical Standards for NASA Programs and Projects, the appropriate information in this report can be provided to the NASA Chief Engineer as input for NASA's annual report to NIST, as required by OMB Circular No. A-119, describing NASA's use of voluntary consensus standards and participation in the development of voluntary consensus standards and bodies.

  3. Hanford Technical Basis for Multiple Dosimetry Effective Dose Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, Robin L.; Rathbone, Bruce A.

    2010-08-01

    The current method at Hanford for dealing with the results from multiple dosimeters worn during non-uniform irradiation is to use a compartmentalization method to calculate the effective dose (E). The method, as documented in the current version of Section 6.9.3 in the 'Hanford External Dosimetry Technical Basis Manual, PNL-MA-842,' is based on the compartmentalization method presented in the 1997 ANSI/HPS N13.41 standard, 'Criteria for Performing Multiple Dosimetry.' With the adoption of the ICRP 60 methodology in the 2007 revision to 10 CFR 835 came changes that have a direct affect on the compartmentalization method described in the 1997 ANSI/HPS N13.41more » standard, and, thus, to the method used at Hanford. The ANSI/HPS N13.41 standard committee is in the process of updating the standard, but the changes to the standard have not yet been approved. And, the drafts of the revision of the standard tend to align more with ICRP 60 than with the changes specified in the 2007 revision to 10 CFR 835. Therefore, a revised method for calculating effective dose from non-uniform external irradiation using a compartmental method was developed using the tissue weighting factors and remainder organs specified in 10 CFR 835 (2007).« less

  4. Development of a New Intelligent Joystick for People with Reduced Mobility.

    PubMed

    Mrabet, Makrem; Rabhi, Yassine; Fnaiech, Farhat

    2018-01-01

    Despite the diversity of electric wheelchairs, many people with physical limitations and seniors have difficulty using their standard joystick. As a result, they cannot meet their needs or ensure safe travel. Recent assistive technologies can help to give them autonomy and independence. This work deals with the real-time implementation of an artificial intelligence device to overcome these problems. Following a review of the literature from previous work, we present the methodology and process for implementing our intelligent control system on an electric wheelchair. The system is based on a neural algorithm that overcomes problems with standard joystick maneuvers such as the inability to move correctly in one direction. However, this implies the need for an appropriate methodology to map the position of the joystick handle. Experiments on a real wheelchair are carried out with real patients of the Mohamed Kassab National Institute Orthopedic, Physical and Functional Rehabilitation Hospital of Tunis. The proposed intelligent system gives good results compared to the use of a standard joystick.

  5. Development of a New Intelligent Joystick for People with Reduced Mobility

    PubMed Central

    Mrabet, Makrem; Fnaiech, Farhat

    2018-01-01

    Despite the diversity of electric wheelchairs, many people with physical limitations and seniors have difficulty using their standard joystick. As a result, they cannot meet their needs or ensure safe travel. Recent assistive technologies can help to give them autonomy and independence. This work deals with the real-time implementation of an artificial intelligence device to overcome these problems. Following a review of the literature from previous work, we present the methodology and process for implementing our intelligent control system on an electric wheelchair. The system is based on a neural algorithm that overcomes problems with standard joystick maneuvers such as the inability to move correctly in one direction. However, this implies the need for an appropriate methodology to map the position of the joystick handle. Experiments on a real wheelchair are carried out with real patients of the Mohamed Kassab National Institute Orthopedic, Physical and Functional Rehabilitation Hospital of Tunis. The proposed intelligent system gives good results compared to the use of a standard joystick. PMID:29765462

  6. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  7. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  8. Using electrical resistance probes for moisture determination in switchgrass windrows

    USDA-ARS?s Scientific Manuscript database

    Determining moisture levels in windrowed biomass is important for both forage producers and researchers. Energy crops such as switchgrass have been troublesome when using the standard methods set for electrical resistance meters. The objectives of this study were to i) develop the methodologies need...

  9. 75 FR 24796 - FBI Records Management Division National Name Check Program Section User Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-06

    ... with generally accepted accounting principles, also include such expenses as capital investment... by RMD. Referencing OMB Circular A-25; the Statement of Federal Financial Accounting Standards (SFFAS... financial management directives, Grant Thornton developed a cost accounting methodology and related cost...

  10. Models for Effective Service Delivery in Special Education Programs

    ERIC Educational Resources Information Center

    Epler, Pam; Ross, Rorie

    2015-01-01

    Educators today are challenged with the task of designing curricula and standards for students of varying abilities. While technology and innovation steadily improve classroom learning, teachers and administrators continue to struggle in developing the best methodologies and practices for students with disabilities. "Models for Effective…

  11. Appraising the methodological quality of the clinical practice guideline for diabetes mellitus using the AGREE II instrument: a methodological evaluation.

    PubMed

    Radwan, Mahmoud; Akbari Sari, Ali; Rashidian, Arash; Takian, Amirhossein; Abou-Dagga, Sanaa; Elsous, Aymen

    2017-02-01

    To evaluate the methodological quality of the Palestinian Clinical Practice Guideline for Diabetes Mellitus using the Translated Arabic Version of the AGREE II. Methodological evaluation. A cross-cultural adaptation framework was followed to translate and develop a standardised Translated Arabic Version of the AGREE II. Palestinian Primary Healthcare Centres. Sixteen appraisers independently evaluated the Clinical Practice Guideline for Diabetes Mellitus using the Translated Arabic Version of the AGREE II. Methodological quality of diabetic guideline. The Translated Arabic Version of the AGREE II showed an acceptable reliability and validity. Internal consistency ranged between 0.67 and 0.88 (Cronbach's α). Intra-class coefficient among appraisers ranged between 0.56 and 0.88. The quality of this guideline is low. Both domains 'Scope and Purpose' and 'Clarity of Presentation' had the highest quality scores (66.7% and 61.5%, respectively), whereas the scores for 'Applicability', 'Stakeholder Involvement', 'Rigour of Development' and 'Editorial Independence' were the lowest (27%, 35%, 36.5%, and 40%, respectively). The findings suggest that the quality of this Clinical Practice Guideline is disappointingly low. To improve the quality of current and future guidelines, the AGREE II instrument is extremely recommended to be incorporated as a gold standard for developing, evaluating or updating the Palestinian Clinical Practice Guidelines. Future guidelines can be improved by setting specific strategies to overcome implementation barriers with respect to economic considerations, engaging of all relevant end-users and patients, ensuring a rigorous methodology for searching, selecting and synthesising the evidences and recommendations, and addressing potential conflict of interests within the development group.

  12. Toward a food service quality management system for compliance with the Mediterranean dietary model.

    PubMed

    Grigoroudis, Evangelos; Psaroudaki, Antonia; Diakaki, Christina

    2013-01-01

    The traditional diet of Cretan people in the 1960s is the basis of the Mediterranean dietary model. This article investigates the potential of this model to inspire proposals of meals by food-serving businesses, and suggests a methodology for the development of a quality management system, which will certify the delivery of food service according to this dietary model. The proposed methodology is built upon the principles and structure of the ISO 9001:2008 quality standard to enable integration with other quality, environmental, and food safety management systems.

  13. Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?

    PubMed

    Bouchard, Chantal; Jean, Olivier

    2017-10-01

    Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  15. Methodology for estimation of total body composition in laboratory mammals

    NASA Technical Reports Server (NTRS)

    Pace, N.; Rahlmann, D. F.; Smith, A. H.

    1979-01-01

    A standardized dissection and chemical analysis procedure was developed for individual animals of several species in the size range mouse to monkey (15 g to 15 kg). The standardized procedure permits rigorous comparisons to be made both interspecifically and intraspecifically of organ weights and gross chemical composition in mammalian species series, and was applied successfully to laboratory mice, hamsters, rats, guinea pigs, and rabbits, as well as to macaque monkeys. The procedure is described in detail.

  16. PCB congener analysis with Hall electrolytic conductivity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, R.D.

    1989-01-01

    This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less

  17. Combining natural background levels (NBLs) assessment with indicator kriging analysis to improve groundwater quality data interpretation and management.

    PubMed

    Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis

    2016-11-01

    The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Detecting the optic disc boundary in digital fundus images using morphological, edge detection, and feature extraction techniques.

    PubMed

    Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego

    2010-11-01

    Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.

  19. Standardization and application of an index of community integrity for waterbirds in the Chesapeake Bay, USA

    USGS Publications Warehouse

    Prosser, Diann J.; Nagel, Jessica L.; Marban, Paul; Ze, Luo; Day, Daniel D.; Erwin, R. Michael

    2017-01-01

    In recent decades, there has been increasing interest in the application of ecological indices to assess ecosystem condition in response to anthropogenic activities. An Index of Waterbird Community Integrity was previously developed for the Chesapeake Bay, USA. However, the scoring criteria were not defined well enough to generate scores for new species that were not observed in the original study. The goal of this study was to explicitly define the scoring criteria for the existing index and to develop index scores for all waterbirds of the Chesapeake Bay. The standardized index then was applied to a case study investigating the relationship between waterbird community integrity and shoreline development during late summer and late fall (2012–2014) using an alternative approach to survey methodology, which allowed for greater area coverage compared to the approach used in the original study. Index scores for both seasons were negatively related to percentage of developed shorelines. Providing these updated tools using the detailed scoring system will facilitate future application to new species or development of the index in other estuaries worldwide. This methodology allows for consistent cross-study comparisons and can be combined with other community integrity indices, allowing for more effective estuarine management.

  20. Working group written presentation: Solar radiation

    NASA Technical Reports Server (NTRS)

    Slemp, Wayne S.

    1989-01-01

    The members of the Solar Radiation Working Group arrived at two major solar radiation technology needs: (1) generation of a long term flight data base; and (2) development of a standardized UV testing methodology. The flight data base should include 1 to 5 year exposure of optical filters, windows, thermal control coatings, hardened coatings, polymeric films, and structural composites. The UV flux and wavelength distribution, as well as particulate radiation flux and energy, should be measured during this flight exposure. A standard testing methodology is needed to establish techniques for highly accelerated UV exposure which will correlate well with flight test data. Currently, UV can only be accelerated to about 3 solar constants and can correlate well with flight exposure data. With space missions to 30 years, acceleration rates of 30 to 100X are needed for efficient laboratory testing.

  1. Considerations on methodological challenges for water footprint calculations.

    PubMed

    Thaler, S; Zessner, M; De Lis, F Bertran; Kreuzinger, N; Fehringer, R

    2012-01-01

    We have investigated how different approaches for water footprint (WF) calculations lead to different results, taking sugar beet production and sugar refining as examples. To a large extent, results obtained from any WF calculation are reflective of the method used and the assumptions made. Real irrigation data for 59 European sugar beet growing areas showed inadequate estimation of irrigation water when a widely used simple approach was used. The method resulted in an overestimation of blue water and an underestimation of green water usage. Dependent on the chosen (available) water quality standard, the final grey WF can differ up to a factor of 10 and more. We conclude that further development and standardisation of the WF is needed to reach comparable and reliable results. A special focus should be on standardisation of the grey WF methodology based on receiving water quality standards.

  2. My Teaching Philosophy

    ERIC Educational Resources Information Center

    Mambetaliev, Askarbek

    2007-01-01

    Since the collapse of the Soviet Union the Ministry of Education of the Kyrgyz Republic has included a few social science disciplines in the list of the Educational State Standards, though the content of these subjects and teaching methodologies are still weak. One of the problems, which I constantly face in Kyrgyzstan when developing a new…

  3. Using Photo-Interviewing as Tool for Research and Evaluation.

    ERIC Educational Resources Information Center

    Dempsey, John V.; Tucker, Susan A.

    Arguing that photo-interviewing yields richer data than that usually obtained from verbal interviewing procedures alone, it is proposed that this method of data collection be added to "standard" methodologies in instructional development research and evaluation. The process, as described in this paper, consists of using photographs of…

  4. Testing for change in structural elements of forest inventories

    Treesearch

    Melinda Vokoun; David Wear; Robert Abt

    2009-01-01

    In this article we develop a methodology to test for changes in the underlying relationships between measures of forest productivity (structural elements) and site characteristics, herein referred to as structural changes, using standard forest inventories. Changes in measures of forest growing stock volume and number of trees for both...

  5. Developing a methodology to inspect and assess conditions of short span structures on county roads in Wyoming : [project brief].

    DOT National Transportation Integrated Search

    2015-12-01

    Even though the FHWAs National Bridge Inspection Standards are a very comprehensive tool for : bridge inspection, they only apply to structures with spans of more than 20 feet. WYDOT inspects : these larger bridges on regular intervals, but there ...

  6. Screening Methodologies to Support Risk and Technology Reviews (RTR): A Case Study Analysis

    EPA Science Inventory

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have lar...

  7. Comparing Accessibility Auditing Methods for Ebooks: Crowdsourced, Functionality-Led Versus Web Content Methodologies.

    PubMed

    James, Abi; Draffan, E A; Wald, Mike

    2017-01-01

    This paper presents a gap analysis between crowdsourced functional accessibility evaluations of ebooks conducted by non-experts and the technical accessibility standards employed by developers. It also illustrates how combining these approaches can provide more appropriate information for a wider group of users with print impairments.

  8. Authentic Geometry Adventures

    ERIC Educational Resources Information Center

    Minetola, Janice; Serr, Konnie; Nelson, Laureen

    2012-01-01

    Teachers have the opportunity to capitalize on a vast array of real-world, two- and three-dimensional objects as they guide students in developing a conceptual understanding of geometric shapes. An important component of the NCTM Standards is to use teaching methodologies that engage children in making real-world connections to the mathematics…

  9. Optimization of protocol design: a path to efficient, lower cost clinical trial execution

    PubMed Central

    Malikova, Marina A

    2016-01-01

    Managing clinical trials requires strategic planning and efficient execution. In order to achieve a timely delivery of important clinical trials’ outcomes, it is useful to establish standardized trial management guidelines and develop robust scoring methodology for evaluation of study protocol complexity. This review will explore the challenges clinical teams face in developing protocols to ensure that the right patients are enrolled and the right data are collected to demonstrate that a drug is safe and efficacious, while managing study costs and study complexity based on proposed comprehensive scoring model. Key factors to consider when developing protocols and techniques to minimize complexity will be discussed. A methodology to identify processes at planning phase, approaches to increase fiscal return and mitigate fiscal compliance risk for clinical trials will be addressed. PMID:28031939

  10. Rotary-wing flight test methods used for the evaluation of night vision devices

    NASA Astrophysics Data System (ADS)

    Haworth, Loran A.; Blanken, Christopher J.; Szoboszlay, Zoltan P.

    2001-08-01

    The U.S. Army Aviation mission includes flying helicopters at low altitude, at night, and in adverse weather. Night Vision Devices (NVDs) are used to supplement the pilot's visual cues for night flying. As the military requirement to conduct night helicopter operations has increased, the impact of helicopter flight operations with NVD technology in the Degraded Visual Environment (DVE) became increasingly important to quantify. Aeronautical Design Standard-33 (ADS- 33) was introduced to update rotorcraft handling qualities requirements and to quantify the impact of the NVDs in the DVE. As reported in this paper, flight test methodology in ADS-33 has been used by the handling qualities community to measure the impact of NVDs on task performance in the DVE. This paper provides the background and rationale behind the development of ADS-33 flight test methodology for handling qualities in the DVE, as well as the test methodology developed for human factor assessment of NVDs in the DVE. Lessons learned, shortcomings and recommendations for NVD flight test methodology are provided in this paper.

  11. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  12. Annual Research Progress Report.

    DTIC Science & Technology

    1979-09-30

    will be trained in SLRL test procedures and the methodology will be developed for the incorporation of test materials into the standard rearing diet ...requirements exist for system software maintenance and development of software to report dosing data, to calculate diet preparation data, to manage collected...influence of diet and exercise on myo- globin and metmyoglobin reductase were evaluated in the rat. The activity of inetmyo- globin reductase was

  13. Bridging the gap in complementary and alternative medicine research: manualization as a means of promoting standardization and flexibility of treatment in clinical trials of acupuncture.

    PubMed

    Schnyer, Rosa N; Allen, John J B

    2002-10-01

    An important methodological challenge encountered in acupuncture clinical research involves the design of treatment protocols that help ensure standardization and replicability while allowing for the necessary flexibility to tailor treatments to each individual. Manualization of protocols used in clinical trials of acupuncture and other traditionally-based complementary and alternative medicine (CAM) systems facilitates the systematic delivery of replicable and standardized, yet individually-tailored treatments. To facilitate high-quality CAM acupuncture research by outlining a method for the systematic design and implementation of protocols used in CAM clinical trials based on the concept of treatment manualization. A series of treatment manuals was developed to systematically articulate the Chinese medical theoretical and clinical framework for a given Western-defined illness, to increase the quality and consistency of treatment, and to standardize the technical aspects of the protocol. In all, three manuals were developed for National Institutes of Health (NIH)-funded clinical trials of acupuncture for depression, spasticity in cerebral palsy, and repetitive stress injury. In Part I, the rationale underlying these manuals and the challenges encountered in creating them are discussed, and qualitative assessments of their utility are provided. In Part II, a methodology to develop treatment manuals for use in clinical trials is detailed, and examples are given. A treatment manual provides a precise way to train and supervise practitioners, enable evaluation of conformity and competence, facilitate the training process, and increase the ability to identify the active therapeutic ingredients in clinical trials of acupuncture.

  14. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    NASA Astrophysics Data System (ADS)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  15. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  16. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior

    PubMed Central

    Sazonov, Edward; Schuckers, Stephanie; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Sazonova, Nadezhda; Melanson, Edward L.; Neuman, Michael

    2008-01-01

    A methodology of studying of ingestive behavior by non-invasive monitoring of swallowing (deglutition) and chewing (mastication) has been developed. The target application for the developed methodology is to study the behavioral patterns of food consumption and producing volumetric and weight estimates of energy intake. Monitoring is non-invasive based on detecting swallowing by a sound sensor located over laryngopharynx or by a bone conduction microphone and detecting chewing through a below-the-ear strain sensor. Proposed sensors may be implemented in a wearable monitoring device, thus enabling monitoring of ingestive behavior in free living individuals. In this paper, the goals in the development of this methodology are two-fold. First, a system comprised of sensors, related hardware and software for multimodal data capture is designed for data collection in a controlled environment. Second, a protocol is developed for manual scoring of chewing and swallowing for use as a gold standard. The multi-modal data capture was tested by measuring chewing and swallowing in twenty one volunteers during periods of food intake and quiet sitting (no food intake). Video footage and sensor signals were manually scored by trained raters. Inter-rater reliability study for three raters conducted on the sample set of 5 subjects resulted in high average intra-class correlation coefficients of 0.996 for bites, 0.988 for chews, and 0.98 for swallows. The collected sensor signals and the resulting manual scores will be used in future research as a gold standard for further assessment of sensor design, development of automatic pattern recognition routines, and study of the relationship between swallowing/chewing and ingestive behavior. PMID:18427161

  17. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior.

    PubMed

    Sazonov, Edward; Schuckers, Stephanie; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Sazonova, Nadezhda; Melanson, Edward L; Neuman, Michael

    2008-05-01

    A methodology of studying of ingestive behavior by non-invasive monitoring of swallowing (deglutition) and chewing (mastication) has been developed. The target application for the developed methodology is to study the behavioral patterns of food consumption and producing volumetric and weight estimates of energy intake. Monitoring is non-invasive based on detecting swallowing by a sound sensor located over laryngopharynx or by a bone-conduction microphone and detecting chewing through a below-the-ear strain sensor. Proposed sensors may be implemented in a wearable monitoring device, thus enabling monitoring of ingestive behavior in free-living individuals. In this paper, the goals in the development of this methodology are two-fold. First, a system comprising sensors, related hardware and software for multi-modal data capture is designed for data collection in a controlled environment. Second, a protocol is developed for manual scoring of chewing and swallowing for use as a gold standard. The multi-modal data capture was tested by measuring chewing and swallowing in 21 volunteers during periods of food intake and quiet sitting (no food intake). Video footage and sensor signals were manually scored by trained raters. Inter-rater reliability study for three raters conducted on the sample set of five subjects resulted in high average intra-class correlation coefficients of 0.996 for bites, 0.988 for chews and 0.98 for swallows. The collected sensor signals and the resulting manual scores will be used in future research as a gold standard for further assessment of sensor design, development of automatic pattern recognition routines and study of the relationship between swallowing/chewing and ingestive behavior.

  18. Measures of outdoor play and independent mobility in children and youth: A methodological review.

    PubMed

    Bates, Bree; Stone, Michelle R

    2015-09-01

    Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  19. Sizing Single Cantilever Beam Specimens for Characterizing Facesheet/Core Peel Debonding in Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.

    2010-01-01

    This technical publication details part of an effort focused on the development of a standardized facesheet/core peel debonding test procedure. The purpose of the test is to characterize facesheet/core peel in sandwich structure, accomplished through the measurement of the critical strain energy release rate associated with the debonding process. Following an examination of previously developed tests and a recent evaluation of a selection of these methods, a single cantilever beam (SCB) specimen was identified as being a promising candidate for establishing such a standardized test procedure. The objective of the work described here was to begin development of a protocol for conducting a SCB test that will render the procedure suitable for standardization. To this end, a sizing methodology was developed to ensure appropriate SCB specimen dimensions are selected for a given sandwich system. Application of this method to actual sandwich systems yielded SCB specimen dimensions that would be practical for use. This study resulted in the development of a practical SCB specimen sizing method, which should be well-suited for incorporation into a standardized testing protocol.

  20. Handling the difficult Brownfields issues: A case study of privately funded remediation to residential standards update 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLeod, D.P.; Ridley, A.P.

    Most Brownfields projects are based on either direct or indirect government funding. This paper describes a more unusual scenario: the remediation of a contaminated industrial site for re-use as residential property. Using the ASTM RBCA risk assessment methodology and an innovative fixed fee arrangement between Woodward-Clyde Consultants and the site owner, they developed and successfully implemented a plan to clean up the site to residential standards over a twelve (12) month time period.

  1. Standard methodologies for virus research in Apis mellifera

    USDA-ARS?s Scientific Manuscript database

    The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...

  2. Standard methodologies for Nosema apis and N. ceranae research

    USDA-ARS?s Scientific Manuscript database

    The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...

  3. Methodologic ramifications of paying attention to sex and gender differences in clinical research.

    PubMed

    Prins, Martin H; Smits, Kim M; Smits, Luc J

    2007-01-01

    Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.

  4. Toward the First Data Acquisition Standard in Synthetic Biology.

    PubMed

    Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I

    2016-08-19

    This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.

  5. Model driven development of clinical information sytems using openEHR.

    PubMed

    Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim

    2011-01-01

    openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.

  6. Constructing post-surgical discharge instructions through a Delphi consensus methodology.

    PubMed

    Scott, Aaron R; Sanderson, Cody J; Rush, Augustus J; Alore, Elizabeth A; Naik, Aanand D; Berger, David H; Suliburk, James W

    2018-05-01

    Patient education materials are a crucial part of physician-patient communication. We hypothesize that available discharge instructions are difficult to read and fail to address necessary topics. Our objective is to evaluate readability and content of surgical discharge instructions using thyroidectomy to develop standardized discharge materials. Thyroidectomy discharge materials were analyzed for readability and assessed for content. Fifteen endocrine surgeons participated in a modified Delphi consensus panel to select necessary topics. Using readability best practices, we created standardized discharge instructions which included all selected topics. The panel evaluated 40 topics, selected 23, deemed 4 inappropriate, consolidated 5, and did not reach consensus on 8 topics after 4 rounds. The evaluated instructions' reading levels ranged from grade 6.5 to 13.2; none contained all consensus topics. Current post surgical thyroidectomy discharge instructions are more difficult to read than recommended by literacy standards and omit consensus warning signs of major complications. Our easy-to-read discharge instructions cover pertinent topics and may enhance patient education. Delphi methodology is useful for developing post-surgical instructions. Patient education materials need appropriate readability levels and content. We recommend the Delphi method to select content using consensus expert opinion whenever higher level data is lacking. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Standards for the user interface - Developing a user consensus. [for Space Station Information System

    NASA Technical Reports Server (NTRS)

    Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.

    1987-01-01

    The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.

  8. Query Health: standards-based, cross-platform population health surveillance

    PubMed Central

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371

  9. Query Health: standards-based, cross-platform population health surveillance.

    PubMed

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. Protocol—the RAMESES II study: developing guidance and reporting standards for realist evaluation

    PubMed Central

    Greenhalgh, Trisha; Wong, Geoff; Jagosh, Justin; Greenhalgh, Joanne; Manzano, Ana; Westhorp, Gill; Pawson, Ray

    2015-01-01

    Introduction Realist evaluation is an increasingly popular methodology in health services research. For realist evaluations (RE) this project aims to: develop quality and reporting standards and training materials; build capacity for undertaking and critically evaluating them; produce resources and training materials for lay participants, and those seeking to involve them. Methods To achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources. Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE. Discussion The realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency. PMID:26238395

  11. 49 CFR 1111.9 - Procedural schedule in cases using simplified standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) SURFACE TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION RULES OF PRACTICE COMPLAINT AND INVESTIGATION... the simplified standards: (1) In cases relying upon the Simplified-SAC methodology: Day 0—Complaint... dominance. (b) Defendant's second disclosure. In cases using the Simplified-SAC methodology, the defendant...

  12. Credit risk migration rates modeling as open systems: A micro-simulation approach

    NASA Astrophysics Data System (ADS)

    Landini, S.; Uberti, M.; Casellina, S.

    2018-05-01

    The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.

  13. Technical Support Document: Development of the Advanced Energy Design Guide for K-12 Schools--30% Energy Savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pless, S.; Torcellini, P.; Long, N.

    2007-09-01

    This Technical Support Document describes the process and methodology for the development of the Advanced Energy Design Guide for K-12 School Buildings (K-12 AEDG), a design guidance document intended to provide recommendations for achieving 30% energy savings in K-12 Schools over levels contained in ANSI/ASHRAE/IESNA Standard 90.1-1999, Energy Standard for Buildings Except Low-Rise Residential Buildings. The 30% energy savings target is the first step toward achieving net-zero energy schools; schools that, on an annual basis, draw from outside sources less or equal energy than they generate on site from renewable energy sources.

  14. A semi-automated methodology for finding lipid-related GO terms.

    PubMed

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  15. Determination of the coefficient of dynamic friction between coatings of alumina and metallic materials

    NASA Astrophysics Data System (ADS)

    Santos, A.; Córdoba, E.; Ramírez, Z.; Sierra, C.; Ortega, Y.

    2017-12-01

    This project aims to determine the coefficient of dynamic friction between micrometric size coatings of alumina and metallic materials (Steel and aluminium); the methodology used to achieve the proposed objective consisted of 4 phases, in the first one was developed a procedure that allowed, from a Pin on Disk machine built based on the specifications given by the ASTM G99-05 standard (Standard test method for wear tests with a Pin on Disk machine), to determine the coefficient of dynamic friction between two materials in contact; subsequently the methodology was verified through tests between steel-steel and steel-aluminium, due to these values are widely reported in the literature; as a third step, deposits of alumina particles of micrometric size were made on a steel substrate through thermal spraying by flame; finally, the tests were carried out between pins of steel of aluminium and alumina coating to determine the coefficients of dynamic friction between these two surfaces. The results of the project allowed to verify that the developed methodology is valid to obtain coefficients of dynamic friction between surfaces in contact since the percentages of error were of 3.5% and 2.1% for steel-steel and aluminium-steel, respectively; additionally, it was found that the coefficient of friction between steel-alumina coatings is 0.36 and aluminium-alumina coating is 0.25.

  16. Autism and family home movies: a comprehensive review.

    PubMed

    Palomo, Rubén; Belinchón, Mercedes; Ozonoff, Sally

    2006-04-01

    In this article, we focus on the early development of autism studied through family home movies. We review all investigations published in English that met specific methodological standards, including the use of comparison samples, coding blind to group membership, and adequate levels of interrater reliability. After discussing in detail the pros and cons of the home-movie methodology, we review the results of all empirical studies conducted to date. We then present a summary of the features found consistently across studies that differentiate autism from typical development and mental retardation in the first 2 years of life. How family home movies can contribute to our understanding of the regression phenomenon is also addressed. Finally, the results are interpreted from both a theoretical and clinical point of view.

  17. Economic evaluation of health promotion interventions for older people: do applied economic studies meet the methodological challenges?

    PubMed

    Huter, Kai; Dubas-Jakóbczyk, Katarzyna; Kocot, Ewa; Kissimova-Skarbek, Katarzyna; Rothgang, Heinz

    2018-01-01

    In the light of demographic developments health promotion interventions for older people are gaining importance. In addition to methodological challenges arising from the economic evaluation of health promotion interventions in general, there are specific methodological problems for the particular target group of older people. There are especially four main methodological challenges that are discussed in the literature. They concern measurement and valuation of informal caregiving, accounting for productivity costs, effects of unrelated cost in added life years and the inclusion of 'beyond-health' benefits. This paper focuses on the question whether and to what extent specific methodological requirements are actually met in applied health economic evaluations. Following a systematic review of pertinent health economic evaluations, the included studies are analysed on the basis of four assessment criteria that are derived from methodological debates on the economic evaluation of health promotion interventions in general and economic evaluations targeting older people in particular. Of the 37 studies included in the systematic review, only very few include cost and outcome categories discussed as being of specific relevance to the assessment of health promotion interventions for older people. The few studies that consider these aspects use very heterogeneous methods, thus there is no common methodological standard. There is a strong need for the development of guidelines to achieve better comparability and to include cost categories and outcomes that are relevant for older people. Disregarding these methodological obstacles could implicitly lead to discrimination against the elderly in terms of health promotion and disease prevention and, hence, an age-based rationing of public health care.

  18. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  19. DICOM static and dynamic representation through unified modeling language

    NASA Astrophysics Data System (ADS)

    Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.

    2004-04-01

    The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.

  20. Translating Radiometric Requirements for Satellite Sensors to Match International Standards.

    PubMed

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.

  1. Translating Radiometric Requirements for Satellite Sensors to Match International Standards

    PubMed Central

    Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong

    2014-01-01

    International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032

  2. Development of Field Methodology and Processes for Task Analysis and Training Feedback

    DTIC Science & Technology

    1978-10-31

    To evaluate technical ability and/or pco ad~nil- 2064 5. If part is in. notifies Shop Pffice t-e Job status is tration of shop supply elemnt and...Pepairs are ct- Dieted within a reasonable tnie frare consistent with prevailing conditions and pablispied standards, 5. Corpletion cf work must be

  3. The Transformative Power of Taking an Inquiry Stance on Practice: Practitioner Research as Narrative and Counter-Narrative

    ERIC Educational Resources Information Center

    Ravitch, Sharon M.

    2014-01-01

    Within the ever-developing, intersecting, and overlapping contexts of globalization, top-down policy, mandates, and standardization of public and higher education, many conceptualize and position practitioner research as a powerful stance and a tool of social, communal, and educational transformation, a set of methodological processes that…

  4. A Methodology to Develop Ontologies for Emerging Domains

    ERIC Educational Resources Information Center

    Meenorngwar, Chai

    2013-01-01

    The characteristic of complex, dynamic domains, such as an emerging domain, is that the information necessary to describe them is not fully established. Standards are not yet established for these domains, and hence they are difficult to describe and present, and methods are needed that will reflect the changes that will occur as the domains…

  5. Standard Review Plan for Environmental Restoration Program Quality Management Plans. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-12-01

    The Department of Energy, Richland Operations Office (RL) Manual Environmental Restoration Program Quality System Requirements (QSR) for the Hanford Site, defines all quality requirements governing Hanford Environmental Restoration (ER) Program activities. The QSR requires that ER Program participants develop Quality Management Plans (QMPs) that describe how the QSR requirements will be implemented for their assigned scopes of work. This standard review plan (SRP) describes the ER program participant responsibilities for submittal of QMPs to the RL Environmental Restoration Division for review and the RL methodology for performing the reviews of participant QMPS. The SRP serves the following functions: acts asmore » a guide in the development or revision of QMPs to assure that the content is complete and adequate; acts as a checklist to be used by the RL staff in their review of participant QMPs; acts as an index or matrix between the requirements of the QSR and implementing methodologies described in the QMPs; decreases the time and subjectivity of document reviews; and provides a formal, documented method for describing exceptions, modifications, or waivers to established ER Program quality requirements.« less

  6. Density matters: Review of approaches to setting organism-based ballast water discharge standards

    USGS Publications Warehouse

    Lee II,; Frazier,; Ruiz,

    2010-01-01

    As part of their effort to develop national ballast water discharge standards under NPDES permitting, the Office of Water requested that WED scientists identify and review existing approaches to generating organism-based discharge standards for ballast water. Six potential approaches were identified and the utility and uncertainties of each approach was evaluated. During the process of reviewing the existing approaches, the WED scientists, in conjunction with scientists at the USGS and Smithsonian Institution, developed a new approach (per capita invasion probability or "PCIP") that addresses many of the limitations of the previous methodologies. THE PCIP approach allows risk managers to generate quantitative discharge standards using historical invasion rates, ballast water discharge volumes, and ballast water organism concentrations. The statistical power of sampling ballast water for both the validation of ballast water treatment systems and ship-board compliance monitoring with the existing methods, though it should be possible to obtain sufficient samples during treatment validation. The report will go to a National Academy of Sciences expert panel that will use it in their evaluation of approaches to developing ballast water discharge standards for the Office of Water.

  7. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  8. The availability of public information for insurance risk decision-making in the UK

    NASA Astrophysics Data System (ADS)

    Davis, Nigel; Gibbs, Mark; Chadwick, Ben; Foote, Matthew

    2010-05-01

    At present, there is a wealth of hazard and exposure data which cannot or is not being full used by risk modelling community. The reasons for this under-utilisation of data are many: restrictive and complex data policies and pricing, risks involved in information sharing, technological shortcomings, and variable resolution of data, particularly with catastrophe models only recently having been adjusted to consume high-resolution exposure data. There is therefore an urgent need for the development of common modelling practices and applications for climate and geo-hazard risk assessment, all of which would be highly relevant to public policy, disaster risk management and financial risk transfer communities. This paper will present a methodology to overcome these obstacles and to review the availability of hazard data at research institutions in a consistent format. Such a methodology would facilitate the collation of hazard and other auxiliary data, as well as present data within a geo-spatial framework suitable for public and commercial use. The methodology would also review the suitability of datasets and how these could be made more freely available in conjunction with other research institutions in order to present a consistent data standard. It is clear that an understanding of these different issues of data and data standards have significant ramifications when used in Natural Hazard Risk Assessment. Scrutinising the issue of data standards also allows the data to be evaluated and re-evaluated for its gaps, omissions, fitness, purpose, availability and precision. Not only would there be a quality check on data, but it would also help develop and fine-tune the tools used for decision-making and assessment of risk.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Vleet, Mary J.; Misquitta, Alston J.; Stone, Anthony J.

    Short-range repulsion within inter-molecular force fields is conventionally described by either Lennard-Jones or Born-Mayer forms. Despite their widespread use, these simple functional forms are often unable to describe the interaction energy accurately over a broad range of inter-molecular distances, thus creating challenges in the development of ab initio force fields and potentially leading to decreased accuracy and transferability. Herein, we derive a novel short-range functional form based on a simple Slater-like model of overlapping atomic densities and an iterated stockholder atom (ISA) partitioning of the molecular electron density. We demonstrate that this Slater-ISA methodology yields a more accurate, transferable, andmore » robust description of the short-range interactions at minimal additional computational cost compared to standard Lennard-Jones or Born-Mayer approaches. Lastly, we show how this methodology can be adapted to yield the standard Born-Mayer functional form while still retaining many of the advantages of the Slater-ISA approach.« less

  10. High-Penetration Photovoltaic Planning Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to provide an overview of select U.S. utility methodologies for performing high-penetration photovoltaic (HPPV) system planning and impact studies. This report covers the Federal Energy Regulatory Commission's orders related to photovoltaic (PV) power system interconnection, particularly the interconnection processes for the Large Generation Interconnection Procedures and Small Generation Interconnection Procedures. In addition, it includes U.S. state interconnection standards and procedures. The procedures used by these regulatory bodies consider the impacts of HPPV power plants on the networks. Technical interconnection requirements for HPPV voltage regulation include aspects of power monitoring, grounding, synchronization, connection tomore » the overall distribution system, back-feeds, disconnecting means, abnormal operating conditions, and power quality. This report provides a summary of mitigation strategies to minimize the impact of HPPV. Recommendations and revisions to the standards may take place as the penetration level of renewables on the grid increases and new technologies develop in future years.« less

  11. Evaluation of an ontological resource for pharmacovigilance.

    PubMed

    Jaulent, Marie-Christine; Alecu, Iulian

    2009-01-01

    In this work, we present a methodology for evaluating an ontology designed in a previous study to describe adverse drug reactions. We evaluate it in term of its fitness for grouping cases in pharmacovigilance. We define as gold standard the Standardized MedDRA Queries (SMQs) developed manually to group terms representing similar medical conditions. We perform an automatic search in the ontology in order to retrieve concepts related to the medical conditions. An optimal query is built for each medical condition. The evaluation relies on the comparison between the terms in the SMQ and the terms subsumed by the query. The result is quantified by sensitivity and specificity. We applied this methodology for 24 SMQs and we obtain a mean sensitivity of 0.82. This work allows validating the semantic resource and provides, in perspective, tools to maintain the ontology while the knowledge is evolving.

  12. Development of the Advanced Energy Design Guide for K-12 Schools -- 50% Energy Savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnema, E.; Leach, M.; Pless, S.

    2013-02-01

    This Technical Support Document (TSD) describes the process and methodology for the development of the Advanced Energy Design Guide for K-12 School Buildings: Achieving 50% Energy Savings Toward a Net Zero Energy Building (AEDG-K12) (ASHRAE et al. 2011a). The AEDG-K12 provides recommendations for achieving 50% whole-building energy savings in K-12 schools over levels achieved by following ANSI/ASHRAE/IESNA Standard 90.1-2004, Energy Standard for Buildings Except Low-Rise Residential Buildings (Standard 90.1-2004) (ASHRAE 2004b). The AEDG-K12 was developed in collaboration with the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), the American Institute of Architects (AIA), the Illuminating Engineering Society of Northmore » America (IES), the U.S. Green Building Council (USGBC), and the U.S. Department of Energy (DOE).« less

  13. Enabling reliability assessments of pre-commercial perovskite photovoltaics with lessons learned from industrial standards

    NASA Astrophysics Data System (ADS)

    Snaith, Henry J.; Hacke, Peter

    2018-06-01

    Photovoltaic modules are expected to operate in the field for more than 25 years, so reliability assessment is critical for the commercialization of new photovoltaic technologies. In early development stages, understanding and addressing the device degradation mechanisms are the priorities. However, any technology targeting large-scale deployment must eventually pass industry-standard qualification tests and undergo reliability testing to validate the module lifetime. In this Perspective, we review the methodologies used to assess the reliability of established photovoltaics technologies and to develop standardized qualification tests. We present the stress factors and stress levels for degradation mechanisms currently identified in pre-commercial perovskite devices, along with engineering concepts for mitigation of those degradation modes. Recommendations for complete and transparent reporting of stability tests are given, to facilitate future inter-laboratory comparisons and to further the understanding of field-relevant degradation mechanisms, which will benefit the development of accelerated stress tests.

  14. Technical Support Document: Development of the Advanced Energy Design Guide for Medium to Big Box Retail Buildings - 50% Energy Savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnema, E.; Leach, M.; Pless, S.

    2013-06-01

    This Technical Support Document describes the process and methodology for the development of the Advanced Energy Design Guide for Medium to Big Box Retail Buildings: Achieving 50% Energy Savings Toward a Net Zero Energy Building (AEDG-MBBR) ASHRAE et al. (2011b). The AEDG-MBBR is intended to provide recommendations for achieving 50% whole-building energy savings in retail stores over levels achieved by following ANSI/ASHRAE/IESNA Standard 90.1-2004, Energy Standard for Buildings Except Low-Rise Residential Buildings (Standard 90.1-2004) (ASHRAE 2004b). The AEDG-MBBR was developed in collaboration with the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), the American Institute of Architects (AIA), themore » Illuminating Engineering Society of North America (IES), the U.S. Green Building Council (USGBC), and the U.S. Department of Energy.« less

  15. Technical Support Document: Development of the Advanced Energy Design Guide for Medium to Big Box Retail Buildings - 50% Energy Savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnema, Eric; Leach, Matt; Pless, Shanti

    2013-06-05

    This Technical Support Document describes the process and methodology for the development of the Advanced Energy Design Guide for Medium to Big Box Retail Buildings: Achieving 50% Energy Savings Toward a Net Zero Energy Building (AEDG-MBBR) ASHRAE et al. (2011b). The AEDG-MBBR is intended to provide recommendations for achieving 50% whole-building energy savings in retail stores over levels achieved by following ANSI/ASHRAE/IESNA Standard 90.1-2004, Energy Standard for Buildings Except Low-Rise Residential Buildings (Standard 90.1-2004) (ASHRAE 2004b). The AEDG-MBBR was developed in collaboration with the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), the American Institute of Architects (AIA), themore » Illuminating Engineering Society of North America (IES), the U.S. Green Building Council (USGBC), and the U.S. Department of Energy.« less

  16. Development of ASTM Standard for SiC-SiC Joint Testing Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, George; Back, Christina

    2015-10-30

    As the nuclear industry moves to advanced ceramic based materials for cladding and core structural materials for a variety of advanced reactors, new standards and test methods are required for material development and licensing purposes. For example, General Atomics (GA) is actively developing silicon carbide (SiC) based composite cladding (SiC-SiC) for its Energy Multiplier Module (EM2), a high efficiency gas cooled fast reactor. Through DOE funding via the advanced reactor concept program, GA developed a new test method for the nominal joint strength of an endplug sealed to advanced ceramic tubes, Fig. 1-1, at ambient and elevated temperatures called themore » endplug pushout (EPPO) test. This test utilizes widely available universal mechanical testers coupled with clam shell heaters, and specimen size is relatively small, making it a viable post irradiation test method. The culmination of this effort was a draft of an ASTM test standard that will be submitted for approval to the ASTM C28 ceramic committee. Once the standard has been vetted by the ceramics test community, an industry wide standard methodology to test joined tubular ceramic components will be available for the entire nuclear materials community.« less

  17. Most systematic reviews of high methodological quality on psoriasis interventions are classified as high risk of bias using ROBIS tool.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-01

    No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  19. Technology transfer through a network of standard methods and recommended practices - The case of petrochemicals

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Karvounis, Sotirios

    2012-12-01

    Technology transfer may take place in parallel with cooperative action between companies participating in the same organizational scheme or using one another as subcontractor (outsourcing). In this case, cooperation should be realized by means of Standard Methods and Recommended Practices (SRPs) to achieve (i) quality of intermediate/final products according to specifications and (ii) industrial process control as required to guarantee such quality with minimum deviation (corresponding to maximum reliability) from preset mean values of representative quality parameters. This work deals with the design of the network of SRPs needed in each case for successful cooperation, implying also the corresponding technology transfer, effectuated through a methodological framework developed in the form of an algorithmic procedure with 20 activity stages and 8 decision nodes. The functionality of this methodology is proved by presenting the path leading from (and relating) a standard test method for toluene, as petrochemical feedstock in the toluene diisocyanate production, to the (6 generations distance upstream) performance evaluation of industrial process control systems (ie., from ASTM D5606 to BS EN 61003-1:2004 in the SRPs network).

  20. Product pricing in the Solar Array Manufacturing Industry - An executive summary of SAMICS

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1978-01-01

    Capabilities, methodology, and a description of input data to the Solar Array Manufacturing Industry Costing Standards (SAMICS) are presented. SAMICS were developed to provide a standardized procedure and data base for comparing manufacturing processes of Low-cost Solar Array (LSA) subcontractors, guide the setting of research priorities, and assess the progress of LSA toward its hundred-fold cost reduction goal. SAMICS can be used to estimate the manufacturing costs and product prices and determine the impact of inflation, taxes, and interest rates, but it is limited by its ignoring the effects of the market supply and demand and an assumption that all factories operate in a production line mode. The SAMICS methodology defines the industry structure, hypothetical supplier companies, and manufacturing processes and maintains a body of standardized data which is used to compute the final product price. The input data includes the product description, the process characteristics, the equipment cost factors, and production data for the preparation of detailed cost estimates. Activities validating that SAMICS produced realistic price estimates and cost breakdowns are described.

  1. A reference case for economic evaluations in osteoarthritis: an expert consensus article from the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis (ESCEO).

    PubMed

    Hiligsmann, Mickaël; Cooper, Cyrus; Guillemin, Francis; Hochberg, Marc C; Tugwell, Peter; Arden, Nigel; Berenbaum, Francis; Boers, Maarten; Boonen, Annelies; Branco, Jaime C; Maria-Luisa, Brandi; Bruyère, Olivier; Gasparik, Andrea; Kanis, John A; Kvien, Tore K; Martel-Pelletier, Johanne; Pelletier, Jean-Pierre; Pinedo-Villanueva, Rafael; Pinto, Daniel; Reiter-Niesert, Susanne; Rizzoli, René; Rovati, Lucio C; Severens, Johan L; Silverman, Stuart; Reginster, Jean-Yves

    2014-12-01

    General recommendations for a reference case for economic studies in rheumatic diseases were published in 2002 in an initiative to improve the comparability of cost-effectiveness studies in the field. Since then, economic evaluations in osteoarthritis (OA) continue to show considerable heterogeneity in methodological approach. To develop a reference case specific for economic studies in OA, including the standard optimal care, with which to judge new pharmacologic and non-pharmacologic interventions. Four subgroups of an ESCEO expert working group on economic assessments (13 experts representing diverse aspects of clinical research and/or economic evaluations) were charged with producing lists of recommendations that would potentially improve the comparability of economic analyses in OA: outcome measures, comparators, costs and methodology. These proposals were discussed and refined during a face-to-face meeting in 2013. They are presented here in the format of the recommendations of the recently published Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement, so that an initiative on economic analysis methodology might be consolidated with an initiative on reporting standards. Overall, three distinct reference cases are proposed, one for each hand, knee and hip OA; with diagnostic variations in the first two, giving rise to different treatment options: interphalangeal or thumb-based disease for hand OA and the presence or absence of joint malalignment for knee OA. A set of management strategies is proposed, which should be further evaluated to help establish a consensus on the "standard optimal care" in each proposed reference case. The recommendations on outcome measures, cost itemisation and methodological approaches are also provided. The ESCEO group proposes a set of disease-specific recommendations on the conduct and reporting of economic evaluations in OA that could help the standardisation and comparability of studies that evaluate therapeutic strategies of OA in terms of costs and effectiveness. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Systematic and progressive implementation of the centers of excellence for rheumatoid arthritis: a methodological proposal.

    PubMed

    Santos-Moreno, Pedro; Caballero-Uribe, Carlo V; Massardo, Maria Loreto; Maldonado, Claudio Galarza; Soriano, Enrique R; Pineda, Carlos; Cardiel, Mario; Benavides, Juan Alberto; Beltrán, Paula Andrea

    2017-12-01

    The implementation of excellence centers in specific diseases has been gaining recognition in the field of health; specifically in rheumatoid arthritis, where the prognosis of the disease is related to an early diagnosis and a timely intervention, it is necessary that the provision of health services is developed in an environment of quality, opportunity, and safety with the highest standards of care. A methodology that allows this implementation in such a way that is achievable by the most of the care centers is a priority to achieve a better attention to populations with this disease. In this paper, we propose a systematic and progressive methodology that will help all the institutions to develop successful models without faltering in the process. The expected impact on public health is defined by a better effective coverage of high-quality treatments, obtaining better health outcomes with safety and accessibility that reduces the budgetary impact for the health systems of our countries.

  3. DETERMINATION OF THE STRONG ACIDITY OF ATMOSPHERIC FINE PARTICLES (<2.5 UM) USING ANNULAR DENUDER TECHNOLOGY

    EPA Science Inventory

    This report is a standardized methodology description for the determination of strong acidity of fine particles (less than 2.5 microns) in ambient air using annular denuder technology. his methodology description includes two parts: art A - Standard Method and Part B - Enhanced M...

  4. Renewable Energy used in State Renewable Portfolio Standards Yielded

    Science.gov Websites

    . Renewable Portfolio Standards also shows national water withdrawals and water consumption by fossil-fuel methodologies, while recognizing that states could perform their own more-detailed assessments," NREL's , respectively. Ranges are presented as the models and methodologies used are sensitive to multiple parameters

  5. 42 CFR 416.171 - Determination of payment rates for ASC services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Determination of payment rates for ASC services... Determination of payment rates for ASC services. (a) Standard methodology. The standard methodology for determining the national unadjusted payment rate for ASC services is to calculate the product of the...

  6. 45 CFR 153.510 - Risk corridors establishment and payment methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... methodology. 153.510 Section 153.510 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...

  7. 45 CFR 153.510 - Risk corridors establishment and payment methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... methodology. 153.510 Section 153.510 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...

  8. The impact of leadership and team behavior on standard of care delivered during human patient simulation: a pilot study for undergraduate medical students.

    PubMed

    Carlson, Jim; Min, Elana; Bridges, Diane

    2009-01-01

    Methodology to train team behavior during simulation has received increased attention, but standard performance measures are lacking, especially at the undergraduate level. Our purposes were to develop a reliable team behavior measurement tool and explore the relationship between team behavior and the delivery of an appropriate standard of care specific to the simulated case. Authors developed a unique team measurement tool based on previous work. Trainees participated in a simulated event involving the presentation of acute dyspnea. Performance was rated by separate raters using the team behavior measurement tool. Interrater reliability was assessed. The relationship between team behavior and the standard of care delivered was explored. The instrument proved to be reliable for this case and group of raters. Team behaviors had a positive relationship with the standard of medical care delivered specific to the simulated case. The methods used provide a possible method for training and assessing team performance during simulation.

  9. Development and application of a methodology for a clean development mechanism to avoid methane emissions in closed landfills.

    PubMed

    Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M

    2013-01-01

    In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.

  10. An economic toolkit for identifying the cost of emergency medical services (EMS) systems: detailed methodology of the EMS Cost Analysis Project (EMSCAP).

    PubMed

    Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W

    2012-02-01

    Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources. © 2012 by the Society for Academic Emergency Medicine.

  11. DICOM: a standard for medical imaging

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Bidgood, W. Dean

    1993-01-01

    Since 1983, the American College of Radiology (ACR) and the National Electrical Manufacturers Association (NEMA) have been engaged in developing standards related to medical imaging. This alliance of users and manufacturers was formed to meet the needs of the medical imaging community as its use of digital imaging technology increased. The development of electronic picture archiving and communications systems (PACS), which could connect a number of medical imaging devices together in a network, led to the need for a standard interface and data structure for use on imaging equipment. Since medical image files tend to be very large and include much text information along with the image, the need for a fast, flexible, and extensible standard was quickly established. The ACR-NEMA Digital Imaging and Communications Standards Committee developed a standard which met these needs. The standard (ACR-NEMA 300-1988) was first published in 1985 and revised in 1988. It is increasingly available from equipment manufacturers. The current work of the ACR- NEMA Committee has been to extend the standard to incorporate direct network connection features, and build on standards work done by the International Standards Organization in its Open Systems Interconnection series. This new standard, called Digital Imaging and Communication in Medicine (DICOM), follows an object-oriented design methodology and makes use of as many existing internationally accepted standards as possible. This paper gives a brief overview of the requirements for communications standards in medical imaging, a history of the ACR-NEMA effort and what it has produced, and a description of the DICOM standard.

  12. [Shoulder disability questionnaires: a systematic review].

    PubMed

    Fayad, F; Mace, Y; Lefevre-Colau, M M

    2005-07-01

    To identify all available shoulder disability questionnaires designed to measure physical functioning and to examine those with satisfactory clinimetric quality. We used the Medline database and the "Guide des outils de mesure de l'évaluation en médecine physique et de réadaptation" textbook to search for questionnaires. Analysis took into account the development methodology, clinimetric quality of the instruments and frequency of their utilization. We classified the instruments according to the International Classification of Functioning, Disability and Health. Thirty-eight instruments have been developed to measure disease-, shoulder- or upper extremity-specific outcome. Four scales assess upper-extremity disability and 3 others shoulder disability. We found 6 scales evaluating disability and shoulder pain, 7 scales measuring the quality of life in patients with various conditions of the shoulder, 14 scales combining objective and subjective measures, 2 pain scales and 2 unclassified scales. Older instruments developed before the advent of modern measurement development methodology usually combine objective and subjective measures. Recent instruments were designed with appropriate methodology. Most are self-administered questionnaires. Numerous shoulder outcome measure instruments are available. There is no "gold standard" for assessing shoulder function outcome in the general population.

  13. Efficacy of Monitoring Devices in Support of Prevention of Pressure Injuries: Systematic Review and Meta-analysis.

    PubMed

    Walia, Gurjot S; Wong, Alison L; Lo, Andrea Y; Mackert, Gina A; Carl, Hannah M; Pedreira, Rachel A; Bello, Ricardo; Aquino, Carla S; Padula, William V; Sacks, Justin M

    2016-12-01

    To present a systematic review of the literature assessing the efficacy of monitoring devices for reducing the risk of developing pressure injuries. This continuing education activity is intended for physicians, physician assistants, nurse practitioners, and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Explain the methodology of the literature review and its results.2. Discuss the scope of the problem and the implications of the research. OBJECTIVE: To assess the efficacy of monitoring devices for reducing the risk of developing pressure injuries (PIs). The authors systematically reviewed the literature by searching PubMed/MEDLINE and CINAHL databases through January 2016. Articles included clinical trials and cohort studies that tested monitoring devices, evaluating PI risk factors on patients in acute and skilled nursing settings. The articles were scored using the Methodological Index for Non-randomized Studies. Using a standardized extraction form, the authors extracted patient inclusion/exclusion criteria, care setting, key baseline, description of monitoring device and methodology, number of patients included in each group, description of any standard of care, follow-up period, and outcomes. Of the identified 1866 publications, 9 met the inclusion criteria. The high-quality studies averaged Methodological Index for Non-randomized Studies scores of 19.4 for clinical trials and 12.2 for observational studies. These studies evaluated monitoring devices that measured interface pressure, subdermal tissue stress, motion, and moisture. Most studies found a statistically significant decrease in PIs; 2 studies were eligible for meta-analysis, demonstrating that use of monitoring devices was associated with an 88% reduction in the risk of developing PIs (Mantel-Haenszel risk ratio, 0.12; 95% confidence interval, 0.04-0.41; I = 0%). Pressure injury monitoring devices are associated with a strong reduction in the risk of developing PIs. These devices provide clinicians and patients with critical information to implement prevention guidelines. Randomized controlled trials would help assess which technologies are most effective at reducing the risk of developing PIs.

  14. The method of selecting an integrated development territory for the high-rise unique constructions

    NASA Astrophysics Data System (ADS)

    Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena

    2018-03-01

    On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.

  15. Development of a standardized training course for laparoscopic procedures using Delphi methodology.

    PubMed

    Bethlehem, Martijn S; Kramp, Kelvin H; van Det, Marc J; ten Cate Hoedemaker, Henk O; Veeger, Nicolaas J G M; Pierie, Jean Pierre E N

    2014-01-01

    Content, evaluation, and certification of laparoscopic skills and procedure training lack uniformity among different hospitals in The Netherlands. Within the process of developing a new regional laparoscopic training curriculum, a uniform and transferrable curriculum was constructed for a series of laparoscopic procedures. The aim of this study was to determine regional expert consensus regarding the key steps for laparoscopic appendectomy and cholecystectomy using Delphi methodology. Lists of suggested key steps for laparoscopic appendectomy and cholecystectomy were created using surgical textbooks, available guidelines, and local practice. A total of 22 experts, working for teaching hospitals throughout the region, were asked to rate the suggested key steps for both procedures on a Likert scale from 1-5. Consensus was reached with Crohnbach's α ≥ 0.90. Of the 22 experts, 21 completed and returned the survey (95%). Data analysis already showed consensus after the first round of Delphi on the key steps for laparoscopic appendectomy (Crohnbach's α = 0.92) and laparoscopic cholecystectomy (Crohnbach's α = 0.90). After the second round, 15 proposed key steps for laparoscopic appendectomy and 30 proposed key steps for laparoscopic cholecystectomy were rated as important (≥4 by at least 80% of the expert panel). These key steps were used for the further development of the training curriculum. By using the Delphi methodology, regional consensus was reached on the key steps for laparoscopic appendectomy and cholecystectomy. These key steps are going to be used for standardized training and evaluation purposes in a new regional laparoscopic curriculum. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  16. Development of a diaphragmatic motion-based elastography framework for assessment of liver stiffness

    NASA Astrophysics Data System (ADS)

    Weis, Jared A.; Johnsen, Allison M.; Wile, Geoffrey E.; Yankeelov, Thomas E.; Abramson, Richard G.; Miga, Michael I.

    2015-03-01

    Evaluation of mechanical stiffness imaging biomarkers, through magnetic resonance elastography (MRE), has shown considerable promise for non-invasive assessment of liver stiffness to monitor hepatic fibrosis. MRE typically requires specialized externally-applied vibratory excitation and scanner-specific motion-sensitive pulse sequences. In this work, we have developed an elasticity imaging approach that utilizes natural diaphragmatic respiratory motion to induce deformation and eliminates the need for external deformation excitation hardware and specialized pulse sequences. Our approach uses clinically-available standard of care volumetric imaging acquisitions, combined with offline model-based post-processing to generate volumetric estimates of stiffness within the liver and surrounding tissue structures. We have previously developed a novel methodology for non-invasive elasticity imaging which utilizes a model-based elasticity reconstruction algorithm and MR image volumes acquired under different states of deformation. In prior work, deformation was external applied through inflation of an air bladder placed within the MR radiofrequency coil. In this work, we extend the methodology with the goal of determining the feasibility of assessing liver mechanical stiffness using diaphragmatic respiratory motion between end-inspiration and end-expiration breath-holds as a source of deformation. We present initial investigations towards applying this methodology to assess liver stiffness in healthy volunteers and cirrhotic patients. Our preliminary results suggest that this method is capable of non-invasive image-based assessment of liver stiffness using natural diaphragmatic respiratory motion and provides considerable enthusiasm for extension of our approach towards monitoring liver stiffness in cirrhotic patients with limited impact to standard-of-care clinical imaging acquisition workflow.

  17. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    PubMed

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  18. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    PubMed

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  19. A Methodological Analysis of Randomized Clinical Trials of Computer-Assisted Therapies for Psychiatric Disorders: Toward Improved Standards for an Emerging Field

    PubMed Central

    Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.

    2013-01-01

    Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689

  20. The European Stroke Organisation Guidelines: a standard operating procedure.

    PubMed

    Ntaios, George; Bornstein, Natan M; Caso, Valeria; Christensen, Hanne; De Keyser, Jacques; Diener, Hans-Christoph; Diez-Tejedor, Exuperio; Ferro, Jose M; Ford, Gary A; Grau, Armin; Keller, Emanuella; Leys, Didier; Russell, David; Toni, Danilo; Turc, Guillaume; Van der Worp, Bart; Wahlgren, Nils; Steiner, Thorsten

    2015-10-01

    In 2008, the recently founded European Stroke Organisation published its guidelines for the management of ischemic stroke and transient ischemic attack. This highly cited document was translated in several languages and was updated in 2009. Since then, the European Stroke Organisation has published guidelines for the management of intracranial aneurysms and subarachnoidal hemorrhage, for the establishment of stroke units and stroke centers, and recently for the management of intracerebral hemorrhage. In recent years, the methodology for the development of guidelines has evolved significantly. To keep pace with this progress and driven by the strong determination of the European Stroke Organisation to further promote stroke management, education, and research, the European Stroke Organisation decided to delineate a detailed standard operating procedure for its guidelines. There are two important cornerstones in this standard operating procedure: The first is the implementation of the Grading of Recommendations Assessment, Development, and Evaluation methodology for the development of its Guideline Documents. The second one is the decision of the European Stroke Organisation to move from the classical model of a single Guideline Document about a major topic (e.g. management of ischemic stroke) to focused modules (i.e. subdivisions of a major topic). This will enable the European Stroke Organisation to react faster when new developments in a specific stroke field occur and update its recommendations on the related module rather swiftly; with the previous approach of a single large Guideline Document, its entire revision had to be completed before an updated publication, delaying the production of up-to-date guidelines. After discussion within the European Stroke Organisation Guidelines Committee and significant input from European Stroke Organisation members as well as methodologists and analysts, this document presents the official standard operating procedure for the development of the Guideline Documents of the European Stroke Organisation. © 2015 World Stroke Organization.

  1. The development of a test methodology for the evaluation of EVA gloves

    NASA Technical Reports Server (NTRS)

    O'Hara, John M.; Cleland, John; Winfield, Dan

    1988-01-01

    This paper describes the development of a standardized set of tests designed to assess EVA-gloved hand capabilities in six measurement domains: range of motion, strength, tactile perception, dexterity, fatigue, and comfort. Based upon an assessment of general human-hand functioning and EVA task requirements, several tests within each measurement domain were developed to provide a comprehensive evaluation. All tests were designed to be conducted in a glove box with the bare hand as a baseline and the EVA glove at operating pressure.

  2. Unbiased roughness measurements: the key to better etch performance

    NASA Astrophysics Data System (ADS)

    Liang, Andrew; Mack, Chris; Sirard, Stephen; Liang, Chen-wei; Yang, Liu; Jiang, Justin; Shamma, Nader; Wise, Rich; Yu, Jengyi; Hymes, Diane

    2018-03-01

    Edge placement error (EPE) has become an increasingly critical metric to enable Moore's Law scaling. Stochastic variations, as characterized for lines by line width roughness (LWR) and line edge roughness (LER), are dominant factors in EPE and known to increase with the introduction of EUV lithography. However, despite recommendations from ITRS, NIST, and SEMI standards, the industry has not agreed upon a methodology to quantify these properties. Thus, differing methodologies applied to the same image often result in different roughness measurements and conclusions. To standardize LWR and LER measurements, Fractilia has developed an unbiased measurement that uses a raw unfiltered line scan to subtract out image noise and distortions. By using Fractilia's inverse linescan model (FILM) to guide development, we will highlight the key influences of roughness metrology on plasma-based resist smoothing processes. Test wafers were deposited to represent a 5 nm node EUV logic stack. The patterning stack consists of a core Si target layer with spin-on carbon (SOC) as the hardmask and spin-on glass (SOG) as the cap. Next, these wafers were exposed through an ASML NXE 3350B EUV scanner with an advanced chemically amplified resist (CAR). Afterwards, these wafers were etched through a variety of plasma-based resist smoothing techniques using a Lam Kiyo conductor etch system. Dense line and space patterns on the etched samples were imaged through advanced Hitachi CDSEMs and the LER and LWR were measured through both Fractilia and an industry standard roughness measurement software. By employing Fractilia to guide plasma-based etch development, we demonstrate that Fractilia produces accurate roughness measurements on resist in contrast to an industry standard measurement software. These results highlight the importance of subtracting out SEM image noise to obtain quicker developmental cycle times and lower target layer roughness.

  3. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. The Advantages and Challenges of Unannounced Standardized Patients Methodology to Assess Healthcare Communication

    PubMed Central

    Siminoff, Laura A.; Rogers, Heather L.; Waller, Allison C.; Harris-Haywood, Sonja; Esptein, Ronald M.; Borrell Carrio, Francesc; Gliva-McConvey, Gayle; Longo, Daniel R.

    2011-01-01

    Objective This paper provides an overview of the implementation of using Unannounced Standardized Patients (USPs) to conduct health communication research in clinical settings. Methods Certain types of health communication situations are difficult to capture because of their rarity or unpredictable nature. In primary care the real reasons for a visit are frequently unknown until the consultation is well under way. Therefore, it is logistically difficult for communication studies to capture many real-time communications between patients and their physicians. Although the USP methodology is ideal for capturing these communication behaviors, challenges to using this method include developing collaborative relationships with clinical practices, logistical issues such as safeguarding the identity of the USP, training USPs and creating their identities, maintaining fidelity to the role, and analyzing the resultant data. Results This paper discusses the challenges and solutions to USP implementation. We provide an example of how to implement a USP study using an on-going study being conducted in primary care practices. Conclusion This paper explores the advantages and challenges as well as strategies to overcome obstacles to implementing a USP study. Practice Implications Despite the challenges, USP methodology can contribute much to our understanding of health communication and practice. PMID:21316182

  5. Common methodologies in the evaluation of food allergy: pitfalls and prospects of food allergy prevalence studies.

    PubMed

    Shu, Shang-an; Chang, Christopher; Leung, Patrick S C

    2014-06-01

    Global and regional studies on the prevalence of food allergies are plagued by inconsistent methodologies, variations in interpretation of results, and non-standardized study design. Hence, it becomes difficult to compare the prevalence of food allergies in different communities. This information would be useful in providing critical data that will enhance research to elucidate the nature of food allergies, and the role of gene-environment interactions in the sensitization of children and adults to foods. Testing methodologies range from questionnaires to objective in vitro and in vivo testing, to the gold standard, double-blind placebo-controlled food challenge (DBPCFC). Although considered the most accurate and reliable method in detecting the prevalence of food allergy, DBPCFC is not always practical in epidemiological studies of food allergy. On the other hand, multiple logistic regression studies have been done to determine predictability of the outcome of food challenges, and it appears that skin prick testing and in vitro-specific serum IgE are the best predictors. Future studies directed towards confirming the validity of these methods as well as developing algorithms to predict the food challenge outcomes are required, as they may someday become accessory tools to complement DBPCFC.

  6. Clinical results from a noninvasive blood glucose monitor

    NASA Astrophysics Data System (ADS)

    Blank, Thomas B.; Ruchti, Timothy L.; Lorenz, Alex D.; Monfre, Stephen L.; Makarewicz, M. R.; Mattu, Mutua; Hazen, Kevin

    2002-05-01

    Non-invasive blood glucose monitoring has long been proposed as a means for advancing the management of diabetes through increased measurement and control. The use of a near-infrared, NIR, spectroscopy based methodology for noninvasive monitoring has been pursued by a number of groups. The accuracy of the NIR measurement technology is limited by challenges related to the instrumentation, the heterogeneity and time-variant nature of skin tissue, and the complexity of the calibration methodology. In this work, we discuss results from a clinical study that targeted the evaluation of individual calibrations for each subject based on a series of controlled calibration visits. While the customization of the calibrations to individuals was intended to reduce model complexity, the extensive requirements for each individual set of calibration data were difficult to achieve and required several days of measurement. Through the careful selection of a small subset of data from all samples collected on the 138 study participants in a previous study, we have developed a methodology for applying a single standard calibration to multiple persons. The standard calibrations have been applied to a plurality of individuals and shown to be persistent over periods greater than 24 weeks.

  7. A MULTIPLE TESTING OF THE ABC METHOD AND THE DEVELOPMENT OF A SECOND GENERATION MODEL. PART I, PRELIMINARY DISCUSSIONS OF METHODOLOGY. SUPPLEMENT, COMPUTER PROGRAMS OF THE HDL INFORMATION SYSTEMS.

    ERIC Educational Resources Information Center

    ALTMANN, BERTHOLD; BROWN, WILLIAM G.

    THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…

  8. Laboratory Training Manual on the Use of Nuclear Techniques in Pesticide Research. Technical Reports Series No. 225.

    ERIC Educational Resources Information Center

    International Atomic Energy Agency, Vienna (Austria).

    Radiolabelled pesticides are used: in studies involving improved formulations of pesticides, to assist in developing standard residue analytical methodology, and in obtaining metabolism data to support registration of pesticides. This manual is designed to give the scientist involved in pesticide research the basic terms and principles for…

  9. Guiding the Development and Use of Cost-Effectiveness Analysis in Education

    ERIC Educational Resources Information Center

    Levin, Henry M.; Belfield, Clive

    2015-01-01

    Cost-effectiveness analysis is rarely used in education. When it is used, it often fails to meet methodological standards, especially with regard to cost measurement. Although there are occasional criticisms of these failings, we believe that it is useful to provide a listing of the more common concerns and how they might be addressed. Based upon…

  10. Developing and Testing Locally Derived Mental Health Scales: Examples from North India and Haiti

    ERIC Educational Resources Information Center

    Weaver, Lesley Jo; Kaiser, Bonnie N.

    2015-01-01

    Cross-cultural studies of mental health and illness generally adhere to one of two agendas: the comparison of mental health between sites using standard measurement tools, or the identification of locally specific ways of discussing mental illness. Here, we illustrate a methodological approach to measuring mental health that unites these two…

  11. Library Manpower, A Preliminary Study of Essential Factors Contributing to Library Staffing Patterns.

    ERIC Educational Resources Information Center

    Fairholm, G.W.; And Others

    This study was conducted to develop quantitative and qualitative productivity standards, work measures, and activity reports to facilitate effective budgeting for library staff in the State University of New York (SUNY) library system. The research methodology used by the study team involved a survey of 11 libraries of the 22 institutions in the…

  12. Handling Math Expressions in Economics: Recoding Spreadsheet Teaching Tool of Growth Models

    ERIC Educational Resources Information Center

    Moro-Egido, Ana I.; Pedauga, Luis E.

    2017-01-01

    In the present paper, we develop a teaching methodology for economic theory. The main contribution of this paper relies on combining the interactive characteristics of spreadsheet programs such as Excel and Unicode plain-text linear format for mathematical expressions. The advantage of Unicode standard rests on its ease for writing and reading…

  13. Estimating Teacher Turnover Costs: A Case Study

    ERIC Educational Resources Information Center

    Levy, Abigail Jurist; Joy, Lois; Ellis, Pamela; Jablonski, Erica; Karelitz, Tzur M.

    2012-01-01

    High teacher turnover in large U.S. cities is a critical issue for schools and districts, and the students they serve; but surprisingly little work has been done to develop methodologies and standards that districts and schools can use to make reliable estimates of turnover costs. Even less is known about how to detect variations in turnover costs…

  14. A Review of Methodological Issues in the Differential Diagnosis of Autism Spectrum Disorders in Children

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Nebel-Schwalm, Marie; Matson, Michael L.

    2007-01-01

    The development of standardized tests to assess autism, particularly in young children, is a topic of considerable interest in the research community. Recent years have seen an exponential growth in scales for differential diagnosis. Particular emphasis has been placed on defining and better delineating the symptoms of the disorder relative to…

  15. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.

    2011-01-01

    A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.

  16. Childhood blindness: a new form for recording causes of visual loss in children.

    PubMed Central

    Gilbert, C.; Foster, A.; Négrel, A. D.; Thylefors, B.

    1993-01-01

    The new standardized form for recording the causes of visual loss in children is accompanied by coding instructions and by a database for statistical analysis. The aim is to record the causes of childhood visual loss, with an emphasis on preventable and treatable causes, so that appropriate control measures can be planned. With this standardized methodology, it will be possible to monitor the changing patterns of childhood blindness over a period of time in response to changes in health care services, specific interventions, and socioeconomic development. PMID:8261552

  17. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. About subjective evaluation of adaptive video streaming

    NASA Astrophysics Data System (ADS)

    Tavakoli, Samira; Brunnström, Kjell; Garcia, Narciso

    2015-03-01

    The usage of HTTP Adaptive Streaming (HAS) technology by content providers is increasing rapidly. Having available the video content in multiple qualities, using HAS allows to adapt the quality of downloaded video to the current network conditions providing smooth video-playback. However, the time-varying video quality by itself introduces a new type of impairment. The quality adaptation can be done in different ways. In order to find the best adaptation strategy maximizing users perceptual quality it is necessary to investigate about the subjective perception of adaptation-related impairments. However, the novelties of these impairments and their comparably long time duration make most of the standardized assessment methodologies fall less suited for studying HAS degradation. Furthermore, in traditional testing methodologies, the quality of the video in audiovisual services is often evaluated separated and not in the presence of audio. Nevertheless, the requirement of jointly evaluating the audio and the video within a subjective test is a relatively under-explored research field. In this work, we address the research question of determining the appropriate assessment methodology to evaluate the sequences with time-varying quality due to the adaptation. This was done by studying the influence of different adaptation related parameters through two different subjective experiments using a methodology developed to evaluate long test sequences. In order to study the impact of audio presence on quality assessment by the test subjects, one of the experiments was done in the presence of audio stimuli. The experimental results were subsequently compared with another experiment using the standardized single stimulus Absolute Category Rating (ACR) methodology.

  19. Building Energy Monitoring and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Feng, Wei; Lu, Alison

    This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyzemore » good building energy data to provide valuable and actionable information for key stakeholders.« less

  20. eSPEM - A SPEM Extension for Enactable Behavior Modeling

    NASA Astrophysics Data System (ADS)

    Ellner, Ralf; Al-Hilank, Samir; Drexler, Johannes; Jung, Martin; Kips, Detlef; Philippsen, Michael

    OMG's SPEM - by means of its (semi-)formal notation - allows for a detailed description of development processes and methodologies, but can only be used for a rather coarse description of their behavior. Concepts for a more fine-grained behavior model are considered out of scope of the SPEM standard and have to be provided by other standards like BPDM/BPMN or UML. However, a coarse granularity of the behavior model often impedes a computer-aided enactment of a process model. Therefore, in this paper we present eSPEM, an extension of SPEM, that is based on the UML meta-model and focused on fine-grained behavior and life-cycle modeling and thereby supports automated enactment of development processes.

  1. Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2012-12-01

    What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid critics of particular statistical reconstructions with naïve and misapplied methodological criticism. Examples will include the skeptical responses to multi-proxy mean temperature reconstructions and congressional hearings criticizing the work of Michael Mann et al.'s Hockey Stick.

  2. Wastewater Treatment Costs and Outlays in Organic Petrochemicals: Standards Versus Taxes With Methodology Suggestions for Marginal Cost Pricing and Analysis

    NASA Astrophysics Data System (ADS)

    Thompson, Russell G.; Singleton, F. D., Jr.

    1986-04-01

    With the methodology recommended by Baumol and Oates, comparable estimates of wastewater treatment costs and industry outlays are developed for effluent standard and effluent tax instruments for pollution abatement in five hypothetical organic petrochemicals (olefins) plants. The computational method uses a nonlinear simulation model for wastewater treatment to estimate the system state inputs for linear programming cost estimation, following a practice developed in a National Science Foundation (Research Applied to National Needs) study at the University of Houston and used to estimate Houston Ship Channel pollution abatement costs for the National Commission on Water Quality. Focusing on best practical and best available technology standards, with effluent taxes adjusted to give nearly equal pollution discharges, shows that average daily treatment costs (and the confidence intervals for treatment cost) would always be less for the effluent tax than for the effluent standard approach. However, industry's total outlay for these treatment costs, plus effluent taxes, would always be greater for the effluent tax approach than the total treatment costs would be for the effluent standard approach. Thus the practical necessity of showing smaller outlays as a prerequisite for a policy change toward efficiency dictates the need to link the economics at the microlevel with that at the macrolevel. Aggregation of the plants into a programming modeling basis for individual sectors and for the economy would provide a sound basis for effective policy reform, because the opportunity costs of the salient regulatory policies would be captured. Then, the government's policymakers would have the informational insights necessary to legislate more efficient environmental policies in light of the wealth distribution effects.

  3. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  4. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  5. Predicting Great Lakes fish yields: tools and constraints

    USGS Publications Warehouse

    Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.

    1987-01-01

    Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.

  6. Teaching and assessing procedural skills using simulation: metrics and methodology.

    PubMed

    Lammers, Richard L; Davenport, Moira; Korley, Frederick; Griswold-Theodorson, Sharon; Fitch, Michael T; Narang, Aneesh T; Evans, Leigh V; Gross, Amy; Rodriguez, Elliot; Dodge, Kelly L; Hamann, Cara J; Robey, Walter C

    2008-11-01

    Simulation allows educators to develop learner-focused training and outcomes-based assessments. However, the effectiveness and validity of simulation-based training in emergency medicine (EM) requires further investigation. Teaching and testing technical skills require methods and assessment instruments that are somewhat different than those used for cognitive or team skills. Drawing from work published by other medical disciplines as well as educational, behavioral, and human factors research, the authors developed six research themes: measurement of procedural skills; development of performance standards; assessment and validation of training methods, simulator models, and assessment tools; optimization of training methods; transfer of skills learned on simulator models to patients; and prevention of skill decay over time. The article reviews relevant and established educational research methodologies and identifies gaps in our knowledge of how physicians learn procedures. The authors present questions requiring further research that, once answered, will advance understanding of simulation-based procedural training and assessment in EM.

  7. Methodological standards for in vitro models of epilepsy and epileptic seizures. A TASK1-WG4 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Raimondo, Joseph V; Heinemann, Uwe; de Curtis, Marco; Goodkin, Howard P; Dulla, Chris G; Janigro, Damir; Ikeda, Akio; Lin, Chou-Ching K; Jiruska, Premysl; Galanopoulou, Aristea S; Bernard, Christophe

    2017-11-01

    In vitro preparations are a powerful tool to explore the mechanisms and processes underlying epileptogenesis and ictogenesis. In this review, we critically review the numerous in vitro methodologies utilized in epilepsy research. We provide support for the inclusion of detailed descriptions of techniques, including often ignored parameters with unpredictable yet significant effects on study reproducibility and outcomes. In addition, we explore how recent developments in brain slice preparation relate to their use as models of epileptic activity. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  8. 77 FR 55737 - Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ...The U.S. Small Business Administration (SBA) proposes to increase small business size standards for 37 industries in North American Industry Classification System (NAICS) Sector 52, Finance and Insurance, and for two industries in NAICS Sector 55, Management of Companies and Enterprises. In addition, SBA proposes to change the measure of size from average assets to average receipts for NAICS 522293, International Trade Financing. As part of its ongoing comprehensive size standards review, SBA evaluated all receipts based and assets based size standards in NAICS Sectors 52 and 55 to determine whether they should be retained or revised. This proposed rule is one of a series of proposed rules that will review size standards of industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published a notice in the October 21, 2009 issue of the Federal Register to advise the public that the document is available on its Web site at www.sba.gov/size for public review and comments. The ``Size Standards Methodology'' White Paper explains how SBA establishes, reviews, and modifies its receipts based and employee based small business size standards. In this proposed rule, SBA has applied its methodology that pertains to establishing, reviewing, and modifying a receipts based size standard.

  9. Protocol-developing meta-ethnography reporting guidelines (eMERGe).

    PubMed

    France, E F; Ring, N; Noyes, J; Maxwell, M; Jepson, R; Duncan, E; Turley, R; Jones, D; Uny, I

    2015-11-25

    Designing and implementing high-quality health care services and interventions requires robustly synthesised evidence. Syntheses of qualitative research studies can provide evidence of patients' experiences of health conditions; intervention feasibility, appropriateness and acceptability to patients; and advance understanding of health care issues. The unique, interpretive, theory-based meta-ethnography synthesis approach is suited to conveying patients' views and developing theory to inform service design and delivery. However, meta-ethnography reporting is often poor quality, which discourages trust in, and use of, meta-ethnography findings. Users of evidence syntheses require reports that clearly articulate analytical processes and findings. Tailored research reporting guidelines can raise reporting standards but none exists for meta-ethnography. This study aims to create an evidence-based meta-ethnography reporting guideline articulating the methodological standards and depth of reporting required to improve reporting quality. The mixed-methods design of this National Institute of Health Research-funded study (http://www.stir.ac.uk/emerge/) follows good practice in research reporting guideline development comprising: (1) a methodological systematic review (PROSPERO registration: CRD42015024709) to identify recommendations and guidance in conducting/reporting meta-ethnography; (2) a review and audit of published meta-ethnographies to identify good practice principles and develop standards in conduct/reporting; (3) an online workshop and Delphi studies to agree guideline content with 45 international qualitative synthesis experts and 45 other stakeholders including patients; (4) development and wide dissemination of the guideline and its accompanying detailed explanatory document, a report template for National Institute of Health Research commissioned meta-ethnographies, and training materials on guideline use. Meta-ethnography, devised in the field of education, is now used widely in other disciplines. Methodological advances relevant to meta-ethnography conduct exist. The extent of discipline-specific adaptations of meta-ethnography and the fit of any adaptions with the underpinning philosophy of meta-ethnography require investigation. Well-reported meta-ethnography findings could inform clinical decision-making. A bespoke meta-ethnography reporting guideline is needed to improve reporting quality, but to be effective potential users must know it exists, trust it and use it. Therefore, a rigorous study has been designed to develop and promote a guideline. By raising reporting quality, the guideline will maximise the likelihood that high-quality meta-ethnographies will contribute robust evidence to improve health care and patient outcomes.

  10. Standards should be applied in the prevention and handling of missing data for patient-centered outcomes research: a systematic review and expert consensus.

    PubMed

    Li, Tianjing; Hutfless, Susan; Scharfstein, Daniel O; Daniels, Michael J; Hogan, Joseph W; Little, Roderick J A; Roy, Jason A; Law, Andrew H; Dickersin, Kay

    2014-01-01

    To recommend methodological standards in the prevention and handling of missing data for primary patient-centered outcomes research (PCOR). We searched National Library of Medicine Bookshelf and Catalog as well as regulatory agencies' and organizations' Web sites in January 2012 for guidance documents that had formal recommendations regarding missing data. We extracted the characteristics of included guidance documents and recommendations. Using a two-round modified Delphi survey, a multidisciplinary panel proposed mandatory standards on the prevention and handling of missing data for PCOR. We identified 1,790 records and assessed 30 as having relevant recommendations. We proposed 10 standards as mandatory, covering three domains. First, the single best approach is to prospectively prevent missing data occurrence. Second, use of valid statistical methods that properly reflect multiple sources of uncertainty is critical when analyzing missing data. Third, transparent and thorough reporting of missing data allows readers to judge the validity of the findings. We urge researchers to adopt rigorous methodology and promote good science by applying best practices to the prevention and handling of missing data. Developing guidance on the prevention and handling of missing data for observational studies and studies that use existing records is a priority for future research. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  12. Efficient design and inference for multistage randomized trials of individualized treatment policies.

    PubMed

    Dawson, Ree; Lavori, Philip W

    2012-01-01

    Clinical demand for individualized "adaptive" treatment policies in diverse fields has spawned development of clinical trial methodology for their experimental evaluation via multistage designs, building upon methods intended for the analysis of naturalistically observed strategies. Because often there is no need to parametrically smooth multistage trial data (in contrast to observational data for adaptive strategies), it is possible to establish direct connections among different methodological approaches. We show by algebraic proof that the maximum likelihood (ML) and optimal semiparametric (SP) estimators of the population mean of the outcome of a treatment policy and its standard error are equal under certain experimental conditions. This result is used to develop a unified and efficient approach to design and inference for multistage trials of policies that adapt treatment according to discrete responses. We derive a sample size formula expressed in terms of a parametric version of the optimal SP population variance. Nonparametric (sample-based) ML estimation performed well in simulation studies, in terms of achieved power, for scenarios most likely to occur in real studies, even though sample sizes were based on the parametric formula. ML outperformed the SP estimator; differences in achieved power predominately reflected differences in their estimates of the population mean (rather than estimated standard errors). Neither methodology could mitigate the potential for overestimated sample sizes when strong nonlinearity was purposely simulated for certain discrete outcomes; however, such departures from linearity may not be an issue for many clinical contexts that make evaluation of competitive treatment policies meaningful.

  13. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow

    NASA Astrophysics Data System (ADS)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.

    2016-11-01

    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  14. Development of a North American paleoclimate pollen-based reconstruction database application

    NASA Astrophysics Data System (ADS)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  15. Microplastic Generation in the Marine Environment Through Degradation and Fragmentation

    NASA Astrophysics Data System (ADS)

    Perryman, M. E.; Jambeck, J.; Woodson, C. B.; Locklin, J.

    2016-02-01

    Plastic use has become requisite in our global economy; as population continues to increase, so too, will plastic production. At its end-of-life, some amount of plastic is mismanaged and ends up in the ocean. Once there, various environmental stresses eventually fragment plastic into microplastic pieces, now ubiquitous in the marine environment. Microplastics pose a serious threat to marine biota and possibly humans. Though the general mechanisms of microplastic formation are known, the rate and extent is not. Currently, no standard methodology for testing the formation of microplastic exists. We developed a replicable and flexible methodology for testing the formation of microplastics. We used this methodology to test the effects of UV, thermal, and mechanical stress on various types of plastic. We tested for fragmentation by measuring weight and size distribution, and looked for signs of degraded plastic using Fourier transform infrared spectroscopy. Though our results did not find any signs of fragmentation, we did see degradation. Additionally, we established a sound methodology and provided a benchmark for additional studies.

  16. The Effectiveness of Educational Technology Applications for Enhancing Mathematics Achievement in K-12 Classrooms: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cheung, Alan C. K.; Slavin, Robert E.

    2013-01-01

    The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…

  17. 75 FR 5589 - Science Advisory Board Staff Office; Request for Public Nominations of Experts To Conduct a Peer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-03

    ... Conductivity Using Field Data: An Adaptation of the U.S. EPA's Standard Methodology for Deriving Water Quality... Adaptation of the U.S. EPA's Standard Methodology for Deriving Water Quality Criteria'' DATES: Nominations... Deriving Water Quality Criteria'' should be directed to Dr. Michael Slimak, ORD's Associate Director of...

  18. Formulation of a parametric systems design framework for disaster response planning

    NASA Astrophysics Data System (ADS)

    Mma, Stephanie Weiya

    The occurrence of devastating natural disasters in the past several years have prompted communities, responding organizations, and governments to seek ways to improve disaster preparedness capabilities locally, regionally, nationally, and internationally. A holistic approach to design used in the aerospace and industrial engineering fields enables efficient allocation of resources through applied parametric changes within a particular design to improve performance metrics to selected standards. In this research, this methodology is applied to disaster preparedness, using a community's time to restoration after a disaster as the response metric. A review of the responses from Hurricane Katrina and the 2010 Haiti earthquake, among other prominent disasters, provides observations leading to some current capability benchmarking. A need for holistic assessment and planning exists for communities but the current response planning infrastructure lacks a standardized framework and standardized assessment metrics. Within the humanitarian logistics community, several different metrics exist, enabling quantification and measurement of a particular area's vulnerability. These metrics, combined with design and planning methodologies from related fields, such as engineering product design, military response planning, and business process redesign, provide insight and a framework from which to begin developing a methodology to enable holistic disaster response planning. The developed methodology was applied to the communities of Shelby County, TN and pre-Hurricane-Katrina Orleans Parish, LA. Available literature and reliable media sources provide information about the different values of system parameters within the decomposition of the community aspects and also about relationships among the parameters. The community was modeled as a system dynamics model and was tested in the implementation of two, five, and ten year improvement plans for Preparedness, Response, and Development capabilities, and combinations of these capabilities. For Shelby County and for Orleans Parish, the Response improvement plan reduced restoration time the most. For the combined capabilities, Shelby County experienced the greatest reduction in restoration time with the implementation of Development and Response capability improvements, and for Orleans Parish it was the Preparedness and Response capability improvements. Optimization of restoration time with community parameters was tested by using a Particle Swarm Optimization algorithm. Fifty different optimized restoration times were generated using the Particle Swarm Optimization algorithm and ranked using the Technique for Order Preference by Similarity to Ideal Solution. The optimization results indicate that the greatest reduction in restoration time for a community is achieved with a particular combination of different parameter values instead of the maximization of each parameter.

  19. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  20. “Heidelberg standard examination” and “Heidelberg standard procedures” – Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education

    PubMed Central

    Nikendei, C.; Ganschow, P.; Groener, J. B.; Huwendiek, S.; Köchel, A.; Köhl-Hackert, N.; Pjontek, R.; Rodrian, J.; Scheibe, F.; Stadler, A.-K.; Steiner, T.; Stiepak, J.; Tabatabai, J.; Utz, A.; Kadmon, M.

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects “Heidelberg standard examination” and “Heidelberg standard procedures”, which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties. PMID:27579354

  1. Standardization in gully erosion studies: methodology and interpretation of magnitudes from a global review

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Gomez, Jose Alfonso

    2016-04-01

    Standardization is the process of developing common conventions or proceedings to facilitate the communication, use, comparison and exchange of products or information among different parties. It has been an useful tool in different fields from industry to statistics due to technical, economic and social reasons. In science the need for standardization has been recognised in the definition of methods as well as in publication formats. With respect to gully erosion, a number of initiatives have been carried out to propose common methodologies, for instance, for gully delineation (Castillo et al., 2014) and geometrical measurements (Casalí et al., 2015). The main aims of this work are: 1) to examine previous proposals in gully erosion literature implying standardization processes; 2) to contribute with new approaches to improve the homogeneity of methodologies and presentation of results for a better communication among the gully erosion community. For this purpose, we evaluated the basic information provided on environmental factors, discussed the delineation and measurement procedures proposed in previous works and, finally, we analysed statistically the severity of degradation levels derived from different indicators at the world scale. As a result, we presented suggestions aiming to serve as guidance for survey design as well as for the interpretation of vulnerability levels and degradation rates for future gully erosion studies. References Casalí, J., Giménez, R., and Campo-Bescós, M. A.: Gully geometry: what are we measuring?, SOIL, 1, 509-513, doi:10.5194/soil-1-509-2015, 2015. Castillo C., Taguas E. V., Zarco-Tejada P., James M. R., and Gómez J. A. (2014), The normalized topographic method: an automated procedure for gully mapping using GIS, Earth Surf. Process. Landforms, 39, 2002-2015, doi: 10.1002/esp.3595

  2. Methodological approach for the collection and simultaneous estimation of greenhouse gases emission from aquaculture ponds.

    PubMed

    Vasanth, Muthuraman; Muralidhar, Moturi; Saraswathy, Ramamoorthy; Nagavel, Arunachalam; Dayal, Jagabattula Syama; Jayanthi, Marappan; Lalitha, Natarajan; Kumararaja, Periyamuthu; Vijayan, Koyadan Kizhakkedath

    2016-12-01

    Global warming/climate change is the greatest environmental threat of our time. Rapidly developing aquaculture sector is an anthropogenic activity, the contribution of which to global warming is little understood, and estimation of greenhouse gases (GHGs) emission from the aquaculture ponds is a key practice in predicting the impact of aquaculture on global warming. A comprehensive methodology was developed for sampling and simultaneous analysis of GHGs, carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) from the aquaculture ponds. The GHG fluxes were collected using cylindrical acrylic chamber, air pump, and tedlar bags. A cylindrical acrylic floating chamber was fabricated to collect the GHGs emanating from the surface of aquaculture ponds. The sampling methodology was standardized and in-house method validation was established by achieving linearity, accuracy, precision, and specificity. GHGs flux was found to be stable at 10 ± 2 °C of storage for 3 days. The developed methodology was used to quantify GHGs in the Pacific white shrimp Penaeus vannamei and black tiger shrimp Penaeus monodon culture ponds for a period of 4 months. The rate of emission of carbon dioxide was found to be much greater when compared to other two GHGs. Average GHGs emission in gha -1  day -1 during the culture was comparatively high in P.vannamei culture ponds.

  3. Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ian M; Danoix, F; Forbes, Richard

    2011-01-01

    Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less

  4. Toward improved guideline quality: using the COGS statement with GEM.

    PubMed

    Shiffman, Richard N; Michel, Georges

    2004-01-01

    The Conference on Guideline Standardization (COGS) was convened to create a standardized documentation checklist for clinical practice guidelines in an effort to promote guideline quality and facilitate implementation. The statement was created by a multidisciplinary panel using a rigorous consensus development methodology. The Guideline Elements Model (GEM) provides a standardized approach to representing guideline documents using XML. In this work, we demonstrate the sufficiency of GEM for describing COGS components. Using the mapping between COGS and GEM elements we built an XSLT application to examine a guideline's adherence (or non-adherence) to the COGS checklist. Once a guideline has been marked up according to the GEM hierarchy, its knowledge content can be reused in multiple ways.

  5. Summary Report Panel 1: The Need for Protocols and Standards in Research on Underwater Noise Impacts on Marine Life.

    PubMed

    Erbe, Christine; Ainslie, Michael A; de Jong, Christ A F; Racca, Roberto; Stocker, Michael

    2016-01-01

    As concern about anthropogenic noise and its impacts on marine fauna is increasing around the globe, data are being compared across populations, species, noise sources, geographic regions, and time. However, much of the raw and processed data are not comparable due to differences in measurement methodology, analysis and reporting, and a lack of metadata. Common protocols and more formal, international standards are needed to ensure the effectiveness of research, conservation, regulation and practice, and unambiguous communication of information and ideas. Developing standards takes time and effort, is largely driven by a few expert volunteers, and would benefit from stakeholders' contribution and support.

  6. Development Of Methodologies Using PhabrOmeter For Fabric Drape Evaluation

    NASA Astrophysics Data System (ADS)

    Lin, Chengwei

    Evaluation of fabric drape is important for textile industry as it reveals the aesthetic and functionality of the cloth and apparel. Although many fabric drape measuring methods have been developed for several decades, they are falling behind the need for fast product development by the industry. To meet the requirement of industries, it is necessary to develop an effective and reliable method to evaluate fabric drape. The purpose of the present study is to determine if PhabrOmeter can be applied to fabric drape evaluation. PhabrOmeter is a fabric sensory performance evaluating instrument which is developed to provide fast and reliable quality testing results. This study was sought to determine the relationship between fabric drape and other fabric attributes. In addition, a series of conventional methods including AATCC standards, ASTM standards and ISO standards were used to characterize the fabric samples. All the data were compared and analyzed with linear correlation method. The results indicate that PhabrOmeter is reliable and effective instrument for fabric drape evaluation. Besides, some effects including fabric structure, testing directions were considered to examine their impact on fabric drape.

  7. Developing standards for the development of glaucoma virtual clinics using a modified Delphi approach.

    PubMed

    Kotecha, Aachal; Longstaff, Simon; Azuara-Blanco, Augusto; Kirwan, James F; Morgan, James Edwards; Spencer, Anne Fiona; Foster, Paul J

    2018-04-01

    To obtain consensus opinion for the development of a standards framework for the development and implementation of virtual clinics for glaucoma monitoring in the UK using a modified Delphi methodology. A modified Delphi technique was used that involved sampling members of the UK Glaucoma and Eire Society (UKEGS). The first round scored the strength of agreement to a series of standards statements using a 9-point Likert scale. The revised standards were subjected to a second round of scoring and free-text comment. The final standards were discussed and agreed by an expert panel consisting of seven glaucoma subspecialists from across the UK. A version of the standards was submitted to external stakeholders for a 3-month consultation. There was a 44% response rate of UKEGS members to rounds 1 and 2, consisting largely of consultant ophthalmologists with a specialist interest in glaucoma. The final version of the standards document was validated by stakeholder consultation and contains four sections pertaining to the patient groups, testing methods, staffing requirements and governance structure of NHS secondary care glaucoma virtual clinic models. Use of a modified Delphi approach has provided consensus agreement for the standards required for the development of virtual clinics to monitor glaucoma in the UK. It is anticipated that this document will be useful as a guide for those implementing this model of service delivery. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Evaluation of the Turkish translation of the Minimal Standard Terminology for Digestive Endoscopy by development of an endoscopic information system.

    PubMed

    Atalağ, Koray; Bilgen, Semih; Gür, Gürden; Boyacioğlu, Sedat

    2007-09-01

    There are very few evaluation studies for the Minimal Standard Terminology for Digestive Endoscopy. This study aims to evaluate the usage of the Turkish translation of Minimal Standard Terminology by developing an endoscopic information system. After elicitation of requirements, database modeling and software development were performed. Minimal Standard Terminology driven forms were designed for rapid data entry. The endoscopic report was rapidly created by applying basic Turkish syntax and grammar rules. Entering free text and also editing of final report were possible. After three years of live usage, data analysis was performed and results were evaluated. The system has been used for reporting of all endoscopic examinations. 15,638 valid records were analyzed, including 11,381 esophagogastroduodenoscopies, 2,616 colonoscopies, 1,079 rectoscopies and 562 endoscopic retrograde cholangiopancreatographies. In accordance with other previous validation studies, the overall usage of Minimal Standard Terminology terms was very high: 85% for examination characteristics, 94% for endoscopic findings and 94% for endoscopic diagnoses. Some new terms, attributes and allowed values were also added for better clinical coverage. Minimal Standard Terminology has been shown to cover a high proportion of routine endoscopy reports. Good user acceptance proves that both the terms and structure of Minimal Standard Terminology were consistent with usual clinical thinking. However, future work on Minimal Standard Terminology is mandatory for better coverage of endoscopic retrograde cholangiopancreatographies examinations. Technically new software development methodologies have to be sought for lowering cost of development and the maintenance phase. They should also address integration and interoperability of disparate information systems.

  9. Encouraging the pursuit of advanced degrees in science and engineering: Top-down and bottom-up methodologies

    NASA Technical Reports Server (NTRS)

    Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.

    1989-01-01

    The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.

  10. Development of a standardized methodology for quantifying total chlorophyll and carotenoids from foliage of hardwood and conifer tree species

    Treesearch

    Rakesh Minocha; Gabriela Martinez; Benjamin Lyons; Stephanie Long

    2009-01-01

    Despite the availability of several protocols for the extraction of chlorophylls and carotenoids from foliage of forest trees, information regarding their respective extraction efficiencies is scarce. We compared the efficiencies of acetone, ethanol, dimethyl sulfoxide (DMSO), and N, N-dimethylformamide (DMF) over a range of incubation times for the extraction of...

  11. Progress toward developing field protocols for a North American marsh bird monitoring program

    Treesearch

    Courtney J. Conway; Steven T. A. Timmermans

    2005-01-01

    Populations of many marsh-dependent birds appear to be declining, but we currently lack a continental program that provides estimates of population trends for most secretive marshbirds. The survey protocol outlined here is a standardized survey methodology being used on a pilot basis at National Wildlife Refuges and other protected wetland areas across North America...

  12. A User's Guide to CGNS. 1.0

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Poirier, Diane M. A.; Bush, Robert H.; Towne, Charles E.

    2001-01-01

    The CFD General Notation System (CGNS) was developed to be a self-descriptive, machine-independent standard for storing CFD aerodynamic data. This guide aids users in the implementation of CGNS. It is intended as a tutorial on the usage of the CGNS mid-level library routines for reading and writing grid and flow solution datasets for both structured and unstructured methodologies.

  13. Learning Styles Inequity for Small to Micro Firms (SMFs): Social Exclusion through Work-Based E-Learning Practice in Europe

    ERIC Educational Resources Information Center

    Hardaker, Glenn; Dockery, Richard; Sabki, Aishah

    2007-01-01

    Purpose: The elearn2work study of learning styles in the context of small to micro firms' (SMFs) and their perceived satisfaction has identified some important finding specific to e-learning content design, delivery and international standards development. Design/methodology/approach: The method of research adopts a deductive rather than an…

  14. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  15. Opto-Technical Monitoring - a Standardized Methodology to Assess the Treatment of Historical Stone Surfaces

    NASA Astrophysics Data System (ADS)

    Rahrig, M.; Drewello, R.; Lazzeri, A.

    2018-05-01

    Monitoring is an essential requirement for the planning, assessment and evaluation of conservation measures. It should be based on a standardized and reproducible observation of the historical surface. For many areas and materials suitable methods for long-term monitoring already exist. But hardly any non-destructive testing methods have been used to test new materials for conservation of damaged stone surfaces. The Nano-Cathedral project, funded by the European Union's Horizon 2020 research and innovation program, is developing new materials and technologies for preserving damaged stone surfaces of built heritage. The prototypes developed are adjusted to the needs and problems of a total of six major cultural monuments in Europe. In addition to the testing of the materials under controlled laboratory conditions, the products have been applied to trial areas on the original stone surfaces. For a location-independent standardized assessment of surface changes of the entire trial areas a monitoring method based on opto-technical, non-contact and non-destructive testing methods has been developed. This method involves a three-dimensional measurement of the surface topography using Structured-Light-Scanning and the analysis of the surfaces in different light ranges using high resolution VIS photography, as well as UV-A-fluorescence photography and reflected near-field IR photography. The paper will show the workflow of this methodology, including a detailed description of the equipment used data processing and the advantages for monitoring highly valuable stone surfaces. Alongside the theoretical discussion, the results of two measuring campaigns on trial areas of the Nano-Cathedral project will be shown.

  16. Developing a standardized healthcare cost data warehouse.

    PubMed

    Visscher, Sue L; Naessens, James M; Yawn, Barbara P; Reinalda, Megan S; Anderson, Stephanie S; Borah, Bijan J

    2017-06-12

    Research addressing value in healthcare requires a measure of cost. While there are many sources and types of cost data, each has strengths and weaknesses. Many researchers appear to create study-specific cost datasets, but the explanations of their costing methodologies are not always clear, causing their results to be difficult to interpret. Our solution, described in this paper, was to use widely accepted costing methodologies to create a service-level, standardized healthcare cost data warehouse from an institutional perspective that includes all professional and hospital-billed services for our patients. The warehouse is based on a National Institutes of Research-funded research infrastructure containing the linked health records and medical care administrative data of two healthcare providers and their affiliated hospitals. Since all patients are identified in the data warehouse, their costs can be linked to other systems and databases, such as electronic health records, tumor registries, and disease or treatment registries. We describe the two institutions' administrative source data; the reference files, which include Medicare fee schedules and cost reports; the process of creating standardized costs; and the warehouse structure. The costing algorithm can create inflation-adjusted standardized costs at the service line level for defined study cohorts on request. The resulting standardized costs contained in the data warehouse can be used to create detailed, bottom-up analyses of professional and facility costs of procedures, medical conditions, and patient care cycles without revealing business-sensitive information. After its creation, a standardized cost data warehouse is relatively easy to maintain and can be expanded to include data from other providers. Individual investigators who may not have sufficient knowledge about administrative data do not have to try to create their own standardized costs on a project-by-project basis because our data warehouse generates standardized costs for defined cohorts upon request.

  17. Methodological aspects of clinical trials in tinnitus: A proposal for an international standard

    PubMed Central

    Landgrebe, Michael; Azevedo, Andréia; Baguley, David; Bauer, Carol; Cacace, Anthony; Coelho, Claudia; Dornhoffer, John; Figueiredo, Ricardo; Flor, Herta; Hajak, Goeran; van de Heyning, Paul; Hiller, Wolfgang; Khedr, Eman; Kleinjung, Tobias; Koller, Michael; Lainez, Jose Miguel; Londero, Alain; Martin, William H.; Mennemeier, Mark; Piccirillo, Jay; De Ridder, Dirk; Rupprecht, Rainer; Searchfield, Grant; Vanneste, Sven; Zeman, Florian; Langguth, Berthold

    2013-01-01

    Chronic tinnitus is a common condition with a high burden of disease. While many different treatments are used in clinical practice, the evidence for the efficacy of these treatments is low and the variance of treatment response between individuals is high. This is most likely due to the great heterogeneity of tinnitus with respect to clinical features as well as underlying pathophysiological mechanisms. There is a clear need to find effective treatment options in tinnitus, however, clinical trials differ substantially with respect to methodological quality and design. Consequently, the conclusions that can be derived from these studies are limited and jeopardize comparison between studies. Here, we discuss our view of the most important aspects of trial design in clinical studies in tinnitus and make suggestions for an international methodological standard in tinnitus trials. We hope that the proposed methodological standard will stimulate scientific discussion and will help to improve the quality of trials in tinnitus. PMID:22789414

  18. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  19. A web based health technology assessment in tele-echocardiography: the experience within an Italian project.

    PubMed

    Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio

    2009-01-01

    Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.

  20. Methodological Issues in Meta-Analyzing Standard Deviations: Comment on Bond and DePaulo (2008)

    ERIC Educational Resources Information Center

    Pigott, Therese D.; Wu, Meng-Jia

    2008-01-01

    In this comment on C. F. Bond and B. M. DePaulo, the authors raise methodological concerns about the approach used to analyze the data. The authors suggest further refinement of the procedures used, and they compare the approach taken by Bond and DePaulo with standard methods for meta-analysis. (Contains 1 table and 2 figures.)

  1. Quality Measurement Recommendations Relevant to Clinical Guidelines in Germany and the United Kingdom: (What) Can We Learn From Each Other?

    PubMed Central

    Petzold, Thomas; Deckert, Stefanie; Williamson, Paula R.; Schmitt, Jochen

    2018-01-01

    We conducted a systematic review of clinical guidelines (CGs) to examine the methodological approaches of quality indicator derivation in CGs, the frequency of quality indicators to check CG recommendations in routine care, and clinimetric properties of quality indicators. We analyzed the publicly available CG databases of the Association of the Scientific Medical Societies in Germany (AWMF) and National Institute for Health and Care Excellence (NICE). Data on the methodology of subsequent quality indicator derivation, the content and definition of recommended quality indicators, and clinimetric properties of measurement instruments were extracted. In Germany, no explicit methodological guidance exists, but 3 different approaches are used. For NICE, a general approach is used for the derivation of quality indicators out of quality standards. Quality indicators were defined in 34 out of 87 CGs (39%) in Germany and for 58 out of 133 (43%) NICE CGs. Statements regarding measurement properties of instruments for quality indicator assessment were missing in German and NICE documents. Thirteen pairs of CGs (32%) have associated quality indicators. Thirty-four quality indicators refer to the same aspect of the quality of care, which corresponds to 27% of the German and 7% of NICE quality indicators. The development of a standardized and internationally accepted methodology for the derivation of quality indicators relevant to CGs is needed to measure and compare quality of care in health care systems. PMID:29591538

  2. Advanced Oil Recovery Technologies for Improved Recovery From Slope Basin Clastic Reservoirs, Nash Draw Brushy Canyon Pool, Eddy County, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mark B. Murphy

    The overall goal of this project is to demonstrate that an advanced development drilling and pressure maintenance program based on advanced reservoir management methods can significantly improve oil recovery. The plan included developing a control area using standard reservoir management techniques and comparing its performance to an area developed using advanced methods. A key goal is to transfer advanced methodologies to oil and gas producers in the Permian Basin and elsewhere, and throughout the US oil and gas industry.

  3. Placement-aware decomposition of a digital standard cells library for double patterning lithography

    NASA Astrophysics Data System (ADS)

    Wassal, Amr G.; Sharaf, Heba; Hammouda, Sherif

    2012-11-01

    To continue scaling the circuit features down, Double Patterning (DP) technology is needed in 22nm technologies and lower. DP requires decomposing the layout features into two masks for pitch relaxation, such that the spacing between any two features on each mask is greater than the minimum allowed mask spacing. The relaxed pitches of each mask are then processed on two separate exposure steps. In many cases, post-layout decomposition fails to decompose the layout into two masks due to the presence of conflicts. Post-layout decomposition of a standard cells block can result in native conflicts inside the cells (internal conflict), or native conflicts on the boundary between two cells (boundary conflict). Resolving native conflicts requires a redesign and/or multiple iterations for the placement and routing phases to get a clean decomposition. Therefore, DP compliance must be considered in earlier phases, before getting the final placed cell block. The main focus of this paper is generating a library of decomposed standard cells to be used in a DP-aware placer. This library should contain all possible decompositions for each standard cell, i.e., these decompositions consider all possible combinations of boundary conditions. However, the large number of combinations of boundary conditions for each standard cell will significantly increase the processing time and effort required to obtain all possible decompositions. Therefore, an efficient methodology is required to reduce this large number of combinations. In this paper, three different reduction methodologies are proposed to reduce the number of different combinations processed to get the decomposed library. Experimental results show a significant reduction in the number of combinations and decompositions needed for the library processing. To generate and verify the proposed flow and methodologies, a prototype for a placement-aware DP-ready cell-library is developed with an optimized number of cell views.

  4. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  5. An efficient auto TPT stitch guidance generation for optimized standard cell design

    NASA Astrophysics Data System (ADS)

    Samboju, Nagaraj C.; Choi, Soo-Han; Arikati, Srini; Cilingir, Erdem

    2015-03-01

    As the technology continues to shrink below 14nm, triple patterning lithography (TPT) is a worthwhile lithography methodology for printing dense layers such as Metal1. However, this increases the complexity of standard cell design, as it is very difficult to develop a TPT compliant layout without compromising on the area. Hence, this emphasizes the importance to have an accurate stitch generation methodology to meet the standard cell area requirement as defined by the technology shrink factor. In this paper, we present an efficient auto TPT stitch guidance generation technique for optimized standard cell design. The basic idea here is to first identify the conflicting polygons based on the Fix Guidance [1] solution developed by Synopsys. Fix Guidance is a reduced sub-graph containing minimum set of edges along with the connecting polygons; by eliminating these edges in a design 3-color conflicts can be resolved. Once the conflicting polygons are identified using this method, they are categorized into four types [2] - (Type 1 to 4). The categorization is based on number of interactions a polygon has with the coloring links and the triangle loops of fix guidance. For each type a certain criteria for keep-out region is defined, based on which the final stitch guidance locations are generated. This technique provides various possible stitch locations to the user and helps the user to select the best stitch location considering both design flexibility (max. pin access/small area) and process-preferences. Based on this technique, a standard cell library for place and route (P and R) can be developed with colorless data and a stitch marker defined by designer using our proposed method. After P and R, the full chip (block) would contain the colorless data and standard cell stitch markers only. These stitch markers are considered as "must be stitch" candidates. Hence during full chip decomposition it is not required to generate and select the stitch markers again for the complete data; therefore, the proposed method reduces the decomposition time significantly.

  6. Beyond Born-Mayer: Improved models for short-range repulsion in ab initio force fields

    DOE PAGES

    Van Vleet, Mary J.; Misquitta, Alston J.; Stone, Anthony J.; ...

    2016-06-23

    Short-range repulsion within inter-molecular force fields is conventionally described by either Lennard-Jones or Born-Mayer forms. Despite their widespread use, these simple functional forms are often unable to describe the interaction energy accurately over a broad range of inter-molecular distances, thus creating challenges in the development of ab initio force fields and potentially leading to decreased accuracy and transferability. Herein, we derive a novel short-range functional form based on a simple Slater-like model of overlapping atomic densities and an iterated stockholder atom (ISA) partitioning of the molecular electron density. We demonstrate that this Slater-ISA methodology yields a more accurate, transferable, andmore » robust description of the short-range interactions at minimal additional computational cost compared to standard Lennard-Jones or Born-Mayer approaches. Lastly, we show how this methodology can be adapted to yield the standard Born-Mayer functional form while still retaining many of the advantages of the Slater-ISA approach.« less

  7. Studies on the asparagine-linked oligosaccharides from cartilage-specific proteoglycan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cioffi, L.C.

    1987-01-01

    Chondrocytes synthesize and secrete a cartilage-specific proteoglycan (PG-H) as one of their major products. This proteoglycan has attached to it several types of carbohydrate chains, including chondroitin sulfate, keratan sulfate, O-linked oligosaccharides, and asparagine-linked oligosaccharides. The asparagine-linked oligosaccharides found on PG-H were investigated in these studies. Methodology was developed for the isolation and separation of standard of standard complex and high mannose type oligosaccharides. This included digesting glycoproteins with N-glycanase and separation of the oligosaccharides according to type by concanavalin-A lectin chromatography. The different oligosaccharide types were then analyzed by high pressure liquid chromatography. This methodology was used in themore » subsequent studies on the PG-H asparagine-linked oligosaccharides. Initially, the asparagine-linked oligosaccharides recovered from the culture medium (CM) and cell-associated (Ma) fractions of PG-H from of tibial chondrocytes were labeled with (/sup 3/H)-mannose and the oligosaccharides were isolated and analyzed.« less

  8. Estimating the Cost of Cancer Care in British Columbia and Ontario: A Canadian Inter-Provincial Comparison

    PubMed Central

    Pataky, Reka; Bremner, Karen E.; Rangrej, Jagadish; Chan, Kelvin K.W.; Cheung, Winson Y.; Hoch, Jeffrey S.; Peacock, Stuart; Krahn, Murray D.

    2017-01-01

    Background: Costing studies are useful to measure the economic burden of cancer. Comparing costs between healthcare systems can inform evaluation, development or modification of cancer care policies. Objectives: To estimate and compare cancer costs in British Columbia and Ontario from the payers' perspectives. Methods: Using linked cancer registry and administrative data, and standardized costing methodology and analyses, we estimated costs for 21 cancer sites by phase of care to determine potential differences between provinces. Results: Overall, costs were higher in Ontario. Costs were highest in the initial post-diagnosis and pre-death phases and lowest in the pre-diagnosis and continuing phases, and generally higher for brain cancer and multiple myeloma, and lower for melanoma. Hospitalization was the major cost category. Costs for physician services and diagnostic tests differed the most between provinces. Conclusions: The standardization of data and costing methodology is challenging, but it enables interprovincial and international comparative costing analyses. PMID:28277207

  9. [A preliminary mapping methodology for occupational hazards and biomechanical risk evaluation: presentation of a simple, computerized tool kit for ergonomic hazards identification and risk assessment].

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2011-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization. The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists. According to the ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel). The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  10. A comprehensive evaluation of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil by microwave-assisted hydrolysis and HPLC-MS/MS.

    PubMed

    Bartella, Lucia; Mazzotti, Fabio; Napoli, Anna; Sindona, Giovanni; Di Donna, Leonardo

    2018-03-01

    A rapid and reliable method to assay the total amount of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil has been developed. The methodology intends to establish the nutritional quality of this edible oil addressing recent international health claim legislations (the European Commission Regulation No. 432/2012) and changing the classification of extra virgin olive oil to the status of nutraceutical. The method is based on the use of high-performance liquid chromatography coupled with tandem mass spectrometry and labeled internal standards preceded by a fast hydrolysis reaction step performed through the aid of microwaves under acid conditions. The overall process is particularly time saving, much shorter than any methodology previously reported. The developed approach represents a mix of rapidity and accuracy whose values have been found near 100% on different fortified vegetable oils, while the RSD% values, calculated from repeatability and reproducibility experiments, are in all cases under 7%. Graphical abstract Schematic of the methodology applied to the determination of tyrosol and hydroxytyrosol ester conjugates.

  11. Innovation design of medical equipment based on TRIZ.

    PubMed

    Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo

    2015-01-01

    Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.

  12. [Scientific, practical and educational aspects of clinical epidemiology].

    PubMed

    Briko, N I

    2012-01-01

    This article defines clinical epidemiology and describes its goal and objectives. The author claims that clinical epidemiology is a section of epidemiology which underlies the development of evidence-based standards for diagnostics, treatment and prevention and helps to select the appropriate algorithm for each clinical case. The study provides a comprehensive overview of the relationship between clinical epidemiology and evidence-based medicine. Epidemiological research is shown to be methodological basis of clinical epidemiology and evidence-based medicine with randomized controlled trials being the "gold standard" for obtaining reliable data. The key stages in the history of clinical epidemiology are discussed and further development of clinical epidemiology and the integration of courses on clinical epidemiology in education is outlined for progress in medical research and health care practice.

  13. Prediction of Regulation Reserve Requirements in California ISO Control Area based on BAAL Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Makarov, Yuri V.; Samaan, Nader A.

    This paper presents new methodologies developed at Pacific Northwest National Laboratory (PNNL) to estimate regulation capacity requirements in the California ISO control area. Two approaches have been developed: (1) an approach based on statistical analysis of actual historical area control error (ACE) and regulation data, and (2) an approach based on balancing authority ACE limit control performance standard. The approaches predict regulation reserve requirements on a day-ahead basis including upward and downward requirements, for each operating hour of a day. California ISO data has been used to test the performance of the proposed algorithms. Results show that software tool allowsmore » saving up to 30% on the regulation procurements cost .« less

  14. Instruments to assess patients with rotator cuff pathology: a systematic review of measurement properties.

    PubMed

    Longo, Umile Giuseppe; Saris, Daniël; Poolman, Rudolf W; Berton, Alessandra; Denaro, Vincenzo

    2012-10-01

    The aims of this study were to obtain an overview of the methodological quality of studies on the measurement properties of rotator cuff questionnaires and to describe how well various aspects of the design and statistical analyses of studies on measurement properties are performed. A systematic review of published studies on the measurement properties of rotator cuff questionnaires was performed. Two investigators independently rated the quality of the studies using the Consensus-based Standards for the selection of health Measurement Instruments checklist. This checklist was developed in an international Delphi consensus study. Sixteen studies were included, in which two measurement instruments were evaluated, namely the Western Ontario Rotator Cuff Index and the Rotator Cuff Quality-of-Life Measure. The methodological quality of the included studies was adequate on some properties (construct validity, reliability, responsiveness, internal consistency, and translation) but need to be improved on other aspects. The most important methodological aspects that need to be developed are as follows: measurement error, content validity, structural validity, cross-cultural validity, criterion validity, and interpretability. Considering the importance of adequate measurement properties, it is concluded that, in the field of rotator cuff pathology, there is room for improvement in the methodological quality of studies measurement properties. II.

  15. The method of expected number of deaths, 1786-1886-1986.

    PubMed

    Keiding, N

    1987-04-01

    "The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt

  16. Non-prescription medicines: a process for standards development and testing in community pharmacy.

    PubMed

    Benrimoj, Shalom Charlie I; Gilbert, Andrew; Quintrell, Neil; Neto, Abilio C de Almeida

    2007-08-01

    The objective of the study was to develop and test standards of practice for handling non-prescription medicines. In consultation with pharmacy registering authorities, key professional and consumer groups and selected community pharmacists, standards of practice were developed in the areas of Resource Management; Professional Practice; Pharmacy Design and Environment; and Rights and Needs of Customers. These standards defined and described minimum professional activities required in the provision of non-prescription medicines at a consistent and measurable level of practice. Seven standards were described and further defined by 20 criteria, including practice indicators. The Standards were tested in 40 community pharmacies in two States and after further adaptation, endorsed by all Australian pharmacy registering authorities and major Australian pharmacy and consumer organisations. The consultation process effectively engaged practicing pharmacists in developing standards to enable community pharmacists meet their legislative and professional responsibilities. Community pharmacies were audited against a set of standards of practice for handling non-prescription medicines developed in this project. Pharmacies were audited on the Standards at baseline, mid-intervention and post-intervention. Behavior of community pharmacists and their staff in relation to these standards was measured by conducting pseudo-patron visits to participating pharmacies. The testing process demonstrated a significant improvement in the quality of service delivered by staff in community pharmacies in the management of requests involving non-prescription medicines. The use of pseudo-patron visits, as a training tool with immediate feedback, was an acceptable and effective method of achieving changes in practice. Feedback from staff in the pharmacies regarding the pseudo-patron visits was very positive. Results demonstrated the methodology employed was effective in increasing overall compliance with the Standards from a rate of 47.4% to 70.0% (P < 0.01). This project led to a recommendation for the development and execution of a national implementation strategy.

  17. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  19. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  20. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    PubMed Central

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  1. The maximum specific hydrogen-producing activity of anaerobic mixed cultures: definition and determination

    PubMed Central

    Mu, Yang; Yang, Hou-Yun; Wang, Ya-Zhou; He, Chuan-Shu; Zhao, Quan-Bao; Wang, Yi; Yu, Han-Qing

    2014-01-01

    Fermentative hydrogen production from wastes has many advantages compared to various chemical methods. Methodology for characterizing the hydrogen-producing activity of anaerobic mixed cultures is essential for monitoring reactor operation in fermentative hydrogen production, however there is lack of such kind of standardized methodologies. In the present study, a new index, i.e., the maximum specific hydrogen-producing activity (SHAm) of anaerobic mixed cultures, was proposed, and consequently a reliable and simple method, named SHAm test, was developed to determine it. Furthermore, the influences of various parameters on the SHAm value determination of anaerobic mixed cultures were evaluated. Additionally, this SHAm assay was tested for different types of substrates and bacterial inocula. Our results demonstrate that this novel SHAm assay was a rapid, accurate and simple methodology for determining the hydrogen-producing activity of anaerobic mixed cultures. Thus, application of this approach is beneficial to establishing a stable anaerobic hydrogen-producing system. PMID:24912488

  2. The maximum specific hydrogen-producing activity of anaerobic mixed cultures: definition and determination

    NASA Astrophysics Data System (ADS)

    Mu, Yang; Yang, Hou-Yun; Wang, Ya-Zhou; He, Chuan-Shu; Zhao, Quan-Bao; Wang, Yi; Yu, Han-Qing

    2014-06-01

    Fermentative hydrogen production from wastes has many advantages compared to various chemical methods. Methodology for characterizing the hydrogen-producing activity of anaerobic mixed cultures is essential for monitoring reactor operation in fermentative hydrogen production, however there is lack of such kind of standardized methodologies. In the present study, a new index, i.e., the maximum specific hydrogen-producing activity (SHAm) of anaerobic mixed cultures, was proposed, and consequently a reliable and simple method, named SHAm test, was developed to determine it. Furthermore, the influences of various parameters on the SHAm value determination of anaerobic mixed cultures were evaluated. Additionally, this SHAm assay was tested for different types of substrates and bacterial inocula. Our results demonstrate that this novel SHAm assay was a rapid, accurate and simple methodology for determining the hydrogen-producing activity of anaerobic mixed cultures. Thus, application of this approach is beneficial to establishing a stable anaerobic hydrogen-producing system.

  3. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  7. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  8. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  9. Building a gold standard to construct search filters: a case study with biomarkers for oral cancer.

    PubMed

    Frazier, John J; Stein, Corey D; Tseytlin, Eugene; Bekhuis, Tanja

    2015-01-01

    To support clinical researchers, librarians and informationists may need search filters for particular tasks. Development of filters typically depends on a "gold standard" dataset. This paper describes generalizable methods for creating a gold standard to support future filter development and evaluation using oral squamous cell carcinoma (OSCC) as a case study. OSCC is the most common malignancy affecting the oral cavity. Investigation of biomarkers with potential prognostic utility is an active area of research in OSCC. The methods discussed here should be useful for designing quality search filters in similar domains. The authors searched MEDLINE for prognostic studies of OSCC, developed annotation guidelines for screeners, ran three calibration trials before annotating the remaining body of citations, and measured inter-annotator agreement (IAA). We retrieved 1,818 citations. After calibration, we screened the remaining citations (n = 1,767; 97.2%); IAA was substantial (kappa = 0.76). The dataset has 497 (27.3%) citations representing OSCC studies of potential prognostic biomarkers. The gold standard dataset is likely to be high quality and useful for future development and evaluation of filters for OSCC studies of potential prognostic biomarkers. The methodology we used is generalizable to other domains requiring a reference standard to evaluate the performance of search filters. A gold standard is essential because the labels regarding relevance enable computation of diagnostic metrics, such as sensitivity and specificity. Librarians and informationists with data analysis skills could contribute to developing gold standard datasets and subsequent filters tuned for their patrons' domains of interest.

  10. Dipolar recoupling in solid state NMR by phase alternating pulse sequences

    PubMed Central

    Lin, J.; Bayro, M.; Griffin, R. G.; Khaneja, N.

    2009-01-01

    We describe some new developments in the methodology of making heteronuclear and homonuclear recoupling experiments in solid state NMR insensitive to rf-inhomogeneity by phase alternating the irradiation on the spin system every rotor period. By incorporating delays of half rotor periods in the pulse sequences, these phase alternating experiments can be made γ encoded. The proposed methodology is conceptually different from the standard methods of making recoupling experiments robust by the use of ramps and adiabatic pulses in the recoupling periods. We show how the concept of phase alternation can be incorporated in the design of homonuclear recoupling experiments that are both insensitive to chemical-shift dispersion and rf-inhomogeneity. PMID:19157931

  11. Microbiota transplantation: concept, methodology and strategy for its modernization.

    PubMed

    Zhang, Faming; Cui, Bota; He, Xingxiang; Nie, Yuqiang; Wu, Kaichun; Fan, Daiming

    2018-05-01

    Fecal microbiota transplantation (FMT) has become a research focus of biomedicine and clinical medicine in recent years. The clinical response from FMT for different diseases provided evidence for microbiota-host interactions associated with various disorders, including Clostridium difficile infection, inflammatory bowel disease, diabetes mellitus, cancer, liver cirrhosis, gut-brain disease and others. To discuss the experiences of using microbes to treat human diseases from ancient China to current era should be important in moving standardized FMT forward and achieving a better future. Here, we review the changing concept of microbiota transplantation from FMT to selective microbiota transplantation, methodology development of FMT and step-up FMT strategy based on literature and state experts' perspectives.

  12. [Methodological aspects in the evaluation of turn-over and up/down sizing as indicators of work-related stress].

    PubMed

    Veronesi, G; Bertù, L; Mombelli, S; Cimmino, L; Caravati, G; Conti, M; Abate, T; Ferrario, M M

    2011-01-01

    We discuss the methodological aspects related to the evaluation of turn-over and up-down sizing as indicators of work-related stress, in complex organizations like a university hospital. To estimate the active workers population we developed an algorithm which integrated several administrative databases. The indicators were standardized to take into account some potential confounders (age, sex, work seniority) when considering different hospital structures and job mansions. Main advantages of our method include flexibility in the choice of the analysis detail (hospital units, job mansions, a combination of both) and the possibility to describe over-time trends to measure the success of preventive strategies.

  13. The CMC/3DPNS computer program for prediction of three-dimension, subsonic, turbulent aerodynamic juncture region flow. Volume 2: Users' manual

    NASA Technical Reports Server (NTRS)

    Manhardt, P. D.

    1982-01-01

    The CMC fluid mechanics program system was developed to transmit the theoretical solution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. Data procedures for the CMC 3 dimensional Parabolic Navier-Stokes (PNS) algorithm are presented. General data procedures a juncture corner flow standard test case data deck is described. A listing of the data deck and an explanation of grid generation methodology are presented. Tabulations of all commands and variables available to the user are described. These are in alphabetical order with cross reference numbers which refer to storage addresses.

  14. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    PubMed

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  15. Another HISA--the new standard: health informatics--service architecture.

    PubMed

    Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik

    2007-01-01

    In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.

  16. The Research Diagnostic Criteria for Temporomandibular Disorders. I: overview and methodology for assessment of validity.

    PubMed

    Schiffman, Eric L; Truelove, Edmond L; Ohrbach, Richard; Anderson, Gary C; John, Mike T; List, Thomas; Look, John O

    2010-01-01

    The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. The aim of this article is to provide an overview of the project's methodology, descriptive statistics, and data for the study participant sample. This article also details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. The Axis I reference standards were based on the consensus of two criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion examination reliability was also assessed within study sites. Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas > or = 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion examiner agreement with reference standards was excellent (k > or = 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods.

  17. [Evidence-based Chinese medicine:theory and practice].

    PubMed

    Zhang, Jun-Hua; Li, You-Ping; Zhang, Bo-Li

    2018-01-01

    The introduction and popularization of evidence-based medicine has opened up a new research field of clinical efficacy evaluation of traditional Chinese medicine(TCM), produced new research ideas and methods, and promoted the progress of clinical research of TCM. After about 20 years assiduous study and earnest practice, the evidence based evaluation method and technique, which conforms to the characteristics of TCM theory and practice, has been developing continuously. Evidence-based Chinese medicine (EBCM) has gradually formed and become an important branch of evidence-based medicine. The basic concept of evidence-based Chinese medicine: EBCM is an applied discipline, following the theory and methodology of evidence-based medicine, to collect, evaluate, produce, transform the evidence of effectiveness, safety and economy of TCM, to reveal the feature and regular pattern of TCM taking effect, and to guide the development of clinical guidelines, clinical pathways and health decisions. The effects and achievements of EBCM development: secondary studies mainly based on systematic review/Meta-analysis were extensively carried out; clinical efficacy studies mainly relying on randomized controlled trials grew rapidly; clinical safety evaluations based on real world study have been conducted; methodological researches mainly focused on study quality control deepened gradually; internationalization researches mainly on report specifications have got some breakthroughs; standardization researches based on treatment specification were strengthened gradually; the research team and talents with the characteristics of inter-disciplinary have been steadily increased. A number of high-quality research findings have been published at international well-known journals; the clinical efficacy and safety evidence of TCM has been increased; the level of clinical rational use of TCM has been improved; a large number of Chinese patent medicines with big market have been cultured. The future missions of EBCM mainly consist of four categories (scientific research, methodology and standard, platform construction and personnel training) with nine tasks. ①Carry out systematic reviews to systematically collect clinical trial reports of TCM and establish database of clinical evidence of TCM; ②Carry out evidence transformation research to lay the foundation for the development of clinical diagnosis and treatment guidelines, clinical pathways of TCM, and for the screening of basic drug list and medical insurance list, and for the policy-making relevant to TCM; ③Conduct researches to evaluate the advantages and effective regular patterns of TCM and form the evidence chain of TCM efficacy; ④Carry out researches for the safety evaluation of TCM, and provide evidence supporting the rational and safe use of TCM in clinical practice; ⑤Conduct researches on methodology of EBCM and provide method for developing high quality evidence; ⑥Carry out researches to develop standards and norms of TCM, and to form methods, standards, specifications and technical systems; ⑦Establish data management platform for evidence-based evaluation of TCM, and promote data sharing; ⑧Build international academic exchange platform to promote international cooperation and mutual recognition of EBCM research; ⑨Carry out education and popularization activities of evidence-based evaluation methods, and train undergraduate students, graduate students, clinical healthcare providers and practitioners of TCM. The development of EBCM, as it was, not only promoted the transformation of clinical research and decision-making mode of TCM, contributed to the modernization and internationalization of TCM, but also enriched the connotation of Evidence-based Medicine. Copyright© by the Chinese Pharmaceutical Association.

  18. Assessment of the Validity of the Research Diagnostic Criteria for Temporomandibular Disorders: Overview and Methodology

    PubMed Central

    Schiffman, Eric L.; Truelove, Edmond L.; Ohrbach, Richard; Anderson, Gary C.; John, Mike T.; List, Thomas; Look, John O.

    2011-01-01

    AIMS The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. An overview is presented, including Axis I and II methodology and descriptive statistics for the study participant sample. This paper details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. Validity testing for the Axis II biobehavioral instruments was based on previously validated reference standards. METHODS The Axis I reference standards were based on the consensus of 2 criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion exam reliability was also assessed within study sites. RESULTS Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas ≥ 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion exam agreement with reference standards was excellent (k ≥ 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). CONCLUSION The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods. PMID:20213028

  19. Technical Support Document: The Development of the Advanced Energy Design Guide for Highway Lodging Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Wei; Jarnagin, Ronald E.; Gowri, Krishnan

    2008-09-30

    This Technical Support Document (TSD) describes the process and methodology for development of the Advanced Energy Design Guide for Highway Lodgings (AEDG-HL or the Guide), a design guidance document intended to provide recommendations for achieving 30% energy savings in highway lodging properties over levels contained in ANSI/ASHRAE/IESNA Standard 90.1-1999, Energy Standard for Buildings Except Low-Rise Residential Buildings. The AEDG-HL is the fifth in a series of guides being developed by a partnership of organizations, including the American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. (ASHRAE), the American Institute of Architects (AIA), the Illuminating Engineering Society of North America (IESNA),more » the United States Green Buildings Council (USGBC), and the U.S. Department of Energy (DOE).« less

  20. Evaluation of the international standardized 24-h dietary recall methodology (GloboDiet) for potential application in research and surveillance within African settings.

    PubMed

    Aglago, Elom Kouassivi; Landais, Edwige; Nicolas, Geneviève; Margetts, Barrie; Leclercq, Catherine; Allemand, Pauline; Aderibigbe, Olaide; Agueh, Victoire Damienne; Amuna, Paul; Annor, George Amponsah; El Ati, Jalila; Coates, Jennifer; Colaiezzi, Brooke; Compaore, Ella; Delisle, Hélène; Faber, Mieke; Fungo, Robert; Gouado, Inocent; El Hamdouchi, Asmaa; Hounkpatin, Waliou Amoussa; Konan, Amoin Georgette; Labzizi, Saloua; Ledo, James; Mahachi, Carol; Maruapula, Segametsi Ditshebo; Mathe, Nonsikelelo; Mbabazi, Muniirah; Mirembe, Mandy Wilja; Mizéhoun-Adissoda, Carmelle; Nzi, Clement Diby; Pisa, Pedro Terrence; El Rhazi, Karima; Zotor, Francis; Slimani, Nadia

    2017-06-19

    Collection of reliable and comparable individual food consumption data is of primary importance to better understand, control and monitor malnutrition and its related comorbidities in low- and middle-income countries (LMICs), including in Africa. The lack of standardised dietary tools and their related research support infrastructure remains a major obstacle to implement concerted and region-specific research and action plans worldwide. Citing the magnitude and importance of this challenge, the International Agency for Research on Cancer (IARC/WHO) launched the "Global Nutrition Surveillance initiative" to pilot test the use of a standardized 24-h dietary recall research tool (GloboDiet), validated in Europe, in other regions. In this regard, the development of the GloboDiet-Africa can be optimised by better understanding of the local specific methodological needs, barriers and opportunities. The study aimed to evaluate the standardized 24-h dietary recall research tool (GloboDiet) as a possible common methodology for research and surveillance across Africa. A consultative panel of African and international experts in dietary assessment participated in six e-workshop sessions. They completed an in-depth e-questionnaire to evaluate the GloboDiet dietary methodology before and after participating in the e-workshop. The 29 experts expressed their satisfaction on the potential of the software to address local specific needs when evaluating the main structure of the software, the stepwise approach for data collection and standardisation concept. Nevertheless, additional information to better describe local foods and recipes, as well as particular culinary patterns (e.g. mortar pounding), were proposed. Furthermore, food quantification in shared-plates and -bowls eating situations and interviewing of populations with low literacy skills, especially in rural settings, were acknowledged as requiring further specific considerations and appropriate solutions. An overall positive evaluation of the GloboDiet methodology by both African and international experts, supports the flexibility and potential applicability of this tool in diverse African settings and sets a positive platform for improved dietary monitoring and surveillance. Following this evaluation, prerequisite for future implementation and/or adaptation of GloboDiet in Africa, rigorous and robust capacity building as well as knowledge transfer will be required to roadmap a stepwise approach to implement this methodology across pilot African countries/regions.

  1. Decision-problem state analysis methodology

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

  2. Using the GLIMMIX Procedure in SAS 9.3 to Fit a Standard Dichotomous Rasch and Hierarchical 1-PL IRT Model

    ERIC Educational Resources Information Center

    Black, Ryan A.; Butler, Stephen F.

    2012-01-01

    Although Rasch models have been shown to be a sound methodological approach to develop and validate measures of psychological constructs for more than 50 years, they remain underutilized in psychology and other social sciences. Until recently, one reason for this underutilization was the lack of syntactically simple procedures to fit Rasch and…

  3. Serial Scanning and Registration of High Resolution Quantitative Computed Tomography Volume Scans for the Determination of Local Bone Density Changes

    NASA Technical Reports Server (NTRS)

    Whalen, Robert T.; Napel, Sandy; Yan, Chye H.

    1996-01-01

    Progress in development of the methods required to study bone remodeling as a function of time is reported. The following topics are presented: 'A New Methodology for Registration Accuracy Evaluation', 'Registration of Serial Skeletal Images for Accurately Measuring Changes in Bone Density', and 'Precise and Accurate Gold Standard for Multimodality and Serial Registration Method Evaluations.'

  4. Web Content Accessibility Guidelines 2.0: A Further Step towards Accessible Digital Information

    ERIC Educational Resources Information Center

    Ribera, Mireia; Porras, Merce; Boldu, Marc; Termens, Miquel; Sule, Andreu; Paris, Pilar

    2009-01-01

    Purpose: The purpose of this paper is to explain the changes in the Web Content Accessibility Guidelines (WCAG) 2.0 compared with WCAG 1.0 within the context of its historical development. Design/methodology/approach: In order to compare WCAG 2.0 with WCAG 1.0 a diachronic analysis of the evolution of these standards is done. Known authors and…

  5. 75 FR 24757 - Order Making Fiscal Year 2011 Annual Adjustments to the Fee Rates Applicable Under Section 6(b...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... Management and Budget (``OMB'') to project aggregate offering price for purposes of the fiscal year 2010... methodology it developed in consultation with the CBO and OMB to project dollar volume for purposes of prior... AAMOP is given by exp(FLAAMOP t + [sigma] n \\2\\/2), where [sigma] n denotes the standard error of the n...

  6. Chemiluminescent optical fiber immunosensor for the detection of anti-West Nile virus IgG.

    PubMed

    Herrmann, Sebastien; Leshem, Boaz; Landes, Shimi; Rager-Zisman, Bracha; Marks, Robert S

    2005-03-31

    An ELISA-based optical fiber methodology developed for the detection of anti-West Nile virus IgG antibodies in serum was compared to standard colorimetric and chemiluminescent ELISA based on microtiter plates. Colorimetric ELISA was the least sensitive, especially at high titer dilutions. The fiber-optic immunosensor based on the same ELISA immunological rationale was the most sensitive technique.

  7. Minimum Information about a Genotyping Experiment (MIGEN)

    PubMed Central

    Huang, Jie; Mirel, Daniel; Pugh, Elizabeth; Xing, Chao; Robinson, Peter N.; Pertsemlidis, Alexander; Ding, LiangHao; Kozlitina, Julia; Maher, Joseph; Rios, Jonathan; Story, Michael; Marthandan, Nishanth; Scheuermann, Richard H.

    2011-01-01

    Genotyping experiments are widely used in clinical and basic research laboratories to identify associations between genetic variations and normal/abnormal phenotypes. Genotyping assay techniques vary from single genomic regions that are interrogated using PCR reactions to high throughput assays examining genome-wide sequence and structural variation. The resulting genotype data may include millions of markers of thousands of individuals, requiring various statistical, modeling or other data analysis methodologies to interpret the results. To date, there are no standards for reporting genotyping experiments. Here we present the Minimum Information about a Genotyping Experiment (MIGen) standard, defining the minimum information required for reporting genotyping experiments. MIGen standard covers experimental design, subject description, genotyping procedure, quality control and data analysis. MIGen is a registered project under MIBBI (Minimum Information for Biological and Biomedical Investigations) and is being developed by an interdisciplinary group of experts in basic biomedical science, clinical science, biostatistics and bioinformatics. To accommodate the wide variety of techniques and methodologies applied in current and future genotyping experiment, MIGen leverages foundational concepts from the Ontology for Biomedical Investigations (OBI) for the description of the various types of planned processes and implements a hierarchical document structure. The adoption of MIGen by the research community will facilitate consistent genotyping data interpretation and independent data validation. MIGen can also serve as a framework for the development of data models for capturing and storing genotyping results and experiment metadata in a structured way, to facilitate the exchange of metadata. PMID:22180825

  8. Methodological Review of Intimate Partner Violence Prevention Research

    ERIC Educational Resources Information Center

    Murray, Christine E.; Graybeal, Jennifer

    2007-01-01

    The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…

  9. Improving Mathematics Performance among Secondary Students with EBD: A Methodological Review

    ERIC Educational Resources Information Center

    Mulcahy, Candace A.; Krezmien, Michael P.; Travers, Jason

    2016-01-01

    In this methodological review, the authors apply special education research quality indicators and standards for single case design to analyze mathematics intervention studies for secondary students with emotional and behavioral disorders (EBD). A systematic methodological review of literature from 1975 to December 2012 yielded 19 articles that…

  10. Methodology for a vaginal and urinary microbiome study in women with mixed urinary incontinence.

    PubMed

    Komesu, Yuko M; Richter, Holly E; Dinwiddie, Darrell L; Siddiqui, Nazema Y; Sung, Vivian W; Lukacz, Emily S; Ridgeway, Beri; Arya, Lily A; Zyczynski, Halina M; Rogers, Rebecca G; Gantz, Marie

    2017-05-01

    We describe the rationale and methods of a study designed to compare vaginal and urinary microbiomes in women with mixed urinary incontinence (MUI) and similarly aged, asymptomatic controls. This paper delineates the methodology of a supplementary microbiome study nested in an ongoing randomized controlled trial comparing a standardized perioperative behavioral/pelvic floor exercise intervention plus midurethral sling versus midurethral sling alone for MUI. Women in the parent study had at least "moderate bother" from urgency and stress urinary incontinence symptoms (SUI) on validated questionnaire and confirmed MUI on bladder diary. Controls had no incontinence symptoms. All participants underwent vaginal and urine collection for DNA analysis and conventional urine culture. Standardized protocols were designed, and a central lab received samples for subsequent polymerase chain reaction (PCR) amplification and sequencing of the bacterial16S ribosomal RNA (rRNA) gene. The composition of bacterial communities will be determined by dual amplicon sequencing of variable regions 1-3 and 4-6 from vaginal and urine specimens to compare the microbiome of patients with controls. Sample-size estimates determined that 126 MUI and 84 control participants were sufficient to detect a 20 % difference in predominant urinary genera, with 80 % power and 0.05 significance level. Specimen collection commenced January 2015 and finished April 2016. DNA was extracted and stored for subsequent evaluation. Methods papers sharing information regarding development of genitourinary microbiome studies, particularly with control populations, are few. We describe the rigorous methodology developed for a novel urogenital microbiome study in women with MUI.

  11. A methodology for TLD postal dosimetry audit of high-energy radiotherapy photon beams in non-reference conditions.

    PubMed

    Izewska, Joanna; Georg, Dietmar; Bera, Pranabes; Thwaites, David; Arib, Mehenna; Saravi, Margarita; Sergieva, Katia; Li, Kaibao; Yip, Fernando Garcia; Mahant, Ashok Kumar; Bulski, Wojciech

    2007-07-01

    A strategy for national TLD audit programmes has been developed by the International Atomic Energy Agency (IAEA). It involves progression through three sequential dosimetry audit steps. The first step audits are for the beam output in reference conditions for high-energy photon beams. The second step audits are for the dose in reference and non-reference conditions on the beam axis for photon and electron beams. The third step audits involve measurements of the dose in reference, and non-reference conditions off-axis for open and wedged symmetric and asymmetric fields for photon beams. Through a co-ordinated research project the IAEA developed the methodology to extend the scope of national TLD auditing activities to more complex audit measurements for regular fields. Based on the IAEA standard TLD holder for high-energy photon beams, a TLD holder was developed with horizontal arm to enable measurements 5cm off the central axis. Basic correction factors were determined for the holder in the energy range between Co-60 and 25MV photon beams. New procedures were developed for the TLD irradiation in hospitals. The off-axis measurement methodology for photon beams was tested in a multi-national pilot study. The statistical distribution of dosimetric parameters (off-axis ratios for open and wedge beam profiles, output factors, wedge transmission factors) checked in 146 measurements was 0.999+/-0.012. The methodology of TLD audits in non-reference conditions with a modified IAEA TLD holder has been shown to be feasible.

  12. STRengthening analytical thinking for observational studies: the STRATOS initiative.

    PubMed

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-12-30

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  13. Evaluation of a new approach to compute intervertebral disc height measurements from lateral radiographic views of the spine.

    PubMed

    Allaire, Brett T; DePaolis Kaluza, M Clara; Bruno, Alexander G; Samelson, Elizabeth J; Kiel, Douglas P; Anderson, Dennis E; Bouxsein, Mary L

    2017-01-01

    Current standard methods to quantify disc height, namely distortion compensated Roentgen analysis (DCRA), have been mostly utilized in the lumbar and cervical spine and have strict exclusion criteria. Specifically, discs adjacent to a vertebral fracture are excluded from measurement, thus limiting the use of DCRA in studies that include older populations with a high prevalence of vertebral fractures. Thus, we developed and tested a modified DCRA algorithm that does not depend on vertebral shape. Participants included 1186 men and women from the Framingham Heart Study Offspring and Third Generation Multidetector CT Study. Lateral CT scout images were used to place 6 morphometry points around each vertebra at 13 vertebral levels in each participant. Disc heights were calculated utilizing these morphometry points using DCRA methodology and our modified version of DCRA, which requires information from fewer morphometry points than the standard DCRA. Modified DCRA and standard DCRA measures of disc height are highly correlated, with concordance correlation coefficients above 0.999. Both measures demonstrate good inter- and intra-operator reproducibility. 13.9 % of available disc heights were not evaluable or excluded using the standard DCRA algorithm, while only 3.3 % of disc heights were not evaluable using our modified DCRA algorithm. Using our modified DCRA algorithm, it is not necessary to exclude vertebrae with fracture or other deformity from disc height measurements as in the standard DCRA. Modified DCRA also yields identical measurements to the standard DCRA. Thus, the use of modified DCRA for quantitative assessment of disc height will lead to less missing data without any loss of accuracy, making it a preferred alternative to the current standard methodology.

  14. Development and validation of a bioassay to evaluate binding of adalimumab to cell membrane-anchored TNFα using flow cytometry detection.

    PubMed

    Camacho-Sandoval, Rosa; Sosa-Grande, Eréndira N; González-González, Edith; Tenorio-Calvo, Alejandra; López-Morales, Carlos A; Velasco-Velázquez, Marco; Pavón-Romero, Lenin; Pérez-Tapia, Sonia Mayra; Medina-Rivero, Emilio

    2018-06-05

    Physicochemical and structural properties of proteins used as active pharmaceutical ingredients of biopharmaceuticals are determinant to carry out their biological activity. In this regard, the assays intended to evaluate functionality of biopharmaceuticals provide confirmatory evidence that they contain the appropriate physicochemical properties and structural conformation. The validation of the methodologies used for the assessment of critical quality attributes of biopharmaceuticals is a key requirement for manufacturing under GMP environments. Herein we present the development and validation of a flow cytometry-based methodology for the evaluation of adalimumab's affinity towards membrane-bound TNFα (mTNFα) on recombinant CHO cells. This in vitro methodology measures the interaction between an in-solution antibody and its target molecule onto the cell surface through a fluorescent signal. The characteristics evaluated during the validation exercise showed that this methodology is suitable for its intended purpose. The assay demonstrated to be accurate (r 2  = 0.92, slope = 1.20), precise (%CV ≤ 18.31) and specific (curve fitting, r 2  = 0.986-0.997) to evaluate binding of adalimumab to mTNFα. The results obtained here provide evidence that detection by flow cytometry is a viable alternative for bioassays used in the pharmaceutical industry. In addition, this methodology could be standardized for the evaluation of other biomolecules acting through the same mechanism of action. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    NASA Astrophysics Data System (ADS)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  16. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  17. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    NASA Astrophysics Data System (ADS)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean deformation width of 183 m and 133 m, respectively. We envisage that correlation results derived from our methodology will provide vital data for near-field deformation patterns and will be of significant use for constraining inversion solutions for fault slip at depth.

  18. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  19. The case for applying tissue engineering methodologies to instruct human organoid morphogenesis.

    PubMed

    Marti-Figueroa, Carlos R; Ashton, Randolph S

    2017-05-01

    Three-dimensional organoids derived from human pluripotent stem cell (hPSC) derivatives have become widely used in vitro models for studying development and disease. Their ability to recapitulate facets of normal human development during in vitro morphogenesis produces tissue structures with unprecedented biomimicry. Current organoid derivation protocols primarily rely on spontaneous morphogenesis processes to occur within 3-D spherical cell aggregates with minimal to no exogenous control. This yields organoids containing microscale regions of biomimetic tissues, but at the macroscale (i.e. 100's of microns to millimeters), the organoids' morphology, cytoarchitecture, and cellular composition are non-biomimetic and variable. The current lack of control over in vitro organoid morphogenesis at the microscale induces aberrations at the macroscale, which impedes realization of the technology's potential to reproducibly form anatomically correct human tissue units that could serve as optimal human in vitro models and even transplants. Here, we review tissue engineering methodologies that could be used to develop powerful approaches for instructing multiscale, 3-D human organoid morphogenesis. Such technological mergers are critically needed to harness organoid morphogenesis as a tool for engineering functional human tissues with biomimetic anatomy and physiology. Human PSC-derived 3-D organoids are revolutionizing the biomedical sciences. They enable the study of development and disease within patient-specific genetic backgrounds and unprecedented biomimetic tissue microenvironments. However, their uncontrolled, spontaneous morphogenesis at the microscale yields inconsistences in macroscale organoid morphology, cytoarchitecture, and cellular composition that limits their standardization and application. Integration of tissue engineering methods with organoid derivation protocols could allow us to harness their potential by instructing standardized in vitro morphogenesis to generate organoids with biomimicry at all scales. Such advancements would enable the use of organoids as a basis for 'next-generation' tissue engineering of functional, anatomically mimetic human tissues and potentially novel organ transplants. Here, we discuss critical aspects of organoid morphogenesis where application of innovative tissue engineering methodologies would yield significant advancement towards this goal. Copyright © 2017. Published by Elsevier Ltd.

  20. Evidence based herbal drug standardization approach in coping with challenges of holistic management of diabetes: a dreadful lifestyle disorder of 21st century

    PubMed Central

    2013-01-01

    Plants by virtue of its composition of containing multiple constituents developed during its growth under various environmental stresses providing a plethora of chemical families with medicinal utility. Researchers are exploring this wealth and trying to decode its utility for enhancing health standards of human beings. Diabetes is dreadful lifestyle disorder of 21st century caused due to lack of insulin production or insulin physiological unresponsiveness. The chronic impact of untreated diabetes significantly affects vital organs. The allopathic medicines have five classes of drugs, or otherwise insulin in Type I diabetes, targeting insulin secretion, decreasing effect of glucagon, sensitization of receptors for enhanced glucose uptake etc. In addition, diet management, increased food fiber intake, Resistant Starch intake and routine exercise aid in managing such dangerous metabolic disorder. One of the key factors that limit commercial utility of herbal drugs is standardization. Standardization poses numerous challenges related to marker identification, active principle(s), lack of defined regulations, non-availability of universally acceptable technical standards for testing and implementation of quality control/safety standard (toxicological testing). The present study proposed an integrated herbal drug development & standardization model which is an amalgamation of Classical Approach of Ayurvedic Therapeutics, Reverse Pharmacological Approach based on Observational Therapeutics, Technical Standards for complete product cycle, Chemi-informatics, Herbal Qualitative Structure Activity Relationship and Pharmacophore modeling and, Post-Launch Market Analysis. Further studies are warranted to ensure that an effective herbal drug standardization methodology will be developed, backed by a regulatory standard guide the future research endeavors in more focused manner. PMID:23822656

  1. Evidence based herbal drug standardization approach in coping with challenges of holistic management of diabetes: a dreadful lifestyle disorder of 21st century.

    PubMed

    Chawla, Raman; Thakur, Pallavi; Chowdhry, Ayush; Jaiswal, Sarita; Sharma, Anamika; Goel, Rajeev; Sharma, Jyoti; Priyadarshi, Smruti Sagar; Kumar, Vinod; Sharma, Rakesh Kumar; Arora, Rajesh

    2013-07-04

    Plants by virtue of its composition of containing multiple constituents developed during its growth under various environmental stresses providing a plethora of chemical families with medicinal utility. Researchers are exploring this wealth and trying to decode its utility for enhancing health standards of human beings. Diabetes is dreadful lifestyle disorder of 21st century caused due to lack of insulin production or insulin physiological unresponsiveness. The chronic impact of untreated diabetes significantly affects vital organs. The allopathic medicines have five classes of drugs, or otherwise insulin in Type I diabetes, targeting insulin secretion, decreasing effect of glucagon, sensitization of receptors for enhanced glucose uptake etc. In addition, diet management, increased food fiber intake, Resistant Starch intake and routine exercise aid in managing such dangerous metabolic disorder. One of the key factors that limit commercial utility of herbal drugs is standardization. Standardization poses numerous challenges related to marker identification, active principle(s), lack of defined regulations, non-availability of universally acceptable technical standards for testing and implementation of quality control/safety standard (toxicological testing). The present study proposed an integrated herbal drug development & standardization model which is an amalgamation of Classical Approach of Ayurvedic Therapeutics, Reverse Pharmacological Approach based on Observational Therapeutics, Technical Standards for complete product cycle, Chemi-informatics, Herbal Qualitative Structure Activity Relationship and Pharmacophore modeling and, Post-Launch Market Analysis. Further studies are warranted to ensure that an effective herbal drug standardization methodology will be developed, backed by a regulatory standard guide the future research endeavors in more focused manner.

  2. Classification of samples into two or more ordered populations with application to a cancer trial.

    PubMed

    Conde, D; Fernández, M A; Rueda, C; Salvador, B

    2012-12-10

    In many applications, especially in cancer treatment and diagnosis, investigators are interested in classifying patients into various diagnosis groups on the basis of molecular data such as gene expression or proteomic data. Often, some of the diagnosis groups are known to be related to higher or lower values of some of the predictors. The standard methods of classifying patients into various groups do not take into account the underlying order. This could potentially result in high misclassification rates, especially when the number of groups is larger than two. In this article, we develop classification procedures that exploit the underlying order among the mean values of the predictor variables and the diagnostic groups by using ideas from order-restricted inference. We generalize the existing methodology on discrimination under restrictions and provide empirical evidence to demonstrate that the proposed methodology improves over the existing unrestricted methodology. The proposed methodology is applied to a bladder cancer data set where the researchers are interested in classifying patients into various groups. Copyright © 2012 John Wiley & Sons, Ltd.

  3. The use of geospatial web services for exchanging utilities data

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Joanna

    2013-04-01

    Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.

  4. Protocol for the Solid-phase Synthesis of Oligomers of RNA Containing a 2'-O-thiophenylmethyl Modification and Characterization via Circular Dichroism.

    PubMed

    Francis, Andrew J; Resendiz, Marino J E

    2017-07-28

    Solid-phase synthesis has been used to obtain canonical and modified polymers of nucleic acids, specifically of DNA or RNA, which has made it a popular methodology for applications in various fields and for different research purposes. The procedure described herein focuses on the synthesis, purification, and characterization of dodecamers of RNA 5'-[CUA CGG AAU CAU]-3' containing zero, one, or two modifications located at the C2'-O-position. The probes are based on 2-thiophenylmethyl groups, incorporated into RNA nucleotides via standard organic synthesis and introduced into the corresponding oligonucleotides via their respective phosphoramidites. This report makes use of phosphoramidite chemistry via the four canonical nucleobases (Uridine (U), Cytosine (C), Guanosine (G), Adenosine (A)), as well as 2-thiophenylmethyl functionalized nucleotides modified at the 2'-O-position; however, the methodology is amenable for a large variety of modifications that have been developed over the years. The oligonucleotides were synthesized on a controlled-pore glass (CPG) support followed by cleavage from the resin and deprotection under standard conditions, i.e., a mixture of ammonia and methylamine (AMA) followed by hydrogen fluoride/triethylamine/N-methylpyrrolidinone. The corresponding oligonucleotides were purified via polyacrylamide electrophoresis (20% denaturing) followed by elution, desalting, and isolation via reversed-phase chromatography (Sep-pak, C18-column). Quantification and structural parameters were assessed via ultraviolet-visible (UV-vis) and circular dichroism (CD) photometric analysis, respectively. This report aims to serve as a resource and guide for beginner and expert researchers interested in embarking in this field. It is expected to serve as a work-in-progress as new technologies and methodologies are developed. The description of the methodologies and techniques within this document correspond to a DNA/RNA synthesizer (refurbished and purchased in 2013) that uses phosphoramidite chemistry.

  5. COPRED: prediction of fold, GO molecular function and functional residues at the domain level.

    PubMed

    López, Daniel; Pazos, Florencio

    2013-07-15

    Only recently the first resources devoted to the functional annotation of proteins at the domain level started to appear. The next step is to develop specific methodologies for predicting function at the domain level based on these resources, and to implement them in web servers to be used by the community. In this work, we present COPRED, a web server for the concomitant prediction of fold, molecular function and functional sites at the domain level, based on a methodology for domain molecular function prediction and a resource of domain functional annotations previously developed and benchmarked. COPRED can be freely accessed at http://csbg.cnb.csic.es/copred. The interface works in all standard web browsers. WebGL (natively supported by most browsers) is required for the in-line preview and manipulation of protein 3D structures. The website includes a detailed help section and usage examples. pazos@cnb.csic.es.

  6. Methodology Developed for Modeling the Fatigue Crack Growth Behavior of Single-Crystal, Nickel-Base Superalloys

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Because of their superior high-temperature properties, gas generator turbine airfoils made of single-crystal, nickel-base superalloys are fast becoming the standard equipment on today's advanced, high-performance aerospace engines. The increased temperature capabilities of these airfoils has allowed for a significant increase in the operating temperatures in turbine sections, resulting in superior propulsion performance and greater efficiencies. However, the previously developed methodologies for life-prediction models are based on experience with polycrystalline alloys and may not be applicable to single-crystal alloys under certain operating conditions. One of the main areas where behavior differences between single-crystal and polycrystalline alloys are readily apparent is subcritical fatigue crack growth (FCG). The NASA Lewis Research Center's work in this area enables accurate prediction of the subcritical fatigue crack growth behavior in single-crystal, nickel-based superalloys at elevated temperatures.

  7. Determination of anionic surface active agents using silica coated magnetite nanoparticles modified with cationic surfactant aggregates.

    PubMed

    Pena-Pereira, Francisco; Duarte, Regina M B O; Trindade, Tito; Duarte, Armando C

    2013-07-19

    The development of a novel methodology for extraction and preconcentration of the most commonly used anionic surface active agents (SAAs), linear alkylbenzene sulfonates (LAS), is presented herein. The present method, based on the use of silica-magnetite nanoparticles modified with cationic surfactant aggregates, was developed for determination of C10-C13 LAS homologues. The proposed methodology allowed quantitative recoveries of C10-C13 LAS homologues by using a reduced amount of magnetic nanoparticles. Limits of detection were in the range 0.8-1.9μgL(-1) for C10-C13 LAS homologues, while the repeatability, expressed as relative standard deviation (RSD), ranged from 2.0 to 3.9% (N=6). Finally, the proposed method was successfully applied to the analysis of a variety of natural water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. The French connection: some contributions of French-language research in the post-Piagetian era.

    PubMed

    Larivée, S; Normandeau, S; Parent, S

    2000-01-01

    This article presents French-speaking researchers' contribution to the field of differential developmental psychology. Following a brief review of key Piagetian ideas pertaining to his conceptualization of individual differences, the core of the article traces methodological and theoretical transformations that were necessary for understanding individual differences within a general theory of cognitive development. On a methodological level, French-speaking researchers went from standardizing Piaget's clinical method to constructing developmental scales and operational tests. On a theoretical level, Reuchlin's writings guided Longeot, and several other French (Lautrey and Bideaud) and Genevan (de Ribaupierre and Rieben) researchers into a scientific quest for a genuine integration of differential and developmental psychology. We present an overview of the pluralistic and multidimensional model of cognitive functioning and development that emerged from the work of the French-Swiss team of researchers. Concluding remarks focus on the actual research agendas of researchers interested in resolving the challenging issue of understanding relationships between inter- and intraindividual differences and general tendencies in cognitive development.

  9. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  10. Examination of the impact of animal and dairy science journals based on traditional and newly developed bibliometric indices.

    PubMed

    Malesios, C; Abas, Z

    2012-12-01

    Using traditional bibliometric indices such as the well-known journal impact factor (IFAC), as well as other more recently developed measures like the (journal) h-index and modifications, we assessed the impact of most prolific scientific journals in the field of animal and dairy science. To achieve this end, we performed a detailed investigation on the evaluation of journals quality, using a total of 50 journals selected from the category of "Agriculture, Dairy & Animal Science" included in the Thomson Reuters' (formerly Institute of Scientific Information, ISI) Web of Science. Our analysis showed that among the top journals in the field are the Journal of Dairy Research, the Journal of Dairy Science, and the Journal of Animal Science. In particular, the Journal of Animal Science, the most productive and frequently cited journal, has shown rapid development, especially in recent years. The majority of the top-tier, highly cited articles are those associated with the description of statistical methodology and the standard chemical analytical methodologies.

  11. Can context justify an ethical double standard for clinical research in developing countries?

    PubMed Central

    Landes, Megan

    2005-01-01

    Background The design of clinical research deserves special caution so as to safeguard the rights of participating individuals. While the international community has agreed on ethical standards for the design of research, these frameworks still remain open to interpretation, revision and debate. Recently a breach in the consensus of how to apply these ethical standards to research in developing countries has occurred, notably beginning with the 1994 placebo-controlled trials to reduce maternal to child transmission of HIV-1 in Africa, Asia and the Caribbean. The design of these trials sparked intense debate with the inclusion of a placebo-control group despite the existence of a 'gold standard' and trial supporters grounded their justifications of the trial design on the context of scarcity in resource-poor settings. Discussion These 'contextual' apologetics are arguably an ethical loophole inherent in current bioethical methodology. However, this convenient appropriation of 'contextual' analysis simply fails to acknowledge the underpinnings of feminist ethical analysis upon which it must stand. A more rigorous analysis of the political, social, and economic structures pertaining to the global context of developing countries reveals that the bioethical principles of beneficence and justice fail to be met in this trial design. Conclusion Within this broader, and theoretically necessary, understanding of context, it becomes impossible to justify an ethical double standard for research in developing countries. PMID:16045801

  12. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.

  13. Intelligent Devices - Sensors and Actuators - A KSC Perspective

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Perotti, Jose M.

    2008-01-01

    The primary objective of this workshop is to identify areas of advancement in sensor measurements and technologies that will help to define standard practices and procedures that will better enable the infusion into flight programs of sensors with improved capabilities but limited or no flight heritage. These standards would be crucial to demonstrating a methodology for validating current models while also creating the possibility of being able to have sufficient data to either update these models (e. g., spatial or temporal resolution, etc.) or develop new models based on the ability to simulate the new measured physical parameters. The workshop is also intended to narrow the gap between sensor measurements (and techniques), data processing techniques and the ability to make use of that data by gathering together experts in the field for a short workshop. This collaboration will unite NASA and other government agencies with contractor capabilities industry-wide to prevent duplication, spawn synergistic growth in sensor technology, help analysts make good engineering decisions and help focus new sensor maturation efforts to better meet future flight program customers' needs. This is the first such workshop designed to specifically address establishing a standardized protocol/methodology for demonstrating the technology readiness of non-flight heritage sensor systems. While other similar workshops are held covering many areas of interest to the sensor development community, no other meeting is specific enough to address this vital but often overlooked topic. By encouraging cross-fertilization of ideas from instrument experts from many different backgrounds, it is hoped that this workshop will initiate innovative new ideas and concepts in sensor development, calibration and validation. It is anticipated this workshop will repeat periodically as needed.

  14. Gas-diffusion microextraction coupled with spectrophotometry for the determination of formaldehyde in cork agglomerates.

    PubMed

    Brandão, Pedro F; Ramos, Rui M; Valente, Inês M; Almeida, Paulo J; Carro, Antonia M; Lorenzo, Rosa A; Rodrigues, José A

    2017-04-01

    In this work, a simple methodology was developed for the extraction and determination of free formaldehyde content in cork agglomerate samples. For the first time, gas-diffusion microextraction was used for the extraction of volatile formaldehyde directly from samples, with simultaneous derivatization with acetylacetone (Hantzsch reaction). The absorbance of the coloured solution was read in a spectrophotometer at 412 nm. Different extraction parameters were studied and optimized (extraction temperature, sample mass, volume of acceptor solution, extraction time and concentration of derivatization reagent) by means of an asymmetric screening. The developed methodology proved to be a reliable tool for the determination of formaldehyde in cork agglomerates with the following suitable method features: low LOD (0.14 mg kg -1 ) and LOQ (0.47 mg kg -1 ), r 2  = 0.9994, and intraday and interday precision of 3.5 and 4.9%, respectively. The developed methodology was applied to the determination of formaldehyde in different cork agglomerate samples, and contents between 1.9 and 9.4 mg kg -1 were found. Furthermore, formaldehyde was also determined by the standard method EN 717-3 for comparison purposes; no significant differences between the results of both methods were observed. Graphical abstract Representation of the GDME system and its main components.

  15. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  16. Maintaining Equivalent Cut Scores for Small Sample Test Forms

    ERIC Educational Resources Information Center

    Dwyer, Andrew C.

    2016-01-01

    This study examines the effectiveness of three approaches for maintaining equivalent performance standards across test forms with small samples: (1) common-item equating, (2) resetting the standard, and (3) rescaling the standard. Rescaling the standard (i.e., applying common-item equating methodology to standard setting ratings to account for…

  17. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument.

    PubMed

    Mokkink, Lidwine B; Prinsen, Cecilia A C; Bouter, Lex M; Vet, Henrica C W de; Terwee, Caroline B

    2016-01-19

    COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments.

  18. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument

    PubMed Central

    Mokkink, Lidwine B.; Prinsen, Cecilia A. C.; Bouter, Lex M.; de Vet, Henrica C. W.; Terwee, Caroline B.

    2016-01-01

    Background: COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) is an initiative of an international multidisciplinary team of researchers who aim to improve the selection of outcome measurement instruments both in research and in clinical practice by developing tools for selecting the most appropriate available instrument. Method: In this paper these tools are described, i.e. the COSMIN taxonomy and definition of measurement properties; the COSMIN checklist to evaluate the methodological quality of studies on measurement properties; a search filter for finding studies on measurement properties; a protocol for systematic reviews of outcome measurement instruments; a database of systematic reviews of outcome measurement instruments; and a guideline for selecting outcome measurement instruments for Core Outcome Sets in clinical trials. Currently, we are updating the COSMIN checklist, particularly the standards for content validity studies. Also new standards for studies using Item Response Theory methods will be developed. Additionally, in the future we want to develop standards for studies on the quality of non-patient reported outcome measures, such as clinician-reported outcomes and performance-based outcomes. Conclusions: In summary, we plea for more standardization in the use of outcome measurement instruments, for conducting high quality systematic reviews on measurement instruments in which the best available outcome measurement instrument is recommended, and for stopping the use of poor outcome measurement instruments. PMID:26786084

  19. Assessing medical professionalism: A systematic review of instruments and their measurement properties.

    PubMed

    Li, Honghe; Ding, Ning; Zhang, Yuanyuan; Liu, Yang; Wen, Deliang

    2017-01-01

    Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments' measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990-2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument's usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee's criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar's instrument for nursing students, Nurse Practitioners' Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies.

  20. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    PubMed Central

    Pollard, Beth; Johnston, Marie; Dixon, Diane

    2007-01-01

    Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the clinician report measures appeared less well developed. It would be of value if new measures defined the construct of interest and, that the construct, be part of theoretical model. By ensuring measures are both theoretically and empirically valid then improvements in subjective health outcome measures should be possible. PMID:17343739

  1. Strain-specific induction of experimental autoimmune prostatitis (EAP) in mice.

    PubMed

    Jackson, Christopher M; Flies, Dallas B; Mosse, Claudio A; Parwani, Anil; Hipkiss, Edward L; Drake, Charles G

    2013-05-01

    Prostatitis, a clinical syndrome characterized by pelvic pain and inflammation, is common in adult males. Although several induced and spontaneous murine models of prostatitis have been explored, the role of genetic background on induction has not been well-defined. Using a standard methodology for the induction of experimental autoimmune prostatitis (EAP), we investigated both acute and chronic inflammation on several murine genetic backgrounds. In our colony, nonobese diabetic (NOD) mice evinced spontaneous prostatitis that was not augmented by immunization with rat prostate extract (RPE). In contrast, the standard laboratory strain Balb/c developed chronic inflammation in response to RPE immunization. Development of EAP in other strains was variable. These data suggest that Balb/c mice injected with RPE may provide a useful model for chronic prostatic inflammation. Copyright © 2012 Wiley Periodicals, Inc.

  2. Strain-Specific Induction of Experimental Autoimmune Prostatitis (EAP) in Mice

    PubMed Central

    Jackson, Christopher M.; Flies, Dallas B.; Mosse, Claudio A.; Parwani, Anil; Hipkiss, Edward L.; Drake, Charles G.

    2013-01-01

    BACKGROUND Prostatitis, a clinical syndrome characterized by pelvic pain and inflammation, is common in adult males. Although several induced and spontaneous murine models of prostatitis have been explored, the role of genetic background on induction has not been well-defined. METHODS Using a standard methodology for the induction of experimental autoimmune prostatitis (EAP), we investigated both acute and chronic inflammation on several murine genetic backgrounds. RESULTS In our colony, nonobese diabetic (NOD) mice evinced spontaneous prostatitis that was not augmented by immunization with rat prostate extract (RPE). In contrast, the standard laboratory strain Balb/c developed chronic inflammation in response to RPE immunization. Development of EAP in other strains was variable. CONCLUSIONS These data suggest that Balb/c mice injected with RPE may provide a useful model for chronic prostatic inflammation. PMID:23129407

  3. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  4. Predictive Inference Using Latent Variables with Covariates*

    PubMed Central

    Schofield, Lynne Steuerle; Junker, Brian; Taylor, Lowell J.; Black, Dan A.

    2014-01-01

    Plausible Values (PVs) are a standard multiple imputation tool for analysis of large education survey data that measures latent proficiency variables. When latent proficiency is the dependent variable, we reconsider the standard institutionally-generated PV methodology and find it applies with greater generality than shown previously. When latent proficiency is an independent variable, we show that the standard institutional PV methodology produces biased inference because the institutional conditioning model places restrictions on the form of the secondary analysts’ model. We offer an alternative approach that avoids these biases based on the mixed effects structural equations (MESE) model of Schofield (2008). PMID:25231627

  5. REE radiation fault model: a tool for organizing and communication radiation test data and construction COTS based spacebourne computing systems

    NASA Technical Reports Server (NTRS)

    Ferraro, R.; Some, R.

    2002-01-01

    The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.

  6. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology

    PubMed Central

    Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.

    2016-01-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112

  7. Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.

    PubMed

    Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T

    2015-06-01

    We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.

  8. Development of a Mobile Phone App to Support Self-Monitoring of Emotional Well-Being: A Mental Health Digital Innovation

    PubMed Central

    2016-01-01

    Background Emotional well-being is a primary component of mental health and well-being. Monitoring changes in emotional state daily over extended periods is, however, difficult using traditional methodologies. Providing mental health support is also challenging when approximately only 1 in 2 people with mental health issues seek professional help. Mobile phone technology offers a sustainable means of enhancing self-management of emotional well-being. Objective This paper aims to describe the development of a mobile phone tool designed to monitor emotional changes in a natural everyday context and in real time. Methods This evidence-informed mobile phone app monitors emotional mental health and well-being, and it provides links to mental health organization websites and resources. The app obtains data via self-report psychological questionnaires, experience sampling methodology (ESM), and automated behavioral data collection. Results Feedback from 11 individuals (age range 16-52 years; 4 males, 7 females), who tested the app over 30 days, confirmed via survey and focus group methods that the app was functional and usable. Conclusions Recommendations for future researchers and developers of mental health apps to be used for research are also presented. The methodology described in this paper offers a powerful tool for a range of potential mental health research studies and provides a valuable standard against which development of future mental health apps should be considered. PMID:27881358

  9. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  10. Evaluation of a proposed expert system development methodology: Two case studies

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1990-01-01

    Two expert system development projects were studied to evaluate a proposed Expert Systems Development Methodology (ESDM). The ESDM was developed to provide guidance to managers and technical personnel and serve as a standard in the development of expert systems. It was agreed that the proposed ESDM must be evaluated before it could be adopted; therefore a study was planned for its evaluation. This detailed study is now underway. Before the study began, however, two ongoing projects were selected for a retrospective evaluation. They were the Ranging Equipment Diagnostic Expert System (REDEX) and the Backup Control Mode Analysis and Utility System (BCAUS). Both projects were approximately 1 year into development. Interviews of project personnel were conducted, and the resulting data was used to prepare the retrospective evaluation. Decision models of the two projects were constructed and used to evaluate the completeness and accuracy of key provisions of ESDM. A major conclusion reached from these case studies is that suitability and risk analysis should be required for all AI projects, large and small. Further, the objectives of each stage of development during a project should be selected to reduce the next largest area of risk or uncertainty on the project.

  11. A Comparison of Civil and Military, European and United States Regulations and Standards for the Certification of Helicopter Structure

    DTIC Science & Technology

    2013-01-01

    Helicopter Structure Christopher Dore Air Vehicles Division Defence Science and Technology Organisation DSTO-TN-1136 ABSTRACT A...of rotary wing aircraft structure was conducted. The comparison utilised a graphical hierarchy-based methodology developed as an improvement on text...Science and Technology Organisation researchers on the intent of the subject documents and the similarities and differences between them. RELEASE

  12. Medical Technology Base Master Plan

    DTIC Science & Technology

    1990-03-01

    methodologes to evaluate off ectiveness of current and new Integrated protectivb equxipment systems. The failure of the system designer to adequately consider...With animal studies to use fewer than one-tenth the nu~r~e of aniimals used In standard factorial experimental designs , and "* Preliminary development of...research and developmnent have Produced Cost savings as well as sustained and augmented combat and non -combat rniss~on effectiveness. Examples 0f the Armys

  13. Control of autonomous ground vehicles: a brief technical review

    NASA Astrophysics Data System (ADS)

    Babak, Shahian-Jahromi; Hussain, Syed A.; Karakas, Burak; Cetin, Sabri

    2017-07-01

    This paper presents a brief review of the developments achieved in autonomous vehicle systems technology. A concise history of autonomous driver assistance systems is presented, followed by a review of current state of the art sensor technology used in autonomous vehicles. Standard sensor fusion method that has been recently explored is discussed. Finally, advances in embedded software methodologies that define the logic between sensory information and actuation decisions are reviewed.

  14. Radiation Status of Sub-65 nm Electronics

    NASA Technical Reports Server (NTRS)

    Pellish, Jonathan A.

    2011-01-01

    Ultra-scaled complementary metal oxide semiconductor (CMOS) includes commercial foundry capabilities at and below the 65 nm technology node Radiation evaluations take place using standard products and test characterization vehicles (memories, logic/latch chains, etc.) NEPP focus is two-fold: (1) Conduct early radiation evaluations to ascertain viability for future NASA missions (i.e. leverage commercial technology development). (2) Uncover gaps in current testing methodologies and mechanism comprehension -- early risk mitigation.

  15. 50% Advanced Energy Design Guides: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnema, E.; Leach, M.; Pless, S.

    2012-07-01

    This paper presents the process, methodology, and assumptions for the development of the 50% Energy Savings Advanced Energy Design Guides (AEDGs), a design guidance document that provides specific recommendations for achieving 50% energy savings above the requirements of ANSI/ASHRAE/IESNA Standard 90.1-2004 in four building types: (1) Small to medium office buildings, (2) K-12 school buildings, (3) Medium to big box retail buildings, (4) Large hospital buildings.

  16. Second-career science teachers' classroom conceptions of science and engineering practices examined through the lens of their professional histories

    NASA Astrophysics Data System (ADS)

    Antink-Meyer, Allison; Brown, Ryan A.

    2017-07-01

    Science standards in the U.S. have shifted to emphasise science and engineering process skills (i.e. specific practices within inquiry) to a greater extent than previous standards' emphases on broad representations of inquiry. This study examined the alignment between second-career science teachers' personal histories with the latter and examined the extent to which they viewed that history as a factor in their teaching. Four, second-career science teachers with professional backgrounds in engineering, environmental, industrial, and research and development careers participated. Through the examination of participants' methodological and contextual histories in science and engineering, little evidence of conflict with teaching was found. They generally exemplified the agency and motivation of a second-career teacher-scientist that has been found elsewhere [Gilbert, A. (2011). There and back again: Exploring teacher attrition and mobility with two transitioning science teachers. Journal of Science Teacher Education, 22(5), 393-415; Grier, J. M., & Johnston, C. C. (2009). An inquiry into the development of teacher identities in STEM career changers. Journal of Science Teacher Education, 20(1), 57-75]. The methodological and pedagogical perspectives of participants are explored and a discussion of the implications of findings for science teacher education are presented.

  17. The effects of organizational flexibility on nurse utilization and vacancy statistics in Ontario hospitals.

    PubMed

    Fisher, Anita; Baumann, Andrea; Blythe, Jennifer

    2007-01-01

    Social and economic changes in industrial societies during the past quarter-century encouraged organizations to develop greater flexibility in their employment systems in order to adapt to organizational restructuring and labour market shifts (Kallenberg 2003). During the 1990s this trend became evident in healthcare organizations. Before healthcare restructuring, employment in the acute hospital sector was more stable, with higher levels of full-time staff. However, in the downsizing era, employers favoured more flexible, contingent workforces (Zeytinoglu 1999). As healthcare systems evolved, staffing patterns became more chaotic and predicting staffing requirements more complex. Increased use of casual and part-time staff, overtime and agency nurses, as well as alterations in skills mix, masked vacancy counts and thus rendered this measurement of nursing demand increasingly difficult. This study explores flexible nurse staffing practices and demonstrates how data such as nurse vacancy statistics, considered in isolation from nurse utilization information, are inaccurate indicators of nursing demand and nurse shortage. It develops an algorithm that provides a standard methodology for improved monitoring and management of nurse utilization data and better quantification of vacancy statistics. Use of standard methodology promotes more accurate measurement of nurse utilization and shortage. Furthermore, it provides a solid base for improved nursing workforce planning, production and management.

  18. Strategies to design clinical studies to identify predictive biomarkers in cancer research.

    PubMed

    Perez-Gracia, Jose Luis; Sanmamed, Miguel F; Bosch, Ana; Patiño-Garcia, Ana; Schalper, Kurt A; Segura, Victor; Bellmunt, Joaquim; Tabernero, Josep; Sweeney, Christopher J; Choueiri, Toni K; Martín, Miguel; Fusco, Juan Pablo; Rodriguez-Ruiz, Maria Esperanza; Calvo, Alfonso; Prior, Celia; Paz-Ares, Luis; Pio, Ruben; Gonzalez-Billalabeitia, Enrique; Gonzalez Hernandez, Alvaro; Páez, David; Piulats, Jose María; Gurpide, Alfonso; Andueza, Mapi; de Velasco, Guillermo; Pazo, Roberto; Grande, Enrique; Nicolas, Pilar; Abad-Santos, Francisco; Garcia-Donas, Jesus; Castellano, Daniel; Pajares, María J; Suarez, Cristina; Colomer, Ramon; Montuenga, Luis M; Melero, Ignacio

    2017-02-01

    The discovery of reliable biomarkers to predict efficacy and toxicity of anticancer drugs remains one of the key challenges in cancer research. Despite its relevance, no efficient study designs to identify promising candidate biomarkers have been established. This has led to the proliferation of a myriad of exploratory studies using dissimilar strategies, most of which fail to identify any promising targets and are seldom validated. The lack of a proper methodology also determines that many anti-cancer drugs are developed below their potential, due to failure to identify predictive biomarkers. While some drugs will be systematically administered to many patients who will not benefit from them, leading to unnecessary toxicities and costs, others will never reach registration due to our inability to identify the specific patient population in which they are active. Despite these drawbacks, a limited number of outstanding predictive biomarkers have been successfully identified and validated, and have changed the standard practice of oncology. In this manuscript, a multidisciplinary panel reviews how those key biomarkers were identified and, based on those experiences, proposes a methodological framework-the DESIGN guidelines-to standardize the clinical design of biomarker identification studies and to develop future research in this pivotal field. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. GEOTHERMAL EFFLUENT SAMPLING WORKSHOP

    EPA Science Inventory

    This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.

  20. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  1. Comparison of Damage Path Predictions for Composite Laminates by Explicit and Standard Finite Element Analysis Tools

    NASA Technical Reports Server (NTRS)

    Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.

    2006-01-01

    Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.

  2. [Fair use of tests in health sciences].

    PubMed

    Espelt, Albert; Viladrich, Carme; Doval, Eduardo; Aliaga, Joan; García-Rueda, Rebeca; Tárrega, Salomé

    2014-01-01

    Standardized measurement instruments (tests) have become an essential tool in health sciences. The concept of equity in the development, adaptation and administration of psychometric tests was first introduced in "Standards for Educational and Psychological Testing" published in 1999 by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education. Despite its importance, this concept has been scarcely used in epidemiology and public health. Consequently, this methodological note aims to explain the concept of equity in testing and to provide tools and indications to detect and solve their inequitable use. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  3. [Is there a German history of evidence-based medicine? Methodic standards of therapeutic research in the early 20th century and Paul Martini's "Methodology of therapeutic investigation" (1932)].

    PubMed

    Stoll, S; Roelcke, V; Raspe, H

    2005-07-29

    The article addresses the history of evidence-based medicine in Germany. Its aim was to reconstruct the standard of clinical-therapeutic investigation in Germany at the beginning of the 20 (th) century. By a historical investigation of five important German general medical journals for the time between 1918 and 1932 an overview of the situation of clinical investigation is given. 268 clinical trails are identified, and are analysed in view of their methodological design. Heterogeneous results are found: While few examples of sophisticated methodology exist, the design of the majority of the studies is poor. A response to the situation described can be seen in Paul Martini's book "Methodology of Therapeutic Investigation", first published in 1932. Paul Martini's biography, his criticism of the situation of clinical-therapeutic investigation of his time, the major points of his methodology and the reception of the book in Germany and abroad are described.

  4. Multi-factor energy price models and exotic derivatives pricing

    NASA Astrophysics Data System (ADS)

    Hikspoors, Samuel

    The high pace at which many of the world's energy markets have gradually been opened to competition have generated a significant amount of new financial activity. Both academicians and practitioners alike recently started to develop the tools of energy derivatives pricing/hedging as a quantitative topic of its own. The energy contract structures as well as their underlying asset properties set the energy risk management industry apart from its more standard equity and fixed income counterparts. This thesis naturally contributes to these broad market developments in participating to the advances of the mathematical tools aiming at a better theory of energy contingent claim pricing/hedging. We propose many realistic two-factor and three-factor models for spot and forward price processes that generalize some well known and standard modeling assumptions. We develop the associated pricing methodologies and propose stable calibration algorithms that motivate the application of the relevant modeling schemes.

  5. Human-Centered Design Study: Enhancing the Usability of a Mobile Phone App in an Integrated Falls Risk Detection System for Use by Older Adult Users

    PubMed Central

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227

  6. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  7. Feasibility of "Standardized Clinician" Methodology for Patient Training on Hospital-to-Home Transitions.

    PubMed

    Wehbe-Janek, Hania; Hochhalter, Angela K; Castilla, Theresa; Jo, Chanhee

    2015-02-01

    Patient engagement in health care is increasingly recognized as essential for promoting the health of individuals and populations. This study pilot tested the standardized clinician (SC) methodology, a novel adaptation of standardized patient methodology, for teaching patient engagement skills for the complex health care situation of transitioning from a hospital back to home. Sixty-seven participants at heightened risk for hospitalization were randomly assigned to either simulation exposure-only or full-intervention group. Both groups participated in simulation scenarios with "standardized clinicians" around tasks related to hospital discharge and follow-up. The full-intervention group was also debriefed after scenario sets and learned about tools for actively participating in hospital-to-home transitions. Measures included changes in observed behaviors at baseline and follow-up and an overall program evaluation. The full-intervention group showed increases in observed tool possession (P = 0.014) and expression of their preferences and values (P = 0.043). The simulation exposure-only group showed improvement in worksheet scores (P = 0.002) and fewer engagement skills (P = 0.021). Both groups showed a decrease in telling an SC about their hospital admission (P < 0.05). Open-ended comments from the program evaluation were largely positive. Both groups benefited from exposure to the SC intervention. Program evaluation data suggest that simulation training is feasible and may provide a useful methodology for teaching patient skills for active engagement in health care. Future studies are warranted to determine if this methodology can be used to assess overall patient engagement and whether new patient learning transfers to health care encounters.

  8. Methodological proposal for validation of the disinfecting efficacy of an automated flexible endoscope reprocessor

    PubMed Central

    Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine

    2016-01-01

    ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian Health Surveillance Agency (Agência Nacional de Vigilância Sanitária - ANVISA) Collegiate Board Resolution (Resolução de Diretoria Colegiada - RDC) no. 35/2010 and 15/2012. The proposed method was applied to a commercially available device using a high-level 0.2% peracetic acid-based disinfectant. Results: the proposed method of assessment was found to be robust when the recommendations made in the relevant legislation were incorporated with some adjustments to ensure their feasibility. Application of the proposed method provided evidence of the efficacy of the tested equipment for the high-level disinfection of endoscopes. Conclusion: the proposed method may serve as a reference for the assessment of flexible endoscope reprocessors, thereby providing solid ground for the purchase of this category of health products. PMID:27508915

  9. Amendment of water quality standards in China: viewpoint on strategic considerations.

    PubMed

    Zhao, Xiaoli; Wang, Hao; Tang, Zhi; Zhao, Tianhui; Qin, Ning; Li, Huixian; Wu, Fengchang; Giesy, John P

    2018-02-01

    Water quality standards (WQS) are the most important tool for protection of quality of aquatic environments in China and play a decisive role in the management of China's aquatic environments. Due to limited scientific information available previously, WQS were developed largely based on water quality criteria (WQC) or WQS recommended by developed countries, which may not be suitable for current circumstances in China. The Chinese government recently initiated the revision of Environmental Quality Standards for Surface Water (EQSSW) (GB3838-2002) to meet the challenge of environmental protection. This review analyzed how the WQS developed and applied in China differ from those of more developed countries and pointed out that the lack of strong scientific bases for China's WQC pose major limitations of current WQS. We focus on discussing the six aspects that require high attention on how to establish a national WQC system to support the revision of WQS (Table 1) such as development of methodology, refining water function zoning, establish priority pollutants list, improving protection drinking water sources, development of site-specific water quality criteria, and field toxicity test. It is essential that China and other developing countries established a relatively mature system for promulgating, applying, and enforcing WQC and to implement a dynamic system to incorporate most recent research results into periodically updated WQS.

  10. [AIDS and pauperization: principal concepts and empirical evidence].

    PubMed

    Bastos, F I; Szwarcwald, C L

    2000-01-01

    This paper discusses methodologies for analyzing relations between social inequalities, marginalization, prejudice, and vulnerability to HIV/AIDS, highlighting current difficulties and alternative research strategies. It also reviews the international and Brazilian literature, emphasizing: economic and macropolitical dimensions in the spread of HIV/AIDS; the role of drug policies and consumption; gender inequalities and prejudice; racial/ethnic inequalities and prejudice; and interaction with other STIs and their relationship to poverty; HIV/AIDS and health care standards, especially access to antiretroviral therapy; and human rights violations. Despite current methodological dilemmas in analyzing relations between psychosocial, cultural, and sociopolitical variables and vulnerability to HIV/AIDS and the limited Brazil literature, such themes merit further investigation, addressing Brazilian social and cultural specificities and profiting from recently developed research strategies.

  11. Functioning information in the learning health system.

    PubMed

    Stucki, Gerold; Bickenbach, Jerome

    2017-02-01

    In this methodological note on applying the ICF in rehabilitation, we introduce functioning information as fundamental for the "learning health system" and the continuous improvement of the health system's response to people's functioning needs by means of the provision of rehabilitation. A learning health system for rehabilitation operates at the micro-level of the individual patient, meso-level of operational management, and the macro-level of policy that guides rehabilitation programming. All three levels rely on the capacity of the informational system of the health system for standardized documentation and coding of functioning information, and the development of national rehabilitation quality management systems. This methodological note describes how functioning information is used for the continuous improvement of functioning outcomes in a learning health system across these three levels.

  12. Handwriting Examination: Moving from Art to Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kristin H.; Hanlen, Richard C.; Manzolillo, P. A.

    The scientific basis for handwriting individuality and the expertise of handwriting examiners has been questioned in several court cases and law review articles. The criticisms were originally directed at the proficiency and expertise of forensic document examiners (FDE's). However, these criticisms also illustrate the lack of empirical data to support and validate the premises and methodology of handwriting examination. As a result the admissibility and weight of FDE testimony has been called into question. These assaults on the scientific integrity of handwriting analysis have created an urgent need for the forensic document examination community to develop objective standards, measurable criteriamore » and a uniform methodology supported by properly controlled studies that evaluate and validate the significance of measurable handwriting characteristics.« less

  13. Advances in bioanalytical techniques to measure steroid hormones in serum.

    PubMed

    French, Deborah

    2016-06-01

    Steroid hormones are measured clinically to determine if a patient has a pathological process occurring in the adrenal gland, or other hormone responsive organs. They are very similar in structure making them analytically challenging to measure. Additionally, these hormones have vast concentration differences in human serum adding to the measurement complexity. GC-MS was the gold standard methodology used to measure steroid hormones clinically, followed by radioimmunoassay, but that was replaced by immunoassay due to ease of use. LC-MS/MS has now become a popular alternative owing to simplified sample preparation than for GC-MS and increased specificity and sensitivity over immunoassay. This review will discuss these methodologies and some new developments that could simplify and improve steroid hormone analysis in serum.

  14. Recipe for Success: Digital Viewables

    NASA Technical Reports Server (NTRS)

    LaPha, Steven; Gaydos, Frank

    2014-01-01

    The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.

  15. Development of a frequency regulation duty-cycle for standardized energy storage performance testing

    DOE PAGES

    Rosewater, David; Ferreira, Summer

    2016-05-25

    The US DOE Protocol for uniformly measuring and expressing the performance of energy storage systems, first developed in 2012 through inclusive working group activities, provides standardized methodologies for evaluating an energy storage system’s ability to supply specific services to electrical grids. This article elaborates on the data and decisions behind the duty-cycle used for frequency regulation in this protocol. Analysis of a year of publicly available frequency regulation control signal data from a utility was considered in developing the representative signal for this use case. Moreover, this showed that signal standard deviation can be used as a metric for aggressivenessmore » or rigor. From these data, we select representative 2 h long signals that exhibit nearly all of dynamics of actual usage under two distinct regimens, one for average use and the other for highly aggressive use. Our results were combined into a 24-h duty-cycle comprised of average and aggressive segments. The benefits and drawbacks of the selected duty-cycle are discussed along with its potential implications to the energy storage industry.« less

  16. Accelerator controls at CERN: Some converging trends

    NASA Astrophysics Data System (ADS)

    Kuiper, B.

    1990-08-01

    CERN's growing services to the high-energy physics community using frozen resources has led to the implementation of "Technical Boards", mandated to assist the management by making recommendations for rationalizations in various technological domains. The Board on Process Control and Electronics for Accelerators, TEBOCO, has emphasized four main lines which might yield economy in resources. First, a common architecture for accelerator controls has been agreed between the three accelerator divisions. Second, a common hardware/software kit has been defined, from which the large majority of future process interfacing may be composed. A support service for this kit is an essential part of the plan. Third, high-level protocols have been developed for standardizing access to process devices. They derive from agreed standard models of the devices and involve a standard control message. This should ease application development and mobility of equipment. Fourth, a common software engineering methodology and a commercial package of application development tools have been adopted. Some rationalization in the field of the man-machine interface and in matters of synchronization is also under way.

  17. Assessing human rights impacts in corporate development projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salcito, Kendyl, E-mail: kendyl.salcito@unibas.ch; University of Basel, P.O. Box, CH-4003 Basel; NomoGaia, 1900 Wazee Street, Suite 303, Denver, CO 80202

    Human rights impact assessment (HRIA) is a process for systematically identifying, predicting and responding to the potential impact on human rights of a business operation, capital project, government policy or trade agreement. Traditionally, it has been conducted as a desktop exercise to predict the effects of trade agreements and government policies on individuals and communities. In line with a growing call for multinational corporations to ensure they do not violate human rights in their activities, HRIA is increasingly incorporated into the standard suite of corporate development project impact assessments. In this context, the policy world's non-structured, desk-based approaches to HRIAmore » are insufficient. Although a number of corporations have commissioned and conducted HRIA, no broadly accepted and validated assessment tool is currently available. The lack of standardisation has complicated efforts to evaluate the effectiveness of HRIA as a risk mitigation tool, and has caused confusion in the corporate world regarding company duties. Hence, clarification is needed. The objectives of this paper are (i) to describe an HRIA methodology, (ii) to provide a rationale for its components and design, and (iii) to illustrate implementation of HRIA using the methodology in two selected corporate development projects—a uranium mine in Malawi and a tree farm in Tanzania. We found that as a prognostic tool, HRIA could examine potential positive and negative human rights impacts and provide effective recommendations for mitigation. However, longer-term monitoring revealed that recommendations were unevenly implemented, dependent on market conditions and personnel movements. This instability in the approach to human rights suggests a need for on-going monitoring and surveillance. -- Highlights: • We developed a novel methodology for corporate human rights impact assessment. • We piloted the methodology on two corporate projects—a mine and a plantation. • Human rights impact assessment exposed impacts not foreseen in ESIA. • Corporations adopted the majority of findings, but not necessarily immediately. • Methodological advancements are expected for monitoring processes.« less

  18. Surface electromyography in animals: A systematic review

    PubMed Central

    Valentin, Stephanie; Zsoldos, Rebeka R.

    2017-01-01

    The study of muscle activity using surface electromyography (sEMG) is commonly used for investigations of the neuromuscular system in man. Although sEMG has faced methodological challenges, considerable technical advances have been made in the last few decades. Similarly, the field of animal biomechanics, including sEMG, has grown despite being confronted with often complex experimental conditions. In human sEMG research, standardised protocols have been developed, however these are lacking in animal sEMG. Before standards can be proposed in this population group, the existing research in animal sEMG should be collated and evaluated. Therefore the aim of this review is to systematically identify and summarise the literature in animal sEMG focussing on (1) species, breeds, activities and muscles investigated, and (2) electrode placement and normalisation methods used. The databases PubMed, Web of Science, Scopus, and Vetmed Resource were searched systematically for sEMG studies in animals and 38 articles were included in the final review. Data on methodological quality was collected and summarised. The findings from this systematic review indicate the divergence in animal sEMG methodology and as a result, future steps required to develop standardisation in animal sEMG are proposed. PMID:26763600

  19. Surface electromyography in animal biomechanics: A systematic review.

    PubMed

    Valentin, Stephanie; Zsoldos, Rebeka R

    2016-06-01

    The study of muscle activity using surface electromyography (sEMG) is commonly used for investigations of the neuromuscular system in man. Although sEMG has faced methodological challenges, considerable technical advances have been made in the last few decades. Similarly, the field of animal biomechanics, including sEMG, has grown despite being confronted with often complex experimental conditions. In human sEMG research, standardised protocols have been developed, however these are lacking in animal sEMG. Before standards can be proposed in this population group, the existing research in animal sEMG should be collated and evaluated. Therefore the aim of this review is to systematically identify and summarise the literature in animal sEMG focussing on (1) species, breeds, activities and muscles investigated, and (2) electrode placement and normalisation methods used. The databases PubMed, Web of Science, Scopus, and Vetmed Resource were searched systematically for sEMG studies in animals and 38 articles were included in the final review. Data on methodological quality was collected and summarised. The findings from this systematic review indicate the divergence in animal sEMG methodology and as a result, future steps required to develop standardisation in animal sEMG are proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  1. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  2. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  3. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  4. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  5. Life Cycle Assessment and Carbon Footprint in the Wine Supply-Chain

    NASA Astrophysics Data System (ADS)

    Pattara, Claudio; Raggi, Andrea; Cichelli, Angelo

    2012-06-01

    Global warming represents one of the most critical internationally perceived environmental issues. The growing, and increasingly global, wine sector is one of the industries which is under increasing pressure to adopt approaches for environmental assessment and reporting of product-related greenhouse gas emissions. The International Organization for Vine and Wine has recently recognized the need to develop a standard and objective methodology and a related tool for calculating carbon footprint (CF). This study applied this tool to a wine previously analyzed using the life cycle assessment (LCA) methodology. The objective was to test the tool as regards both its potential and possible limitations, and thus to assess its suitability as a standard tool. Despite the tool's user-friendliness, a number of limitations were noted including the lack of accurate baseline data, a partial system boundary and the impossibility of dealing with the multi-functionality issue. When the CF and LCA results are compared in absolute terms, large discrepancies become obvious due to a number of different assumptions, as well as the modeling framework adopted. Nonetheless, in relative terms the results seem to be quite consistent. However, a critical limitation of the CF methodology was its focus on a single issue, which can lead to burden shifting. In conclusion, the study confirmed the need for both further improvement and adaptation to additional contexts and further studies to validate the use of this tool in different companies.

  6. Life cycle assessment and carbon footprint in the wine supply-chain.

    PubMed

    Pattara, Claudio; Raggi, Andrea; Cichelli, Angelo

    2012-06-01

    Global warming represents one of the most critical internationally perceived environmental issues. The growing, and increasingly global, wine sector is one of the industries which is under increasing pressure to adopt approaches for environmental assessment and reporting of product-related greenhouse gas emissions. The International Organization for Vine and Wine has recently recognized the need to develop a standard and objective methodology and a related tool for calculating carbon footprint (CF). This study applied this tool to a wine previously analyzed using the life cycle assessment (LCA) methodology. The objective was to test the tool as regards both its potential and possible limitations, and thus to assess its suitability as a standard tool. Despite the tool's user-friendliness, a number of limitations were noted including the lack of accurate baseline data, a partial system boundary and the impossibility of dealing with the multi-functionality issue. When the CF and LCA results are compared in absolute terms, large discrepancies become obvious due to a number of different assumptions, as well as the modeling framework adopted. Nonetheless, in relative terms the results seem to be quite consistent. However, a critical limitation of the CF methodology was its focus on a single issue, which can lead to burden shifting. In conclusion, the study confirmed the need for both further improvement and adaptation to additional contexts and further studies to validate the use of this tool in different companies.

  7. Research priorities: women in Africa.

    PubMed

    Okeyo, A P

    1979-01-01

    In December 1979, an Expert Meeting on Research and Data Collection on Women and Development was convened in Nairobi for the purpose of defining research priorities and methodological approaches for studying the role of African women in development. After reviewing current literature relevant to the subject matter, the participants developed a number of hypotheses regarding the impact of development activities on the role and status of women, and recommended that these hypotheses be tested in future reserach. In general, agrarian reform, mechanization of agriculture, the introduction of cash cropping, and modernization were hypothesized as having a negative impact on the role, status, productive activities, and nutritional standards of women. Other hypotheses stated that development programs and agricultural extension services tended to neglect women. Recommended research methodologies include: 1) efforts to involve the community members in the development and implementation of research projects undertaken in their communities; 2) increased use of local experts and community members in data collection; and 3) interdisciplinary collaboration. The participants also recommended that each country compile a statistical profile on the women in their countries. The profiles should include comparable information on: 1) fertility; 2) educational levels, employment status, and income levels for women; 3) household composition; and 4) types of services available to women.

  8. A systematic review and metaanalysis of energy intake and weight gain in pregnancy.

    PubMed

    Jebeile, Hiba; Mijatovic, Jovana; Louie, Jimmy Chun Yu; Prvan, Tania; Brand-Miller, Jennie C

    2016-04-01

    Gestational weight gain within the recommended range produces optimal pregnancy outcomes, yet many women exceed the guidelines. Official recommendations to increase energy intake by ∼ 1000 kJ/day in pregnancy may be excessive. To determine by metaanalysis of relevant studies whether greater increments in energy intake from early to late pregnancy corresponded to greater or excessive gestational weight gain. We systematically searched electronic databases for observational and intervention studies published from 1990 to the present. The databases included Ovid Medline, Cochrane Library, Excerpta Medica DataBASE (EMBASE), Cumulative Index to Nursing and Allied Health Literature (CINAHL), and Science Direct. In addition we hand-searched reference lists of all identified articles. Studies were included if they reported gestational weight gain and energy intake in early and late gestation in women of any age with a singleton pregnancy. Search also encompassed journals emerging from both developed and developing countries. Studies were individually assessed for quality based on the Quality Criteria Checklist obtained from the Evidence Analysis Manual: Steps in the academy evidence analysis process. Publication bias was plotted by the use of a funnel plot with standard mean difference against standard error. Identified studies were meta-analyzed and stratified by body mass index, study design, dietary methodology, and country status (developed/developing) by the use of a random-effects model. Of 2487 articles screened, 18 studies met inclusion criteria. On average, women gained 12.0 (2.8) kg (standardized mean difference = 1.306, P < .0005) yet reported only a small increment in energy intake that did not reach statistical significance (∼475 kJ/day, standard mean difference = 0.266, P = .016). Irrespective of baseline body mass index, study design, dietary methodology, or country status, changes in energy intake were not significantly correlated to the amount of gestational weight gain (r = 0.321, P = .11). Despite rapid physiologic weight gain, women report little or no change in energy intake during pregnancy. Current recommendations to increase energy intake by ∼ 1000 kJ/day may, therefore, encourage excessive weight gain and adverse pregnancy outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Texas. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  10. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Minnesota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Minnesota. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  11. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Indiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Indiana. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  12. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Florida. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  13. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Maine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Maine. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  14. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Vermont

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Vermont. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  15. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Michigan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Michigan. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  16. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Alabama. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  17. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Hampshire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Hampshire. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  18. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Mexico. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  19. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Colorado. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  20. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Washington. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

Top