Sample records for high methodological standards

  1. Standardized Laboratory Test Requirements for Hardening Equipment to Withstand Wave Impact Shock in Small High Speed Craft

    DTIC Science & Technology

    2017-02-06

    and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and example requirements for...engineering rationale, assumptions, and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and... Methodologies for Small High-Speed Craft Structure, Equipment, Shock Isolation Seats, and Human Performance At-Sea, 10 th Symposium on High

  2. Methodological aspects of clinical trials in tinnitus: A proposal for an international standard

    PubMed Central

    Landgrebe, Michael; Azevedo, Andréia; Baguley, David; Bauer, Carol; Cacace, Anthony; Coelho, Claudia; Dornhoffer, John; Figueiredo, Ricardo; Flor, Herta; Hajak, Goeran; van de Heyning, Paul; Hiller, Wolfgang; Khedr, Eman; Kleinjung, Tobias; Koller, Michael; Lainez, Jose Miguel; Londero, Alain; Martin, William H.; Mennemeier, Mark; Piccirillo, Jay; De Ridder, Dirk; Rupprecht, Rainer; Searchfield, Grant; Vanneste, Sven; Zeman, Florian; Langguth, Berthold

    2013-01-01

    Chronic tinnitus is a common condition with a high burden of disease. While many different treatments are used in clinical practice, the evidence for the efficacy of these treatments is low and the variance of treatment response between individuals is high. This is most likely due to the great heterogeneity of tinnitus with respect to clinical features as well as underlying pathophysiological mechanisms. There is a clear need to find effective treatment options in tinnitus, however, clinical trials differ substantially with respect to methodological quality and design. Consequently, the conclusions that can be derived from these studies are limited and jeopardize comparison between studies. Here, we discuss our view of the most important aspects of trial design in clinical studies in tinnitus and make suggestions for an international methodological standard in tinnitus trials. We hope that the proposed methodological standard will stimulate scientific discussion and will help to improve the quality of trials in tinnitus. PMID:22789414

  3. The Effectiveness of Educational Technology Applications for Enhancing Mathematics Achievement in K-12 Classrooms: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cheung, Alan C. K.; Slavin, Robert E.

    2013-01-01

    The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…

  4. Validity of High-School Grades in Predicting Student Success beyond the Freshman Year: High-School Record vs. Standardized Tests as Indicators of Four-Year College Outcomes. Research & Occasional Paper Series: CSHE.6.07

    ERIC Educational Resources Information Center

    Geiser, Saul; Santelices, Maria Veronica

    2007-01-01

    High-school grades are often viewed as an unreliable criterion for college admissions, owing to differences in grading standards across high schools, while standardized tests are seen as methodologically rigorous, providing a more uniform and valid yardstick for assessing student ability and achievement. The present study challenges that…

  5. A taxonomy of multinational ethical and methodological standards for clinical trials of therapeutic interventions

    PubMed Central

    Ashton, Carol M; Wray, Nelda P; Jarman, Anna F; Kolman, Jacob M; Wenner, Danielle M; Brody, Baruch A

    2013-01-01

    Background If trials of therapeutic interventions are to serve society’s interests, they must be of high methodological quality and must satisfy moral commitments to human subjects. The authors set out to develop a clinical-trials compendium in which standards for the ethical treatment of human subjects are integrated with standards for research methods. Methods The authors rank-ordered the world’s nations and chose the 31 with >700 active trials as of 24 July 2008. Governmental and other authoritative entities of the 31 countries were searched, and 1004 English-language documents containing ethical and/or methodological standards for clinical trials were identified. The authors extracted standards from 144 of those: 50 designated as ‘core’, 39 addressing trials of invasive procedures and a 5% sample (N=55) of the remainder. As the integrating framework for the standards we developed a coherent taxonomy encompassing all elements of a trial’s stages. Findings Review of the 144 documents yielded nearly 15 000 discrete standards. After duplicates were removed, 5903 substantive standards remained, distributed in the taxonomy as follows: initiation, 1401 standards, 8 divisions; design, 1869 standards, 16 divisions; conduct, 1473 standards, 8 divisions; analysing and reporting results, 997 standards, four divisions; and post-trial standards, 168 standards, 5 divisions. Conclusions The overwhelming number of source documents and standards uncovered in this study was not anticipated beforehand and confirms the extraordinary complexity of the clinical trials enterprise. This taxonomy of multinational ethical and methodological standards may help trialists and overseers improve the quality of clinical trials, particularly given the globalisation of clinical research. PMID:21429960

  6. A taxonomy of multinational ethical and methodological standards for clinical trials of therapeutic interventions.

    PubMed

    Ashton, Carol M; Wray, Nelda P; Jarman, Anna F; Kolman, Jacob M; Wenner, Danielle M; Brody, Baruch A

    2011-06-01

    If trials of therapeutic interventions are to serve society's interests, they must be of high methodological quality and must satisfy moral commitments to human subjects. The authors set out to develop a clinical-trials compendium in which standards for the ethical treatment of human subjects are integrated with standards for research methods. The authors rank-ordered the world's nations and chose the 31 with >700 active trials as of 24 July 2008. Governmental and other authoritative entities of the 31 countries were searched, and 1004 English-language documents containing ethical and/or methodological standards for clinical trials were identified. The authors extracted standards from 144 of those: 50 designated as 'core', 39 addressing trials of invasive procedures and a 5% sample (N=55) of the remainder. As the integrating framework for the standards we developed a coherent taxonomy encompassing all elements of a trial's stages. Review of the 144 documents yielded nearly 15 000 discrete standards. After duplicates were removed, 5903 substantive standards remained, distributed in the taxonomy as follows: initiation, 1401 standards, 8 divisions; design, 1869 standards, 16 divisions; conduct, 1473 standards, 8 divisions; analysing and reporting results, 997 standards, four divisions; and post-trial standards, 168 standards, 5 divisions. The overwhelming number of source documents and standards uncovered in this study was not anticipated beforehand and confirms the extraordinary complexity of the clinical trials enterprise. This taxonomy of multinational ethical and methodological standards may help trialists and overseers improve the quality of clinical trials, particularly given the globalisation of clinical research.

  7. Seven Performance Drivers.

    ERIC Educational Resources Information Center

    Ross, Linda

    2003-01-01

    Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…

  8. Methodological framework for heart rate variability analysis during exercise: application to running and cycling stress testing.

    PubMed

    Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel

    2018-05-01

    Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.

  9. Most systematic reviews of high methodological quality on psoriasis interventions are classified as high risk of bias using ROBIS tool.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-01

    No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Standards and Methodologies for Characterizing Radiobiological Impact of High-Z Nanoparticles

    PubMed Central

    Subiel, Anna; Ashmore, Reece; Schettino, Giuseppe

    2016-01-01

    Research on the application of high-Z nanoparticles (NPs) in cancer treatment and diagnosis has recently been the subject of growing interest, with much promise being shown with regards to a potential transition into clinical practice. In spite of numerous publications related to the development and application of nanoparticles for use with ionizing radiation, the literature is lacking coherent and systematic experimental approaches to fully evaluate the radiobiological effectiveness of NPs, validate mechanistic models and allow direct comparison of the studies undertaken by various research groups. The lack of standards and established methodology is commonly recognised as a major obstacle for the transition of innovative research ideas into clinical practice. This review provides a comprehensive overview of radiobiological techniques and quantification methods used in in vitro studies on high-Z nanoparticles and aims to provide recommendations for future standardization for NP-mediated radiation research. PMID:27446499

  11. ASTM and VAMAS activities in titanium matrix composites test methods development

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.; Harmon, D. M.; Bartolotta, P. A.; Russ, S. M.

    1994-01-01

    Titanium matrix composites (TMC's) are being considered for a number of aerospace applications ranging from high performance engine components to airframe structures in areas that require high stiffness to weight ratios at temperatures up to 400 C. TMC's exhibit unique mechanical behavior due to fiber-matrix interface failures, matrix cracks bridged by fibers, thermo-viscoplastic behavior of the matrix at elevated temperatures, and the development of significant thermal residual stresses in the composite due to fabrication. Standard testing methodology must be developed to reflect the uniqueness of this type of material systems. The purpose of this paper is to review the current activities in ASTM and Versailles Project on Advanced Materials and Standards (VAMAS) that are directed toward the development of standard test methodology for titanium matrix composites.

  12. A web based health technology assessment in tele-echocardiography: the experience within an Italian project.

    PubMed

    Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio

    2009-01-01

    Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.

  13. How recalibration method, pricing, and coding affect DRG weights

    PubMed Central

    Carter, Grace M.; Rogowski, Jeannette A.

    1992-01-01

    We compared diagnosis-related group (DRG) weights calculated using the hospital-specific relative-value (HSR V) methodology with those calculated using the standard methodology for each year from 1985 through 1989 and analyzed differences between the two methods in detail for 1989. We provide evidence suggesting that classification error and subsidies of higher weighted cases by lower weighted cases caused compression in the weights used for payment as late as the fifth year of the prospective payment system. However, later weights calculated by the standard method are not compressed because a statistical correlation between high markups and high case-mix indexes offsets the cross-subsidization. HSR V weights from the same files are compressed because this methodology is more sensitive to cross-subsidies. However, both sets of weights produce equally good estimates of hospital-level costs net of those expenses that are paid by outlier payments. The greater compression of the HSR V weights is counterbalanced by the fact that more high-weight cases qualify as outliers. PMID:10127456

  14. Who Gets In, and Who Doesn't Selecting Medical Students: An Australian Case Study

    ERIC Educational Resources Information Center

    Mercer, Annette

    2009-01-01

    Medicine is a profession with a long history of research and high standards of methodological rigour and evidence-based decision making. These standards are being transferred into medical education. Universities are starting to collect high-quality data on their admissions systems. This has not been a short-term proposition. Medical courses are…

  15. Summary Brief: International Baccalaureate Standards Development and Alignment Project

    ERIC Educational Resources Information Center

    Conley, David T.; Ward, Terri

    2009-01-01

    Although the International Baccalaureate (IB) Diploma Programme is offered by many high schools in the United States and considered to be challenging and rich in content, the curriculum has not been analyzed to determine its alignment with college readiness standards or state educational standards in the U.S. The research methodology employed by…

  16. Stochastic capture zone analysis of an arsenic-contaminated well using the generalized likelihood uncertainty estimator (GLUE) methodology

    NASA Astrophysics Data System (ADS)

    Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro

    2003-06-01

    In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.

  17. Applications of cost-effectiveness methodologies in behavioral medicine.

    PubMed

    Kaplan, Robert M; Groessl, Erik J

    2002-06-01

    In 1996, the Panel on Cost-Effectiveness in Health and Medicine developed standards for cost-effectiveness analysis. The standards include the use of a societal perspective, that treatments be evaluated in comparison with the best available alternative (rather than with no care at all), and that health benefits be expressed in standardized units. Guidelines for cost accounting were also offered. Among 24,562 references on cost-effectiveness in Medline between 1995 and 2000, only a handful were relevant to behavioral medicine. Only 19 studies published between 1983 and 2000 met criteria for further evaluation. Among analyses that were reported, only 2 studies were found consistent with the Panel's criteria for high-quality analyses, although more recent studies were more likely to meet methodological standards. There are substantial opportunities to advance behavioral medicine by performing standardized cost-effectiveness analyses.

  18. 77 FR 39287 - Self-Regulatory Organizations; Chicago Mercantile Exchange, Inc.; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-02

    ... Change To Adopt Changes That Would Affect Its Standard Portfolio Analysis of Risk Methodology for Certain... Rule Change CME proposes to adopt certain changes that would affect its Standard Portfolio Analysis of... calibrates the risk of portfolios, consisting of positions in highly similar and correlated futures and...

  19. Detection of beryllium in digested autopsy tissues by inductively coupled plasma mass spectrometry using a high matrix interface configuration.

    PubMed

    Larivière, Dominic; Tremblay, Mélodie; Durand-Jézéquel, Myriam; Tolmachev, Sergei

    2012-04-01

    This article describes a robust methodology using the combination of instrumental design (high matrix interface-HMI), sample dilution and internal standardization for the quantification of beryllium (Be) in various digested autopsy tissues using inductively coupled plasma mass spectrometry. The applicability of rhodium as a proper internal standard for Be was demonstrated in three types of biological matrices (i.e., femur, hair, lung tissues). Using HMI, it was possible to achieve instrumental detection limits and sensitivity of 0.6 ng L(-1) and 157 cps L ng(-1), respectively. Resilience to high salt matrices of the HMI setup was also highlighted using bone mimicking solution ([Ca(2+)] = 26 to 1,400 mg L(-1)), providing a 14-fold increase in tolerance and a 2.7-fold decrease in method detection limit compared to optimized experimental conditions obtained without the HMI configuration. Precision of the methodology to detect low levels of Be in autopsy samples was demonstrated using hair and blood certified reference materials. Be concentration ranging from 0.015 to 255 μg kg(-1) in autopsy samples obtained from the U.S. Transuranium and Uranium Registries were measured using the methodology presented.

  20. Ultra-high-performance liquid chromatography/tandem high-resolution mass spectrometry analysis of sixteen red beverages containing carminic acid: identification of degradation products by using principal component analysis/discriminant analysis.

    PubMed

    Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio

    2015-01-15

    The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review.

    PubMed

    Mathes, Tim; Klaßen, Pauline; Pieper, Dawid

    2017-11-28

    Our objective was to assess the frequency of data extraction errors and its potential impact on results in systematic reviews. Furthermore, we evaluated the effect of different extraction methods, reviewer characteristics and reviewer training on error rates and results. We performed a systematic review of methodological literature in PubMed, Cochrane methodological registry, and by manual searches (12/2016). Studies were selected by two reviewers independently. Data were extracted in standardized tables by one reviewer and verified by a second. The analysis included six studies; four studies on extraction error frequency, one study comparing different reviewer extraction methods and two studies comparing different reviewer characteristics. We did not find a study on reviewer training. There was a high rate of extraction errors (up to 50%). Errors often had an influence on effect estimates. Different data extraction methods and reviewer characteristics had moderate effect on extraction error rates and effect estimates. The evidence base for established standards of data extraction seems weak despite the high prevalence of extraction errors. More comparative studies are needed to get deeper insights into the influence of different extraction methods.

  2. Vertically aligned carbon nanotubes for microelectrode arrays applications.

    PubMed

    Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric

    2012-09-01

    In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.

  3. Rapid quantitative chemical mapping of surfaces with sub-2 nm resolution

    NASA Astrophysics Data System (ADS)

    Lai, Chia-Yun; Perri, Saverio; Santos, Sergio; Garcia, Ricardo; Chiesa, Matteo

    2016-05-01

    We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems.We present a theory that exploits four observables in bimodal atomic force microscopy to produce maps of the Hamaker constant H. The quantitative H maps may be employed by the broader community to directly interpret the high resolution of standard bimodal AFM images as chemical maps while simultaneously quantifying chemistry in the non-contact regime. We further provide a simple methodology to optimize a range of operational parameters for which H is in the closest agreement with the Lifshitz theory in order to (1) simplify data acquisition and (2) generalize the methodology to any set of cantilever-sample systems. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr00496b

  4. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  5. Implementing the PAIN RelieveIt Randomized Controlled Trial in Hospice Care: Mechanisms for Success and Meeting PCORI Methodology Standards.

    PubMed

    Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J

    2017-07-01

    This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.

  6. Determining radiated sound power of building structures by means of laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.

    2015-06-01

    This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.

  7. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  8. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  9. Best Available Evidence: Three Complementary Approaches

    ERIC Educational Resources Information Center

    Slocum, Timothy A.; Spencer, Trina D.; Detrich, Ronnie

    2012-01-01

    The best available evidence is one of the three critical features of evidence-based practice. Best available evidence is often considered to be synonymous with extremely high standards for research methodology. However, this notion may limit the scope and impact of evidence based practice to those educational decisions on which high quality…

  10. Multidimensional Scaling of High School Students' Perceptions of Academic Dishonesty

    ERIC Educational Resources Information Center

    Schmelkin, Liora Pedhazur; Gilbert, Kimberly A.; Silva, Rebecca

    2010-01-01

    Although cheating on tests and other forms of academic dishonesty are considered rampant, no standard definition of academic dishonesty exists. The current study was conducted to investigate the perceptions of academic dishonesty in high school students, utilizing an innovative methodology, multidimensional scaling (MDS). Two methods were used to…

  11. Methodological standards and patient-centeredness in comparative effectiveness research: the PCORI perspective.

    PubMed

    2012-04-18

    Rigorous methodological standards help to ensure that medical research produces information that is valid and generalizable, and are essential in patient-centered outcomes research (PCOR). Patient-centeredness refers to the extent to which the preferences, decision-making needs, and characteristics of patients are addressed, and is the key characteristic differentiating PCOR from comparative effectiveness research. The Patient Protection and Affordable Care Act signed into law in 2010 created the Patient-Centered Outcomes Research Institute (PCORI), which includes an independent, federally appointed Methodology Committee. The Methodology Committee is charged to develop methodological standards for PCOR. The 4 general areas identified by the committee in which standards will be developed are (1) prioritizing research questions, (2) using appropriate study designs and analyses, (3) incorporating patient perspectives throughout the research continuum, and (4) fostering efficient dissemination and implementation of results. A Congressionally mandated PCORI methodology report (to be issued in its first iteration in May 2012) will begin to provide standards in each of these areas, and will inform future PCORI funding announcements and review criteria. The work of the Methodology Committee is intended to enable generation of information that is relevant and trustworthy for patients, and to enable decisions that improve patient-centered outcomes.

  12. The effect of instructional methodology on high school students natural sciences standardized tests scores

    NASA Astrophysics Data System (ADS)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  13. Assessment of capillary suction time (CST) test methodologies.

    PubMed

    Sawalha, O; Scholz, M

    2007-12-01

    The capillary suction time (CST) test is a commonly used method to measure the filterability and the easiness of removing moisture from slurry and sludge in numerous environmental and industrial applications. This study assessed several novel alterations of both the test methodology and the current standard capillary suction time (CST) apparatus. Twelve different papers including the standard Whatman No. 17 chromatographic paper were tested. The tests were run using four different types of sludge including a synthetic sludge, which was specifically developed for benchmarking purposes. The standard apparatus was altered by the introduction of a novel rectangular funnel instead of a standard circular one. A stirrer was also introduced to solve the problem of test inconsistency (e.g. high CST variability) particularly for heavy types of sludge. Results showed that several alternative papers, which are cheaper than the standard paper, can be used to estimate CST values accurately, and that the test repeatability can be improved in many cases and for different types of sludge. The introduction of the rectangular funnel demonstrated an obvious enhancement of test repeatability. The use of a stirrer to avoid sedimentation of heavy sludge did not have statistically significant impact on the CST values or the corresponding data variability. The application of synthetic sludge can support the testing of experimental methodologies and should be used for subsequent benchmarking purposes.

  14. High-Penetration Photovoltaic Planning Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to provide an overview of select U.S. utility methodologies for performing high-penetration photovoltaic (HPPV) system planning and impact studies. This report covers the Federal Energy Regulatory Commission's orders related to photovoltaic (PV) power system interconnection, particularly the interconnection processes for the Large Generation Interconnection Procedures and Small Generation Interconnection Procedures. In addition, it includes U.S. state interconnection standards and procedures. The procedures used by these regulatory bodies consider the impacts of HPPV power plants on the networks. Technical interconnection requirements for HPPV voltage regulation include aspects of power monitoring, grounding, synchronization, connection tomore » the overall distribution system, back-feeds, disconnecting means, abnormal operating conditions, and power quality. This report provides a summary of mitigation strategies to minimize the impact of HPPV. Recommendations and revisions to the standards may take place as the penetration level of renewables on the grid increases and new technologies develop in future years.« less

  15. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  16. The Effectiveness of Educational Technology Applications for Enhancing Reading Achievement in K-12 Classrooms: A Meta-Analysis. Educator's Summary. Revised

    ERIC Educational Resources Information Center

    Center for Research and Reform in Education, 2012

    2012-01-01

    This review examines research on the effects of technology use on reading achievement in K-12 classrooms. It applies consistent inclusion standards to focus on studies that met high methodological standards. A total of 84 qualified studies based on over 60,000 K-12 participants were included in the final analysis. Four major categories of…

  17. Investigation of Science Inquiry Items for Use on an Alternate Assessment Based on Modified Achievement Standards Using Cognitive Lab Methodology

    ERIC Educational Resources Information Center

    Dickenson, Tammiee S.; Gilmore, Joanna A.; Price, Karen J.; Bennett, Heather L.

    2013-01-01

    This study evaluated the benefits of item enhancements applied to science-inquiry items for incorporation into an alternate assessment based on modified achievement standards for high school students. Six items were included in the cognitive lab sessions involving both students with and without disabilities. The enhancements (e.g., use of visuals,…

  18. Low- and High-Achieving Sixth-Grade Students' Access to Participation during Mathematics Discourse

    ERIC Educational Resources Information Center

    Lack, Brian; Swars, Susan Lee; Meyers, Barbara

    2014-01-01

    A descriptive, holistic, multiple-case methodology was applied to examine the nature of participation in discourse of two low- and two high-achieving grade 6 students while solving mathematical tasks in a standards-based classroom. Data collected via classroom observations and student interviews were analyzed through a multiple-cycle coding…

  19. Methodology for enabling high-throughput simultaneous saccharification and fermentation screening of yeast using solid biomass as a substrate.

    PubMed

    Elliston, Adam; Wood, Ian P; Soucouri, Marie J; Tantale, Rachelle J; Dicks, Jo; Roberts, Ian N; Waldron, Keith W

    2015-01-01

    High-throughput (HTP) screening is becoming an increasingly useful tool for collating biological data which would otherwise require the employment of excessive resources. Second generation biofuel production is one such process. HTP screening allows the investigation of large sample sets to be undertaken with increased speed and cost effectiveness. This paper outlines a methodology that will enable solid lignocellulosic substrates to be hydrolyzed and fermented at a 96-well plate scale, facilitating HTP screening of ethanol production, whilst maintaining repeatability similar to that achieved at a larger scale. The results showed that utilizing sheets of biomass of consistent density (handbills), for paper, and slurries of pretreated biomass that could be pipetted allowed standardized and accurate transfers to 96-well plates to be achieved (±3.1 and 1.7%, respectively). Processing these substrates by simultaneous saccharification and fermentation (SSF) at various volumes showed no significant difference on final ethanol yields, either at standard shake flask (200 mL), universal bottle (10 mL) or 96-well plate (1 mL) scales. Substrate concentrations of up to 10% (w/v) were trialed successfully for SSFs at 1 mL volume. The methodology was successfully tested by showing the effects of steam explosion pretreatment on both oilseed rape and wheat straws. This methodology could be used to replace large shake flask reactions with comparatively fast 96-well plate SSF assays allowing for HTP experimentation. Additionally this method is compatible with a number of standardized assay techniques such as simple colorimetric, High-performance liquid chromatography (HPLC) and Nuclear magnetic resonance (NMR) spectroscopy. Furthermore this research has practical uses in the biorefining of biomass substrates for second generation biofuels and novel biobased chemicals by allowing HTP SSF screening, which should allow selected samples to be scaled up or studied in more detail.

  20. “Retention Projection” Enables Reliable Use of Shared Gas Chromatographic Retention Data Across Labs, Instruments, and Methods

    PubMed Central

    Barnes, Brian B.; Wilson, Michael B.; Carr, Peter W.; Vitha, Mark F.; Broeckling, Corey D.; Heuberger, Adam L.; Prenni, Jessica; Janis, Gregory C.; Corcoran, Henry; Snow, Nicholas H.; Chopra, Shilpi; Dhandapani, Ramkumar; Tawfall, Amanda; Sumner, Lloyd W.; Boswell, Paul G.

    2014-01-01

    Gas chromatography-mass spectrometry (GC-MS) is a primary tool used to identify compounds in complex samples. Both mass spectra and GC retention times are matched to those of standards, but it is often impractical to have standards on hand for every compound of interest, so we must rely on shared databases of MS data and GC retention information. Unfortunately, retention databases (e.g. linear retention index libraries) are experimentally restrictive, notoriously unreliable, and strongly instrument dependent, relegating GC retention information to a minor, often negligible role in compound identification despite its potential power. A new methodology called “retention projection” has great potential to overcome the limitations of shared chromatographic databases. In this work, we tested the reliability of the methodology in five independent laboratories. We found that even when each lab ran nominally the same method, the methodology was 3-fold more accurate than retention indexing because it properly accounted for unintentional differences between the GC-MS systems. When the labs used different methods of their own choosing, retention projections were 4- to 165-fold more accurate. More importantly, the distribution of error in the retention projections was predictable across different methods and labs, thus enabling automatic calculation of retention time tolerance windows. Tolerance windows at 99% confidence were generally narrower than those widely used even when physical standards are on hand to measure their retention. With its high accuracy and reliability, the new retention projection methodology makes GC retention a reliable, precise tool for compound identification, even when standards are not available to the user. PMID:24205931

  1. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  2. Expect the Best.

    ERIC Educational Resources Information Center

    Omotani, Barbara J.; Omotani, Les

    1996-01-01

    School leaders can create an environment that supports highly effective beliefs, attitudes, and behaviors in teachers. Effective teachers believe every student has abundant, innate potential. Instead of watering down standards and expectations, they modify three key variables (time, grouping, and methodology) to help specific students achieve…

  3. A data envelope analysis to assess factors affecting technical and economic efficiency of individual broiler breeder hens.

    PubMed

    Romero, L F; Zuidhof, M J; Jeffrey, S R; Naeima, A; Renema, R A; Robinson, F E

    2010-08-01

    This study evaluated the effect of feed allocation and energetic efficiency on technical and economic efficiency of broiler breeder hens using the data envelope analysis methodology and quantified the effect of variables affecting technical efficiency. A total of 288 Ross 708 pullets were placed in individual cages at 16 wk of age and assigned to 1 of 4 feed allocation groups. Three of them had feed allocated on a group basis with divergent BW targets: standard, high (standard x 1.1), and low (standard x 0.9). The fourth group had feed allocated on an individual bird basis following the standard BW target. Birds were classified in 3 energetic efficiency categories: low, average, and high, based on estimated maintenance requirements. Technical efficiency considered saleable chicks as output and cumulative ME intake and time as inputs. Economic efficiency of feed allocation treatments was analyzed under different cost scenarios. Birds with low feed allocation exhibited a lower technical efficiency (69.4%) than standard (72.1%), which reflected a reduced egg production rate. Feed allocation of the high treatment could have been reduced by 10% with the same chick production as the standard treatment. The low treatment exhibited reduced economic efficiency at greater capital costs, whereas high had reduced economic efficiency at greater feed costs. The average energetic efficiency hens had a lower technical efficiency in the low compared with the standard feed allocation. A 1% increment in estimated maintenance requirement changed technical efficiency by -0.23%, whereas a 1% increment in ME intake had a -0.47% effect. The negative relationship between technical efficiency and ME intake was counterbalanced by a positive correlation of ME intake and egg production. The negative relationship of technical efficiency and maintenance requirements was synergized by a negative correlation of hen maintenance and egg production. Economic efficiency methodologies are effective tools to assess the economic effect of selection and flock management programs because biological, allocative, and economic factors can be independently analyzed.

  4. A semi-automated methodology for finding lipid-related GO terms.

    PubMed

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  5. 76 FR 70680 - Small Business Size Standards: Real Estate and Rental and Leasing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... industries and one sub- industry in North American Industry Classification System (NAICS) Sector 53, Real... industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published in the October 21, 2009 issue of the Federal Register. That ``Size Standards Methodology'' is...

  6. Workshop on LCA: Methodology, Current Development, and Application in Standards - LCA Methodology

    EPA Science Inventory

    As ASTM standards are being developed including Life Cycle Assessment within the Standards it is imperative that practitioners in the field learn more about what LCA is, and how to conduct it. This presentation will include an overview of the LCA process and will concentrate on ...

  7. Studies on the asparagine-linked oligosaccharides from cartilage-specific proteoglycan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cioffi, L.C.

    1987-01-01

    Chondrocytes synthesize and secrete a cartilage-specific proteoglycan (PG-H) as one of their major products. This proteoglycan has attached to it several types of carbohydrate chains, including chondroitin sulfate, keratan sulfate, O-linked oligosaccharides, and asparagine-linked oligosaccharides. The asparagine-linked oligosaccharides found on PG-H were investigated in these studies. Methodology was developed for the isolation and separation of standard of standard complex and high mannose type oligosaccharides. This included digesting glycoproteins with N-glycanase and separation of the oligosaccharides according to type by concanavalin-A lectin chromatography. The different oligosaccharide types were then analyzed by high pressure liquid chromatography. This methodology was used in themore » subsequent studies on the PG-H asparagine-linked oligosaccharides. Initially, the asparagine-linked oligosaccharides recovered from the culture medium (CM) and cell-associated (Ma) fractions of PG-H from of tibial chondrocytes were labeled with (/sup 3/H)-mannose and the oligosaccharides were isolated and analyzed.« less

  8. Intimate Partner Violence, 1993-2010

    MedlinePlus

    ... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ...

  9. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  10. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  11. 7T MRI subthalamic nucleus atlas for use with 3T MRI.

    PubMed

    Milchenko, Mikhail; Norris, Scott A; Poston, Kathleen; Campbell, Meghan C; Ushe, Mwiza; Perlmutter, Joel S; Snyder, Abraham Z

    2018-01-01

    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) reduces motor symptoms in most patients with Parkinson disease (PD), yet may produce untoward effects. Investigation of DBS effects requires accurate localization of the STN, which can be difficult to identify on magnetic resonance images collected with clinically available 3T scanners. The goal of this study is to develop a high-quality STN atlas that can be applied to standard 3T images. We created a high-definition STN atlas derived from seven older participants imaged at 7T. This atlas was nonlinearly registered to a standard template representing 56 patients with PD imaged at 3T. This process required development of methodology for nonlinear multimodal image registration. We demonstrate mm-scale STN localization accuracy by comparison of our 3T atlas with a publicly available 7T atlas. We also demonstrate less agreement with an earlier histological atlas. STN localization error in the 56 patients imaged at 3T was less than 1 mm on average. Our methodology enables accurate STN localization in individuals imaged at 3T. The STN atlas and underlying 3T average template in MNI space are freely available to the research community. The image registration methodology developed in the course of this work may be generally applicable to other datasets.

  12. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  13. On the use of EEG or MEG brain imaging tools in neuromarketing research.

    PubMed

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.

  14. On the Use of EEG or MEG Brain Imaging Tools in Neuromarketing Research

    PubMed Central

    Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio

    2011-01-01

    Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries. PMID:21960996

  15. Evidence for current recommendations concerning the management of foot health for people with chronic long-term conditions: a systematic review.

    PubMed

    Edwards, Katherine; Borthwick, Alan; McCulloch, Louise; Redmond, Anthony; Pinedo-Villanueva, Rafael; Prieto-Alhambra, Daniel; Judge, Andrew; Arden, Nigel; Bowen, Catherine

    2017-01-01

    Research focusing on management of foot health has become more evident over the past decade, especially related to chronic conditions such as diabetes. The level of methodological rigour across this body of work however is varied and outputs do not appear to have been developed or translated into clinical practice. The aim of this systematic review was to assess the latest guidelines, standards of care and current recommendations relative to people with chronic conditions to ascertain the level of supporting evidence concerning the management of foot health. A systematic search of electronic databases (Medline, Embase, Cinahl, Web of Science, SCOPUS and The Cochrane Library) for literature on recommendations for foot health management for people with chronic conditions was performed between 2000 and 2016 using predefined criteria. Data from the included publications was synthesised via template analysis, employing a thematic organisation and structure. The methodological quality of all included publications was appraised using the Appraisal for Research and Evaluation (AGREE II) instrument. A more in-depth analysis was carried out that specifically considered the levels of evidence that underpinned the strength of their recommendations concerning management of foot health. The data collected revealed 166 publications in which the majority (102) were guidelines, standards of care or recommendations related to the treatment and management of diabetes. We noted a trend towards a systematic year on year increase in guidelines standards of care or recommendations related to the treatment and management of long term conditions other than diabetes over the past decade. The most common recommendation is for preventive care or assessments (e.g. vascular tests), followed by clinical interventions such as foot orthoses, foot ulcer care and foot health education. Methodological quality was spread across the range of AGREE II scores with 62 publications falling into the category of high quality (scores 6-7). The number of publications providing a recommendation in the context of a narrative but without an indication of the strength or quality of the underlying evidence was high (79 out of 166). It is clear that evidence needs to be accelerated and in place to support the future of the Podiatry workforce. Whilst high level evidence for podiatry is currently low in quantity, the methodological quality is growing. Where levels of evidence have been given in in high quality guidelines, standards of care or recommendations, they also tend to be strong-moderate quality such that further strategically prioritised research, if performed, is likely to have an important impact in the field.

  16. Evaluation of a new approach to compute intervertebral disc height measurements from lateral radiographic views of the spine.

    PubMed

    Allaire, Brett T; DePaolis Kaluza, M Clara; Bruno, Alexander G; Samelson, Elizabeth J; Kiel, Douglas P; Anderson, Dennis E; Bouxsein, Mary L

    2017-01-01

    Current standard methods to quantify disc height, namely distortion compensated Roentgen analysis (DCRA), have been mostly utilized in the lumbar and cervical spine and have strict exclusion criteria. Specifically, discs adjacent to a vertebral fracture are excluded from measurement, thus limiting the use of DCRA in studies that include older populations with a high prevalence of vertebral fractures. Thus, we developed and tested a modified DCRA algorithm that does not depend on vertebral shape. Participants included 1186 men and women from the Framingham Heart Study Offspring and Third Generation Multidetector CT Study. Lateral CT scout images were used to place 6 morphometry points around each vertebra at 13 vertebral levels in each participant. Disc heights were calculated utilizing these morphometry points using DCRA methodology and our modified version of DCRA, which requires information from fewer morphometry points than the standard DCRA. Modified DCRA and standard DCRA measures of disc height are highly correlated, with concordance correlation coefficients above 0.999. Both measures demonstrate good inter- and intra-operator reproducibility. 13.9 % of available disc heights were not evaluable or excluded using the standard DCRA algorithm, while only 3.3 % of disc heights were not evaluable using our modified DCRA algorithm. Using our modified DCRA algorithm, it is not necessary to exclude vertebrae with fracture or other deformity from disc height measurements as in the standard DCRA. Modified DCRA also yields identical measurements to the standard DCRA. Thus, the use of modified DCRA for quantitative assessment of disc height will lead to less missing data without any loss of accuracy, making it a preferred alternative to the current standard methodology.

  17. Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?

    PubMed

    Bouchard, Chantal; Jean, Olivier

    2017-10-01

    Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  19. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  20. The Study of a Simple Redox Reaction as an Experimental Approach to Chemical Kinetics.

    ERIC Educational Resources Information Center

    Elias, Horst; Zipp, Arden P.

    1988-01-01

    Recommends using iodide ions and peroxodisulfate ions for studying rate laws instead of the standard iodine clock for kinetic study. Presents the methodology and a discussion of the kinetics involved for a laboratory experiment for a high school or introductory college course. (ML)

  1. Using a Lean Six Sigma Approach to Yield Sustained Pressure Ulcer Prevention for Complex Critical Care Patients.

    PubMed

    Donovan, Elizabeth A; Manta, Christine J; Goldsack, Jennifer C; Collins, Michelle L

    2016-01-01

    Under value-based purchasing, Medicare withholds reimbursements for hospital-acquired pressure ulcer occurrence and rewards hospitals that meet performance standards. With little evidence of a validated prevention process, nurse managers are challenged to find evidence-based interventions. The aim of this study was to reduce the unit-acquired pressure ulcer (UAPU) rate on targeted intensive care and step-down units by 15% using Lean Six Sigma (LSS) methodology. An interdisciplinary team designed a pilot program using LSS methodology to test 4 interventions: standardized documentation, equipment monitoring, patient out-of-bed-to-chair monitoring, and a rounding checklist. During the pilot, the UAPU rate decreased from 4.4% to 2.8%, exceeding the goal of a 15% reduction. The rate remained below the goal through the program control phase at 2.9%, demonstrating a statistically significant reduction after intervention implementation. The program significantly reduced UAPU rates in high-risk populations. LSS methodologies are a sustainable approach to reducing hospital-acquired conditions that should be broadly tested and implemented.

  2. Working group written presentation: Solar radiation

    NASA Technical Reports Server (NTRS)

    Slemp, Wayne S.

    1989-01-01

    The members of the Solar Radiation Working Group arrived at two major solar radiation technology needs: (1) generation of a long term flight data base; and (2) development of a standardized UV testing methodology. The flight data base should include 1 to 5 year exposure of optical filters, windows, thermal control coatings, hardened coatings, polymeric films, and structural composites. The UV flux and wavelength distribution, as well as particulate radiation flux and energy, should be measured during this flight exposure. A standard testing methodology is needed to establish techniques for highly accelerated UV exposure which will correlate well with flight test data. Currently, UV can only be accelerated to about 3 solar constants and can correlate well with flight exposure data. With space missions to 30 years, acceleration rates of 30 to 100X are needed for efficient laboratory testing.

  3. A standard description and costing methodology for the balance-of-plant items of a solar thermal electric power plant. Report of a multi-institutional working group

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Standard descriptions for solar thermal power plants are established and uniform costing methodologies for nondevelopmental balance of plant (BOP) items are developed. The descriptions and methodologies developed are applicable to the major systems. These systems include the central receiver, parabolic dish, parabolic trough, hemispherical bowl, and solar pond. The standard plant is defined in terms of four categories comprising (1) solar energy collection, (2) power conversion, (3) energy storage, and (4) balance of plant. Each of these categories is described in terms of the type and function of components and/or subsystems within the category. A detailed description is given for the BOP category. BOP contains a number of nondevelopmental items that are common to all solar thermal systems. A standard methodology for determining the costs of these nondevelopmental BOP items is given. The methodology is presented in the form of cost equations involving cost factors such as unit costs. A set of baseline values for the normalized cost factors is also given.

  4. [Impact of Lean methodology to improve care processes and levels of satisfaction in patient care in a clinical laboratory].

    PubMed

    Morón-Castañeda, L H; Useche-Bernal, A; Morales-Reyes, O L; Mojica-Figueroa, I L; Palacios-Carlos, A; Ardila-Gómez, C E; Parra-Ardila, M V; Martínez-Nieto, O; Sarmiento-Echeverri, N; Rodríguez, C A; Alvarado-Heine, C; Isaza-Ruget, M A

    2015-01-01

    The application of the Lean methodology in health institutions is an effective tool to improve the capacity and workflow, as well as to increase the level of satisfaction of patients and employees. To optimise the time of outpatient care in a clinical laboratory, by implementing a methodology based on the organisation of operational procedures to improve user satisfaction and reduce the number of complaints for delays in care. A quasi-experimental before and after study was conducted between October 2011 to September 2012. XBar and S charts were used to observe the mean service times and standard deviation. The user satisfaction was assessed using service questionnaires. A reduction of 17 minutes was observed in the time of patient care from arrival to leaving the laboratory, and a decrease of 60% in complaints of delay in care. Despite the high staff turnover and 38% increase in the number of patients seen, a culture of empowerment and continuous improvement was acquired, as well as greater efficiency and productivity in the care process, which was reflected by maintaining standards 12 months after implementation. Lean is a viable methodology for clinical laboratory procedures, improving their efficiency and effectiveness. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  5. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    PubMed

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials. At the same time, in order to improve the reporting quality of randomized controlled trials, CONSORT standards should be adopted in the preparation of research reports and submissions. Copyright© by the Chinese Pharmaceutical Association.

  6. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation.

    PubMed

    Lewis, Cara C; Stanick, Cameo F; Martinez, Ruben G; Weiner, Bryan J; Kim, Mimi; Barwick, Melanie; Comtois, Katherine A

    2015-01-08

    Identification of psychometrically strong instruments for the field of implementation science is a high priority underscored in a recent National Institutes of Health working meeting (October 2013). Existing instrument reviews are limited in scope, methods, and findings. The Society for Implementation Research Collaboration Instrument Review Project's objectives address these limitations by identifying and applying a unique methodology to conduct a systematic and comprehensive review of quantitative instruments assessing constructs delineated in two of the field's most widely used frameworks, adopt a systematic search process (using standard search strings), and engage an international team of experts to assess the full range of psychometric criteria (reliability, construct and criterion validity). Although this work focuses on implementation of psychosocial interventions in mental health and health-care settings, the methodology and results will likely be useful across a broad spectrum of settings. This effort has culminated in a centralized online open-access repository of instruments depicting graphical head-to-head comparisons of their psychometric properties. This article describes the methodology and preliminary outcomes. The seven stages of the review, synthesis, and evaluation methodology include (1) setting the scope for the review, (2) identifying frameworks to organize and complete the review, (3) generating a search protocol for the literature review of constructs, (4) literature review of specific instruments, (5) development of an evidence-based assessment rating criteria, (6) data extraction and rating instrument quality by a task force of implementation experts to inform knowledge synthesis, and (7) the creation of a website repository. To date, this multi-faceted and collaborative search and synthesis methodology has identified over 420 instruments related to 34 constructs (total 48 including subconstructs) that are relevant to implementation science. Despite numerous constructs having greater than 20 available instruments, which implies saturation, preliminary results suggest that few instruments stem from gold standard development procedures. We anticipate identifying few high-quality, psychometrically sound instruments once our evidence-based assessment rating criteria have been applied. The results of this methodology may enhance the rigor of implementation science evaluations by systematically facilitating access to psychometrically validated instruments and identifying where further instrument development is needed.

  7. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  8. Methodologic ramifications of paying attention to sex and gender differences in clinical research.

    PubMed

    Prins, Martin H; Smits, Kim M; Smits, Luc J

    2007-01-01

    Methodologic standards for studies on sex and gender differences should be developed to improve reporting of studies and facilitate their inclusion in systematic reviews. The essence of these studies lies within the concept of effect modification. This article reviews important methodologic issues in the design and reporting of pharmacogenetic studies. Differences in effect based on sex or gender should preferably be expressed in absolute terms (risk differences) to facilitate clinical decisions on treatment. Information on the distribution of potential effect modifiers or prognostic factors should be available to prevent a biased comparison of differences in effect between genotypes. Other considerations included the possibility of selective nonavailability of biomaterial and the choice of a statistical model to study effect modification. To ensure high study quality, additional methodologic issues should be taken into account when designing and reporting studies on sex and gender differences.

  9. Updated methodology for nuclear magnetic resonance characterization of shales

    NASA Astrophysics Data System (ADS)

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-08-01

    Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world's energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1-T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.

  10. Methodological proposal for validation of the disinfecting efficacy of an automated flexible endoscope reprocessor

    PubMed Central

    Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine

    2016-01-01

    ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian Health Surveillance Agency (Agência Nacional de Vigilância Sanitária - ANVISA) Collegiate Board Resolution (Resolução de Diretoria Colegiada - RDC) no. 35/2010 and 15/2012. The proposed method was applied to a commercially available device using a high-level 0.2% peracetic acid-based disinfectant. Results: the proposed method of assessment was found to be robust when the recommendations made in the relevant legislation were incorporated with some adjustments to ensure their feasibility. Application of the proposed method provided evidence of the efficacy of the tested equipment for the high-level disinfection of endoscopes. Conclusion: the proposed method may serve as a reference for the assessment of flexible endoscope reprocessors, thereby providing solid ground for the purchase of this category of health products. PMID:27508915

  11. Updated methodology for nuclear magnetic resonance characterization of shales

    USGS Publications Warehouse

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-01-01

    Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world’s energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1–T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.

  12. Measures of outdoor play and independent mobility in children and youth: A methodological review.

    PubMed

    Bates, Bree; Stone, Michelle R

    2015-09-01

    Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  13. Standard methodologies for virus research in Apis mellifera

    USDA-ARS?s Scientific Manuscript database

    The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...

  14. Standard methodologies for Nosema apis and N. ceranae research

    USDA-ARS?s Scientific Manuscript database

    The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...

  15. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  16. Methodological quality of randomized trials published in the Journal of the American Podiatric Medical Association, 1999-2013.

    PubMed

    Landorf, Karl B; Menz, Hylton B; Armstrong, David G; Herbert, Robert D

    2015-07-01

    Randomized trials must be of high methodological quality to yield credible, actionable findings. The main aim of this project was to evaluate whether there has been an improvement in the methodological quality of randomized trials published in the Journal of the American Podiatric Medical Association (JAPMA). Randomized trials published in JAPMA during a 15-year period (January 1999 to December 2013) were evaluated. The methodological quality of randomized trials was evaluated using the PEDro scale (scores range from 0 to 10, with 0 being lowest quality). Linear regression was used to assess changes in methodological quality over time. A total of 1,143 articles were published in JAPMA between January 1999 and December 2013. Of these, 44 articles were reports of randomized trials. Although the number of randomized trials published each year increased, there was only minimal improvement in their methodological quality (mean rate of improvement = 0.01 points per year). The methodological quality of the trials studied was typically moderate, with a mean ± SD PEDro score of 5.1 ± 1.5. Although there were a few high-quality randomized trials published in the journal, most (84.1%) scored between 3 and 6. Although there has been an increase in the number of randomized trials published in JAPMA, there is substantial opportunity for improvement in the methodological quality of trials published in the journal. Researchers seeking to publish reports of randomized trials should seek to meet current best-practice standards in the conduct and reporting of their trials.

  17. Logical and Methodological Issues Affecting Genetic Studies of Humans Reported in Top Neuroscience Journals.

    PubMed

    Grabitz, Clara R; Button, Katherine S; Munafò, Marcus R; Newbury, Dianne F; Pernet, Cyril R; Thompson, Paul A; Bishop, Dorothy V M

    2018-01-01

    Genetics and neuroscience are two areas of science that pose particular methodological problems because they involve detecting weak signals (i.e., small effects) in noisy data. In recent years, increasing numbers of studies have attempted to bridge these disciplines by looking for genetic factors associated with individual differences in behavior, cognition, and brain structure or function. However, different methodological approaches to guarding against false positives have evolved in the two disciplines. To explore methodological issues affecting neurogenetic studies, we conducted an in-depth analysis of 30 consecutive articles in 12 top neuroscience journals that reported on genetic associations in nonclinical human samples. It was often difficult to estimate effect sizes in neuroimaging paradigms. Where effect sizes could be calculated, the studies reporting the largest effect sizes tended to have two features: (i) they had the smallest samples and were generally underpowered to detect genetic effects, and (ii) they did not fully correct for multiple comparisons. Furthermore, only a minority of studies used statistical methods for multiple comparisons that took into account correlations between phenotypes or genotypes, and only nine studies included a replication sample or explicitly set out to replicate a prior finding. Finally, presentation of methodological information was not standardized and was often distributed across Methods sections and Supplementary Material, making it challenging to assemble basic information from many studies. Space limits imposed by journals could mean that highly complex statistical methods were described in only a superficial fashion. In summary, methods that have become standard in the genetics literature-stringent statistical standards, use of large samples, and replication of findings-are not always adopted when behavioral, cognitive, or neuroimaging phenotypes are used, leading to an increased risk of false-positive findings. Studies need to correct not just for the number of phenotypes collected but also for the number of genotypes examined, genetic models tested, and subsamples investigated. The field would benefit from more widespread use of methods that take into account correlations between the factors corrected for, such as spectral decomposition, or permutation approaches. Replication should become standard practice; this, together with the need for larger sample sizes, will entail greater emphasis on collaboration between research groups. We conclude with some specific suggestions for standardized reporting in this area.

  18. Developing a User Oriented Design Methodology for Learning Activities Using Boundary Objects

    ERIC Educational Resources Information Center

    Fragou, ?lga; Kameas, Achilles

    2013-01-01

    International Standards in High and Open and Distance Education are used for developing Open Educational Resources (OERs). Current issues in e-learning community are the specification of learning chunks and the definition of describing designs for different units of learning (activities, units, courses) in a generic though expandable format.…

  19. CHARACTERIZATION OF GROUNDWATER SAMPLES FROM SUPERFUND SITES BY AS CHROMATOGRAPHY/MASS SPECTROMETRY AND LIQUID CHROMATOGRAPHY/MASS SPECTROMETRY.

    EPA Science Inventory

    Groundwater at or near Superfund sites often contains much organic matter,as indicated by total organic carbon (TOC) measurements. Analyses by standard GC and GC/MS methodology often miss the more polar or nonvolatile of these organic compounds. The identification of the highly p...

  20. Understanding, Embracing and Reflecting upon the Messiness of Doctoral Fieldwork

    ERIC Educational Resources Information Center

    Naveed, Arif; Sakata, Nozomi; Kefallinou, Anthoula; Young, Sara; Anand, Kusha

    2017-01-01

    This Forum issue discusses the centrality of the fieldwork in doctoral research. The inevitability of researchers' influence and of their values apparent during and after their fieldwork calls for a high degree of reflexivity. Since the standard methodology textbooks do not sufficiently guide on addressing such challenges, doctoral researchers go…

  1. The Effects of Cooperative Learning on Student Achievement in Algebra I

    ERIC Educational Resources Information Center

    Brandy, Travis D.

    2013-01-01

    It is a well-documented finding that high school students in schools across the nation, including California, fail to achieve at the proficient level in mathematics, based on standardized test scores. The purpose of this research study was to compare the findings of students taught using traditional instructional methodologies versus cooperative…

  2. 49 CFR 1111.9 - Procedural schedule in cases using simplified standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) SURFACE TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION RULES OF PRACTICE COMPLAINT AND INVESTIGATION... the simplified standards: (1) In cases relying upon the Simplified-SAC methodology: Day 0—Complaint... dominance. (b) Defendant's second disclosure. In cases using the Simplified-SAC methodology, the defendant...

  3. The availability of public information for insurance risk decision-making in the UK

    NASA Astrophysics Data System (ADS)

    Davis, Nigel; Gibbs, Mark; Chadwick, Ben; Foote, Matthew

    2010-05-01

    At present, there is a wealth of hazard and exposure data which cannot or is not being full used by risk modelling community. The reasons for this under-utilisation of data are many: restrictive and complex data policies and pricing, risks involved in information sharing, technological shortcomings, and variable resolution of data, particularly with catastrophe models only recently having been adjusted to consume high-resolution exposure data. There is therefore an urgent need for the development of common modelling practices and applications for climate and geo-hazard risk assessment, all of which would be highly relevant to public policy, disaster risk management and financial risk transfer communities. This paper will present a methodology to overcome these obstacles and to review the availability of hazard data at research institutions in a consistent format. Such a methodology would facilitate the collation of hazard and other auxiliary data, as well as present data within a geo-spatial framework suitable for public and commercial use. The methodology would also review the suitability of datasets and how these could be made more freely available in conjunction with other research institutions in order to present a consistent data standard. It is clear that an understanding of these different issues of data and data standards have significant ramifications when used in Natural Hazard Risk Assessment. Scrutinising the issue of data standards also allows the data to be evaluated and re-evaluated for its gaps, omissions, fitness, purpose, availability and precision. Not only would there be a quality check on data, but it would also help develop and fine-tune the tools used for decision-making and assessment of risk.

  4. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 2: Test Bed Performance Evaluation and Final AeroMACS Recommendations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Magner, James

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  5. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 1: Concepts of Use, Initial System Requirements, Architecture, and AeroMACS Design Considerations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Isaacs, James; Henriksen, Steve; Zelkin, Natalie

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I (this document) is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  6. Methodology of Clinical Trials Aimed at Assessing Interventions for Cutaneous Leishmaniasis

    PubMed Central

    Olliaro, Piero; Vaillant, Michel; Arana, Byron; Grogl, Max; Modabber, Farrokh; Magill, Alan; Lapujade, Olivier; Buffet, Pierre; Alvar, Jorge

    2013-01-01

    The current evidence-base for recommendations on the treatment of cutaneous leishmaniasis (CL) is generally weak. Systematic reviews have pointed to a general lack of standardization of methods for the conduct and analysis of clinical trials of CL, compounded with poor overall quality of several trials. For CL, there is a specific need for methodologies which can be applied generally, while allowing the flexibility needed to cover the diverse forms of the disease. This paper intends to provide clinical investigators with guidance for the design, conduct, analysis and report of clinical trials of treatments for CL, including the definition of measurable, reproducible and clinically-meaningful outcomes. Having unified criteria will help strengthen evidence, optimize investments, and enhance the capacity for high-quality trials. The limited resources available for CL have to be concentrated in clinical studies of excellence that meet international quality standards. PMID:23556016

  7. Extracting insights from the shape of complex data using topology

    PubMed Central

    Lum, P. Y.; Singh, G.; Lehman, A.; Ishkanov, T.; Vejdemo-Johansson, M.; Alagappan, M.; Carlsson, J.; Carlsson, G.

    2013-01-01

    This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods. PMID:23393618

  8. Extracting insights from the shape of complex data using topology.

    PubMed

    Lum, P Y; Singh, G; Lehman, A; Ishkanov, T; Vejdemo-Johansson, M; Alagappan, M; Carlsson, J; Carlsson, G

    2013-01-01

    This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods.

  9. Wall jet analysis for circulation control aerodynamics. Part 1: Fundamental CFD and turbulence modeling concepts

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; York, B. J.; Sinha, N.; Dvorak, F. A.

    1987-01-01

    An overview of parabolic and PNS (Parabolized Navier-Stokes) methodology developed to treat highly curved sub and supersonic wall jets is presented. The fundamental data base to which these models were applied is discussed in detail. The analysis of strong curvature effects was found to require a semi-elliptic extension of the parabolic modeling to account for turbulent contributions to the normal pressure variations, as well as an extension to the turbulence models utilized, to account for the highly enhanced mixing rates observed in situations with large convex curvature. A noniterative, pressure split procedure is shown to extend parabolic models to account for such normal pressure variations in an efficient manner, requiring minimal additional run time over a standard parabolic approach. A new PNS methodology is presented to solve this problem which extends parabolic methodology via the addition of a characteristic base wave solver. Applications of this approach to analyze the interaction of wave and turbulence processes in wall jets is presented.

  10. A systematic review of the diagnostic accuracy of provocative tests of the neck for diagnosing cervical radiculopathy

    PubMed Central

    Pool, Jan J. M.; van Tulder, Maurits W.; Riphagen, Ingrid I.; de Vet, Henrica C. W.

    2006-01-01

    Clinical provocative tests of the neck, which position the neck and arm inorder to aggravate or relieve arm symptoms, are commonly used in clinical practice in patients with a suspected cervical radiculopathy. Their diagnostic accuracy, however, has never been examined in a systematic review. A comprehensive search was conducted in order to identify all possible studies fulfilling the inclusion criteria. A study was included if: (1) any provocative test of the neck for diagnosing cervical radiculopathy was identified; (2) any reference standard was used; (3) sensitivity and specificity were reported or could be (re-)calculated; and, (4) the publication was a full report. Two reviewers independently selected studies, and assessed methodological quality. Only six studies met the inclusion criteria, which evaluated five provocative tests. In general, Spurling’s test demonstrated low to moderate sensitivity and high specificity, as did traction/neck distraction, and Valsalva’s maneuver. The upper limb tension test (ULTT) demonstrated high sensitivity and low specificity, while the shoulder abduction test demonstrated low to moderate sensitivity and moderate to high specificity. Common methodological flaws included lack of an optimal reference standard, disease progression bias, spectrum bias, and review bias. Limitations include few primary studies, substantial heterogeneity, and numerous methodological flaws among the studies; therefore, a meta-analysis was not conducted. This review suggests that, when consistent with the history and other physical findings, a positive Spurling’s, traction/neck distraction, and Valsalva’s might be indicative of a cervical radiculopathy, while a negative ULTT might be used to rule it out. However, the lack of evidence precludes any firm conclusions regarding their diagnostic value, especially when used in primary care. More high quality studies are necessary in order to resolve this issue. PMID:17013656

  11. [Quality standards for duplex ultrasonographic assessment (duplex us) of abdominal aortic stent grafts].

    PubMed

    Diard, A; Becker, F; Pichot, O

    2017-05-01

    The quality standards of the French Society of Vascular Medicine for the ultrasound assessment of lower limb arteries in vascular medicine practice are based on the principle that these examinations have to meet two requirements: technical know-how (knowledge of devices and methodologies); medical know-how (level of examination matching the indication and purpose of the examination, interpretation and critical analysis of results). To describe an optimal level of examination adjusted to the indication or clinical hypothesis; to establish harmonious practices, methodologies, terminologies, results description and report; to provide good practice reference points and to promote a high quality process. The three levels of examination, indications and objectives for each level; the reference standard examination (level 2) and its variants according to indications; the minimal content of the exam report, the medical conclusion letter to the corresponding physician (synthesis, conclusion and management suggestions); commented glossary (anatomy, hemodynamics, signs and symptoms); technical basis; device settings. Here, we discuss duplex ultrasound for the supervision of the aortic stent grafts. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  12. Effect of High Intensity Interval and Continuous Swimming Training on Body Mass Adiposity Level and Serum Parameters in High-Fat Diet Fed Rats.

    PubMed

    da Rocha, Guilherme L; Crisp, Alex H; de Oliveira, Maria R M; da Silva, Carlos A; Silva, Jadson O; Duarte, Ana C G O; Sene-Fiorese, Marcela; Verlengia, Rozangela

    2016-01-01

    This study aimed to investigate the effects of interval and continuous training on the body mass gain and adiposity levels of rats fed a high-fat diet. Forty-eight male Sprague-Dawley rats were randomly divided into two groups, standard diet and high-fat diet, and received their respective diets for a period of four weeks without exercise stimuli. After this period, the animals were randomly divided into six groups (n = 8): control standard diet (CS), control high-fat diet (CH), continuous training standard diet (CTS), continuous training high-fat diet (CTH), interval training standard diet (ITS), and interval training high-fat diet (ITH). The interval and continuous training consisted of a swimming exercise performed over eight weeks. CH rats had greater body mass gain, sum of adipose tissues mass, and lower serum high density lipoprotein values than CS. The trained groups showed lower values of feed intake, caloric intake, body mass gain, and adiposity levels compared with the CH group. No significant differences were observed between the trained groups (CTS versus ITS and CTH versus ITH) on body mass gains and adiposity levels. In conclusion, both training methodologies were shown to be effective in controlling body mass gain and adiposity levels in high-fat diet fed rats.

  13. Steps towards the international regulatory acceptance of non-animal methodology in safety assessment.

    PubMed

    Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie

    2017-10-01

    The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Intermountain Health Care, Inc.: Standard Costing System Methodology and Implementation

    PubMed Central

    Rosqvist, W.V.

    1984-01-01

    Intermountain Health Care, Inc. (IHC) a notfor-profit hospital chain with 22 hospitals in the intermountain area and corporate offices located in Salt Lake City, Utah, has developed a Standard Costing System to provide hospital management with a tool for confronting increased cost pressures in the health care environment. This document serves as a description of methodology used in developing the standard costing system and outlines the implementation process.

  15. DETERMINATION OF THE STRONG ACIDITY OF ATMOSPHERIC FINE PARTICLES (<2.5 UM) USING ANNULAR DENUDER TECHNOLOGY

    EPA Science Inventory

    This report is a standardized methodology description for the determination of strong acidity of fine particles (less than 2.5 microns) in ambient air using annular denuder technology. his methodology description includes two parts: art A - Standard Method and Part B - Enhanced M...

  16. Renewable Energy used in State Renewable Portfolio Standards Yielded

    Science.gov Websites

    . Renewable Portfolio Standards also shows national water withdrawals and water consumption by fossil-fuel methodologies, while recognizing that states could perform their own more-detailed assessments," NREL's , respectively. Ranges are presented as the models and methodologies used are sensitive to multiple parameters

  17. 42 CFR 416.171 - Determination of payment rates for ASC services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Determination of payment rates for ASC services... Determination of payment rates for ASC services. (a) Standard methodology. The standard methodology for determining the national unadjusted payment rate for ASC services is to calculate the product of the...

  18. 45 CFR 153.510 - Risk corridors establishment and payment methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... methodology. 153.510 Section 153.510 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...

  19. 45 CFR 153.510 - Risk corridors establishment and payment methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... methodology. 153.510 Section 153.510 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...

  20. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  1. A Methodological Analysis of Randomized Clinical Trials of Computer-Assisted Therapies for Psychiatric Disorders: Toward Improved Standards for an Emerging Field

    PubMed Central

    Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.

    2013-01-01

    Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689

  2. An eLearning Standard Approach for Supporting PBL in Computer Engineering

    ERIC Educational Resources Information Center

    Garcia-Robles, R.; Diaz-del-Rio, F.; Vicente-Diaz, S.; Linares-Barranco, A.

    2009-01-01

    Problem-based learning (PBL) has proved to be a highly successful pedagogical model in many fields, although it is not that common in computer engineering. PBL goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to a course in a computer engineering degree at the University of…

  3. A Methodology for Validation of High Resolution Combat Models

    DTIC Science & Technology

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  4. Estimating Teacher Turnover Costs: A Case Study

    ERIC Educational Resources Information Center

    Levy, Abigail Jurist; Joy, Lois; Ellis, Pamela; Jablonski, Erica; Karelitz, Tzur M.

    2012-01-01

    High teacher turnover in large U.S. cities is a critical issue for schools and districts, and the students they serve; but surprisingly little work has been done to develop methodologies and standards that districts and schools can use to make reliable estimates of turnover costs. Even less is known about how to detect variations in turnover costs…

  5. On the temporal and spatial variability of near-surface soil moisture for the identification of representative in situ soil moisture monitoring stations

    USDA-ARS?s Scientific Manuscript database

    The high spatio-temporal variability of soil moisture complicates the validation of remotely sensed soil moisture products using in-situ monitoring stations. Therefore, a standard methodology for selecting the most repre- sentative stations for the purpose of validating satellites and land surface ...

  6. Using continuous process improvement methodology to standardize nursing handoff communication.

    PubMed

    Klee, Kristi; Latta, Linda; Davis-Kirsch, Sallie; Pecchia, Maria

    2012-04-01

    The purpose of this article was to describe the use of continuous performance improvement (CPI) methodology to standardize nurse shift-to-shift handoff communication. The goals of the process were to standardize the content and process of shift handoff, improve patient safety, increase patient and family involvement in the handoff process, and decrease end-of-shift overtime. This article will describe process changes made over a 4-year period as result of application of the plan-do-check-act procedure, which is an integral part of the CPI methodology, and discuss further work needed to continue to refine this critical nursing care process. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Standards and guidelines for observational studies: quality is in the eye of the beholder.

    PubMed

    Morton, Sally C; Costlow, Monica R; Graff, Jennifer S; Dubois, Robert W

    2016-03-01

    Patient care decisions demand high-quality research. To assist those decisions, numerous observational studies are being performed. Are the standards and guidelines to assess observational studies consistent and actionable? What policy considerations should be considered to ensure decision makers can determine if an observational study is of high-quality and valid to inform treatment decisions? Based on a literature review and input from six experts, we compared and contrasted nine standards/guidelines using 23 methodological elements involved in observational studies (e.g., study protocol, data analysis, and so forth). Fourteen elements (61%) were addressed by at least seven standards/guidelines; 12 of these elements disagreed in the approach. Nine elements (39%) were addressed by six or fewer standards/guidelines. Ten elements (43%) were not actionable in at least one standard/guideline that addressed the element. The lack of observational study standard/guideline agreement may contribute to variation in study conduct; disparities in what is considered credible research; and ultimately, what evidence is adopted. A common set of agreed on standards/guidelines for conducting observational studies will benefit funders, researchers, journal editors, and decision makers. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Adherence of hip and knee arthroplasty studies to RSA standardization guidelines. A systematic review.

    PubMed

    Madanat, Rami; Mäkinen, Tatu J; Aro, Hannu T; Bragdon, Charles; Malchau, Henrik

    2014-09-01

    Guidelines for standardization of radiostereometry (RSA) of implants were published in 2005 to facilitate comparison of outcomes between various research groups. In this systematic review, we determined how well studies have adhered to these guidelines. We carried out a literature search to identify all articles published between January 2000 and December 2011 that used RSA in the evaluation of hip or knee prosthesis migration. 2 investigators independently evaluated each of the studies for adherence to the 13 individual guideline items. Since some of the 13 points included more than 1 criterion, studies were assessed on whether each point was fully met, partially met, or not met. 153 studies that met our inclusion criteria were identified. 61 of these were published before the guidelines were introduced (2000-2005) and 92 after the guidelines were introduced (2006-2011). The methodological quality of RSA studies clearly improved from 2000 to 2011. None of the studies fully met all 13 guidelines. Nearly half (43) of the studies published after the guidelines demonstrated a high methodological quality and adhered at least partially to 10 of the 13 guidelines, whereas less than one-fifth (11) of the studies published before the guidelines had the same methodological quality. Commonly unaddressed guideline items were related to imaging methodology, determination of precision from double examinations, and also mean error of rigid-body fitting and condition number cutoff levels. The guidelines have improved methodological reporting in RSA studies, but adherence to these guidelines is still relatively low. There is a need to update and clarify the guidelines for clinical hip and knee arthroplasty RSA studies.

  9. A methodology for Manufacturing Execution Systems (MES) implementation

    NASA Astrophysics Data System (ADS)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  10. Methodological quality of randomised controlled trials in burns care. A systematic review.

    PubMed

    Danilla, Stefan; Wasiak, Jason; Searle, Susana; Arriagada, Cristian; Pedreros, Cesar; Cleland, Heather; Spinks, Anneliese

    2009-11-01

    To evaluate the methodological quality of published randomised controlled trials (RCTs) in burn care treatment and management. Using a predetermined search strategy we searched Ovid MEDLINE (1950 to January 2008) database to identify all English RCTs related to burn care. Full text studies identified were reviewed for key demographic and methodological characteristics. Methodological trial quality was assessed using the Jadad scale. A total of 257 studies involving 14,535 patients met the inclusion criteria. The median Jadad score was 2 (out of a best possible score of 5). Information was given in the introduction and discussion sections of most RCTs, although insufficient detail was provided on randomisation, allocation concealment, and blinding. The number of RCTs increased between 1950 and 2008 (Spearman's rho=0.6129, P<0.001), although the reporting quality did not improve over the same time period (P=0.1896) and was better in RCTs with larger sample sizes (median Jadad score, 4 vs. 2 points, P<0.0001). Methodological quality did not correlate with journal impact factor (P=0.2371). The reporting standards of RCTs are highly variable and less than optimal in most cases. The advent of evidence-based medicine heralds a new approach to burns care and systematic steps are needed to improve the quality of RCTs in this field. Identifying and reviewing the existing number of RCTs not only highlights the need for burn clinicians to conduct more trials, but may also encourage burn health clinicians to consider the importance of conducting trials that follow appropriate, evidence-based standards.

  11. Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1999-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.

  12. Methodological considerations for designing a community water fluoridation cessation study.

    PubMed

    Singhal, Sonica; Farmer, Julie; McLaren, Lindsay

    2017-06-01

    High-quality, up-to-date research on community water fluoridation (CWF), and especially on the implications of CWF cessation for dental health, is limited. Although CWF cessation studies have been conducted, they are few in number; one of the major reasons is the methodological complexity of conducting such a study. This article draws on a systematic review of existing cessation studies (n=15) to explore methodological considerations of conducting CWF cessation studies in future. We review nine important methodological aspects (study design, comparison community, target population, time frame, sampling strategy, clinical indicators, assessment criteria, covariates and biomarkers) and provide recommendations for planning future CWF cessation studies that examine effects on dental caries. There is no one ideal study design to answer a research question. However, recommendations proposed regarding methodological aspects to conduct an epidemiological study to observe the effects of CWF cessation on dental caries, coupled with our identification of important methodological gaps, will be useful for researchers who are looking to optimize resources to conduct such a study with standards of rigour. © 2017 Her Majesty the Queen in Right of Canada. Community Dentistry and Oral Epidemiology © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less

  14. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    ERIC Educational Resources Information Center

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  15. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  16. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    PubMed

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. The Development of Standard Operating Procedures for Social Mobilization and Community Engagement in Sierra Leone During the West Africa Ebola Outbreak of 2014-2015.

    PubMed

    Pedi, Danielle; Gillespie, Amaya; Bedson, Jamie; Jalloh, Mohamed F; Jalloh, Mohammad B; Kamara, Alusine; Bertram, Kathryn; Owen, Katharine; Jalloh, Mohamed A; Conte, Lansana

    2017-01-01

    This article describes the development of standard operating procedures (SOPs) for social mobilization and community engagement (SM/CE) in Sierra Leone during the Ebola outbreak of 2014-2015. It aims to (a) explain the rationale for a standardized approach, (b) describe the methodology used to develop the resulting SOPs, and (c) discuss the implications of the SOPs for future outbreak responses. Mixed methodologies were applied, including analysis of data on Ebola-related knowledge, attitudes, and practices; consultation through a national forum; and a series of workshops with more than 250 participants active in SM/CE in seven districts with recent confirmed cases. Specific challenges, best practices, and operational models were identified in relation to (a) the quality of SM/CE approaches; (b) coordination and operational structures; and (c) integration with Ebola services, including case management, burials, quarantine, and surveillance. This information was synthesized and codified into the SOPs, which include principles, roles, and actions for partners engaging in SM/CE as part of the Ebola response. This experience points to the need for a set of global principles and standards for meaningful SM/CE that can be rapidly adapted as a high-priority response component at the outset of future health and humanitarian crises.

  18. Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty

    NASA Astrophysics Data System (ADS)

    Brumble, K. C.

    2012-12-01

    What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid critics of particular statistical reconstructions with naïve and misapplied methodological criticism. Examples will include the skeptical responses to multi-proxy mean temperature reconstructions and congressional hearings criticizing the work of Michael Mann et al.'s Hockey Stick.

  19. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search.

    PubMed

    Villagra, Andrea; Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology.

  20. Methodological Quality of National Guidelines for Pediatric Inpatient Conditions

    PubMed Central

    Hester, Gabrielle; Nelson, Katherine; Mahant, Sanjay; Eresuma, Emily; Keren, Ron; Srivastava, Rajendu

    2014-01-01

    Background Guidelines help inform standardization of care for quality improvement (QI). The Pediatric Research in Inpatient Settings (PRIS) network published a prioritization list of inpatient conditions with high prevalence, cost, and variation in resource utilization across children’s hospitals. The methodological quality of guidelines for priority conditions is unknown. Objective To rate the methodological quality of national guidelines for 20 priority pediatric inpatient conditions. Design We searched sources including PubMed for national guidelines published 2002–2012. Guidelines specific to one organism, test or treatment, or institution were excluded. Guidelines were rated by two raters using a validated tool (AGREE II) with an overall rating on a 7-point scale (7–highest). Inter-rater reliability was measured with a weighted kappa coefficient. Results 17 guidelines met inclusion criteria for 13 conditions, 7 conditions yielded no relevant national guidelines. The highest methodological quality guidelines were for asthma, tonsillectomy, and bronchiolitis (mean overall rating 7, 6.5 and 6.5 respectively); the lowest were for sickle cell disease (2 guidelines) and dental caries (mean overall rating 4, 3.5, and 3 respectively). The overall weighted kappa was 0.83 (95% confidence interval 0.78–0.87). Conclusions We identified a group of moderate to high methodological quality national guidelines for priority pediatric inpatient conditions. Hospitals should consider these guidelines to inform QI initiatives. PMID:24677729

  1. Event-driven, pattern-based methodology for cost-effective development of standardized personal health devices.

    PubMed

    Martínez-Espronceda, Miguel; Trigo, Jesús D; Led, Santiago; Barrón-González, H Gilberto; Redondo, Javier; Baquero, Alfonso; Serrano, Luis

    2014-11-01

    Experiences applying standards in personal health devices (PHDs) show an inherent trade-off between interoperability and costs (in terms of processing load and development time). Therefore, reducing hardware and software costs as well as time-to-market is crucial for standards adoption. The ISO/IEEE11073 PHD family of standards (also referred to as X73PHD) provides interoperable communication between PHDs and aggregators. Nevertheless, the responsibility of achieving inexpensive implementations of X73PHD in limited resource microcontrollers falls directly on the developer. Hence, the authors previously presented a methodology based on patterns to implement X73-compliant PHDs into devices with low-voltage low-power constraints. That version was based on multitasking, which required additional features and resources. This paper therefore presents an event-driven evolution of the patterns-based methodology for cost-effective development of standardized PHDs. The results of comparing between the two versions showed that the mean values of decrease in memory consumption and cycles of latency are 11.59% and 45.95%, respectively. In addition, several enhancements in terms of cost-effectiveness and development time can be derived from the new version of the methodology. Therefore, the new approach could help in producing cost-effective X73-compliant PHDs, which in turn could foster the adoption of standards. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Ultrasound for assessing disease activity in IBD patients: a systematic review of activity scores.

    PubMed

    Bots, S; Nylund, K; Löwenberg, M; Gecse, K; Gilja, O H; D'Haens, G

    2018-04-19

    Ultrasound (US) indices for assessing disease activity in IBD patients have never been critically reviewed. We aimed to systematically review the quality and reliability of available ultrasound (US) indices compared with reference standards for grading disease activity in IBD patients. Pubmed, Embase and Medline were searched from 1990 until June 2017. Relevant publications were identified through full text review after initial screening by 2 investigators. Data on methodology and index characteristics were collected. Study quality was assessed with a modified version of the Quadas-2 tool for risk of bias assessment. Of 20 studies with an US index, 11 studies met the inclusion criteria. Out of these 11 studies, 7 and 4 studied CD and UC activity indices, respectively. Parameters that were used in these indices included bowel wall thickness (BWT), Doppler signal (DS), wall layer stratification (WLS), compressibility, peristalsis, haustrations, fatty wrapping, contrast enhancement (CE) and strain pattern. Study quality was graded high in 5 studies, moderate in 3 studies and low in 3 studies. Ileocolonoscopy was used as the reference standard in 9 studies. In 1 study a combined index of ileocolonoscopy and barium contrast radiography and in 1 study histology was used as the reference standard. Only 5 studies used an established endoscopic index for comparison with US. Several US indices for assessing disease activity in IBD are available; however the methodology for development was suboptimal in most studies. For the development of future indices stringent methodological design is required.

  3. A primer on standards setting as it applies to surgical education and credentialing.

    PubMed

    Cendan, Juan; Wier, Daryl; Behrns, Kevin

    2013-07-01

    Surgical technological advances in the past three decades have led to dramatic reductions in the morbidity associated with abdominal procedures and permanently altered the surgical practice landscape. Significant changes continue apace including surgical robotics, natural orifice-based surgery, and single-incision approaches. These disruptive technologies have on occasion been injurious to patients, and high-stakes assessment before adoption of new technologies would be reasonable. We reviewed the drivers for well-established psychometric techniques available for the standards-setting process. We present a series of examples that are relevant in the surgical domain including standards setting for knowledge and skills assessments. Defensible standards for knowledge and procedural skills will likely become part of surgical clinical practice. Understanding the methodology for determining standards should position the surgical community to assist in the process and lead within their clinical settings as standards are considered that may affect patient safety and physician credentialing.

  4. 77 FR 55737 - Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ...The U.S. Small Business Administration (SBA) proposes to increase small business size standards for 37 industries in North American Industry Classification System (NAICS) Sector 52, Finance and Insurance, and for two industries in NAICS Sector 55, Management of Companies and Enterprises. In addition, SBA proposes to change the measure of size from average assets to average receipts for NAICS 522293, International Trade Financing. As part of its ongoing comprehensive size standards review, SBA evaluated all receipts based and assets based size standards in NAICS Sectors 52 and 55 to determine whether they should be retained or revised. This proposed rule is one of a series of proposed rules that will review size standards of industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published a notice in the October 21, 2009 issue of the Federal Register to advise the public that the document is available on its Web site at www.sba.gov/size for public review and comments. The ``Size Standards Methodology'' White Paper explains how SBA establishes, reviews, and modifies its receipts based and employee based small business size standards. In this proposed rule, SBA has applied its methodology that pertains to establishing, reviewing, and modifying a receipts based size standard.

  5. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  6. A multicenter study to standardize reporting and analyses of fluorescence-activated cell-sorted murine intestinal epithelial cells

    PubMed Central

    Magness, Scott T.; Puthoff, Brent J.; Crissey, Mary Ann; Dunn, James; Henning, Susan J.; Houchen, Courtney; Kaddis, John S.; Kuo, Calvin J.; Li, Linheng; Lynch, John; Martin, Martin G.; May, Randal; Niland, Joyce C.; Olack, Barbara; Qian, Dajun; Stelzner, Matthias; Swain, John R.; Wang, Fengchao; Wang, Jiafang; Wang, Xinwei; Yan, Kelley; Yu, Jian

    2013-01-01

    Fluorescence-activated cell sorting (FACS) is an essential tool for studies requiring isolation of distinct intestinal epithelial cell populations. Inconsistent or lack of reporting of the critical parameters associated with FACS methodologies has complicated interpretation, comparison, and reproduction of important findings. To address this problem a comprehensive multicenter study was designed to develop guidelines that limit experimental and data reporting variability and provide a foundation for accurate comparison of data between studies. Common methodologies and data reporting protocols for tissue dissociation, cell yield, cell viability, FACS, and postsort purity were established. Seven centers tested the standardized methods by FACS-isolating a specific crypt-based epithelial population (EpCAM+/CD44+) from murine small intestine. Genetic biomarkers for stem/progenitor (Lgr5 and Atoh 1) and differentiated cell lineages (lysozyme, mucin2, chromogranin A, and sucrase isomaltase) were interrogated in target and control populations to assess intra- and intercenter variability. Wilcoxon's rank sum test on gene expression levels showed limited intracenter variability between biological replicates. Principal component analysis demonstrated significant intercenter reproducibility among four centers. Analysis of data collected by standardized cell isolation methods and data reporting requirements readily identified methodological problems, indicating that standard reporting parameters facilitate post hoc error identification. These results indicate that the complexity of FACS isolation of target intestinal epithelial populations can be highly reproducible between biological replicates and different institutions by adherence to common cell isolation methods and FACS gating strategies. This study can be considered a foundation for continued method development and a starting point for investigators that are developing cell isolation expertise to study physiology and pathophysiology of the intestinal epithelium. PMID:23928185

  7. Effects of Mat Pilates on Physical Functional Performance of Older Adults: A Meta-analysis of Randomized Controlled Trials.

    PubMed

    Bueno de Souza, Roberta Oliveira; Marcon, Liliane de Faria; Arruda, Alex Sandro Faria de; Pontes Junior, Francisco Luciano; Melo, Ruth Caldeira de

    2018-06-01

    The present meta-analysis aimed to examine evidence from randomized controlled trials to determine the effects of mat Pilates on measures of physical functional performance in the older population. A search was conducted in the MEDLINE/PubMed, Scopus, Scielo, and PEDro databases between February and March 2017. Only randomized controlled trials that were written in English, included subjects aged 60 yrs who used mat Pilates exercises, included a comparison (control) group, and reported performance-based measures of physical function (balance, flexibility, muscle strength, and cardiorespiratory fitness) were included. The methodological quality of the studies was analyzed according to the PEDro scale and the best-evidence synthesis. The meta-analysis was conducted with the Review Manager 5.3 software. The search retrieved 518 articles, nine of which fulfilled the inclusion criteria. High methodological quality was found in five of these studies. Meta-analysis indicated a large effect of mat Pilates on dynamic balance (standardized mean difference = 1.10, 95% confidence interval = 0.29-1.90), muscle strength (standardized mean difference = 1.13, 95% confidence interval = 0.30-1.96), flexibility (standardized mean difference = 1.22, 95% confidence interval = 0.39-2.04), and cardiorespiratory fitness (standardized mean difference = 1.48, 95% confidence interval = 0.42-2.54) of elderly subjects. There is evidence that mat Pilates improves dynamic balance, lower limb strength, hip and lower back flexibility, and cardiovascular endurance in elderly individuals. Furthermore, high-quality studies are necessary to clarify the effects of mat Pilates on other physical functional measurements among older adults.

  8. 75 FR 5589 - Science Advisory Board Staff Office; Request for Public Nominations of Experts To Conduct a Peer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-03

    ... Conductivity Using Field Data: An Adaptation of the U.S. EPA's Standard Methodology for Deriving Water Quality... Adaptation of the U.S. EPA's Standard Methodology for Deriving Water Quality Criteria'' DATES: Nominations... Deriving Water Quality Criteria'' should be directed to Dr. Michael Slimak, ORD's Associate Director of...

  9. Methodology issues in implementation science.

    PubMed

    Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita

    2013-04-01

    Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.

  10. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  11. Methodological Measurement Fruitfulness of Exploratory Structural Equation Modeling (ESEM): New Approaches to Key Substantive Issues in Motivation and Engagement

    ERIC Educational Resources Information Center

    Marsh, Herbert W.; Liem, Gregory Arief D.; Martin, Andrew J.; Morin, Alexandre J. S.; Nagengast, Benjamin

    2011-01-01

    The most popular measures of multidimensional constructs typically fail to meet standards of good measurement: goodness of fit, measurement invariance, lack of differential item functioning, and well-differentiated factors that are not so highly correlated as to detract from their discriminant validity. Part of the problem, the authors argue, is…

  12. Methodology Used for Gas Analysis and Control of Trace Chemical Contaminants at a Hyperbaric Facility. 1. Gas Sampling

    DTIC Science & Technology

    1988-12-01

    made using a gas sampling valve. All instruments were calibrated using gravimetric standards certified to t 1-2% relative of stated value ( Air Products and Chemicals , Inc ., Allentown...cannister - 985410 7. High Purity Gas Cylinder Regulators - several sources Air Products and Chemicals , Inc . P.O. Box 1536 Washington, DC 20013 (301

  13. Case Study Analyses of Play Behaviors of 12-Month-Old Infants Later Diagnosed with Autism

    ERIC Educational Resources Information Center

    Mulligan, Shelley

    2015-01-01

    Case study research methodology was used to describe the play behaviors of three infants at 12 months of age, who were later diagnosed with an autism spectrum disorder. Data included standardized test scores, and analyses of video footage of semi-structured play sessions from infants identified as high risk for autism, because of having a sibling…

  14. Using the Malcolm Baldrige "are we making progress" survey for organizational self-assessment and performance improvement.

    PubMed

    Shields, Judith A; Jennings, Jerry L

    2013-01-01

    A national healthcare company applied the Malcolm Baldrige Criteria for Performance Excellence and its "Are We Making Progress?" survey as an annual organizational self-assessment to identify areas for improvement. For 6 years, Liberty Healthcare Corporation reviewed the survey results on an annual basis to analyze positive and negative trends, monitor company progress toward targeted goals and develop new initiatives to address emerging areas for improvement. As such, the survey provided a simple and inexpensive methodology to gain useful information from employees at all levels and from multiple service sites and business sectors. In particular, it provided a valuable framework for assessing and improving the employees' commitment to the company's mission and values, high standards and ethics, quality of work, and customer satisfaction. The methodology also helped the company to incorporate the philosophy and principles of continuous quality improvement in a unified fashion. Corporate and local leadership used the same measure to evaluate the performance of individual programs relative to each other, to the company as a whole, and to the "best practices" standard of highly successful companies that received the Malcolm Baldrige National Quality Award. © 2012 National Association for Healthcare Quality.

  15. Effect of High Intensity Interval and Continuous Swimming Training on Body Mass Adiposity Level and Serum Parameters in High-Fat Diet Fed Rats

    PubMed Central

    da Rocha, Guilherme L.; Crisp, Alex H.; de Oliveira, Maria R. M.; da Silva, Carlos A.; Silva, Jadson O.; Duarte, Ana C. G. O.; Sene-Fiorese, Marcela; Verlengia, Rozangela

    2016-01-01

    This study aimed to investigate the effects of interval and continuous training on the body mass gain and adiposity levels of rats fed a high-fat diet. Forty-eight male Sprague-Dawley rats were randomly divided into two groups, standard diet and high-fat diet, and received their respective diets for a period of four weeks without exercise stimuli. After this period, the animals were randomly divided into six groups (n = 8): control standard diet (CS), control high-fat diet (CH), continuous training standard diet (CTS), continuous training high-fat diet (CTH), interval training standard diet (ITS), and interval training high-fat diet (ITH). The interval and continuous training consisted of a swimming exercise performed over eight weeks. CH rats had greater body mass gain, sum of adipose tissues mass, and lower serum high density lipoprotein values than CS. The trained groups showed lower values of feed intake, caloric intake, body mass gain, and adiposity levels compared with the CH group. No significant differences were observed between the trained groups (CTS versus ITS and CTH versus ITH) on body mass gains and adiposity levels. In conclusion, both training methodologies were shown to be effective in controlling body mass gain and adiposity levels in high-fat diet fed rats. PMID:26904718

  16. Head-To-Head Comparison Between High- and Standard-b-Value DWI for Detecting Prostate Cancer: A Systematic Review and Meta-Analysis.

    PubMed

    Woo, Sungmin; Suh, Chong Hyun; Kim, Sang Youn; Cho, Jeong Yeon; Kim, Seung Hyup

    2018-01-01

    The purpose of this study was to perform a head-to-head comparison between high-b-value (> 1000 s/mm 2 ) and standard-b-value (800-1000 s/mm 2 ) DWI regarding diagnostic performance in the detection of prostate cancer. The MEDLINE and EMBASE databases were searched up to April 1, 2017. The analysis included diagnostic accuracy studies in which high- and standard-b-value DWI were used for prostate cancer detection with histopathologic examination as the reference standard. Methodologic quality was assessed with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Sensitivity and specificity of all studies were calculated and were pooled and plotted in a hierarchic summary ROC plot. Meta-regression and multiple-subgroup analyses were performed to compare the diagnostic performances of high- and standard-b-value DWI. Eleven studies (789 patients) were included. High-b-value DWI had greater pooled sensitivity (0.80 [95% CI, 0.70-0.87]) (p = 0.03) and specificity (0.92 [95% CI, 0.87-0.95]) (p = 0.01) than standard-b-value DWI (sensitivity, 0.78 [95% CI, 0.66-0.86]); specificity, 0.87 [95% CI, 0.77-0.93] (p < 0.01). Multiple-subgroup analyses showed that specificity was consistently higher for high- than for standard-b-value DWI (p ≤ 0.05). Sensitivity was significantly higher for high- than for standard-b-value DWI only in the following subgroups: peripheral zone only, transition zone only, multiparametric protocol (DWI and T2-weighted imaging), visual assessment of DW images, and per-lesion analysis (p ≤ 0.04). In a head-to-head comparison, high-b-value DWI had significantly better sensitivity and specificity for detection of prostate cancer than did standard-b-value DWI. Multiple-subgroup analyses showed that specificity was consistently superior for high-b-value DWI.

  17. Process optimization of an auger pyrolyzer with heat carrier using response surface methodology.

    PubMed

    Brown, J N; Brown, R C

    2012-01-01

    A 1 kg/h auger reactor utilizing mechanical mixing of steel shot heat carrier was used to pyrolyze red oak wood biomass. Response surface methodology was employed using a circumscribed central composite design of experiments to optimize the system. Factors investigated were: heat carrier inlet temperature and mass flow rate, rotational speed of screws in the reactor, and volumetric flow rate of sweep gas. Conditions for maximum bio-oil and minimum char yields were high flow rate of sweep gas (3.5 standard L/min), high heat carrier temperature (∼600 °C), high auger speeds (63 RPM) and high heat carrier mass flow rates (18 kg/h). Regression models for bio-oil and char yields are described including identification of a novel interaction effect between heat carrier mass flow rate and auger speed. Results suggest that auger reactors, which are rarely described in literature, are well suited for bio-oil production. The reactor achieved liquid yields greater than 73 wt.%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. The method of selecting an integrated development territory for the high-rise unique constructions

    NASA Astrophysics Data System (ADS)

    Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena

    2018-03-01

    On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.

  19. Applying an Empirical Hydropathic Forcefield in Refinement May Improve Low-Resolution Protein X-Ray Crystal Structures

    PubMed Central

    Koparde, Vishal N.; Scarsdale, J. Neel; Kellogg, Glen E.

    2011-01-01

    Background The quality of X-ray crystallographic models for biomacromolecules refined from data obtained at high-resolution is assured by the data itself. However, at low-resolution, >3.0 Å, additional information is supplied by a forcefield coupled with an associated refinement protocol. These resulting structures are often of lower quality and thus unsuitable for downstream activities like structure-based drug discovery. Methodology An X-ray crystallography refinement protocol that enhances standard methodology by incorporating energy terms from the HINT (Hydropathic INTeractions) empirical forcefield is described. This protocol was tested by refining synthetic low-resolution structural data derived from 25 diverse high-resolution structures, and referencing the resulting models to these structures. The models were also evaluated with global structural quality metrics, e.g., Ramachandran score and MolProbity clashscore. Three additional structures, for which only low-resolution data are available, were also re-refined with this methodology. Results The enhanced refinement protocol is most beneficial for reflection data at resolutions of 3.0 Å or worse. At the low-resolution limit, ≥4.0 Å, the new protocol generated models with Cα positions that have RMSDs that are 0.18 Å more similar to the reference high-resolution structure, Ramachandran scores improved by 13%, and clashscores improved by 51%, all in comparison to models generated with the standard refinement protocol. The hydropathic forcefield terms are at least as effective as Coulombic electrostatic terms in maintaining polar interaction networks, and significantly more effective in maintaining hydrophobic networks, as synthetic resolution is decremented. Even at resolutions ≥4.0 Å, these latter networks are generally native-like, as measured with a hydropathic interactions scoring tool. PMID:21246043

  20. Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing

    NASA Technical Reports Server (NTRS)

    Selzer, Robert H. (Inventor); Hodis, Howard N. (Inventor)

    2011-01-01

    A standardized acquisition methodology assists operators to accurately replicate high resolution B-mode ultrasound images obtained over several spaced-apart examinations utilizing a split-screen display in which the arterial ultrasound image from an earlier examination is displayed on one side of the screen while a real-time "live" ultrasound image from a current examination is displayed next to the earlier image on the opposite side of the screen. By viewing both images, whether simultaneously or alternately, while manually adjusting the ultrasound transducer, an operator is able to bring into view the real-time image that best matches a selected image from the earlier ultrasound examination. Utilizing this methodology, dynamic material properties of arterial structures, such as IMT and diameter, are measured in a standard region over successive image frames. Each frame of the sequence has its echo edge boundaries automatically determined by using the immediately prior frame's true echo edge coordinates as initial boundary conditions. Computerized echo edge recognition and tracking over multiple successive image frames enhances measurement of arterial diameter and IMT and allows for improved vascular dimension measurements, including vascular stiffness and IMT determinations.

  1. Validation of mid-infrared spectroscopy for macronutrient analysis of human milk.

    PubMed

    Parat, S; Groh-Wargo, S; Merlino, S; Wijers, C; Super, D M

    2017-07-01

    Human milk has considerable variation in its composition. Hence, the nutrient profile is only an estimate and can result in under- or over-estimation of the intake of preterm infants. Mid-infrared (MIR) spectroscopy is an evolving technique for analyzing human milk but needs validation before use in clinical practice. Human milk samples from 35 mothers delivering at 35 weeks to term gestation were analyzed for macronutrients by MIR spectroscopy and by standard laboratory methods using Kjeldahl assay for protein, Mojonnier assay for fat and high-pressure liquid chromatography assay for lactose. MIR analysis of the macronutrients in human milk correlated well with standard laboratory tests with intraclass correlation coefficients of 0.997 for fat, 0.839 for protein and 0.776 for lactose. Agreement between the two methods was excellent for fat, and moderate for protein and lactose (P<0.001). This methodological paper provides evidence that MIR spectroscopy can be used to analyze macronutrient composition of human milk. Agreement between the methodologies varies by macronutrient.

  2. A theoretical-experimental methodology for assessing the sensitivity of biomedical spectral imaging platforms, assays, and analysis methods.

    PubMed

    Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C

    2018-01-01

    Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Natural background levels and threshold values for groundwater in fluvial Pleistocene and Tertiary marine aquifers in Flanders, Belgium

    NASA Astrophysics Data System (ADS)

    Coetsiers, Marleen; Blaser, Petra; Martens, Kristine; Walraevens, Kristine

    2009-05-01

    Aquifers from the same typology can have strongly different groundwater chemistry. Deducing the groundwater quality of less well-characterized aquifers from well-documented aquifers belonging to the same typology should be done with great reserve, and can only be considered as a preliminary approach. In the EU’s 6th FP BRIDGE project “Background cRiteria for the IDentification of Groundwater thrEsholds”, a methodology for the derivation of threshold values (TV) for groundwater bodies is proposed. This methodology is tested on four aquifers in Flanders of the sand and gravel typology. The methodology works well for all but the Ledo-Paniselian aquifer, where the subdivision into a fresh and saline part is disproved, as a gradual natural transition from fresh to saline conditions in the aquifer is observed. The 90 percentile is proposed as natural background level (NBL) for the unconfined Pleistocene deposits, ascribing the outliers to possible influence of pollution. For the Tertiary aquifers, high values for different parameters have a natural origin and the 97.7 percentile is preferred as NBL. The methodology leads to high TVs for parameters presenting low NBL, when compared to the standard used as a reference. This would allow for substantial anthropogenic inputs of these parameters.

  4. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  5. Methodological Issues in Meta-Analyzing Standard Deviations: Comment on Bond and DePaulo (2008)

    ERIC Educational Resources Information Center

    Pigott, Therese D.; Wu, Meng-Jia

    2008-01-01

    In this comment on C. F. Bond and B. M. DePaulo, the authors raise methodological concerns about the approach used to analyze the data. The authors suggest further refinement of the procedures used, and they compare the approach taken by Bond and DePaulo with standard methods for meta-analysis. (Contains 1 table and 2 figures.)

  6. On the development of a methodology for extensive in-situ and continuous atmospheric CO2 monitoring

    NASA Astrophysics Data System (ADS)

    Wang, K.; Chang, S.; Jhang, T.

    2010-12-01

    Carbon dioxide is recognized as the dominating greenhouse gas contributing to anthropogenic global warming. Stringent controls on carbon dioxide emissions are viewed as necessary steps in controlling atmospheric carbon dioxide concentrations. From the view point of policy making, regulation of carbon dioxide emissions and its monitoring are keys to the success of stringent controls on carbon dioxide emissions. Especially, extensive atmospheric CO2 monitoring is a crucial step to ensure that CO2 emission control strategies are closely followed. In this work we develop a methodology that enables reliable and accurate in-situ and continuous atmospheric CO2 monitoring for policy making. The methodology comprises the use of gas filter correlation (GFC) instrument for in-situ CO2 monitoring, the use of CO2 working standards accompanying the continuous measurements, and the use of NOAA WMO CO2 standard gases for calibrating the working standards. The use of GFC instruments enables 1-second data sampling frequency with the interference of water vapor removed from added dryer. The CO2 measurements are conducted in the following timed and cycled manner: zero CO2 measurement, two standard CO2 gases measurements, and ambient air measurements. The standard CO2 gases are calibrated again NOAA WMO CO2 standards. The methodology is used in indoor CO2 measurements in a commercial office (about 120 people working inside), ambient CO2 measurements, and installed in a fleet of in-service commercial cargo ships for monitoring CO2 over global marine boundary layer. These measurements demonstrate our method is reliable, accurate, and traceable to NOAA WMO CO2 standards. The portability of the instrument and the working standards make the method readily applied for large-scale and extensive CO2 measurements.

  7. Adherence of hip and knee arthroplasty studies to RSA standardization guidelines

    PubMed Central

    Mäkinen, Tatu J; Aro, Hannu T; Bragdon, Charles; Malchau, Henrik

    2014-01-01

    Background and purpose Guidelines for standardization of radiostereometry (RSA) of implants were published in 2005 to facilitate comparison of outcomes between various research groups. In this systematic review, we determined how well studies have adhered to these guidelines. Methods We carried out a literature search to identify all articles published between January 2000 and December 2011 that used RSA in the evaluation of hip or knee prosthesis migration. 2 investigators independently evaluated each of the studies for adherence to the 13 individual guideline items. Since some of the 13 points included more than 1 criterion, studies were assessed on whether each point was fully met, partially met, or not met. Results 153 studies that met our inclusion criteria were identified. 61 of these were published before the guidelines were introduced (2000–2005) and 92 after the guidelines were introduced (2006–2011). The methodological quality of RSA studies clearly improved from 2000 to 2011. None of the studies fully met all 13 guidelines. Nearly half (43) of the studies published after the guidelines demonstrated a high methodological quality and adhered at least partially to 10 of the 13 guidelines, whereas less than one-fifth (11) of the studies published before the guidelines had the same methodological quality. Commonly unaddressed guideline items were related to imaging methodology, determination of precision from double examinations, and also mean error of rigid-body fitting and condition number cutoff levels. Interpretation The guidelines have improved methodological reporting in RSA studies, but adherence to these guidelines is still relatively low. There is a need to update and clarify the guidelines for clinical hip and knee arthroplasty RSA studies. PMID:24954489

  8. Establishing Inter- and Intrarater Reliability for High-Stakes Testing Using Simulation.

    PubMed

    Kardong-Edgren, Suzan; Oermann, Marilyn H; Rizzolo, Mary Anne; Odom-Maryon, Tamara

    This article reports one method to develop a standardized training method to establish the inter- and intrarater reliability of a group of raters for high-stakes testing. Simulation is used increasingly for high-stakes testing, but without research into the development of inter- and intrarater reliability for raters. Eleven raters were trained using a standardized methodology. Raters scored 28 student videos over a six-week period. Raters then rescored all videos over a two-day period to establish both intra- and interrater reliability. One rater demonstrated poor intrarater reliability; a second rater failed all students. Kappa statistics improved from the moderate to substantial agreement range with the exclusion of the two outlier raters' scores. There may be faculty who, for different reasons, should not be included in high-stakes testing evaluations. All faculty are content experts, but not all are expert evaluators.

  9. A systematic tale of two differing reviews: evaluating the evidence on public and private sector quality of primary care in low and middle income countries.

    PubMed

    Coarasa, Jorge; Das, Jishnu; Gummerson, Elizabeth; Bitton, Asaf

    2017-04-12

    Systematic reviews are powerful tools for summarizing vast amounts of data in controversial areas; but their utility is limited by methodological choices and assumptions. Two systematic reviews of literature on the quality of private sector primary care in low and middle income countries (LMIC), published in the same journal within a year, reached conflicting conclusions. The difference in findings reflects different review methodologies, but more importantly, a weak underlying body of literature. A detailed examination of the literature cited in both reviews shows that only one of the underlying studies met the gold standard for methodological robustness. Given the current policy momentum on universal health coverage and primary health care reform across the globe, there is an urgent need for high quality empirical evidence on the quality of private versus public sector primary health care in LMIC.

  10. Analysis of Material Sample Heated by Impinging Hot Hydrogen Jet in a Non-Nuclear Tester

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Foote, John; Litchford, Ron

    2006-01-01

    A computational conjugate heat transfer methodology was developed and anchored with data obtained from a hot-hydrogen jet heated, non-nuclear materials tester, as a first step towards developing an efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective and thermal radiative, and conjugate heat transfers. Predicted hot hydrogen jet and material surface temperatures were compared with those of measurement. Predicted solid temperatures were compared with those obtained with a standard heat transfer code. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.

  11. Tsunamis: bridging science, engineering and society.

    PubMed

    Kânoğlu, U; Titov, V; Bernard, E; Synolakis, C

    2015-10-28

    Tsunamis are high-impact, long-duration disasters that in most cases allow for only minutes of warning before impact. Since the 2004 Boxing Day tsunami, there have been significant advancements in warning methodology, pre-disaster preparedness and basic understanding of related phenomena. Yet, the trail of destruction of the 2011 Japan tsunami, broadcast live to a stunned world audience, underscored the difficulties of implementing advances in applied hazard mitigation. We describe state of the art methodologies, standards for warnings and summarize recent advances in basic understanding, and identify cross-disciplinary challenges. The stage is set to bridge science, engineering and society to help build up coastal resilience and reduce losses. © 2015 The Author(s).

  12. Food control from farm to fork: implementing the standards of Codex and the OIE.

    PubMed

    Hathaway, S C

    2013-08-01

    The Codex Alimentarius (Codex) international food standards help to ensure food safety and promote fair practices in the international food trade. Implementing these standards using a risk management framework (RMF) approach to decision-making is an increasingly common aspect of the food control programmes of national governments. The Codex Alimentarius Commission (CAC) provides guidance at both the system and food commodity levels. In the case of zoonoses, similarities in the risk analysis methodologies used to underpin standard setting by the CAC and the World Organisation for Animal Health (OIE) are highly enabling of integrated food control systems. The CAC and the OIE are increasingly working together to develop their respective standards for foodborne zoonoses and other hazards so that they are non-duplicative, cohesive and utilise the whole food chain. There is a clear need for effective integration of food safety and animal health monitoring and surveillance information to better control foodborne zoonoses. This is increasingly supported by Codex and OIE standards working together in a variety of ways and realisation of benefits is highly dependent on coordination and sharing of information between Competent Authorities and other food safety stakeholders at the national level.

  13. The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.

    ERIC Educational Resources Information Center

    Samers, Bernard N.; And Others

    The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…

  14. Serial Scanning and Registration of High Resolution Quantitative Computed Tomography Volume Scans for the Determination of Local Bone Density Changes

    NASA Technical Reports Server (NTRS)

    Whalen, Robert T.; Napel, Sandy; Yan, Chye H.

    1996-01-01

    Progress in development of the methods required to study bone remodeling as a function of time is reported. The following topics are presented: 'A New Methodology for Registration Accuracy Evaluation', 'Registration of Serial Skeletal Images for Accurately Measuring Changes in Bone Density', and 'Precise and Accurate Gold Standard for Multimodality and Serial Registration Method Evaluations.'

  15. Chemiluminescent optical fiber immunosensor for the detection of anti-West Nile virus IgG.

    PubMed

    Herrmann, Sebastien; Leshem, Boaz; Landes, Shimi; Rager-Zisman, Bracha; Marks, Robert S

    2005-03-31

    An ELISA-based optical fiber methodology developed for the detection of anti-West Nile virus IgG antibodies in serum was compared to standard colorimetric and chemiluminescent ELISA based on microtiter plates. Colorimetric ELISA was the least sensitive, especially at high titer dilutions. The fiber-optic immunosensor based on the same ELISA immunological rationale was the most sensitive technique.

  16. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    PubMed

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Combined Heat Transfer in High-Porosity High-Temperature Fibrous Insulations: Theory and Experimental Validation

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran; Cunnington, George R.; Miller, Steve D.; Knutson, Jeffry R.

    2010-01-01

    Combined radiation and conduction heat transfer through various high-temperature, high-porosity, unbonded (loose) fibrous insulations was modeled based on first principles. The diffusion approximation was used for modeling the radiation component of heat transfer in the optically thick insulations. The relevant parameters needed for the heat transfer model were derived from experimental data. Semi-empirical formulations were used to model the solid conduction contribution of heat transfer in fibrous insulations with the relevant parameters inferred from thermal conductivity measurements at cryogenic temperatures in a vacuum. The specific extinction coefficient for radiation heat transfer was obtained from high-temperature steady-state thermal measurements with large temperature gradients maintained across the sample thickness in a vacuum. Standard gas conduction modeling was used in the heat transfer formulation. This heat transfer modeling methodology was applied to silica, two types of alumina, and a zirconia-based fibrous insulation, and to a variation of opacified fibrous insulation (OFI). OFI is a class of insulations manufactured by embedding efficient ceramic opacifiers in various unbonded fibrous insulations to significantly attenuate the radiation component of heat transfer. The heat transfer modeling methodology was validated by comparison with more rigorous analytical solutions and with standard thermal conductivity measurements. The validated heat transfer model is applicable to various densities of these high-porosity insulations as long as the fiber properties are the same (index of refraction, size distribution, orientation, and length). Furthermore, the heat transfer data for these insulations can be obtained at any static pressure in any working gas environment without the need to perform tests in various gases at various pressures.

  18. Screening Methodologies to Support Risk and Technology ...

    EPA Pesticide Factsheets

    The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title

  19. Empiric determination of corrected visual acuity standards for train crews.

    PubMed

    Schwartz, Steven H; Swanson, William H

    2005-08-01

    Probably the most common visual standard for employment in the transportation industry is best-corrected, high-contrast visual acuity. Because such standards were often established absent empiric linkage to job performance, it is possible that a job applicant or employee who has visual acuity less than the standard may be able to satisfactorily perform the required job activities. For the transportation system that we examined, the train crew is required to inspect visually the length of the train before and during the time it leaves the station. The purpose of the inspection is to determine if an individual is in a hazardous position with respect to the train. In this article, we determine the extent to which high-contrast visual acuity can predict performance on a simulated task. Performance at discriminating hazardous from safe conditions, as depicted in projected photographic slides, was determined as a function of visual acuity. For different levels of visual acuity, which was varied through the use of optical defocus, a subject was required to label scenes as hazardous or safe. Task performance was highly correlated with visual acuity as measured under conditions normally used for vision screenings (high-illumination and high-contrast): as the acuity decreases, performance at discriminating hazardous from safe scenes worsens. This empirically based methodology can be used to establish a corrected high-contrast visual acuity standard for safety-sensitive work in transportation that is linked to the performance of a job-critical task.

  20. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  1. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  2. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  3. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  4. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  5. 43 CFR 11.83 - Damage determination phase-use value methodologies.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...

  6. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Systematic review of the methodological quality of controlled trials evaluating Chinese herbal medicine in patients with rheumatoid arthritis

    PubMed Central

    Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E

    2017-01-01

    Objectives We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). Design For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Results Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Limitations Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Conclusions Studies evaluating CHM often fail to meet expected methodological criteria, and high-quality evidence is lacking. PMID:28249848

  8. Systematic Review of Childhood Sedentary Behavior Questionnaires: What do We Know and What is Next?

    PubMed

    Hidding, Lisan M; Altenburg, Teatske M; Mokkink, Lidwine B; Terwee, Caroline B; Chinapaw, Mai J M

    2017-04-01

    Accurate measurement of child sedentary behavior is necessary for monitoring trends, examining health effects, and evaluating the effectiveness of interventions. We therefore aimed to summarize studies examining the measurement properties of self-report or proxy-report sedentary behavior questionnaires for children and adolescents under the age of 18 years. Additionally, we provided an overview of the characteristics of the evaluated questionnaires. We performed systematic literature searches in the EMBASE, PubMed, and SPORTDiscus electronic databases. Studies had to report on at least one measurement property of a questionnaire assessing sedentary behavior. Questionnaire data were extracted using a standardized checklist, i.e. the Quality Assessment of Physical Activity Questionnaire (QAPAQ) checklist, and the methodological quality of the included studies was rated using a standardized tool, i.e. the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. Forty-six studies on 46 questionnaires met our inclusion criteria, of which 33 examined test-retest reliability, nine examined measurement error, two examined internal consistency, 22 examined construct validity, eight examined content validity, and two examined structural validity. The majority of the included studies were of fair or poor methodological quality. Of the studies with at least a fair methodological quality, six scored positive on test-retest reliability, and two scored positive on construct validity. None of the questionnaires included in this review were considered as both valid and reliable. High-quality studies on the most promising questionnaires are required, with more attention to the content validity of the questionnaires. PROSPERO registration number: CRD42016035963.

  9. Methodological Review of Intimate Partner Violence Prevention Research

    ERIC Educational Resources Information Center

    Murray, Christine E.; Graybeal, Jennifer

    2007-01-01

    The authors present a methodological review of empirical program evaluation research in the area of intimate partner violence prevention. The authors adapted and utilized criterion-based rating forms to standardize the evaluation of the methodological strengths and weaknesses of each study. The findings indicate that the limited amount of…

  10. Improving Mathematics Performance among Secondary Students with EBD: A Methodological Review

    ERIC Educational Resources Information Center

    Mulcahy, Candace A.; Krezmien, Michael P.; Travers, Jason

    2016-01-01

    In this methodological review, the authors apply special education research quality indicators and standards for single case design to analyze mathematics intervention studies for secondary students with emotional and behavioral disorders (EBD). A systematic methodological review of literature from 1975 to December 2012 yielded 19 articles that…

  11. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  12. Structuring intuition with theory: The high-throughput way

    NASA Astrophysics Data System (ADS)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  13. Methodology Developed for Modeling the Fatigue Crack Growth Behavior of Single-Crystal, Nickel-Base Superalloys

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Because of their superior high-temperature properties, gas generator turbine airfoils made of single-crystal, nickel-base superalloys are fast becoming the standard equipment on today's advanced, high-performance aerospace engines. The increased temperature capabilities of these airfoils has allowed for a significant increase in the operating temperatures in turbine sections, resulting in superior propulsion performance and greater efficiencies. However, the previously developed methodologies for life-prediction models are based on experience with polycrystalline alloys and may not be applicable to single-crystal alloys under certain operating conditions. One of the main areas where behavior differences between single-crystal and polycrystalline alloys are readily apparent is subcritical fatigue crack growth (FCG). The NASA Lewis Research Center's work in this area enables accurate prediction of the subcritical fatigue crack growth behavior in single-crystal, nickel-based superalloys at elevated temperatures.

  14. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search

    PubMed Central

    Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology. PMID:27403153

  15. Classification of samples into two or more ordered populations with application to a cancer trial.

    PubMed

    Conde, D; Fernández, M A; Rueda, C; Salvador, B

    2012-12-10

    In many applications, especially in cancer treatment and diagnosis, investigators are interested in classifying patients into various diagnosis groups on the basis of molecular data such as gene expression or proteomic data. Often, some of the diagnosis groups are known to be related to higher or lower values of some of the predictors. The standard methods of classifying patients into various groups do not take into account the underlying order. This could potentially result in high misclassification rates, especially when the number of groups is larger than two. In this article, we develop classification procedures that exploit the underlying order among the mean values of the predictor variables and the diagnostic groups by using ideas from order-restricted inference. We generalize the existing methodology on discrimination under restrictions and provide empirical evidence to demonstrate that the proposed methodology improves over the existing unrestricted methodology. The proposed methodology is applied to a bladder cancer data set where the researchers are interested in classifying patients into various groups. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology

    PubMed Central

    Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin

    2008-01-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182

  17. Evaluation and optimization of hepatocyte culture media factors by design of experiments (DoE) methodology.

    PubMed

    Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin

    2008-07-01

    Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.

  18. High-throughput tandem mass spectrometry multiplex analysis for newborn urinary screening of creatine synthesis and transport disorders, Triple H syndrome and OTC deficiency.

    PubMed

    Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela

    2014-09-25

    Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by <15%. Stability during storage at different temperatures was confirmed for three weeks. The limits of detection and quantification for each biomarker varied from 0.3 to 6.3 μmol/l and from 1.0 to 20.9 μmol/l, respectively. Analyses of urine specimens from affected patients revealed abnormal results. Targeted biomarkers in urine were detected in the first weeks of life. This rapid, simple and robust liquid chromatography/tandem mass spectrometry methodology is an efficient tool applicable to urine screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Direct cost analysis of intensive care unit stay in four European countries: applying a standardized costing methodology.

    PubMed

    Tan, Siok Swan; Bakker, Jan; Hoogendoorn, Marga E; Kapila, Atul; Martin, Joerg; Pezzi, Angelo; Pittoni, Giovanni; Spronk, Peter E; Welte, Robert; Hakkaart-van Roijen, Leona

    2012-01-01

    The objective of the present study was to measure and compare the direct costs of intensive care unit (ICU) days at seven ICU departments in Germany, Italy, the Netherlands, and the United Kingdom by means of a standardized costing methodology. A retrospective cost analysis of ICU patients was performed from the hospital's perspective. The standardized costing methodology was developed on the basis of the availability of data at the seven ICU departments. It entailed the application of the bottom-up approach for "hotel and nutrition" and the top-down approach for "diagnostics," "consumables," and "labor." Direct costs per ICU day ranged from €1168 to €2025. Even though the distribution of costs varied by cost component, labor was the most important cost driver at all departments. The costs for "labor" amounted to €1629 at department G but were fairly similar at the other departments (€711 ± 115). Direct costs of ICU days vary widely between the seven departments. Our standardized costing methodology could serve as a valuable instrument to compare actual cost differences, such as those resulting from differences in patient case-mix. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. Standard operation procedures for conducting the on-the-road driving test, and measurement of the standard deviation of lateral position (SDLP)

    PubMed Central

    Verster, Joris C; Roth, Thomas

    2011-01-01

    This review discusses the methodology of the standardized on-the-road driving test and standard operation procedures to conduct the test and analyze the data. The on-the-road driving test has proven to be a sensitive and reliable method to examine driving ability after administration of central nervous system (CNS) drugs. The test is performed on a public highway in normal traffic. Subjects are instructed to drive with a steady lateral position and constant speed. Its primary parameter, the standard deviation of lateral position (SDLP), ie, an index of ‘weaving’, is a stable measure of driving performance with high test–retest reliability. SDLP differences from placebo are dose-dependent, and do not depend on the subject’s baseline driving skills (placebo SDLP). It is important that standard operation procedures are applied to conduct the test and analyze the data in order to allow comparisons between studies from different sites. PMID:21625472

  1. Standard operation procedures for conducting the on-the-road driving test, and measurement of the standard deviation of lateral position (SDLP).

    PubMed

    Verster, Joris C; Roth, Thomas

    2011-01-01

    This review discusses the methodology of the standardized on-the-road driving test and standard operation procedures to conduct the test and analyze the data. The on-the-road driving test has proven to be a sensitive and reliable method to examine driving ability after administration of central nervous system (CNS) drugs. The test is performed on a public highway in normal traffic. Subjects are instructed to drive with a steady lateral position and constant speed. Its primary parameter, the standard deviation of lateral position (SDLP), ie, an index of 'weaving', is a stable measure of driving performance with high test-retest reliability. SDLP differences from placebo are dose-dependent, and do not depend on the subject's baseline driving skills (placebo SDLP). It is important that standard operation procedures are applied to conduct the test and analyze the data in order to allow comparisons between studies from different sites.

  2. WAIS-III index score profiles in the Canadian standardization sample.

    PubMed

    Lange, Rael T

    2007-01-01

    Representative index score profiles were examined in the Canadian standardization sample of the Wechsler Adult Intelligence Scale-Third Edition (WAIS-III). The identification of profile patterns was based on the methodology proposed by Lange, Iverson, Senior, and Chelune (2002) that aims to maximize the influence of profile shape and minimize the influence of profile magnitude on the cluster solution. A two-step cluster analysis procedure was used (i.e., hierarchical and k-means analyses). Cluster analysis of the four index scores (i.e., Verbal Comprehension [VCI], Perceptual Organization [POI], Working Memory [WMI], Processing Speed [PSI]) identified six profiles in this sample. Profiles were differentiated by pattern of performance and were primarily characterized as (a) high VCI/POI, low WMI/PSI, (b) low VCI/POI, high WMI/PSI, (c) high PSI, (d) low PSI, (e) high VCI/WMI, low POI/PSI, and (f) low VCI, high POI. These profiles are potentially useful for determining whether a patient's WAIS-III performance is unusual in a normal population.

  3. Design of clinical trials of antidepressants: should a placebo control arm be included?

    PubMed

    Fritze, J; Möller, H J

    2001-01-01

    There is no doubt that available antidepressants are efficacious and effective. Nevertheless, more effective drugs with improved tolerability are needed. With this need in mind, some protagonists claim that future antidepressants should be proved superior to, or at least as effective as, established antidepressants, making placebo control methodologically dispensable in clinical trials. Moreover, the use of placebo control is criticised as unethical because it might result in effective treatment being withheld. There are, however, a number of methodological reasons why placebo control is indispensable for the proof of efficacy of antidepressants. Comparing investigational antidepressants only with standard antidepressants and not placebo yields ambiguous results that are difficult to interpret, be it in superiority or equivalence testing, and this method of assessment requires larger sample sizes than those required with the use of placebo control. Experimental methodology not adhering to the optimal study design is ethically questionable. Restricting the testing of investigational antidepressants only to superiority over standard antidepressants is an obstacle to therapeutic progress in terms of tolerability and the detection of innovative mechanisms of action from which certain subgroups of future patients might benefit. The use of a methodology that requires larger samples for testing of superiority or equivalence is also ethically questionable. In view of the high placebo response rates in trials of antidepressants, placebo treatment does not mean withholding effective treatment. Accepting the necessity of the clinical evaluation of new, potentially ineffective antidepressants implicitly means accepting placebo control as ethically justified. Three- or multi-arm comparisons including placebo and an active reference represent the optimal study design.

  4. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  5. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  6. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  7. Maintaining Equivalent Cut Scores for Small Sample Test Forms

    ERIC Educational Resources Information Center

    Dwyer, Andrew C.

    2016-01-01

    This study examines the effectiveness of three approaches for maintaining equivalent performance standards across test forms with small samples: (1) common-item equating, (2) resetting the standard, and (3) rescaling the standard. Rescaling the standard (i.e., applying common-item equating methodology to standard setting ratings to account for…

  8. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  9. Music therapy for depression.

    PubMed

    Maratos, A S; Gold, C; Wang, X; Crawford, M J

    2008-01-23

    Depression is a highly prevalent disorder associated with reduced social functioning, impaired quality of life, and increased mortality. Music therapy has been used in the treatment of a variety of mental disorders, but its impact on those with depression is unclear. To examine the efficacy of music therapy with standard care compared to standard care alone among people with depression and to compare the effects of music therapy for people with depression against other psychological or pharmacological therapies. CCDANCTR-Studies and CCDANCTR-References were searched on 7/11/2007, MEDLINE, PsycINFO, EMBASE, PsycLit, PSYindex, and other relevant sites were searched in November 2006. Reference lists of retrieved articles were hand searched, as well as specialist music and arts therapies journals. All randomised controlled trials comparing music therapy with standard care or other interventions for depression. Data on participants, interventions and outcomes were extracted and entered onto a database independently by two review authors. The methodological quality of each study was also assessed independently by two review authors. The primary outcome was reduction in symptoms of depression, based on a continuous scale. Five studies met the inclusion criteria of the review. Marked variations in the interventions offered and the populations studied meant that meta-analysis was not appropriate. Four of the five studies individually reported greater reduction in symptoms of depression among those randomised to music therapy than to those in standard care conditions. The fifth study, in which music therapy was used as an active control treatment, reported no significant change in mental state for music therapy compared with standard care. Dropout rates from music therapy conditions appeared to be low in all studies. Findings from individual randomised trials suggest that music therapy is accepted by people with depression and is associated with improvements in mood. However, the small number and low methodological quality of studies mean that it is not possible to be confident about its effectiveness. High quality trials evaluating the effects of music therapy on depression are required.

  10. A Robust Framework for Microbial Archaeology

    PubMed Central

    Warinner, Christina; Herbig, Alexander; Mann, Allison; Yates, James A. Fellows; Weiβ, Clemens L.; Burbano, Hernán A.; Orlando, Ludovic; Krause, Johannes

    2017-01-01

    Microbial archaeology is flourishing in the era of high-throughput sequencing, revealing the agents behind devastating historical plagues, identifying the cryptic movements of pathogens in prehistory, and reconstructing the ancestral microbiota of humans. Here, we introduce the fundamental concepts and theoretical framework of the discipline, then discuss applied methodologies for pathogen identification and microbiome characterization from archaeological samples. We give special attention to the process of identifying, validating, and authenticating ancient microbes using high-throughput DNA sequencing data. Finally, we outline standards and precautions to guide future research in the field. PMID:28460196

  11. A very simple, highly stereoselective and modular synthesis of ferrocene-based P-chiral phosphine ligands.

    PubMed

    Chen, Weiping; Mbafor, William; Roberts, Stanley M; Whittall, John

    2006-03-29

    A very simple, highly stereoselective and modular synthesis of ferrocene-based P-chiral phosphine ligands has been developed. On the basis of this new methodology, several new families of ferrocene-based phosphine ligands have been prepared coupling chirality at phosphorus with other, more standard stereogenic features. The introduction of P-chirality into ferrocene-based phosphine ligands enhances the enantioselective discrimination produced by the corresponding Rh catalyst when a matching among the planar chirality, carbon chirality, and the chirality of phosphorus is achieved.

  12. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  13. Predictive Inference Using Latent Variables with Covariates*

    PubMed Central

    Schofield, Lynne Steuerle; Junker, Brian; Taylor, Lowell J.; Black, Dan A.

    2014-01-01

    Plausible Values (PVs) are a standard multiple imputation tool for analysis of large education survey data that measures latent proficiency variables. When latent proficiency is the dependent variable, we reconsider the standard institutionally-generated PV methodology and find it applies with greater generality than shown previously. When latent proficiency is an independent variable, we show that the standard institutional PV methodology produces biased inference because the institutional conditioning model places restrictions on the form of the secondary analysts’ model. We offer an alternative approach that avoids these biases based on the mixed effects structural equations (MESE) model of Schofield (2008). PMID:25231627

  14. GEOTHERMAL EFFLUENT SAMPLING WORKSHOP

    EPA Science Inventory

    This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.

  15. Eligibility criteria in systematic reviews published in prominent medical journals: a methodological review.

    PubMed

    McCrae, Niall; Purssell, Edward

    2015-12-01

    Clear and logical eligibility criteria are fundamental to the design and conduct of a systematic review. This methodological review examined the quality of reporting and application of eligibility criteria in systematic reviews published in three leading medical journals. All systematic reviews in the BMJ, JAMA and The Lancet in the years 2013 and 2014 were extracted. These were assessed using a refined version of a checklist previously designed by the authors. A total of 113 papers were eligible, of which 65 were in BMJ, 17 in The Lancet and 31 in JAMA. Although a generally high level of reporting was found, eligibility criteria were often problematic. In 67% of papers, eligibility was specified after the search sources or terms. Unjustified time restrictions were used in 21% of reviews, and unpublished or unspecified data in 27%. Inconsistency between journals was apparent in the requirements for systematic reviews. The quality of reviews in these leading medical journals was high; however, there were issues that reduce the clarity and replicability of the review process. As well as providing a useful checklist, this methodological review informs the continued development of standards for systematic reviews. © 2015 John Wiley & Sons, Ltd.

  16. Reviewing methodologically disparate data: a practical guide for the patient safety research field.

    PubMed

    Brown, Katrina F; Long, Susannah J; Athanasiou, Thanos; Vincent, Charles A; Kroll, J Simon; Sevdalis, Nick

    2012-02-01

    This article addresses key questions frequently asked by researchers conducting systematic reviews in patient safety. This discipline is relatively young, and asks complex questions about complex aspects of health care delivery and experience, therefore its studies are typically methodologically heterogeneous, non-randomized and complex; but content rich and highly relevant to practice. Systematic reviews are increasingly necessary to drive forward practice and research in this area, but the data do not always lend themselves to 'standard' review methodologies. This accessible 'how-to' article demonstrates that data diversity need not preclude high-quality systematic reviews. It draws together information from published guidelines and experience within our multidisciplinary patient safety research group to provide entry-level advice for the clinician-researcher new to systematic reviewing, to non-biomedical research data or to both. It offers entry-level advice, illustrated with detailed practical examples, on defining a research question, creating a comprehensive search strategy, selecting articles for inclusion, assessing study quality, extracting data, synthesizing data and evaluating the impact of your review. The article concludes with a comment on the vital role of robust systematic reviews in the continuing advancement of the patient safety field. © 2010 Blackwell Publishing Ltd.

  17. GIS applied to location of fires detection towers in domain area of tropical forest.

    PubMed

    Eugenio, Fernando Coelho; Rosa Dos Santos, Alexandre; Fiedler, Nilton Cesar; Ribeiro, Guido Assunção; da Silva, Aderbal Gomes; Juvanhol, Ronie Silva; Schettino, Vitor Roberto; Marcatti, Gustavo Eduardo; Domingues, Getúlio Fonseca; Alves Dos Santos, Gleissy Mary Amaral Dino; Pezzopane, José Eduardo Macedo; Pedra, Beatriz Duguy; Banhos, Aureo; Martins, Lima Deleon

    2016-08-15

    In most countries, the loss of biodiversity caused by the fires is worrying. In this sense, the fires detection towers are crucial for rapid identification of fire outbreaks and can also be used in environmental inspection, biodiversity monitoring, telecommunications mechanisms, telemetry and others. Currently the methodologies for allocating fire detection towers over large areas are numerous, complex and non-standardized by government supervisory agencies. Therefore, this study proposes and evaluates different methodologies to best location of points to install fire detection towers considering the topography, risk areas, conservation units and heat spots. Were used Geographic Information Systems (GIS) techniques and unaligned stratified systematic sampling for implementing and evaluating 9 methods for allocating fire detection towers. Among the methods evaluated, the C3 method was chosen, represented by 140 fire detection towers, with coverage of: a) 67% of the study area, b) 73.97% of the areas with high risk, c) 70.41% of the areas with very high risk, d) 70.42% of the conservation units and e) 84.95% of the heat spots in 2014. The proposed methodology can be adapted to areas of other countries. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Comparison of Damage Path Predictions for Composite Laminates by Explicit and Standard Finite Element Analysis Tools

    NASA Technical Reports Server (NTRS)

    Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.

    2006-01-01

    Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.

  19. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, K. B.; Peter, A. M.

    2017-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. This paper focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  20. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, Kyle B.; Peter, Adrian M.

    2016-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Our research focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  1. [Is there a German history of evidence-based medicine? Methodic standards of therapeutic research in the early 20th century and Paul Martini's "Methodology of therapeutic investigation" (1932)].

    PubMed

    Stoll, S; Roelcke, V; Raspe, H

    2005-07-29

    The article addresses the history of evidence-based medicine in Germany. Its aim was to reconstruct the standard of clinical-therapeutic investigation in Germany at the beginning of the 20 (th) century. By a historical investigation of five important German general medical journals for the time between 1918 and 1932 an overview of the situation of clinical investigation is given. 268 clinical trails are identified, and are analysed in view of their methodological design. Heterogeneous results are found: While few examples of sophisticated methodology exist, the design of the majority of the studies is poor. A response to the situation described can be seen in Paul Martini's book "Methodology of Therapeutic Investigation", first published in 1932. Paul Martini's biography, his criticism of the situation of clinical-therapeutic investigation of his time, the major points of his methodology and the reception of the book in Germany and abroad are described.

  2. Feasibility of "Standardized Clinician" Methodology for Patient Training on Hospital-to-Home Transitions.

    PubMed

    Wehbe-Janek, Hania; Hochhalter, Angela K; Castilla, Theresa; Jo, Chanhee

    2015-02-01

    Patient engagement in health care is increasingly recognized as essential for promoting the health of individuals and populations. This study pilot tested the standardized clinician (SC) methodology, a novel adaptation of standardized patient methodology, for teaching patient engagement skills for the complex health care situation of transitioning from a hospital back to home. Sixty-seven participants at heightened risk for hospitalization were randomly assigned to either simulation exposure-only or full-intervention group. Both groups participated in simulation scenarios with "standardized clinicians" around tasks related to hospital discharge and follow-up. The full-intervention group was also debriefed after scenario sets and learned about tools for actively participating in hospital-to-home transitions. Measures included changes in observed behaviors at baseline and follow-up and an overall program evaluation. The full-intervention group showed increases in observed tool possession (P = 0.014) and expression of their preferences and values (P = 0.043). The simulation exposure-only group showed improvement in worksheet scores (P = 0.002) and fewer engagement skills (P = 0.021). Both groups showed a decrease in telling an SC about their hospital admission (P < 0.05). Open-ended comments from the program evaluation were largely positive. Both groups benefited from exposure to the SC intervention. Program evaluation data suggest that simulation training is feasible and may provide a useful methodology for teaching patient skills for active engagement in health care. Future studies are warranted to determine if this methodology can be used to assess overall patient engagement and whether new patient learning transfers to health care encounters.

  3. A Public Health Grid (PHGrid): Architecture and value proposition for 21st century public health.

    PubMed

    Savel, T; Hall, K; Lee, B; McMullin, V; Miles, M; Stinn, J; White, P; Washington, D; Boyd, T; Lenert, L

    2010-07-01

    This manuscript describes the value of and proposal for a high-level architectural framework for a Public Health Grid (PHGrid), which the authors feel has the capability to afford the public health community a robust technology infrastructure for secure and timely data, information, and knowledge exchange, not only within the public health domain, but between public health and the overall health care system. The CDC facilitated multiple Proof-of-Concept (PoC) projects, leveraging an open-source-based software development methodology, to test four hypotheses with regard to this high-level framework. The outcomes of the four PoCs in combination with the use of the Federal Enterprise Architecture Framework (FEAF) and the newly emerging Federal Segment Architecture Methodology (FSAM) was used to develop and refine a high-level architectural framework for a Public Health Grid infrastructure. The authors were successful in documenting a robust high-level architectural framework for a PHGrid. The documentation generated provided a level of granularity needed to validate the proposal, and included examples of both information standards and services to be implemented. Both the results of the PoCs as well as feedback from selected public health partners were used to develop the granular documentation. A robust high-level cohesive architectural framework for a Public Health Grid (PHGrid) has been successfully articulated, with its feasibility demonstrated via multiple PoCs. In order to successfully implement this framework for a Public Health Grid, the authors recommend moving forward with a three-pronged approach focusing on interoperability and standards, streamlining the PHGrid infrastructure, and developing robust and high-impact public health services. Published by Elsevier Ireland Ltd.

  4. Application of machine learning methodology for pet-based definition of lung cancer

    PubMed Central

    Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.

    2010-01-01

    We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802

  5. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  6. Circulating microRNA Biomarkers as Liquid Biopsy for Cancer Patients: Pros and Cons of Current Assays

    PubMed Central

    Ono, Shigeshi; Lam, Stella; Nagahara, Makoto; Hoon, Dave S. B.

    2015-01-01

    An increasing number of studies have focused on circulating microRNAs (cmiRNA) in cancer patients’ blood for their potential as minimally-invasive biomarkers. Studies have reported the utility of assessing specific miRNAs in blood as diagnostic/prognostic biomarkers; however, the methodologies are not validated or standardized across laboratories. Unfortunately, there is often minimum limited overlap in techniques between results reported even in similar type studies on the same cancer. This hampers interpretation and reliability of cmiRNA as potential cancer biomarkers. Blood collection and processing, cmiRNA extractions, quality and quantity control of assays, defined patient population assessment, reproducibility, and reference standards all affect the cmiRNA assay results. To date, there is no reported definitive method to assess cmiRNAs. Therefore, appropriate and reliable methodologies are highly necessary in order for cmiRNAs to be used in regulated clinical diagnostic laboratories. In this review, we summarize the developments made over the past decade towards cmiRNA detection and discuss the pros and cons of the assays. PMID:26512704

  7. A frontier analysis approach for benchmarking hospital performance in the treatment of acute myocardial infarction.

    PubMed

    Stanford, Robert E

    2004-05-01

    This paper uses a non-parametric frontier model and adaptations of the concepts of cross-efficiency and peer-appraisal to develop a formal methodology for benchmarking provider performance in the treatment of Acute Myocardial Infarction (AMI). Parameters used in the benchmarking process are the rates of proper recognition of indications of six standard treatment processes for AMI; the decision making units (DMUs) to be compared are the Medicare eligible hospitals of a particular state; the analysis produces an ordinal ranking of individual hospital performance scores. The cross-efficiency/peer-appraisal calculation process is constructed to accommodate DMUs that experience no patients in some of the treatment categories. While continuing to rate highly the performances of DMUs which are efficient in the Pareto-optimal sense, our model produces individual DMU performance scores that correlate significantly with good overall performance, as determined by a comparison of the sums of the individual DMU recognition rates for the six standard treatment processes. The methodology is applied to data collected from 107 state Medicare hospitals.

  8. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  9. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  10. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  11. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  12. 7 CFR 3600.3 - Functions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... agricultural and rural economy. (2) Administering a methodological research program to improve agricultural... design and data collection methodologies to the agricultural statistics program. Major functions include...) Designing, testing, and establishing survey techniques and standards, including sample design, sample...

  13. Opto-Technical Monitoring - a Standardized Methodology to Assess the Treatment of Historical Stone Surfaces

    NASA Astrophysics Data System (ADS)

    Rahrig, M.; Drewello, R.; Lazzeri, A.

    2018-05-01

    Monitoring is an essential requirement for the planning, assessment and evaluation of conservation measures. It should be based on a standardized and reproducible observation of the historical surface. For many areas and materials suitable methods for long-term monitoring already exist. But hardly any non-destructive testing methods have been used to test new materials for conservation of damaged stone surfaces. The Nano-Cathedral project, funded by the European Union's Horizon 2020 research and innovation program, is developing new materials and technologies for preserving damaged stone surfaces of built heritage. The prototypes developed are adjusted to the needs and problems of a total of six major cultural monuments in Europe. In addition to the testing of the materials under controlled laboratory conditions, the products have been applied to trial areas on the original stone surfaces. For a location-independent standardized assessment of surface changes of the entire trial areas a monitoring method based on opto-technical, non-contact and non-destructive testing methods has been developed. This method involves a three-dimensional measurement of the surface topography using Structured-Light-Scanning and the analysis of the surfaces in different light ranges using high resolution VIS photography, as well as UV-A-fluorescence photography and reflected near-field IR photography. The paper will show the workflow of this methodology, including a detailed description of the equipment used data processing and the advantages for monitoring highly valuable stone surfaces. Alongside the theoretical discussion, the results of two measuring campaigns on trial areas of the Nano-Cathedral project will be shown.

  14. A methodology for TLD postal dosimetry audit of high-energy radiotherapy photon beams in non-reference conditions.

    PubMed

    Izewska, Joanna; Georg, Dietmar; Bera, Pranabes; Thwaites, David; Arib, Mehenna; Saravi, Margarita; Sergieva, Katia; Li, Kaibao; Yip, Fernando Garcia; Mahant, Ashok Kumar; Bulski, Wojciech

    2007-07-01

    A strategy for national TLD audit programmes has been developed by the International Atomic Energy Agency (IAEA). It involves progression through three sequential dosimetry audit steps. The first step audits are for the beam output in reference conditions for high-energy photon beams. The second step audits are for the dose in reference and non-reference conditions on the beam axis for photon and electron beams. The third step audits involve measurements of the dose in reference, and non-reference conditions off-axis for open and wedged symmetric and asymmetric fields for photon beams. Through a co-ordinated research project the IAEA developed the methodology to extend the scope of national TLD auditing activities to more complex audit measurements for regular fields. Based on the IAEA standard TLD holder for high-energy photon beams, a TLD holder was developed with horizontal arm to enable measurements 5cm off the central axis. Basic correction factors were determined for the holder in the energy range between Co-60 and 25MV photon beams. New procedures were developed for the TLD irradiation in hospitals. The off-axis measurement methodology for photon beams was tested in a multi-national pilot study. The statistical distribution of dosimetric parameters (off-axis ratios for open and wedge beam profiles, output factors, wedge transmission factors) checked in 146 measurements was 0.999+/-0.012. The methodology of TLD audits in non-reference conditions with a modified IAEA TLD holder has been shown to be feasible.

  15. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Texas. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  16. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Minnesota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Minnesota. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  17. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Indiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Indiana. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  18. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Florida. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  19. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Maine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Maine. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  20. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Vermont

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Vermont. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  1. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Michigan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Michigan. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  2. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Alabama. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  3. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Hampshire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Hampshire. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  4. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Mexico. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  5. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Colorado. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  6. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Washington. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  7. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Montana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Montana. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  8. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the District of Columbia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the District of Columbia. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  9. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Massachusetts. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  10. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Oregon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Oregon. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  11. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Wisconsin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Wisconsin. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  12. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Ohio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Ohio. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  13. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of South Carolina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of South Carolina. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  14. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of North Carolina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of North Carolina. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  15. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Iowa. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  16. Collective Protection Factors Methodology Development Using High Concentration Polydisperse Inert Aerosols: Results of FY09 Testing

    DTIC Science & Technology

    2012-04-01

    Rupprccht & Patashnick (Thermo Scientific) Sequoia Turner*** submicron all airborne sizes 0.4-10 (8 stages) 0.523-20 (5 channels) 0.3-20 (6...Products (San Diego, CA) ** Clean Air Engineering, Inc. (Palatine, IL) *** Sequoia Turner, Block Scientific (Bohemia, NY) IS Table 2. Coordinate...Corporation, New Port Richey, FL [type A/E]), analyzed by extraction, and quantitated by standard curve extrapolation using a Sequoia Turner model

  17. Ammunition for Law Enforcements. Part I. Methodology for Evaluating Relative Stopping Power and Results

    DTIC Science & Technology

    1979-10-01

    expansion, relative stopping power according to one or another formula or test method, penetration, ricochet, and other fragments. In the past, the solution...the data gathered for each test round, in the following documents: a. "Ammunition For Law Enforcement: Part II, Data Obtained for Bullets Penetrating...high velocity testing , chamber pressures exceeded those permissible in standard handguns. For safety, then, Mann test barrels were used. At this point

  18. Application of the Critical Success Factor Methodology to DoD Organization.

    DTIC Science & Technology

    1984-09-01

    high technology manufacturing, banking, airline, insurance, railway, and automobile . Sullen (6t22-25) lists the current CSFs of the 14 S automobile ...industry as image, quality dealer system, cost control, and meting energy standards. However, in 1981 the automobile CSFs included only styling, quality...bearing on current car purchases as well as future car buys. And finally cost control influenced the auto industry as a CSF, since profit per automobile had

  19. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  20. Systematic and progressive implementation of the centers of excellence for rheumatoid arthritis: a methodological proposal.

    PubMed

    Santos-Moreno, Pedro; Caballero-Uribe, Carlo V; Massardo, Maria Loreto; Maldonado, Claudio Galarza; Soriano, Enrique R; Pineda, Carlos; Cardiel, Mario; Benavides, Juan Alberto; Beltrán, Paula Andrea

    2017-12-01

    The implementation of excellence centers in specific diseases has been gaining recognition in the field of health; specifically in rheumatoid arthritis, where the prognosis of the disease is related to an early diagnosis and a timely intervention, it is necessary that the provision of health services is developed in an environment of quality, opportunity, and safety with the highest standards of care. A methodology that allows this implementation in such a way that is achievable by the most of the care centers is a priority to achieve a better attention to populations with this disease. In this paper, we propose a systematic and progressive methodology that will help all the institutions to develop successful models without faltering in the process. The expected impact on public health is defined by a better effective coverage of high-quality treatments, obtaining better health outcomes with safety and accessibility that reduces the budgetary impact for the health systems of our countries.

  1. Selected Streamflow Statistics and Regression Equations for Predicting Statistics at Stream Locations in Monroe County, Pennsylvania

    USGS Publications Warehouse

    Thompson, Ronald E.; Hoffman, Scott A.

    2006-01-01

    A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.

  2. Lognormal Kalman filter for assimilating phase space density data in the radiation belts

    NASA Astrophysics Data System (ADS)

    Kondrashov, D.; Ghil, M.; Shprits, Y.

    2011-11-01

    Data assimilation combines a physical model with sparse observations and has become an increasingly important tool for scientists and engineers in the design, operation, and use of satellites and other high-technology systems in the near-Earth space environment. Of particular importance is predicting fluxes of high-energy particles in the Van Allen radiation belts, since these fluxes can damage spaceborne platforms and instruments during strong geomagnetic storms. In transiting from a research setting to operational prediction of these fluxes, improved data assimilation is of the essence. The present study is motivated by the fact that phase space densities (PSDs) of high-energy electrons in the outer radiation belt—both simulated and observed—are subject to spatiotemporal variations that span several orders of magnitude. Standard data assimilation methods that are based on least squares minimization of normally distributed errors may not be adequate for handling the range of these variations. We propose herein a modification of Kalman filtering that uses a log-transformed, one-dimensional radial diffusion model for the PSDs and includes parameterized losses. The proposed methodology is first verified on model-simulated, synthetic data and then applied to actual satellite measurements. When the model errors are sufficiently smaller then observational errors, our methodology can significantly improve analysis and prediction skill for the PSDs compared to those of the standard Kalman filter formulation. This improvement is documented by monitoring the variance of the innovation sequence.

  3. Prevalence of Overweight and Obesity among Female Adolescents in Jordan: A comparison between Two International Reference Standards

    PubMed Central

    O. Musaiger, Abdulrahman; Al-Mannai, Mariam; Tayyem, Reema

    2013-01-01

    Objective: To find out the prevalence of overweight and obesity among female adolescents in Jordan. Methodology: A cross-sectional survey on females aged 15–18 in Amman, Jordan, was carried out using a multistage stratified random sampling method. The total sample size was 475 girls. Weight and height were measured and body mass index for age was used to determine overweight and obesity using the IOTF and WHO international standards. Results: The prevalence of overweight and obesity decreased with age. The highest prevalence of overweight and obesity was reported at age 15 (24.4% and 8.9%, respectively). The WHO standard showed a higher prevalence of obesity than the IOTF standard in all age groups. Conclusions: Overweight and obesity are serious public health problems among adolescents in Jordan, using both international standards. A program to combat obesity among schoolchildren, therefore, should be given a high priority in school health policy in Jordan. PMID:24353605

  4. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.

    PubMed

    Yiu, Sean; Tom, Brian Dm

    2017-01-01

    Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.

  5. Refining a methodology for determining the economic impacts of transportation improvements.

    DOT National Transportation Integrated Search

    2012-07-01

    Estimating the economic impact of transportation improvements has previously proven to be a difficult task. After an exhaustive literature review, it was clear that the transportation profession lacked standards and methodologies for determining econ...

  6. Expanded uncertainty estimation methodology in determining the sandy soils filtration coefficient

    NASA Astrophysics Data System (ADS)

    Rusanova, A. D.; Malaja, L. D.; Ivanov, R. N.; Gruzin, A. V.; Shalaj, V. V.

    2018-04-01

    The combined standard uncertainty estimation methodology in determining the sandy soils filtration coefficient has been developed. The laboratory researches were carried out which resulted in filtration coefficient determination and combined uncertainty estimation obtaining.

  7. 76 FR 61287 - Request for Public Comment on the United States Standards for Barley

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-04

    ... barley marketing and define U.S. barley quality in the domestic and global marketplace. The standards define commonly used industry terms; contain basic principles governing the application of standards... standards using approved methodologies and can be applied at any point in the marketing chain. Furthermore...

  8. Air Force Energy Plan 2010

    DTIC Science & Technology

    2009-11-24

    production on Air Bases  Field the Critical Asset Prioritization Methodology ( CAPM ) tool  Manage costs  Provide energy leadership throughout the Air...residing on military installations • Field the Critical Asset Prioritization Methodology ( CAPM ) tool. This CAPM tool will allow prioritization of Air...fielding of the Critical Asset Prioritization Methodology ( CAPM ) tool and the adoption of financial standards to enable transparency across Air

  9. Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.

    PubMed

    Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R

    1996-01-01

    Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.

  10. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  11. Standardizing economic analysis in prevention will require substantial effort.

    PubMed

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  12. 76 FR 52892 - Energy Conservation Program: Energy Conservation Standards for Fluorescent Lamp Ballasts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-24

    ... between the DOE test data and the data submitted by NEMA; describe the methodological changes DOE is... differences between test data obtained by DOE and test data submitted by NEMA; (3) describe the methodological...

  13. Methodology for the Assessment of the Macroeconomic Impacts of Stricter CAFE Standards - Addendum

    EIA Publications

    2002-01-01

    This assessment of the economic impacts of Corporate Average Fuel Economy (CAFÉ) standards marks the first time the Energy Information Administration has used the new direct linkage of the DRI-WEFA Macroeconomic Model to the National Energy Modeling System (NEMS) in a policy setting. This methodology assures an internally consistent solution between the energy market concepts forecast by NEMS and the aggregate economy as forecast by the DRI-WEFA Macroeconomic Model of the U.S. Economy.

  14. Non-destructive fraud detection in rosehip oil by MIR spectroscopy and chemometrics.

    PubMed

    Santana, Felipe Bachion de; Gontijo, Lucas Caixeta; Mitsutake, Hery; Mazivila, Sarmento Júnior; Souza, Leticia Maria de; Borges Neto, Waldomiro

    2016-10-15

    Rosehip oil (Rosa eglanteria L.) is an important oil in the food, pharmaceutical and cosmetic industries. However, due to its high added value, it is liable to adulteration with other cheaper or lower quality oils. With this perspective, this work provides a new simple, fast and accurate methodology using mid-infrared (MIR) spectroscopy and partial least squares discriminant analysis (PLS-DA) as a means to discriminate authentic rosehip oil from adulterated rosehip oil containing soybean, corn and sunflower oils in different proportions. The model showed excellent sensitivity and specificity with 100% correct classification. Therefore, the developed methodology is a viable alternative for use in the laboratory and industry for standard quality analysis of rosehip oil since it is fast, accurate and non-destructive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A Methodology for Protective Vibration Monitoring of Hydropower Units Based on the Mechanical Properties.

    PubMed

    Nässelqvist, Mattias; Gustavsson, Rolf; Aidanpää, Jan-Olov

    2013-07-01

    It is important to monitor the radial loads in hydropower units in order to protect the machine from harmful radial loads. Existing recommendations in the standards regarding the radial movements of the shaft and bearing housing in hydropower units, ISO-7919-5 (International Organization for Standardization, 2005, "ISO 7919-5: Mechanical Vibration-Evaluation of Machine Vibration by Measurements on Rotating Shafts-Part 5: Machine Sets in Hydraulic Power Generating and Pumping Plants," Geneva, Switzerland) and ISO-10816-5 (International Organization for Standardization, 2000, "ISO 10816-5: Mechanical Vibration-Evaluation of Machine Vibration by Measurements on Non-Rotating Parts-Part 5: Machine Sets in Hydraulic Power Generating and Pumping Plants," Geneva, Switzerland), have alarm levels based on statistical data and do not consider the mechanical properties of the machine. The synchronous speed of the unit determines the maximum recommended shaft displacement and housing acceleration, according to these standards. This paper presents a methodology for the alarm and trip levels based on the design criteria of the hydropower unit and the measured radial loads in the machine during operation. When a hydropower unit is designed, one of its design criteria is to withstand certain loads spectra without the occurrence of fatigue in the mechanical components. These calculated limits for fatigue are used to set limits for the maximum radial loads allowed in the machine before it shuts down in order to protect itself from damage due to high radial loads. Radial loads in hydropower units are caused by unbalance, shape deviations, dynamic flow properties in the turbine, etc. Standards exist for balancing and manufacturers (and power plant owners) have recommendations for maximum allowed shape deviations in generators. These standards and recommendations determine which loads, at a maximum, should be allowed before an alarm is sent that the machine needs maintenance. The radial bearing load can be determined using load cells, bearing properties multiplied by shaft displacement, or bearing bracket stiffness multiplied by housing compression or movement. Different load measurement methods should be used depending on the design of the machine and accuracy demands in the load measurement. The methodology presented in the paper is applied to a 40 MW hydropower unit; suggestions are presented for the alarm and trip levels for the machine based on the mechanical properties and radial loads.

  16. High-throughput alternative splicing detection using dually constrained correspondence analysis (DCCA).

    PubMed

    Baty, Florent; Klingbiel, Dirk; Zappa, Francesco; Brutsche, Martin

    2015-12-01

    Alternative splicing is an important component of tumorigenesis. Recent advent of exon array technology enables the detection of alternative splicing at a genome-wide scale. The analysis of high-throughput alternative splicing is not yet standard and methodological developments are still needed. We propose a novel statistical approach-Dually Constrained Correspondence Analysis-for the detection of splicing changes in exon array data. Using this methodology, we investigated the genome-wide alteration of alternative splicing in patients with non-small cell lung cancer treated by bevacizumab/erlotinib. Splicing candidates reveal a series of genes related to carcinogenesis (SFTPB), cell adhesion (STAB2, PCDH15, HABP2), tumor aggressiveness (ARNTL2), apoptosis, proliferation and differentiation (PDE4D, FLT3, IL1R2), cell invasion (ETV1), as well as tumor growth (OLFM4, FGF14), tumor necrosis (AFF3) or tumor suppression (TUSC3, CSMD1, RHOBTB2, SERPINB5), with indication of known alternative splicing in a majority of genes. DCCA facilitates the identification of putative biologically relevant alternative splicing events in high-throughput exon array data. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Proceedings of the Seminar on the DOD Computer Security Initiative (4th) Held at the National Bureau of Standards, Gaithersburg, Maryland on August 10-12, 1981.

    DTIC Science & Technology

    1981-01-01

    comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes

  18. Combat Stress: A Collateral Effect in the Operational Effectiveness Loss Multiplier (OELM) Methodology

    DTIC Science & Technology

    2015-02-01

    5202, Draft Final (Alexandria, VA: IDA, April 2015), 10-4. 14 North Atlantic Treaty Organization (NATO) Standardization Agency ( NSA ), NATO Glossary of...Belgium: NSA , 2012), 2-C-2. 15 Disraelly et al., “A New Methodology for CBRN Casualty Estimation,” 228. 16 Disraelly et al., A Methodology for...20 NATO NSA , AAP-06, 2-K-1. 21 Ibid., 2-D-6. 22 Disraelly et al., A Methodology for Examining Collateral Effects on Military Operations during

  19. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  20. Minimum reporting standards for clinical research on groin pain in athletes

    PubMed Central

    Delahunt, Eamonn; Thorborg, Kristian; Khan, Karim M; Robinson, Philip; Hölmich, Per; Weir, Adam

    2015-01-01

    Groin pain in athletes is a priority area for sports physiotherapy and sports medicine research. Heterogeneous studies with low methodological quality dominate research related to groin pain in athletes. Low-quality studies undermine the external validity of research findings and limit the ability to generalise findings to the target patient population. Minimum reporting standards for research on groin pain in athletes are overdue. We propose a set of minimum reporting standards based on best available evidence to be utilised in future research on groin pain in athletes. Minimum reporting standards are provided in relation to: (1) study methodology, (2) study participants and injury history, (3) clinical examination, (4) clinical assessment and (5) radiology. Adherence to these minimum reporting standards will strengthen the quality and transparency of research conducted on groin pain in athletes. This will allow an easier comparison of outcomes across studies in the future. PMID:26031644

  1. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  2. Enhanced Satellite Remote Sensing of Coastal Waters Using Spatially Improved Bio-Optical Products from SNPP-VIIRS

    DTIC Science & Technology

    2015-01-01

    a spatial resolution of 250-m. The Gumley et al. computation for MODIS sharpening is given as a ratio of high to low resolution top of the atmosphere...NIR) correction (Stumpf, Arnone, Gould, Martinolich, & Ransibrahamanakul, 2003). Standard flagswere used tomask interference from land, clouds , sun...technique This new approach expands on the methodology described by Gumley et al. (2010), with somemodifications. We will compute a sim- ilar spatial

  3. High Performance Embedded Computing Software Initiative (HPEC-SI)

    DTIC Science & Technology

    2004-08-20

    models , methodologies, and standards i i i i i i i i i l i i i i l , l i , Slide-5 www.hpec-si.org MITRE AFRLMIT Lincoln...Linderman AFRL Dr. Richard Games MITRE Mr. John Grosh OSD Mr. Bob Graybill DARPA/ITO Dr. Keith Bromley SPAWAR Dr. Mark Richards GTRI Dr. Jeremy Kepner...Capt. Bergmann AFRL Dr. Tony Skjellum MPISoft ... Advanced Research Mr. Bob Graybill DARPA • Partnership with ODUSD(S&T), Government Labs, FFRDCs

  4. Safety in construction--a comprehensive description of the characteristics of high safety standards in construction work, from the combined perspective of supervisors and experienced workers.

    PubMed

    Törner, Marianne; Pousette, Anders

    2009-01-01

    The often applied engineering approach to safety management in the construction industry needs to be supplemented by organizational measures and measures based on how people conceive and react to their social environment. This requires in-depth knowledge of the broad preconditions for high safety standards in construction. The aim of the study was to comprehensively describe the preconditions and components of high safety standards in the construction industry from the perspective of both experienced construction workers and first-line managers. Five worker safety representatives and 19 first-line managers were interviewed, all strategically selected from within a large Swedish construction project. Phenomenographic methodology was used for data acquisition and analysis and to categorize the information. Nine informants verified the results. The study identified four main categories of work safety preconditions and components: (1) Project characteristics and nature of the work, which set the limits of safety management; (2) Organization and structures, with the subcategories planning, work roles, procedures, and resources; (3) Collective values, norms, and behaviors, with the subcategories climate and culture, and interaction and cooperation; and (4) Individual competence and attitudes, with the subcategories knowledge, ability and experience, and individual attitudes. The results comprehensively describe high safety standards in construction, incorporating organizational, group, individual, and technical aspects. High-quality interaction between different organizational functions and hierarchical levels stood out as important aspects of safety. The results are discussed in relation to previous research into safety and into the social-psychological preconditions for other desired outcomes in occupational settings. The results can guide construction companies in planning and executing construction projects to a high safety standard.

  5. Software production methodology tested project

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Arizona. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  7. A standard methodology for the analysis, recording, and control of verbal behavior

    PubMed Central

    Drash, Philip W.; Tudor, Roger M.

    1991-01-01

    Lack of a standard methodology has been one of the major obstacles preventing advancement of behavior analytic research in verbal behavior. This article presents a standard method for the analysis, recording, and control of verbal behavior that overcomes several major methodological problems that have hindered operant research in verbal behavior. The system divides all verbal behavior into four functional response classes, correct, error, no response, and inappropriate behavior, from which all vocal responses of a subject may be classified and consequated. The effects of contingencies of reinforcement on verbal operants within each category are made immediately visible to the researcher as changes in frequency of response. Incorporating frequency of response within each category as the unit of response allows both rate and probability of verbal response to be utilized as basic dependent variables. This method makes it possible to record and consequate verbal behavior in essentially the same way as any other operant response. It may also facilitate an experimental investigation of Skinner's verbal response categories. PMID:22477629

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Hawaii. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Connecticut. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  10. Toward a new culture in verified quantum operations

    NASA Astrophysics Data System (ADS)

    Flammia, Steve

    Measuring error rates of quantum operations has become an indispensable component in any aspiring platform for quantum computation. As the quality of controlled quantum operations increases, the demands on the accuracy and precision with which we measure these error rates also grows. However, well-meaning scientists that report these error measures are faced with a sea of non-standardized methodologies and are often asked during publication for only coarse information about how their estimates were obtained. Moreover, there are serious incentives to use methodologies and measures that will continually produce numbers that improve with time to show progress. These problems will only get exacerbated as our typical error rates go from 1 in 100 to 1 in 1000 or less. This talk will survey existing challenges presented by the current paradigm and offer some suggestions for solutions than can help us move toward fair and standardized methods for error metrology in quantum computing experiments, and towards a culture that values full disclose of methodologies and higher standards for data analysis.

  11. Systematic review of the methodological quality of controlled trials evaluating Chinese herbal medicine in patients with rheumatoid arthritis.

    PubMed

    Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E

    2017-03-01

    We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Studies evaluating CHM often fail to meet expected methodological criteria, and high-quality evidence is lacking. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  12. The Minnesota Grading System Using Fundus Autofluorescence of Eye Bank Eyes: A Correlation To Age-Related Macular Degeneration (An AOS Thesis)

    PubMed Central

    Olsen, Timothy W.

    2008-01-01

    Purpose To establish a grading system of eye bank eyes using fundus autofluorescence (FAF) and identify a methodology that correlates FAF to age-related macular degeneration (AMD) with clinical correlation to the Age-Related Eye Disease Study (AREDS). Methods Two hundred sixty-two eye bank eyes were evaluated using a standardized analysis of FAF. Measurements were taken with the confocal scanning laser ophthalmoscope (cSLO). First, high-resolution, digital, stereoscopic, color images were obtained and graded according to AREDS criteria. With the neurosensory retina removed, mean FAF values were obtained from cSLO images using software analysis that excludes areas of atrophy and other artifact, generating an FAF value from a grading template. Age and AMD grade were compared to FAF values. An internal fluorescence reference standard was tested. Results Standardization of the cSLO machine demonstrated that reliable data could be acquired after a 1-hour warm-up. Images obtained prior to 1 hour had falsely elevated levels of FAF. In this initial analysis, there was no statistical correlation of age to mean FAF. There was a statistically significant decrease in FAF from AREDS grade 1, 2 to 3, 4 (P < .0001). An internal fluorescent standard may serve as a quantitative reference. Conclusions The Minnesota Grading System (MGS) of FAF (MGS-FAF) establishes a standardized methodology for grading eye bank tissue to quantify FAF compounds in the retinal pigment epithelium and correlate these findings to the AREDS. Future studies could then correlate specific FAF to the aging process, histopathology AMD phenotypes, and other maculopathies, as well as to analyze the biochemistry of autofluorescent fluorophores. PMID:19277247

  13. The Minnesota Grading System using fundus autofluorescence of eye bank eyes: a correlation to age-related macular degeneration (an AOS thesis).

    PubMed

    Olsen, Timothy W

    2008-01-01

    To establish a grading system of eye bank eyes using fundus autofluorescence (FAF) and identify a methodology that correlates FAF to age-related macular degeneration (AMD) with clinical correlation to the Age-Related Eye Disease Study (AREDS). Two hundred sixty-two eye bank eyes were evaluated using a standardized analysis of FAF. Measurements were taken with the confocal scanning laser ophthalmoscope (cSLO). First, high-resolution, digital, stereoscopic, color images were obtained and graded according to AREDS criteria. With the neurosensory retina removed, mean FAF values were obtained from cSLO images using software analysis that excludes areas of atrophy and other artifact, generating an FAF value from a grading template. Age and AMD grade were compared to FAF values. An internal fluorescence reference standard was tested. Standardization of the cSLO machine demonstrated that reliable data could be acquired after a 1-hour warm-up. Images obtained prior to 1 hour had falsely elevated levels of FAF. In this initial analysis, there was no statistical correlation of age to mean FAF. There was a statistically significant decrease in FAF from AREDS grade 1, 2 to 3, 4 (P < .0001). An internal fluorescent standard may serve as a quantitative reference. The Minnesota Grading System (MGS) of FAF (MGS-FAF) establishes a standardized methodology for grading eye bank tissue to quantify FAF compounds in the retinal pigment epithelium and correlate these findings to the AREDS. Future studies could then correlate specific FAF to the aging process, histopathology AMD phenotypes, and other maculopathies, as well as to analyze the biochemistry of autofluorescent fluorophores.

  14. The long-term cost-effectiveness of varenicline (12-week standard course and 12 + 12-week extended course) vs. other smoking cessation strategies in Canada.

    PubMed

    von Wartburg, M; Raymond, V; Paradis, P E

    2014-05-01

    Smoking is the leading risk factor for preventable morbidity and mortality as a result of heart and lung diseases and various forms of cancer. Reimbursement coverage for smoking cessation therapies remains limited in Canada and the United States despite the health and economic benefits of smoking cessation. This study aimed to evaluate the long-term cost-effectiveness of varenicline compared with other smoking cessation interventions in Canada using the Benefits of Smoking Cessation on Outcomes (BENESCO) model. Efficacy rates of the standard course (12 weeks) varenicline, extended course (12 + 12 weeks) varenicline, bupropion, nicotine replacement therapy and unaided intervention were derived based on a published mixed treatment comparison methodology and analysed within a Markov cohort model to estimate their cost-effectiveness over the lifetime cycle. Study cohort, smoking rates and prevalence, incidence and mortality of smoking-related diseases were calibrated to represent the Canadian population. Over the subjects' lifetime, both the standard and the extended course of varenicline are shown to dominate (e.g. less costly and more effective) all other alternative smoking cessation interventions considered. Compared with the standard varenicline treatment course, the extended course is highly cost-effective with an incremental cost-effectiveness ratio (ICER) less than $4000 per quality-adjusted life year. Including indirect cost and benefits of smoking cessation interventions further strengthens the result with the extended course of varenicline dominating all other alternatives considered. Evidence from complex smoking cessation models requiring numerous inputs and assumptions should be assessed in conjunction with evidence from other methodologies. The standard and extended courses of varenicline are decidedly cost-effective treatment regimes compared with alternative smoking cessation interventions and can provide significant cost savings to the healthcare system. © 2014 John Wiley & Sons Ltd.

  15. Psychometric evaluation of commonly used game-specific skills tests in rugby: A systematic review

    PubMed Central

    Oorschot, Sander; Chiwaridzo, Matthew; CM Smits-Engelsman, Bouwien

    2017-01-01

    Objectives To (1) give an overview of commonly used game-specific skills tests in rugby and (2) evaluate available psychometric information of these tests. Methods The databases PubMed, MEDLINE CINAHL and Africa Wide information were systematically searched for articles published between January 1995 and March 2017. First, commonly used game-specific skills tests were identified. Second, the available psychometrics of these tests were evaluated and the methodological quality of the studies assessed using the Consensus-based Standards for the selection of health Measurement Instruments checklist. Studies included in the first step had to report detailed information on the construct and testing procedure of at least one game-specific skill, and studies included in the second step had additionally to report at least one psychometric property evaluating reliability, validity or responsiveness. Results 287 articles were identified in the first step, of which 30 articles met the inclusion criteria and 64 articles were identified in the second step of which 10 articles were included. Reactive agility, tackling and simulated rugby games were the most commonly used tests. All 10 studies reporting psychometrics reported reliability outcomes, revealing mainly strong evidence. However, all studies scored poor or fair on methodological quality. Four studies reported validity outcomes in which mainly moderate evidence was indicated, but all articles had fair methodological quality. Conclusion Game-specific skills tests indicated mainly high reliability and validity evidence, but the studies lacked methodological quality. Reactive agility seems to be a promising domain, but the specific tests need further development. Future high methodological quality studies are required in order to develop valid and reliable test batteries for rugby talent identification. Trial registration number PROSPERO CRD42015029747. PMID:29259812

  16. 76 FR 65504 - Proposed Agency Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ..., including the validity of the methodology and assumptions used; (c) ways to enhance the quality, utility... Reliability Standard, FAC- 008-3--Facility Ratings, developed by the North American Electric Reliability... Reliability Standard FAC- 008-3 is pending before the Commission. The proposed Reliability Standard modifies...

  17. [Methods for evaluating diagnostic tests in Enfermedades Infecciosas y Microbiología Clínica].

    PubMed

    Ramos, J M; Hernández, I

    1998-04-01

    In the field of infectious diseases and clinical microbiology, the evaluation of diagnostic tests (DT) is an important research area. The specific difficulties of this type of research has motivated that have not caught the severity methodological of others areas of clinical research. This article try to asses and characterize the methodology of articles about DT published in Enfermedades Infecciosas y Microbiología Clínica (EIMC) journal. Forty-five articles was selected in the EIMC journal during the 1990-1996 period, because of determinate the sensitivity and specificity of different DT. Methodological standards, extensively accepted was used. In all of articles, except one (98%) the gold standard was specified yours use, however in 4 studies (9%) include the DT in the gold standard (incorporation bias). The correct description of DT was reported in 75% of cases, but only in 11% cases the reproducibility of test was evaluated. The description of source of reference population, standard of inclusion and spectrum of composition was described in 58, 33 and 40% of articles, respectively. In 33% of studies presented workup bias, only 6% commented blind-analysis of results, and 11% presented indeterminate test results. Half of the studies reported test indexes for clinical subgroups, only one article (2%) provided numerical precision for test indexes, and only 7% reported receiver operating characteristics curves. The methodological quality of DT research in the EIMC journal may improve in different aspects of design and presentation of results.

  18. Protocol - realist and meta-narrative evidence synthesis: Evolving Standards (RAMESES)

    PubMed Central

    2011-01-01

    Background There is growing interest in theory-driven, qualitative and mixed-method approaches to systematic review as an alternative to (or to extend and supplement) conventional Cochrane-style reviews. These approaches offer the potential to expand the knowledge base in policy-relevant areas - for example by explaining the success, failure or mixed fortunes of complex interventions. However, the quality of such reviews can be difficult to assess. This study aims to produce methodological guidance, publication standards and training resources for those seeking to use the realist and/or meta-narrative approach to systematic review. Methods/design We will: [a] collate and summarise existing literature on the principles of good practice in realist and meta-narrative systematic review; [b] consider the extent to which these principles have been followed by published and in-progress reviews, thereby identifying how rigour may be lost and how existing methods could be improved; [c] using an online Delphi method with an interdisciplinary panel of experts from academia and policy, produce a draft set of methodological steps and publication standards; [d] produce training materials with learning outcomes linked to these steps; [e] pilot these standards and training materials prospectively on real reviews-in-progress, capturing methodological and other challenges as they arise; [f] synthesise expert input, evidence review and real-time problem analysis into more definitive guidance and standards; [g] disseminate outputs to audiences in academia and policy. The outputs of the study will be threefold: 1. Quality standards and methodological guidance for realist and meta-narrative reviews for use by researchers, research sponsors, students and supervisors 2. A 'RAMESES' (Realist and Meta-review Evidence Synthesis: Evolving Standards) statement (comparable to CONSORT or PRISMA) of publication standards for such reviews, published in an open-access academic journal. 3. A training module for researchers, including learning outcomes, outline course materials and assessment criteria. Discussion Realist and meta-narrative review are relatively new approaches to systematic review whose overall place in the secondary research toolkit is not yet fully established. As with all secondary research methods, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency of studies. PMID:21843376

  19. Methodology for Knowledge Synthesis of the Management of Vaccination Pain and Needle Fear.

    PubMed

    Taddio, Anna; McMurtry, C Meghan; Shah, Vibhuti; Yoon, Eugene W; Uleryk, Elizabeth; Pillai Riddell, Rebecca; Lang, Eddy; Chambers, Christine T; Noel, Melanie; MacDonald, Noni E

    2015-10-01

    A knowledge synthesis was undertaken to inform the development of a revised and expanded clinical practice guideline about managing vaccination pain in children to include the management of pain across the lifespan and the management of fear in individuals with high levels of needle fear. This manuscript describes the methodological details of the knowledge synthesis and presents the list of included clinical questions, critical and important outcomes, search strategy, and search strategy results. The Grading of Assessments, Recommendations, Development and Evaluation (GRADE) and Cochrane methodologies provided the general framework. The project team voted on clinical questions for inclusion and critically important and important outcomes. A broad search strategy was used to identify relevant randomized-controlled trials and quasi-randomized-controlled trials. Quality of research evidence was assessed using the Cochrane risk of bias tool and quality across studies was assessed using GRADE. Multiple measures of the same construct within studies (eg, observer-rated and parent-rated infant distress) were combined before pooling. The standardized mean difference and 95% confidence intervals (CI) or relative risk and 95% CI was used to express the effects of an intervention. Altogether, 55 clinical questions were selected for inclusion in the knowledge synthesis; 49 pertained to pain management during vaccine injections and 6 pertained to fear management in individuals with high levels of needle fear. Pain, fear, and distress were typically prioritized as critically important outcomes across clinical questions. The search strategy identified 136 relevant studies. This manuscript describes the methodological details of a knowledge synthesis about pain management during vaccination and fear management in individuals with high levels of needle fear. Subsequent manuscripts in this series will present the results for the included questions.

  20. Stochastic approach for radionuclides quantification

    NASA Astrophysics Data System (ADS)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  1. Online dynamical downscaling of temperature and precipitation within the iLOVECLIM model (version 1.1)

    NASA Astrophysics Data System (ADS)

    Quiquet, Aurélien; Roche, Didier M.; Dumas, Christophe; Paillard, Didier

    2018-02-01

    This paper presents the inclusion of an online dynamical downscaling of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km × 40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the downscaling can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.

  2. Robust ridge regression estimators for nonlinear models with applications to high throughput screening assay data.

    PubMed

    Lim, Changwon

    2015-03-30

    Nonlinear regression is often used to evaluate the toxicity of a chemical or a drug by fitting data from a dose-response study. Toxicologists and pharmacologists may draw a conclusion about whether a chemical is toxic by testing the significance of the estimated parameters. However, sometimes the null hypothesis cannot be rejected even though the fit is quite good. One possible reason for such cases is that the estimated standard errors of the parameter estimates are extremely large. In this paper, we propose robust ridge regression estimation procedures for nonlinear models to solve this problem. The asymptotic properties of the proposed estimators are investigated; in particular, their mean squared errors are derived. The performances of the proposed estimators are compared with several standard estimators using simulation studies. The proposed methodology is also illustrated using high throughput screening assay data obtained from the National Toxicology Program. Copyright © 2014 John Wiley & Sons, Ltd.

  3. How to Select a Questionnaire with a Good Methodological Quality?

    PubMed

    Paiva, Saul Martins; Perazzo, Matheus de França; Ortiz, Fernanda Ruffo; Pordeus, Isabela Almeida; Martins-Júnior, Paulo Antônio

    2018-01-01

    In the last decades, several instruments have been used to evaluate the impact of oral health problems on the oral health-related quality of life (OHRQoL) of individuals. However, some instruments lack thorough methodological validation or present conceptual differences that hinder comparisons with instruments. Thus, it can be difficult to clinicians and researchers to select a questionnaire that accurately reflect what are really meaningful to individuals. This short communication aimed to discuss the importance of use an appropriate checklist to select an instrument with a good methodological quality. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist was developed to provide tools for evidence-based instrument selection. The COSMIN checklist comprises ten boxes that evaluate whether a study meets the standard for good methodological quality and two additional boxes to meet studies that use the Item Response Theory method and general requirements for results generalization, resulting in four steps to be followed. In this way, it is required at least some expertise in psychometrics or clinimetrics to a wide-ranging use of this checklist. The COSMIN applications include its use to ensure the standardization of cross-cultural adaptations and safer comparisons between measurement studies and evaluation of methodological quality of systematic reviews of measurement properties. Also, it can be used by students when training about measurement properties and by editors and reviewers when revising manuscripts on this topic. The popularization of COSMIN checklist is therefore necessary to improve the selection and evaluation of health measurement instruments.

  4. "Assessing the methodological quality of systematic reviews in radiation oncology: A systematic review".

    PubMed

    Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen

    2017-10-01

    The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. 42 CFR 493.2001 - Establishment and function of the Clinical Laboratory Improvement Advisory Committee.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... laboratory testing or methodology, and others as approved by HHS. (c) HHS will designate specialized...: (1) Criteria for categorizing nonwaived testing; (2) Determination of waived tests; (3) Personnel standards; (4) Facility administration and quality systems standards. (5) Proficiency testing standards; (6...

  6. 42 CFR 493.2001 - Establishment and function of the Clinical Laboratory Improvement Advisory Committee.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... laboratory testing or methodology, and others as approved by HHS. (c) HHS will designate specialized...: (1) Criteria for categorizing nonwaived testing; (2) Determination of waived tests; (3) Personnel standards; (4) Facility administration and quality systems standards. (5) Proficiency testing standards; (6...

  7. 42 CFR 493.2001 - Establishment and function of the Clinical Laboratory Improvement Advisory Committee.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... laboratory testing or methodology, and others as approved by HHS. (c) HHS will designate specialized...: (1) Criteria for categorizing nonwaived testing; (2) Determination of waived tests; (3) Personnel standards; (4) Facility administration and quality systems standards. (5) Proficiency testing standards; (6...

  8. Updated Design Standards and Guidance from the What Works Clearinghouse: Regression Discontinuity Designs and Cluster Designs

    ERIC Educational Resources Information Center

    Cole, Russell; Deke, John; Seftor, Neil

    2016-01-01

    The What Works Clearinghouse (WWC) maintains design standards to identify rigorous, internally valid education research. As education researchers advance new methodologies, the WWC must revise its standards to include an assessment of the new designs. Recently, the WWC has revised standards for two emerging study designs: regression discontinuity…

  9. The Empirical and Moral Foundations of the ISLLC Standards

    ERIC Educational Resources Information Center

    Murphy, Joseph

    2015-01-01

    Purpose: The purpose of this paper is to unpack the foundations for the national standards for school leaders in the USA. The author examines some of the background of the Standards from 1996 to 2015. The author explores the two foundations on which the ISLLC Standards rest, academic press and supportive community. Design/methodology/approach:…

  10. Moving to the Next Generation of Standards for Science: Building on Recent Practices. CRESST Report 762

    ERIC Educational Resources Information Center

    Herman, Joan L.

    2009-01-01

    In this report, Joan Herman, director for the National Center for Research, on Evaluation, Standards, & Student Testing (CRESST) recommends that the new generation of science standards be based on lessons learned from current practice and on recent examples of standards-development methodology. In support of this, recent, promising efforts to…

  11. Cryogenic insulation standard data and methodologies

    NASA Astrophysics Data System (ADS)

    Demko, J. A.; Fesmire, J. E.; Johnson, W. L.; Swanger, A. M.

    2014-01-01

    Although some standards exist for thermal insulation, few address the sub-ambient temperature range and cold-side temperatures below 100 K. Standards for cryogenic insulation systems require cryostat testing and data analysis that will allow the development of the tools needed by design engineers and thermal analysts for the design of practical cryogenic systems. Thus, this critically important information can provide reliable data and methodologies for industrial efficiency and energy conservation. Two Task Groups have been established in the area of cryogenic insulation systems Under ASTM International's Committee C16 on Thermal Insulation. These are WK29609 - New Standard for Thermal Performance Testing of Cryogenic Insulation Systems and WK29608 - Standard Practice for Multilayer Insulation in Cryogenic Service. The Cryogenics Test Laboratory of NASA Kennedy Space Center and the Thermal Energy Laboratory of LeTourneau University are conducting Inter-Laboratory Study (ILS) of selected insulation materials. Each lab carries out the measurements of thermal properties of these materials using identical flat-plate boil-off calorimeter instruments. Parallel testing will provide the comparisons necessary to validate the measurements and methodologies. Here we discuss test methods, some initial data in relation to the experimental approach, and the manner reporting the thermal performance data. This initial study of insulation materials for sub-ambient temperature applications is aimed at paving the way for further ILS comparative efforts that will produce standard data sets for several commercial materials. Discrepancies found between measurements will be used to improve the testing and data reduction techniques being developed as part of the future ASTM International standards.

  12. Assessment of the Impacts of Standards and Labeling Programs inMexico (four products).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, Itha; Pulido, Henry; McNeil, Michael A.

    2007-06-12

    This study analyzes impacts from energy efficiency standards and labeling in Mexico from 1994 through 2005 for four major products: household refrigerators, room air conditioners, three-phase (squirrel cage) induction motors, and clothes washers. It is a retrospective analysis, seeking to assess verified impacts on product efficiency in the Mexican market in the first ten years after standards were implemented. Such an analysis allows the Mexican government to compare actual to originally forecast program benefits. In addition, it provides an extremely valuable benchmark for other countries considering standards, and to the energy policy community as a whole. The methodology for evaluationmore » begins with historical test data taken for a large number of models of each product type between 1994 and 2005. The pre-standard efficiency of models in 1994 is taken as a baseline throughout the analysis. Model efficiency data were provided by an independent certification laboratory (ANCE), which tested products as part of the certification and enforcement mechanism defined by the standards program. Using this data, together with economic and market data provided by both government and private sector sources, the analysis considers several types of national level program impacts. These include: Energy savings; Environmental (emissions) impacts, and Net financial impacts to consumers, manufacturers and utilities. Energy savings impacts are calculated using the same methodology as the original projections, allowing a comparison. Other impacts are calculated using a robust and sophisticated methodology developed by the Instituto de Investigaciones Electricas (IIE) and Lawrence Berkeley National Laboratory (LBNL), in a collaboration supported by the Collaborative Labeling and Standards Program (CLASP).« less

  13. Bridging the gap in complementary and alternative medicine research: manualization as a means of promoting standardization and flexibility of treatment in clinical trials of acupuncture.

    PubMed

    Schnyer, Rosa N; Allen, John J B

    2002-10-01

    An important methodological challenge encountered in acupuncture clinical research involves the design of treatment protocols that help ensure standardization and replicability while allowing for the necessary flexibility to tailor treatments to each individual. Manualization of protocols used in clinical trials of acupuncture and other traditionally-based complementary and alternative medicine (CAM) systems facilitates the systematic delivery of replicable and standardized, yet individually-tailored treatments. To facilitate high-quality CAM acupuncture research by outlining a method for the systematic design and implementation of protocols used in CAM clinical trials based on the concept of treatment manualization. A series of treatment manuals was developed to systematically articulate the Chinese medical theoretical and clinical framework for a given Western-defined illness, to increase the quality and consistency of treatment, and to standardize the technical aspects of the protocol. In all, three manuals were developed for National Institutes of Health (NIH)-funded clinical trials of acupuncture for depression, spasticity in cerebral palsy, and repetitive stress injury. In Part I, the rationale underlying these manuals and the challenges encountered in creating them are discussed, and qualitative assessments of their utility are provided. In Part II, a methodology to develop treatment manuals for use in clinical trials is detailed, and examples are given. A treatment manual provides a precise way to train and supervise practitioners, enable evaluation of conformity and competence, facilitate the training process, and increase the ability to identify the active therapeutic ingredients in clinical trials of acupuncture.

  14. An international marine-atmospheric {sup 222}Rn measurement intercomparison in Bermuda. Part 1: NIST calibration and methodology for standardized sample additions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colle, R.; Unterweger, M.P.; Hodge, P.A.

    1996-01-01

    As part of an international {sup 222}Rn measurement intercomparison conducted at Bermuda in October 1991, NIST provided standardized sample additions of known, but undisclosed (blind) {sup 222}Rn concentrations that could be related to US national standards. The standardized sample additions were obtained with a calibrated {sup 226}Ra source and a specially-designed manifold used to obtain well-known dilution factors from simultaneous flow-rate measurements. The additions were introduced over sampling periods of several hours (typically 4 h) into a common streamline on a sampling tower used by the participating laboratories for their measurements. The standardized {sup 222}Rn activity concentrations for the intercomparisonmore » ranged from approximately 2.5 Bq {center_dot} m{sup {minus}3} to 35 Bq {center_dot} m{sup {minus}3} (of which the lower end of this range approached concentration levels for ambient Bermudian air) and had overall uncertainties, approximating a 3 standard deviation uncertainty interval, of about 6% to 13%. This paper describes the calibration and methodology for the standardized sample additions.« less

  15. An International Marine-Atmospheric 222Rn Measurement Intercomparison in Bermuda Part I: NIST Calibration and Methodology for Standardized Sample Additions

    PubMed Central

    Collé, R.; Unterweger, M. P.; Hodge, P. A.; Hutchinson, J. M. R.

    1996-01-01

    As part of an international 222Rn measurement intercomparison conducted at Bermuda in October 1991, NIST provided standardized sample additions of known, but undisclosed (“blind”) 222Rn concentrations that could be related to U.S. national standards. The standardized sample additions were obtained with a calibrated 226Ra source and a specially-designed manifold used to obtain well-known dilution factors from simultaneous flow-rate measurements. The additions were introduced over sampling periods of several hours (typically 4 h) into a common streamline on a sampling tower used by the participating laboratories for their measurements. The standardized 222Rn activity concentrations for the intercomparison ranged from approximately 2.5 Bq · m−3 to 35 Bq · m−3 (of which the lower end of this range approached concentration levels for ambient Bermudian air) and had overall uncertainties, approximating a 3 standard deviation uncertainty interval, of about 6 % to 13 %. This paper describes the calibration and methodology for the standardized sample additions. PMID:27805090

  16. Collaborative Initiative on Fetal Alcohol Spectrum Disorders: Methodology of Clinical Projects

    PubMed Central

    Mattson, Sarah N.; Foroud, Tatiana; Sowell, Elizabeth R.; Jones, Kenneth Lyons; Coles, Claire D.; Fagerlund, Åse; Autti-Rämö, Ilona; May, Philip A.; Adnams, Colleen M.; Konovalova, Valentina; Wetherill, Leah; Arenson, Andrew D.; Barnett, William K.; Riley, Edward P.

    2009-01-01

    The Collaborative Initiative on Fetal Alcohol Spectrum Disorders (CIFASD) was created in 2003 to further understanding of fetal alcohol spectrum disorders. Clinical and basic science projects collect data across multiple sites using standardized methodology. This paper describes the methodology being used by the clinical projects that pertain to assessment of children and adolescents. Domains being addressed are dysmorphology, neurobehavior, 3D facial imaging, and brain imaging. PMID:20036488

  17. Systematic reviews identify important methodological flaws in stroke rehabilitation therapy primary studies: review of reviews.

    PubMed

    Santaguida, Pasqualina; Oremus, Mark; Walker, Kathryn; Wishart, Laurie R; Siegel, Karen Lohmann; Raina, Parminder

    2012-04-01

    A "review of reviews" was undertaken to assess methodological issues in studies evaluating nondrug rehabilitation interventions in stroke patients. MEDLINE, CINAHL, PsycINFO, and the Cochrane Database of Systematic Reviews were searched from January 2000 to January 2008 within the stroke rehabilitation setting. Electronic searches were supplemented by reviews of reference lists and citations identified by experts. Eligible studies were systematic reviews; excluded citations were narrative reviews or reviews of reviews. Review characteristics and criteria for assessing methodological quality of primary studies within them were extracted. The search yielded 949 English-language citations. We included a final set of 38 systematic reviews. Cochrane reviews, which have a standardized methodology, were generally of higher methodological quality than non-Cochrane reviews. Most systematic reviews used standardized quality assessment criteria for primary studies, but not all were comprehensive. Reviews showed that primary studies had problems with randomization, allocation concealment, and blinding. Baseline comparability, adverse events, and co-intervention or contamination were not consistently assessed. Blinding of patients and providers was often not feasible and was not evaluated as a source of bias. The eligible systematic reviews identified important methodological flaws in the evaluated primary studies, suggesting the need for improvement of research methods and reporting. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Toward the First Data Acquisition Standard in Synthetic Biology.

    PubMed

    Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I

    2016-08-19

    This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.

  19. Improving sexuality education: the development of teacher-preparation standards.

    PubMed

    Barr, Elissa M; Goldfarb, Eva S; Russell, Susan; Seabert, Denise; Wallen, Michele; Wilson, Kelly L

    2014-06-01

    Teaching sexuality education to support young people's sexual development and overall sexual health is both needed and supported. Data continue to highlight the high rates of teen pregnancy, sexually transmitted disease, including human immunodeficiency virus (HIV) infections, among young people in the United States as well as the overwhelming public support for sexuality education instruction. In support of the implementation of the National Sexuality Education Standards, the current effort focuses on better preparing teachers to deliver sexuality education. An expert panel was convened by the Future of Sex Education Initiative to develop teacher-preparation standards for sexuality education. Their task was to develop standards and indicators that addressed the unique elements intrinsic to sexuality education instruction. Seven standards and associated indicators were developed that address professional disposition, diversity and equity, content knowledge, legal and professional ethics, planning, implementation, and assessment. The National Teacher-Preparation Standards for Sexuality Education represent an unprecedented unified effort to enable prospective health education teachers to become competent in teaching methodology, theory, practice of pedagogy, content, and skills, specific to sexuality education. Higher education will play a key role in ensuring the success of these standards. © 2014, American School Health Association.

  20. [HL7 standard--features, principles, and methodology].

    PubMed

    Koncar, Miroslav

    2005-01-01

    The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.

  1. Energy efficiency analysis and implementation of AES on an FPGA

    NASA Astrophysics Data System (ADS)

    Kenney, David

    The Advanced Encryption Standard (AES) was developed by Joan Daemen and Vincent Rjimen and endorsed by the National Institute of Standards and Technology in 2001. It was designed to replace the aging Data Encryption Standard (DES) and be useful for a wide range of applications with varying throughput, area, power dissipation and energy consumption requirements. Field Programmable Gate Arrays (FPGAs) are flexible and reconfigurable integrated circuits that are useful for many different applications including the implementation of AES. Though they are highly flexible, FPGAs are often less efficient than Application Specific Integrated Circuits (ASICs); they tend to operate slower, take up more space and dissipate more power. There have been many FPGA AES implementations that focus on obtaining high throughput or low area usage, but very little research done in the area of low power or energy efficient FPGA based AES; in fact, it is rare for estimates on power dissipation to be made at all. This thesis presents a methodology to evaluate the energy efficiency of FPGA based AES designs and proposes a novel FPGA AES implementation which is highly flexible and energy efficient. The proposed methodology is implemented as part of a novel scripting tool, the AES Energy Analyzer, which is able to fully characterize the power dissipation and energy efficiency of FPGA based AES designs. Additionally, this thesis introduces a new FPGA power reduction technique called Opportunistic Combinational Operand Gating (OCOG) which is used in the proposed energy efficient implementation. The AES Energy Analyzer was able to estimate the power dissipation and energy efficiency of the proposed AES design during its most commonly performed operations. It was found that the proposed implementation consumes less energy per operation than any previous FPGA based AES implementations that included power estimations. Finally, the use of Opportunistic Combinational Operand Gating on an AES cipher was found to reduce its dynamic power consumption by up to 17% when compared to an identical design that did not employ the technique.

  2. Appendix B: Methodology. [2014 Teacher Prep Review

    ERIC Educational Resources Information Center

    Greenberg, Julie; Walsh, Kate; McKee, Arthur

    2014-01-01

    The "NCTQ Teacher Prep Review" evaluates the quality of programs that provide preservice preparation of public school teachers. This appendix describes the scope, methodology, timeline, staff, and standards involved in the production of "Teacher Prep Review 2014." Data collection, validation, and analysis for the report are…

  3. 12 CFR 252.155 - Methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for....155 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under...) Losses, pre-provision net revenue, provision for loan and lease losses, and net income; and (2) The...

  4. 12 CFR 252.155 - Methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for....155 Methodologies and practices. (a) Potential impact on capital. In conducting a stress test under...) Losses, pre-provision net revenue, provision for loan and lease losses, and net income; and (2) The...

  5. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software

    PubMed Central

    Zuckerman, Daniel M.; Chong, Lillian T.

    2018-01-01

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772

  6. Methodological standards in single-case experimental design: Raising the bar.

    PubMed

    Ganz, Jennifer B; Ayres, Kevin M

    2018-04-12

    Single-case experimental designs (SCEDs), or small-n experimental research, are frequently implemented to assess approaches to improving outcomes for people with disabilities, particularly those with low-incidence disabilities, such as some developmental disabilities. SCED has become increasingly accepted as a research design. As this literature base is needed to determine what interventions are evidence-based practices, the acceptance of SCED has resulted in increased critiques with regard to methodological quality. Recent trends include recommendations from a number of expert scholars and institutions. The purpose of this article is to summarize the recent history of methodological quality considerations, synthesize the recommendations found in the SCED literature, and provide recommendations to researchers designing SCEDs with regard to essential and aspirational standards for methodological quality. Conclusions include imploring SCED to increase the quality of their experiments, with particular consideration regarding the applied nature of SCED research to be published in Research in Developmental Disabilities and beyond. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Methodological Issues in Trials of Complementary and Alternative Medicine Interventions

    PubMed Central

    Sikorskii, Alla; Wyatt, Gwen; Victorson, David; Faulkner, Gwen; Rahbar, Mohammad Hossein

    2010-01-01

    Background Complementary and alternative medicine (CAM) use is widespread among cancer patients. Information on safety and efficacy of CAM therapies is needed for both patients and health care providers. Well-designed randomized clinical trials (RCTs) of CAM therapy interventions can inform both clinical research and practice. Objectives To review important issues that affect the design of RCTs for CAM interventions. Methods Using the methods component of the Consolidated Standards for Reporting Trials (CONSORT) as a guiding framework, and a National Cancer Institute-funded reflexology study as an exemplar, methodological issues related to participants, intervention, objectives, outcomes, sample size, randomization, blinding, and statistical methods were reviewed. Discussion Trials of CAM interventions designed and implemented according to appropriate methodological standards will facilitate the needed scientific rigor in CAM research. Interventions in CAM can be tested using proposed methodology, and the results of testing will inform nursing practice in providing safe and effective supportive care and improving the well-being of patients. PMID:19918155

  8. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  9. Different doses of prophylactic platelet transfusion for preventing bleeding in people with haematological disorders after myelosuppressive chemotherapy or stem cell transplantation

    PubMed Central

    Estcourt, Lise J; Stanworth, Simon; Doree, Carolyn; Trivella, Marialena; Hopewell, Sally; Blanco, Patricia; Murphy, Michael F

    2015-01-01

    Background Platelet transfusions are used in modern clinical practice to prevent and treat bleeding in people who are thrombocytopenic due to bone marrow failure. Although considerable advances have been made in platelet transfusion therapy in the last 40 years, some areas continue to provoke debate, especially concerning the use of prophylactic platelet transfusions for the prevention of thrombocytopenic bleeding. This is an update of a Cochrane review first published in 2004, and updated in 2012 that addressed four separate questions: prophylactic versus therapeutic-only platelet transfusion policy; prophylactic platelet transfusion threshold; prophylactic platelet transfusion dose; and platelet transfusions compared to alternative treatments. This review has now been split into four smaller reviews; this review compares different platelet transfusion doses. Objectives To determine whether different doses of prophylactic platelet transfusions (platelet transfusions given to prevent bleeding) affect their efficacy and safety in preventing bleeding in people with haematological disorders undergoing myelosuppressive chemotherapy with or without haematopoietic stem cell transplantation (HSCT). Search methods We searched for randomised controlled trials in the Cochrane Central Register of Controlled Trials (CENTRAL) (Cochrane Library 2015, Issue 6), MEDLINE (from 1946), Embase (from 1974), CINAHL (from 1937), the Transfusion Evidence Library (from 1950), and ongoing trial databases to 23 July 2015. Selection criteria Randomised controlled trials involving transfusions of platelet concentrates, prepared either from individual units of whole blood or by apheresis, and given to prevent bleeding in people with malignant haematological disorders or undergoing HSCT that compared different platelet component doses (low dose 1.1 × 1011/m2 ± 25%, standard dose 2.2 × 1011/m2 ± 25%, high dose 4.4 × 1011/m2 ± 25%). Data collection and analysis We used the standard methodological procedures expected by The Cochrane Collaboration. Main results We included seven trials (1814 participants) in this review; six were conducted during one course of treatment (chemotherapy or HSCT). Overall the methodological quality of studies was low to moderate across different outcomes according to GRADE methodology. None of the included studies were at low risk of bias in every domain, and all the included studies had some threats to validity. Five studies reported the number of participants with at least one clinically significant bleeding episode within 30 days from the start of the study. There was no difference in the number of participants with a clinically significant bleeding episode between the low-dose and standard-dose groups (four studies; 1170 participants; risk ratio (RR) 1.04, 95% confidence interval (CI) 0.95 to 1.13; moderate-quality evidence); low-dose and high-dose groups (one study; 849 participants; RR 1.02, 95% CI 0.93 to 1.11; moderate-quality evidence); or high-dose and standard-dose groups (two studies; 951 participants; RR 1.02, 95% CI 0.93 to 1.11; moderate-quality evidence). Three studies reported the number of days with a clinically significant bleeding event per participant. There was no difference in the number of days of bleeding per participant between the low-dose and standard-dose groups (two studies; 230 participants; mean difference −0.17, 95% CI −0.51 to 0.17; low quality evidence). One study (855 participants) showed no difference in the number of days of bleeding per participant between high-dose and standard-dose groups, or between low-dose and high-dose groups (849 participants). Three studies reported the number of participants with severe or life-threatening bleeding. There was no difference in the number of participants with severe or life-threatening bleeding between a low-dose and a standard-dose platelet transfusion policy (three studies; 1059 participants; RR 1.33, 95% CI 0.91 to 1.92; low-quality evidence); low-dose and high-dose groups (one study; 849 participants; RR 1.20, 95% CI 0.82 to 1.77; low-quality evidence); or high-dose and standard-dose groups (one study; 855 participants; RR 1.11, 95% CI 0.73 to 1.68; low-quality evidence). Two studies reported the time to first bleeding episodes; we were unable to perform a meta-analysis. Both studies (959 participants) individually found that the time to first bleeding episode was either the same, or longer, in the low-dose group compared to the standard-dose group. One study (855 participants) found that the time to the first bleeding episode was the same in the high-dose group compared to the standard-dose group. Three studies reported all-cause mortality within 30 days from the start of the study. There was no difference in all-cause mortality between treatment arms (low-dose versus standard-dose: three studies; 1070 participants; RR 2.04, 95% CI 0.70 to 5.93; low-quality evidence; low-dose versus high-dose: one study; 849 participants; RR 1.33, 95% CI 0.50 to 3.54; low-quality evidence; and high-dose versus standard-dose: one study; 855 participants; RR 1.71, 95% CI 0.51 to 5.81; low-quality evidence). Six studies reported the number of platelet transfusions; we were unable to perform a meta-analysis. Two studies (959 participants) out of three (1070 participants) found that a low-dose transfusion strategy led to more transfusion episodes than a standard-dose. One study (849 participants) found that a low-dose transfusion strategy led to more transfusion episodes than a high-dose strategy. One study (855 participants) out of three (1007 participants) found no difference in the number of platelet transfusions between the high-dose and standard-dose groups. One study reported on transfusion reactions. This study’s authors suggested that a high-dose platelet transfusion strategy may lead to a higher rate of transfusion-related adverse events. None of the studies reported quality-of-life. Authors’ conclusions In haematology patients who are thrombocytopenic due to myelosuppressive chemotherapy or HSCT, we found no evidence to suggest that a low-dose platelet transfusion policy is associated with an increased bleeding risk compared to a standard-dose or high-dose policy, or that a high-dose platelet transfusion policy is associated with a decreased risk of bleeding when compared to a standard-dose policy. A low-dose platelet transfusion strategy leads to an increased number of transfusion episodes compared to a standard-dose strategy. A high-dose platelet transfusion strategy does not decrease the number of transfusion episodes per participant compared to a standard-dose regimen, and it may increase the number of transfusion-related adverse events. Findings from this review would suggest a change from current practice, with low-dose platelet transfusions used for people receiving in-patient treatment for their haematological disorder and high-dose platelet transfusion strategies not being used routinely. PMID:26505729

  10. Preparation of TNT, RDX and Ammonium Nitrate Standards on Gold-on-Silicon Surfaces by Thermal Inkjet Technology

    NASA Astrophysics Data System (ADS)

    Wrable-Rose, Madeline; Primera-Pedrozo, Oliva M.; Pacheco-Londoño, Leonardo C.; Hernandez-Rivera, Samuel P.

    2010-12-01

    This research examines the surface contamination properties, trace sample preparation methodologies, detection systems response and generation of explosive contamination standards for trace detection systems. Homogeneous and reproducible sample preparation is relevant for trace detection of chemical threats, such as warfare agents, highly energetic materials (HEM) and toxic industrial chemicals. The objective of this research was to develop a technology capable of producing samples and standards of HEM with controlled size and distribution on a substrate to generate specimens that would reproduce real contamination conditions. The research activities included (1) a study of the properties of particles generated by two deposition techniques: sample smearing deposition and inkjet deposition, on gold-coated silicon, glass and stainless steel substrates; (2) characterization of composition, distribution and adhesion characteristics of deposits; (3) evaluation of accuracy and reproducibility for depositing neat highly energetic materials such as TNT, RDX and ammonium nitrate; (4) a study of HEM-surface interactions using FTIR-RAIRS; and (5) establishment of protocols for validation of surface concentration using destructive methods such as HPLC.

  11. Crowdsourcing-based evaluation of privacy in HDR images

    NASA Astrophysics Data System (ADS)

    Korshunov, Pavel; Nemoto, Hiromi; Skodras, Athanassios; Ebrahimi, Touradj

    2014-05-01

    The ability of High Dynamic Range imaging (HDRi) to capture details in high-contrast environments, making both dark and bright regions clearly visible, has a strong implication on privacy. However, the extent to which HDRi affects privacy when it is used instead of typical Standard Dynamic Range imaging (SDRi) is not yet clear. In this paper, we investigate the effect of HDRi on privacy via crowdsourcing evaluation using the Microworkers platform. Due to the lack of HDRi standard privacy evaluation dataset, we have created such dataset containing people of varying gender, race, and age, shot indoor and outdoor and under large range of lighting conditions. We evaluate the tone-mapped versions of these images, obtained by several representative tone-mapping algorithms, using subjective privacy evaluation methodology. Evaluation was performed using crowdsourcing-based framework, because it is a popular and effective alternative to traditional lab-based assessment. The results of the experiments demonstrate a significant loss of privacy when even tone-mapped versions of HDR images are used compared to typical SDR images shot with a standard exposure.

  12. Accelerator controls at CERN: Some converging trends

    NASA Astrophysics Data System (ADS)

    Kuiper, B.

    1990-08-01

    CERN's growing services to the high-energy physics community using frozen resources has led to the implementation of "Technical Boards", mandated to assist the management by making recommendations for rationalizations in various technological domains. The Board on Process Control and Electronics for Accelerators, TEBOCO, has emphasized four main lines which might yield economy in resources. First, a common architecture for accelerator controls has been agreed between the three accelerator divisions. Second, a common hardware/software kit has been defined, from which the large majority of future process interfacing may be composed. A support service for this kit is an essential part of the plan. Third, high-level protocols have been developed for standardizing access to process devices. They derive from agreed standard models of the devices and involve a standard control message. This should ease application development and mobility of equipment. Fourth, a common software engineering methodology and a commercial package of application development tools have been adopted. Some rationalization in the field of the man-machine interface and in matters of synchronization is also under way.

  13. Integrating remote sensing and local vegetation information for a high-resolution biogenic emissions inventory--application to an urbanized, semiarid region.

    PubMed

    Diem, J E; Comrie, A C

    2000-11-01

    This paper presents a methodology for the development of a high-resolution (30-m), standardized biogenic volatile organic compound (BVOC) emissions inventory and a subsequent application of the methodology to Tucson, AZ. The region's heterogeneous vegetation cover cannot be modeled accurately with low-resolution (e.g., 1-km) land cover and vegetation information. Instead, local vegetation data are used in conjunction with multispectral satellite data to generate a detailed vegetation-based land-cover database of the region. A high-resolution emissions inventory is assembled by associating the vegetation data with appropriate emissions factors. The inventory reveals a substantial variation in BVOC emissions across the region, resulting from the region's diversity of both native and exotic vegetation. The importance of BVOC emissions from forest lands, desert lands, and the urban forest changes according to regional, metropolitan, and urban scales. Within the entire Tucson region, the average isoprene, monoterpene, and OVOC fluxes observed were 454, 248, and 91 micrograms/m2/hr, respectively, with forest and desert lands emitting nearly all of the BVOCs. Within the metropolitan area, which does not include the forest lands, the average fluxes were 323, 181, and 70 micrograms/m2/hr, respectively. Within the urban area, the average fluxes were 801, 100, and 100 micrograms/m2/hr, respectively, with exotic trees such as eucalyptus, pine, and palm emitting most of the urban BVOCs. The methods presented in this paper can be modified to create detailed, standardized BVOC emissions inventories for other regions, especially those with spatially complex vegetation patterns.

  14. A quality improvement initiative to reduce necrotizing enterocolitis across hospital systems.

    PubMed

    Nathan, Amy T; Ward, Laura; Schibler, Kurt; Moyer, Laurel; South, Andrew; Kaplan, Heather C

    2018-04-20

    Necrotizing enterocolitis (NEC) is a devastating intestinal disease in premature infants. Local rates of NEC were unacceptably high. We hypothesized that utilizing quality improvement methodology to standardize care and apply evidence-based practices would reduce our rate of NEC. A multidisciplinary team used the model for improvement to prioritize interventions. Three neonatal intensive care units (NICUs) developed a standardized feeding protocol for very low birth weight (VLBW) infants, and employed strategies to increase the use of human milk, maximize intestinal perfusion, and promote a healthy microbiome. The primary outcome measure, NEC in VLBW infants, decreased from 0.17 cases/100 VLBW patient days to 0.029, an 83% reduction, while the compliance with a standardized feeding protocol improved. Through reliable implementation of evidence-based practices, this project reduced the regional rate of NEC by 83%. A key outcome and primary driver of success was standardization across multiple NICUs, resulting in consistent application of best practices and reduction in variation.

  15. Using inductively coupled plasma-mass spectrometry for calibration transfer between environmental CRMs.

    PubMed

    Turk, G C; Yu, L L; Salit, M L; Guthrie, W F

    2001-06-01

    Multielement analyses of environmental reference materials have been performed using existing certified reference materials (CRMs) as calibration standards for inductively coupled plasma-mass spectrometry. The analyses have been performed using a high-performance methodology that results in comparison measurement uncertainties that are significantly less than the uncertainties of the certified values of the calibration CRM. Consequently, the determined values have uncertainties that are very nearly equivalent to the uncertainties of the calibration CRM. Several uses of this calibration transfer are proposed, including, re-certification measurements of replacement CRMs, establishing traceability of one CRM to another, and demonstrating the equivalence of two CRMs. RM 8704, a river sediment, was analyzed using SRM 2704, Buffalo River Sediment, as the calibration standard. SRM 1632c, Trace Elements in Bituminous Coal, which is a replacement for SRM 1632b, was analyzed using SRM 1632b as the standard. SRM 1635, Trace Elements in Subbituminous Coal, was also analyzed using SRM 1632b as the standard.

  16. Speech-language pathologists and the Common Core Standards initiative: an opportunity for leadership and organizational change.

    PubMed

    Dunkle, Jennifer; Flynn, Perry

    2012-05-01

    The Common Core State Standards initiative within public school education is designed to provide uniform guidelines for academic standards, including more explicit language targets. Speech-language pathologists (SLPs) are highly qualified language experts who may find new leadership roles within their clinical practice using the Common Core Standards. However, determining its usage by SLPs in clinical practice needs to be examined. This article seeks to discover the social context of organizations and organizational change in relation to clinical practice. Specifically, this article presents the diffusion of innovations theory to explain how initiatives move from ideas to institutionalization and the importance of social context in which these initiatives are introduced. Next, the values of both SLPs and organizations will be discussed. Finally, this article provides information on how to affect organizational change through the value of an affirmative, socially based theoretical perspective and methodology, appreciative inquiry. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1993-01-01

    In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

  18. An insight into morphometric descriptors of cell shape that pertain to regenerative medicine.

    PubMed

    Lobo, Joana; See, Eugene Yong-Shun; Biggs, Manus; Pandit, Abhay

    2016-07-01

    Cellular morphology has recently been indicated as a powerful indicator of cellular function. The analysis of cell shape has evolved from rudimentary forms of microscopic visual inspection to more advanced methodologies that utilize high-resolution microscopy coupled with sophisticated computer hardware and software for data analysis. Despite this progress, there is still a lack of standardization in quantification of morphometric parameters. In addition, uncertainty remains as to which methodologies and parameters of cell morphology will yield meaningful data, which methods should be utilized to categorize cell shape, and the extent of reliability of measurements and the interpretation of the resulting analysis. A large range of descriptors has been employed to objectively assess the cellular morphology in two-dimensional and three-dimensional domains. Intuitively, simple and applicable morphometric descriptors are preferable and standardized protocols for cell shape analysis can be achieved with the help of computerized tools. In this review, cellular morphology is discussed as a descriptor of cellular function and the current morphometric parameters that are used quantitatively in two- and three-dimensional environments are described. Furthermore, the current problems associated with these morphometric measurements are addressed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Validation of a fast and accurate chromatographic method for detailed quantification of vitamin E in green leafy vegetables.

    PubMed

    Cruz, Rebeca; Casal, Susana

    2013-11-15

    Vitamin E analysis in green vegetables is performed by an array of different methods, making it difficult to compare published data or choosing the adequate one for a particular sample. Aiming to achieve a consistent method with wide applicability, the current study reports the development and validation of a fast micro-method for quantification of vitamin E in green leafy vegetables. The methodology uses solid-liquid extraction based on the Folch method, with tocol as internal standard, and normal-phase HPLC with fluorescence detection. A large linear working range was confirmed, being highly reproducible, with inter-day precisions below 5% (RSD). Method sensitivity was established (below 0.02 μg/g fresh weight), and accuracy was assessed by recovery tests (>96%). The method was tested in different green leafy vegetables, evidencing diverse tocochromanol profiles, with variable ratios and amounts of α- and γ-tocopherol, and other minor compounds. The methodology is adequate for routine analyses, with a reduced chromatographic run (<7 min) and organic solvent consumption, and requires only standard chromatographic equipment available in most laboratories. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. The relationship between return on investment and quality of study methodology in workplace health promotion programs.

    PubMed

    Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J

    2014-01-01

    To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become increasingly positive across quasi-experimental, nonexperimental, and modeled studies: 1.12 ± 2.16 (1.11-1.14), 1.61 ± 0.91 (1.56-1.65), and 2.05 ± 0.88 (2.04-2.06), respectively. Overall, mean weighted ROI in workplace health promotion demonstrated a positive ROI. Higher methodological quality studies provided evidence of smaller financial returns. Methodological quality and study design are important determinants.

  1. 42 CFR 493.1281 - Standard: Comparison of test results.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Standard: Comparison of test results. 493.1281... Testing Analytic Systems § 493.1281 Standard: Comparison of test results. (a) If a laboratory performs the... between test results using the different methodologies, instruments, or testing sites. (b) The laboratory...

  2. 42 CFR 493.1281 - Standard: Comparison of test results.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Comparison of test results. 493.1281... Testing Analytic Systems § 493.1281 Standard: Comparison of test results. (a) If a laboratory performs the... between test results using the different methodologies, instruments, or testing sites. (b) The laboratory...

  3. 42 CFR 493.1281 - Standard: Comparison of test results.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Standard: Comparison of test results. 493.1281... Testing Analytic Systems § 493.1281 Standard: Comparison of test results. (a) If a laboratory performs the... between test results using the different methodologies, instruments, or testing sites. (b) The laboratory...

  4. 42 CFR 493.1281 - Standard: Comparison of test results.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Standard: Comparison of test results. 493.1281... Testing Analytic Systems § 493.1281 Standard: Comparison of test results. (a) If a laboratory performs the... between test results using the different methodologies, instruments, or testing sites. (b) The laboratory...

  5. 42 CFR 493.1281 - Standard: Comparison of test results.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Standard: Comparison of test results. 493.1281... Testing Analytic Systems § 493.1281 Standard: Comparison of test results. (a) If a laboratory performs the... between test results using the different methodologies, instruments, or testing sites. (b) The laboratory...

  6. Pediatric Cancer Survivorship Research: Experience of the Childhood Cancer Survivor Study

    PubMed Central

    Leisenring, Wendy M.; Mertens, Ann C.; Armstrong, Gregory T.; Stovall, Marilyn A.; Neglia, Joseph P.; Lanctot, Jennifer Q.; Boice, John D.; Whitton, John A.; Yasui, Yutaka

    2009-01-01

    The Childhood Cancer Survivor Study (CCSS) is a comprehensive multicenter study designed to quantify and better understand the effects of pediatric cancer and its treatment on later health, including behavioral and sociodemographic outcomes. The CCSS investigators have published more than 100 articles in the scientific literature related to the study. As with any large cohort study, high standards for methodologic approaches are imperative for valid and generalizable results. In this article we describe methodological issues of study design, exposure assessment, outcome validation, and statistical analysis. Methods for handling missing data, intrafamily correlation, and competing risks analysis are addressed; each with particular relevance to pediatric cancer survivorship research. Our goal in this article is to provide a resource and reference for other researchers working in the area of long-term cancer survivorship. PMID:19364957

  7. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  8. 45 CFR 153.320 - Federally certified risk adjustment methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  9. 45 CFR 153.330 - State alternate risk adjustment methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  10. 45 CFR 153.330 - State alternate risk adjustment methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  11. 45 CFR 153.320 - Federally certified risk adjustment methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  12. 45 CFR 153.320 - Federally certified risk adjustment methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  13. 45 CFR 153.330 - State alternate risk adjustment methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...

  14. A comprehensive evaluation of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil by microwave-assisted hydrolysis and HPLC-MS/MS.

    PubMed

    Bartella, Lucia; Mazzotti, Fabio; Napoli, Anna; Sindona, Giovanni; Di Donna, Leonardo

    2018-03-01

    A rapid and reliable method to assay the total amount of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil has been developed. The methodology intends to establish the nutritional quality of this edible oil addressing recent international health claim legislations (the European Commission Regulation No. 432/2012) and changing the classification of extra virgin olive oil to the status of nutraceutical. The method is based on the use of high-performance liquid chromatography coupled with tandem mass spectrometry and labeled internal standards preceded by a fast hydrolysis reaction step performed through the aid of microwaves under acid conditions. The overall process is particularly time saving, much shorter than any methodology previously reported. The developed approach represents a mix of rapidity and accuracy whose values have been found near 100% on different fortified vegetable oils, while the RSD% values, calculated from repeatability and reproducibility experiments, are in all cases under 7%. Graphical abstract Schematic of the methodology applied to the determination of tyrosol and hydroxytyrosol ester conjugates.

  15. Determination of iodopropynyl butylcarbamate in cosmetic formulations utilizing pulsed splitless injection, gas chromatography with electron capture detector.

    PubMed

    Palmer, Kevin B; LaFon, William; Burford, Mark D

    2017-09-22

    Current analytical methodology for iodopropynyl butylcarbamate (IPBC) analysis focuses on the use of liquid chromatography and mass spectrometer (LC-MS), but the high instrumentation and operator investment required has resulted in the need for a cost effective alternative methodology. Past publications investigating gas chromatography with electron capture detector (GC-ECD) for IPBC quantitation proved largely unsuccessful, likely due to the preservatives limited thermal stability. The use of pulsed injection techniques commonly used for trace analysis of thermally labile pharmaceutical compounds was successfully adapted for IPBC analysis and utilizes the selectivity of GC-ECD analysis. System optimization and sample preparation improvements resulted in substantial performance and reproducibility gains. Cosmetic formulations preserved with IPBC (50-100ppm) were solvated in toluene/isopropyl alcohol and quantified over the 0.3-1.3μg/ml calibration range. The methodology was robust (relative standard deviation 4%), accurate (98% recovery), and sensitive (limit of detection 0.25ng/ml) for use in routine testing of cosmetic formulation preservation. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. The secret lives of experiments: methods reporting in the fMRI literature.

    PubMed

    Carp, Joshua

    2012-10-15

    Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Creating a standardized process to offer the standard of care: continuous process improvement methodology is associated with increased rates of sperm cryopreservation among adolescent and young adult males with cancer.

    PubMed

    Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca

    2012-11-01

    There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.

  18. Product environmental footprint in policy and market decisions: Applicability and impact assessment.

    PubMed

    Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias

    2015-07-01

    In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.

  19. A Novel Performance Evaluation Methodology for Single-Target Trackers.

    PubMed

    Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka

    2016-11-01

    This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.

  20. Large scale nonlinear programming for the optimization of spacecraft trajectories

    NASA Astrophysics Data System (ADS)

    Arrieta-Camacho, Juan Jose

    Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.

  1. Evaluation of the Turkish translation of the Minimal Standard Terminology for Digestive Endoscopy by development of an endoscopic information system.

    PubMed

    Atalağ, Koray; Bilgen, Semih; Gür, Gürden; Boyacioğlu, Sedat

    2007-09-01

    There are very few evaluation studies for the Minimal Standard Terminology for Digestive Endoscopy. This study aims to evaluate the usage of the Turkish translation of Minimal Standard Terminology by developing an endoscopic information system. After elicitation of requirements, database modeling and software development were performed. Minimal Standard Terminology driven forms were designed for rapid data entry. The endoscopic report was rapidly created by applying basic Turkish syntax and grammar rules. Entering free text and also editing of final report were possible. After three years of live usage, data analysis was performed and results were evaluated. The system has been used for reporting of all endoscopic examinations. 15,638 valid records were analyzed, including 11,381 esophagogastroduodenoscopies, 2,616 colonoscopies, 1,079 rectoscopies and 562 endoscopic retrograde cholangiopancreatographies. In accordance with other previous validation studies, the overall usage of Minimal Standard Terminology terms was very high: 85% for examination characteristics, 94% for endoscopic findings and 94% for endoscopic diagnoses. Some new terms, attributes and allowed values were also added for better clinical coverage. Minimal Standard Terminology has been shown to cover a high proportion of routine endoscopy reports. Good user acceptance proves that both the terms and structure of Minimal Standard Terminology were consistent with usual clinical thinking. However, future work on Minimal Standard Terminology is mandatory for better coverage of endoscopic retrograde cholangiopancreatographies examinations. Technically new software development methodologies have to be sought for lowering cost of development and the maintenance phase. They should also address integration and interoperability of disparate information systems.

  2. Quality assessment of recent evidence-based clinical practice guidelines for management of type 2 diabetes mellitus in adults using the AGREE II instrument.

    PubMed

    Anwer, Muhammad A; Al-Fahed, Ousama B; Arif, Samir I; Amer, Yasser S; Titi, Maher A; Al-Rukban, Mohammed O

    2018-02-01

    Type 2 diabetes mellitus (T2DM) is a worldwide and national public health problem that has a great impact on the population in Saudi Arabia. High-quality clinical practice guidelines (CPGs) are cornerstones in improving the health care provided for patients with diabetes. This study evaluated the methodological rigour, transparency, and applicability of recently published CPGs. Our group conducted a systematic search for recently published CPGs for T2DM. The searching and screening for Source CPGs were guided by tools from the ADAPTE methods with specific inclusion/exclusion criteria. Five reviewers using the second version of the Appraisal of Guidelines for Research and Evaluation (AGREE II) Instrument independently assessed the quality of the retrieved Source CPGs. Domains of Scope and purpose and Clarity of presentation received the highest scores in all CPGs. Most of the assessed CPGs (86%) were considered with high overall quality and were recommended for use. Rigour of development and applicability domains were together highest in 3 CPGs (43%). The overall high quality of DM CPGs published in the last 3 years demonstrated the continuous development and improvement in CPG methodologies and standards. Health care professionals should consider the quality of any CPG for T2DM before deciding to use it in their daily clinical practice. Three CPGs have been identified, using the AGREE criteria, as high-quality and trustworthy. Ideally, the resources provided by the AGREE trust including the AGREE II Instrument should be used by a clinician to scan through the large number of published T2DM CPGs to identify the CPGs with high methodological quality and applicability. © 2017 John Wiley & Sons, Ltd.

  3. A new methodology for accurate 3-dimensional coronary artery reconstruction using routine intravascular ultrasound and angiographic data: implications for widespread assessment of endothelial shear stress in humans.

    PubMed

    Bourantas, Christos V; Papafaklis, Michail I; Athanasiou, Lambros; Kalatzis, Fanis G; Naka, Katerina K; Siogkas, Panagiotis K; Takahashi, Saeko; Saito, Shigeru; Fotiadis, Dimitrios I; Feldman, Charles L; Stone, Peter H; Michalis, Lampros K

    2013-09-01

    To develop and validate a new methodology that allows accurate 3-dimensional (3-D) coronary artery reconstruction using standard, simple angiographic and intravascular ultrasound (IVUS) data acquired during routine catheterisation enabling reliable assessment of the endothelial shear stress (ESS) distribution. Twenty-two patients (22 arteries: 7 LAD; 7 LCx; 8 RCA) who underwent angiography and IVUS examination were included. The acquired data were used for 3-D reconstruction using a conventional method and a new methodology that utilised the luminal 3-D centreline to place the detected IVUS borders and anatomical landmarks to estimate their orientation. The local ESS distribution was assessed by computational fluid dynamics. In corresponding consecutive 3 mm segments, lumen, plaque and ESS measurements in the 3-D models derived by the centreline approach were highly correlated to those derived from the conventional method (r>0.98 for all). The centreline methodology had a 99.5% diagnostic accuracy for identifying segments exposed to low ESS and provided similar estimations to the conventional method for the association between the change in plaque burden and ESS (centreline method: slope= -1.65%/Pa, p=0.078; conventional method: slope= -1.64%/Pa, p=0.084; p =0.69 for difference between the two methodologies). The centreline methodology provides geometrically correct models and permits reliable ESS computation. The ability to utilise data acquired during routine coronary angiography and IVUS examination will facilitate clinical investigation of the role of local ESS patterns in the natural history of coronary atherosclerosis.

  4. Are normative sonographic values of kidney size in children valid and reliable? A systematic review of the methodological quality of ultrasound studies using the Anatomical Quality Assessment (AQUA) tool.

    PubMed

    Chhapola, Viswas; Tiwari, Soumya; Deepthi, Bobbity; Henry, Brandon Michael; Brar, Rekha; Kanwal, Sandeep Kumar

    2018-06-01

    A plethora of research is available on ultrasonographic kidney size standards. We performed a systematic review of methodological quality of ultrasound studies aimed at developing normative renal parameters in healthy children, by evaluating the risk of bias (ROB) using the 'Anatomical Quality Assessment (AQUA)' tool. We searched Medline, Scopus, CINAHL, and Google Scholar on June 04 2018, and observational studies measuring kidney size by ultrasonography in healthy children (0-18 years) were included. The ROB of each study was evaluated in five domains using a 20 item coding scheme based on AQUA tool framework. Fifty-four studies were included. Domain 1 (subject characteristics) had a high ROB in 63% of studies due to the unclear description of age, sex, and ethnicity. The performance in Domain 2 (study design) was the best with 85% of studies having a prospective design. Methodological characterization (Domain 3) was poor across the studies (< 10% compliance), with suboptimal performance in the description of patient positioning, operator experience, and assessment of intra/inter-observer reliability. About three-fourth of the studies had a low ROB in Domain 4 (descriptive anatomy). Domain 5 (reporting of results) had a high ROB in approximately half of the studies, the majority reporting results in the form of central tendency measures. Significant deficiencies and heterogeneity were observed in the methodological quality of USG studies performed to-date for measurement of kidney size in children. We hereby provide a framework for the conducting such studies in future. PROSPERO (CRD42017071601).

  5. Appendix C: Background and Methodology for Alternative Certification Pilot. [2014 Teacher Prep Review

    ERIC Educational Resources Information Center

    Greenberg, Julie; Walsh, Kate; McKee, Arthur

    2014-01-01

    The "NCTQ Teacher Prep Review" evaluates the quality of programs that provide preservice preparation of public school teachers. As part of the "Review," this appendix reports on a pilot study of new standards for assessing the quality of alternative certification programs. Background and methodology for alternative…

  6. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    ERIC Educational Resources Information Center

    Smith, Justin D.

    2012-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…

  7. School Climate Reports from Norwegian Teachers: A Methodological and Substantive Study.

    ERIC Educational Resources Information Center

    Kallestad, Jan Helge; Olweus, Dan; Alsaker, Francoise

    1998-01-01

    Explores methodological and substantive issues relating to school climate, using a dataset derived from 42 Norwegian schools at two points of time and a standard definition of organizational climate. Identifies and analyzes four school-climate dimensions. Three dimensions (collegial communication, orientation to change, and teacher influence over…

  8. Culturally Competent Social Work Research: Methodological Considerations for Research with Language Minorities

    ERIC Educational Resources Information Center

    Casado, Banghwa Lee; Negi, Nalini Junko; Hong, Michin

    2012-01-01

    Despite the growing number of language minorities, foreign-born individuals with limited English proficiency, this population has been largely left out of social work research, often due to methodological challenges involved in conducting research with this population. Whereas the professional standard calls for cultural competence, a discussion…

  9. Public Relations Telephone Surveys: Avoiding Methodological Debacles.

    ERIC Educational Resources Information Center

    Stone, Gerald C.

    1996-01-01

    Reports that a study revealed a serious methodological flaw in interviewer bias in telephone surveys. States that most surveys, using standard detection measures, would not find the defect, but outcomes were so misleading that a campaign using the results would be doomed. Warns about practitioner telephone surveys; suggests special precautions if…

  10. Alignment of Standards and Assessments as an Accountability Criterion.

    ERIC Educational Resources Information Center

    La Marca, Paul M.

    2001-01-01

    Provides an overview of the concept of alignment and the role it plays in assessment and accountability systems. Discusses some methodological issues affecting the study of alignment and explores the relationship between alignment and test score interpretation. Alignment is not only a methodological requirement but also an ethical requirement.…

  11. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    PubMed

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  12. 47 CFR 51.503 - General pricing standard.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false General pricing standard. 51.503 Section 51.503... Pricing of Elements § 51.503 General pricing standard. (a) An incumbent LEC shall offer elements to... commission— (1) Pursuant to the forward-looking economic cost-based pricing methodology set forth in §§ 51...

  13. Higher Education in Non-Standard Wage Contracts

    ERIC Educational Resources Information Center

    Rosti, Luisa; Chelli, Francesco

    2012-01-01

    Purpose: The purpose of this paper is to verify whether higher education increases the likelihood of young Italian workers moving from non-standard to standard wage contracts. Design/methodology/approach: The authors exploit a data set on labour market flows, produced by the Italian National Statistical Office, by interviewing about 85,000…

  14. Situating Standard Setting within Argument-Based Validity

    ERIC Educational Resources Information Center

    Papageorgiou, Spiros; Tannenbaum, Richard J.

    2016-01-01

    Although there has been substantial work on argument-based approaches to validation as well as standard-setting methodologies, it might not always be clear how standard setting fits into argument-based validity. The purpose of this article is to address this lack in the literature, with a specific focus on topics related to argument-based…

  15. The Benefits of Standards-Based Grading: A Critical Evaluation of Modern Grading Practices

    ERIC Educational Resources Information Center

    Iamarino, Danielle L.

    2014-01-01

    This paper explores the methodology and application of an assessment philosophy known as standards-based grading, via a critical comparison of standards-based grading to other assessment philosophies commonly employed at the elementary, secondary, and post-secondary levels of education. Evidenced by examples of increased student engagement and…

  16. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  17. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  18. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under current methodologies; and increase transparency through enhanced disclosures. The final rule does not include all of the methodologies adopted by the Basel Committee on Banking Supervision for calculating the standardized specific risk capital requirements for debt and securitization positions due to their reliance on credit ratings, which is impermissible under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. Instead, the final rule includes alternative methodologies for calculating standardized specific risk capital requirements for debt and securitization positions.

  19. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  20. Visual characterization and diversity quantification of chemical libraries: 2. Analysis and selection of size-independent, subspace-specific diversity indices.

    PubMed

    Colliandre, Lionel; Le Guilloux, Vincent; Bourg, Stephane; Morin-Allory, Luc

    2012-02-27

    High Throughput Screening (HTS) is a standard technique widely used to find hit compounds in drug discovery projects. The high costs associated with such experiments have highlighted the need to carefully design screening libraries in order to avoid wasting resources. Molecular diversity is an established concept that has been used to this end for many years. In this article, a new approach to quantify the molecular diversity of screening libraries is presented. The approach is based on the Delimited Reference Chemical Subspace (DRCS) methodology, a new method that can be used to delimit the densest subspace spanned by a reference library in a reduced 2D continuous space. A total of 22 diversity indices were implemented or adapted to this methodology, which is used here to remove outliers and obtain a relevant cell-based partition of the subspace. The behavior of these indices was assessed and compared in various extreme situations and with respect to a set of theoretical rules that a diversity function should satisfy when libraries of different sizes have to be compared. Some gold standard indices are found inappropriate in such a context, while none of the tested indices behave perfectly in all cases. Five DRCS-based indices accounting for different aspects of diversity were finally selected, and a simple framework is proposed to use them effectively. Various libraries have been profiled with respect to more specific subspaces, which further illustrate the interest of the method.

  1. Investigation of the Surface Stress in SiC and Diamond Nanocrystals by In-situ High Pressure Powder Diffraction Technique

    NASA Technical Reports Server (NTRS)

    Palosz, B.; Stelmakh, S.; Grzanka, E.; Gierlotka, S.; Zhao, Y.; Palosz, W.

    2003-01-01

    The real atomic structure of nanocrystals determines key properties of the materials. For such materials the serious experimental problem lies in obtaining sufficiently accurate measurements of the structural parameters of the crystals, since very small crystals constitute rather a two-phase than a uniform crystallographic phase system. As a result, elastic properties of nanograins may be expected to reflect a dual nature of their structure, with a corresponding set of different elastic property parameters. We studied those properties by in-situ high-pressure powder diffraction technique. For nanocrystalline, even one-phase materials such measurements are particularly difficult to make since determination of the lattice parameters of very small crystals presents a challenge due to inherent limitations of standard elaboration of powder diffractograms. In this investigation we used our methodology of the structural analysis, the 'apparent lattice parameter' (alp) concept. The methodology allowed us to avoid the traps (if applied to nanocrystals) of standard powder diffraction evaluation techniques. The experiments were performed for nanocrystalline Sic and GaN powders using synchrotron sources. We applied both hydrostatic and isostatic pressures in the range of up to 40 GPa. Elastic properties of the samples were examined based on the measurements of a change of the lattice parameters with pressure. The results show a dual nature of the mechanical properties (compressibilities) of the materials, indicating a complex, core-shell structure of the grains.

  2. Collecting standardized urban health indicator data at an individual level for school-aged children living in urban areas: methods from EURO-URHIS 2.

    PubMed

    Pope, D; Katreniak, Z; Guha, J; Puzzolo, E; Higgerson, J; Steels, S; Woode-Owusu, M; Bruce, N; Birt, Christopher A; Ameijden, E van; Verma, A

    2017-05-01

    Measuring health and its determinants in urban populations is essential to effectively develop public health policies maximizing health gain within this context. Adolescents are important in this regard given the origins of leading causes of morbidity and mortality develop pre-adulthood. Comprehensive, accurate and comparable information on adolescent urban health indicators from heterogeneous urban contexts is an important challenge. EURO-URHIS 2 aimed to develop standardized tools and methodologies collecting data from adolescents across heterogenous European urban contexts. Questionnaires were developed including (i) comprehensive assessment of urban health indicators from 7 pre-defined domains, (ii) use of previously validated questions from a literature review and other European surveys, (iii) translation/back-translation into European languages and (iv) piloting. Urban area-specific data collection methodologies were established through literature review, consultation and piloting. School-based surveys of 14-16-year olds (400-800 per urban area) were conducted in 13 European countries (33 urban areas). Participation rates were high (80-100%) for students from schools taking part in the surveys from all urban areas, and data quality was generally good (low rates of missing/spoiled data). Overall, 13 850 questionnaires were collected, coded and entered for EURO-URHIS 2. Dissemination included production of urban area health profiles (allowing benchmarking for a number of important public health indicators in young people) and use of visualization tools as part of the EURO-URHIS 2 project. EURO-URHIS 2 has developed standardized survey tools and methodologies for assessing key measures of health and its determinants in adolescents from heterogenous urban contexts and demonstrated the utility of this data to public health practitioners and policy makers. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  3. Multi-class methodology to determine pesticides and mycotoxins in green tea and royal jelly supplements by liquid chromatography coupled to Orbitrap high resolution mass spectrometry.

    PubMed

    Martínez-Domínguez, Gerardo; Romero-González, Roberto; Garrido Frenich, Antonia

    2016-04-15

    A multi-class methodology was developed to determine pesticides and mycotoxins in food supplements. The extraction was performed using acetonitrile acidified with formic acid (1%, v/v). Different clean-up sorbents were tested, and the best results were obtained using C18 and zirconium oxide for green tea and royal jelly, respectively. The compounds were determined using ultra high performance liquid chromatography (UHPLC) coupled to Exactive-Orbitrap high resolution mass spectrometry (HRMS). The recovery rates obtained were between 70% and 120% for most of the compounds studied with a relative standard deviation <25%, at three different concentration levels. The calculated limits of quantification (LOQ) were <10 μg/kg. The method was applied to green tea (10) and royal jelly (8) samples. Nine (eight of green tea and one of royal jelly) samples were found to be positive for pesticides at concentrations ranging from 10.6 (cinosulfuron) to 47.9 μg/kg (paclobutrazol). The aflatoxin B1 (5.4 μg/kg) was also found in one of the green tea samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Office of Space Terrestrial Applications (OSTA)/Applications Data Service (ADS) data systems standards

    NASA Technical Reports Server (NTRS)

    Walton, B. A. (Editor)

    1981-01-01

    Standards needed to interconnect applications data service pilots for data sharing were identified. Current pilot methodologies are assessed. Recommendations for future work are made. A preliminary set of requirements for guidelines and standards for catalogues, directories, and dictionaries was identified. The user was considered to be a scientist at a terminal. Existing and emerging national and international telecommunication standards were adopted where possible in view of new and unproven standards.

  5. Combat Stress: A Collateral Effect in the Operational Effectiveness Loss Multiplier (OELM) Methodology

    DTIC Science & Technology

    2015-02-01

    Organization (NATO) Standardization Agency ( NSA ), NATO Glossary of Terms and Definitions (English and French), Allied Administration Publication (AAP)-06...Edition 2012 Version 2 (hereafter referred to as AAP-06) (Belgium: NSA , 2012), 2-C-2. 15 Disraelly et al., “A New Methodology for CBRN Casualty...20 NATO NSA , AAP-06, 2-K-1. 21 Ibid., 2-D-6. 22 Disraelly et al., A Methodology for Examining Collateral Effects on Military Operations during

  6. A proposed standard methodology for estimating the wounding capacity of small calibre projectiles or other missiles.

    PubMed

    Berlin, R H; Janzon, B; Rybeck, B; Schantz, B; Seeman, T

    1982-01-01

    A standard methodology for estimating the energy transfer characteristics of small calibre bullets and other fast missiles is proposed, consisting of firings against targets made of soft soap. The target is evaluated by measuring the size of the permanent cavity remaining in it after the shot. The method is very simple to use and does not require access to any sophisticated measuring equipment. It can be applied under all circumstances, even under field conditions. Adequate methods of calibration to ensure good accuracy are suggested. The precision and limitations of the method are discussed.

  7. [Methodologic developmental principles of standardized surveys within the scope of social gerontologic studies].

    PubMed

    Bansemir, G

    1987-01-01

    The conception and evaluation of standardized oral or written questioning as quantifying instruments of research orientate by the basic premises of Marxist-Leninist theory of recognition and general scientific logic. In the present contribution the socio-gerontological research process is outlined in extracts. By referring to the intrinsic connection between some of its essential components--problem, formation of hypotheses, obtaining indicators/measurement, preliminary examination, evaluation-as well as to typical errors and (fictitious) examples of practical research, this contribution contrasts the natural, apparently uncomplicated course of structured questioning with its qualitative methodological fundamentals and demands.

  8. Minimum Information about a Genotyping Experiment (MIGEN)

    PubMed Central

    Huang, Jie; Mirel, Daniel; Pugh, Elizabeth; Xing, Chao; Robinson, Peter N.; Pertsemlidis, Alexander; Ding, LiangHao; Kozlitina, Julia; Maher, Joseph; Rios, Jonathan; Story, Michael; Marthandan, Nishanth; Scheuermann, Richard H.

    2011-01-01

    Genotyping experiments are widely used in clinical and basic research laboratories to identify associations between genetic variations and normal/abnormal phenotypes. Genotyping assay techniques vary from single genomic regions that are interrogated using PCR reactions to high throughput assays examining genome-wide sequence and structural variation. The resulting genotype data may include millions of markers of thousands of individuals, requiring various statistical, modeling or other data analysis methodologies to interpret the results. To date, there are no standards for reporting genotyping experiments. Here we present the Minimum Information about a Genotyping Experiment (MIGen) standard, defining the minimum information required for reporting genotyping experiments. MIGen standard covers experimental design, subject description, genotyping procedure, quality control and data analysis. MIGen is a registered project under MIBBI (Minimum Information for Biological and Biomedical Investigations) and is being developed by an interdisciplinary group of experts in basic biomedical science, clinical science, biostatistics and bioinformatics. To accommodate the wide variety of techniques and methodologies applied in current and future genotyping experiment, MIGen leverages foundational concepts from the Ontology for Biomedical Investigations (OBI) for the description of the various types of planned processes and implements a hierarchical document structure. The adoption of MIGen by the research community will facilitate consistent genotyping data interpretation and independent data validation. MIGen can also serve as a framework for the development of data models for capturing and storing genotyping results and experiment metadata in a structured way, to facilitate the exchange of metadata. PMID:22180825

  9. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    NASA Astrophysics Data System (ADS)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean deformation width of 183 m and 133 m, respectively. We envisage that correlation results derived from our methodology will provide vital data for near-field deformation patterns and will be of significant use for constraining inversion solutions for fault slip at depth.

  10. What can we learn from international comparisons of costs by DRG?

    PubMed

    Pirson, M; Schenker, L; Martins, D; Dung, Duong; Chalé, J J; Leclercq, P

    2013-02-01

    The objective of this study was to compare costs data by diagnosis related group (DRG) between Belgium and Switzerland. Our hypotheses were that differences between countries can probably be explained by methodological differences in cost calculations, by differences in medical practices and by differences in cost structures within the two countries. Classifications of DRG used in the two countries differ (AP-DRGs version 1.7 in Switzerland and APR-DRGs version 15.0 in Belgium). The first step of this study was to transform Belgian summaries into Swiss AP-DRGs. Belgian and Swiss data were calculated with a clinical costing methodology (full costing). Belgian and Swiss costs were converted into US$ PPP (purchasing power parity) in order to neutralize differences in purchasing power between countries. The results of this study showed higher costs in Switzerland despite standardization of cost data according to PPP. The difference is not explained by the case-mix index because this was similar for inliers between the two countries. The length of stay (LOS) was also quite similar for inliers between the two countries. The case-mix index was, however, higher for high outliers in Belgium, as reflected in a higher LOS for these patients. Higher costs in Switzerland are thus probably explained mainly by the higher number of agency staff by service in this country or because of differences in medical practices. It is possible to make international comparisons but only if there is standardization of the case-mix between countries and only if comparable accountancy methodologies are used. Harmonization of DRGs groups, nomenclature and accountancy is thus required.

  11. Preliminary estimates of the direct costs associated with endemic diseases of livestock in Great Britain.

    PubMed

    Bennett, R; Christiansen, K; Clifton-Hadley, R

    1999-04-09

    Many 'economic' studies of livestock diseases in Great Britain have been carried out over time. Most studies have considered just one or two diseases and used a different methodology and valuation base from other studies, hampering any comparative assessment of the economic impact of diseases. A standardized methodology was applied to the estimation of the direct costs to livestock production of some 30 endemic diseases/conditions of farm animals in Great Britain. This involved identification of the livestock populations at risk, estimation of the annual incidence of each disease in these populations, identification of the range and incidence of physical effects of each disease on production, valuation of the physical effects of each disease and estimation of the financial value of output losses/resource wastage due to a disease and the costs of specific treatment and prevention measures. The wider economic impacts of disease (such as the implications for human health, animal welfare and markets) were not included in the assessments. Using this standardized methodology with common financial values, a simple spreadsheet model was constructed for each disease. Given the paucity of appropriate disease data for economic assessment, 'low' and 'high' values were used to reflect uncertainties surrounding key disease parameters. Preliminary estimates of the value of disease output losses/resource wastage, treatment and prevention costs are presented for each disease. Despite the limitations of the spreadsheet models and of the estimates derived from them, we conclude that the models represent a useful start in developing a system for the comparative economic assessment of livestock diseases in Great Britain.

  12. Methodologic European external quality assurance for DNA sequencing: the EQUALseq program.

    PubMed

    Ahmad-Nejad, Parviz; Dorn-Beineke, Alexandra; Pfeiffer, Ulrike; Brade, Joachim; Geilenkeuser, Wolf-Jochen; Ramsden, Simon; Pazzagli, Mario; Neumaier, Michael

    2006-04-01

    DNA sequencing is a key technique in molecular diagnostics, but to date no comprehensive methodologic external quality assessment (EQA) programs have been instituted. Between 2003 and 2005, the European Union funded, as specific support actions, the EQUAL initiative to develop methodologic EQA schemes for genotyping (EQUALqual), quantitative PCR (EQUALquant), and sequencing (EQUALseq). Here we report on the results of the EQUALseq program. The participating laboratories received a 4-sample set comprising 2 DNA plasmids, a PCR product, and a finished sequencing reaction to be analyzed. Data and information from detailed questionnaires were uploaded online and evaluated by use of a scoring system for technical skills and proficiency of data interpretation. Sixty laboratories from 21 European countries registered, and 43 participants (72%) returned data and samples. Capillary electrophoresis was the predominant platform (n = 39; 91%). The median contiguous correct sequence stretch was 527 nucleotides with considerable variation in quality of both primary data and data evaluation. The association between laboratory performance and the number of sequencing assays/year was statistically significant (P <0.05). Interestingly, more than 30% of participants neither added comments to their data nor made efforts to identify the gene sequences or mutational positions. Considerable variations exist even in a highly standardized methodology such as DNA sequencing. Methodologic EQAs are appropriate tools to uncover strengths and weaknesses in both technique and proficiency, and our results emphasize the need for mandatory EQAs. The results of EQUALseq should help improve the overall quality of molecular genetics findings obtained by DNA sequencing.

  13. Overview of systematic reviews of therapeutic ranges: methodologies and recommendations for practice.

    PubMed

    Cooney, Lewis; Loke, Yoon K; Golder, Su; Kirkham, Jamie; Jorgensen, Andrea; Sinha, Ian; Hawcutt, Daniel

    2017-06-02

    Many medicines are dosed to achieve a particular therapeutic range, and monitored using therapeutic drug monitoring (TDM). The evidence base for a therapeutic range can be evaluated using systematic reviews, to ensure it continues to reflect current indications, doses, routes and formulations, as well as updated adverse effect data. There is no consensus on the optimal methodology for systematic reviews of therapeutic ranges. An overview of systematic reviews of therapeutic ranges was undertaken. The following databases were used: Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts and Reviews of Effects (DARE) and MEDLINE. The published methodologies used when systematically reviewing the therapeutic range of a drug were analyzed. Step by step recommendations to optimize such systematic reviews are proposed. Ten systematic reviews that investigated the correlation between serum concentrations and clinical outcomes encompassing a variety of medicines and indications were assessed. There were significant variations in the methodologies used (including the search terms used, data extraction methods, assessment of bias, and statistical analyses undertaken). Therapeutic ranges should be population and indication specific and based on clinically relevant outcomes. Recommendations for future systematic reviews based on these findings have been developed. Evidence based therapeutic ranges have the potential to improve TDM practice. Current systematic reviews investigating therapeutic ranges have highly variable methodologies and there is no consensus of best practice when undertaking systematic reviews in this field. These recommendations meet a need not addressed by standard protocols.

  14. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  15. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue.

    PubMed

    Laurinavicius, Arvydas; Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Dasevicius, Darius; Elie, Nicolas; Iqbal, Yasir; Bor, Catherine

    2014-01-01

    Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists' VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particularfor the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers.

  16. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue

    PubMed Central

    2014-01-01

    Introduction Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Methods Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. Results ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists’ VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particular for the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Conclusions Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers. PMID:24708745

  17. Investigation of Radiation Protection Methodologies for Radiation Therapy Shielding Using Monte Carlo Simulation and Measurement

    NASA Astrophysics Data System (ADS)

    Tanny, Sean

    The advent of high-energy linear accelerators for dedicated medical use in the 1950's by Henry Kaplan and the Stanford University physics department began a revolution in radiation oncology. Today, linear accelerators are the standard of care for modern radiation therapy and can generate high-energy beams that can produce tens of Gy per minute at isocenter. This creates a need for a large amount of shielding material to properly protect members of the public and hospital staff. Standardized vault designs and guidance on shielding properties of various materials are provided by the National Council on Radiation Protection (NCRP) Report 151. However, physicists are seeking ways to minimize the footprint and volume of shielding material needed which leads to the use of non-standard vault configurations and less-studied materials, such as high-density concrete. The University of Toledo Dana Cancer Center has utilized both of these methods to minimize the cost and spatial footprint of the requisite radiation shielding. To ensure a safe work environment, computer simulations were performed to verify the attenuation properties and shielding workloads produced by a variety of situations where standard recommendations and guidance documents were insufficient. This project studies two areas of concern that are not addressed by NCRP 151, the radiation shielding workload for the vault door with a non-standard design, and the attenuation properties of high-density concrete for both photon and neutron radiation. Simulations have been performed using a Monte-Carlo code produced by the Los Alamos National Lab (LANL), Monte Carlo Neutrons, Photons 5 (MCNP5). Measurements have been performed using a shielding test port designed into the maze of the Varian Edge treatment vault.

  18. Use of simulated experiments for material characterization of brittle materials subjected to high strain rate dynamic tension

    PubMed Central

    Saletti, Dominique

    2017-01-01

    Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505

  19. Evaluation of Behaviours of Laminated Glass

    NASA Astrophysics Data System (ADS)

    Sable, L.; Japins, G.; Kalnins, K.

    2015-11-01

    Visual appearance of building facades and other load bearing structures, which now are part of modern architecture, is the reason why it is important to investigate in more detail the reliability of laminated glass for civil structures. Laminated glass in particular has become one of the trendy materials, for example Apple© stores have both load carrying capacity and transparent appearance. Glass has high mechanical strength and relatively medium density, however, the risk of sudden brittle failure like concrete or other ceramics determine relatively high conservatism in design practice of glass structures. This should be changed as consumer requirements evolve calling for a safe and reliable design methodology and corresponding building standards. A design methodology for glass and glass laminates should be urgently developed and included as a chapter in Eurocode. This paper presents initial experimental investigation of behaviour of simple glass sheets and laminated glass samples in 4-point bending test. The aim of the current research is to investigate laminated glass characteristic values and to verify the obtained experimental results with finite element method for glass and EVA material in line with future European Structural Design of Glass Components code.

  20. Peptide biomarkers as a way to determine meat authenticity.

    PubMed

    Sentandreu, Miguel Angel; Sentandreu, Enrique

    2011-11-01

    Meat fraud implies many illegal procedures affecting the composition of meat and meat products, something that is commonly done with the aim to increase profit. These practices need to be controlled by legal authorities by means of robust, accurate and sensitive methodologies capable to assure that fraudulent or accidental mislabelling does not arise. Common strategies traditionally used to assess meat authenticity have been based on methods such as chemometric analysis of a large set of data analysis, immunoassays or DNA analysis. The identification of peptide biomarkers specific of a particular meat species, tissue or ingredient by proteomic technologies constitutes an interesting and promising alternative to existing methodologies due to its high discriminating power, robustness and sensitivity. The possibility to develop standardized protein extraction protocols, together with the considerably higher resistance of peptide sequences to food processing as compared to DNA sequences, would overcome some of the limitations currently existing for quantitative determinations of highly processed food samples. The use of routine mass spectrometry equipment would make the technology suitable for control laboratories. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Biologically-inspired data decorrelation for hyper-spectral imaging

    NASA Astrophysics Data System (ADS)

    Picon, Artzai; Ghita, Ovidiu; Rodriguez-Vaamonde, Sergio; Iriondo, Pedro Ma; Whelan, Paul F.

    2011-12-01

    Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification

  2. Teaching mathematical word problem solving: the quality of evidence for strategy instruction priming the problem structure.

    PubMed

    Jitendra, Asha K; Petersen-Brown, Shawna; Lein, Amy E; Zaslofsky, Anne F; Kunkel, Amy K; Jung, Pyung-Gang; Egan, Andrea M

    2015-01-01

    This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et al. and 10 single case design (SCD) research studies using criteria suggested by Horner et al. and the What Works Clearinghouse. Results indicated that 14 group design studies met the criteria for high-quality or acceptable research, whereas SCD studies did not meet the standards for an evidence-based practice. Based on these findings, strategy instruction priming the mathematics problem structure is considered an evidence-based practice using only group design methodological criteria. Implications for future research and for practice are discussed. © Hammill Institute on Disabilities 2013.

  3. Workplace Learning for the Public Good: Implementation of a Standardized, Competency-Based Curriculum in Texas WIC

    ERIC Educational Resources Information Center

    Kessler, Seth A.; Horton, Karissa D.; Gottlieb, Nell H.; Atwood, Robin

    2012-01-01

    Purpose: The purpose of this study is to describe preceptors' implementation experiences after implementing a workplace learning program in Texas WIC (women, infant, and children) agencies and identify implementation best practices. Design/methodology/approach: This research used qualitative description methodology. Data collection consisted of 11…

  4. Theater Level War Games.

    DTIC Science & Technology

    1982-06-02

    to Army Modeling efforts. Include design for future priori- ties and specific actions. (13) Establish standards, methodology and formats for exter- I...with models and the wider technological-scientific-academic community, (4) increased centralized management of data, and (5) design of a proactive...andObjectives ............... 2 Purposes and Preliminary Results . . . . . . . . . . . . 4 Scope of Study .................... 6 Methodology

  5. Application of Resource Description Framework to Personalise Learning: Systematic Review and Methodology

    ERIC Educational Resources Information Center

    Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus

    2017-01-01

    The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…

  6. Classification of Word Levels with Usage Frequency, Expert Opinions and Machine Learning

    ERIC Educational Resources Information Center

    Sohsah, Gihad N.; Ünal, Muhammed Esad; Güzey, Onur

    2015-01-01

    Educational applications for language teaching can utilize the language levels of words to target proficiency levels of students. This paper and the accompanying data provide a methodology for making educational standard-aligned language-level predictions for all English words. The methodology involves expert opinions on language levels and…

  7. Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process

    ERIC Educational Resources Information Center

    Arnab, Sylvester; Clarke, Samantha

    2017-01-01

    The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…

  8. Measurement of Workforce Readiness Competencies: Design of Prototype Measures.

    ERIC Educational Resources Information Center

    O'Neil, Harold F., Jr.; And Others

    A general methodology approach is suggested for measurement of workforce readiness competencies in the context of overall work by the National Center for Research on Evaluation, Standards, and Student Testing on the domain-independent measurement of workforce readiness skills. The methodology consists of 14 steps, from the initial selection of a…

  9. Traditional vs. Experiential: A Comparative Study of Instructional Methodologies on Student Achievement in New York City Public Schools

    ERIC Educational Resources Information Center

    Mohan, Subhas

    2015-01-01

    This study explores the differences in student achievement on state standardized tests between experiential learning and direct learning instructional methodologies. Specifically, the study compares student performances in Expeditionary Learning schools, which is a Comprehensive School Reform model that utilizes experiential learning, to their…

  10. Addressing the English Language Arts Technology Standard in a Secondary Reading Methodology Course.

    ERIC Educational Resources Information Center

    Merkley, Donna J.; Schmidt, Denise A.; Allen, Gayle

    2001-01-01

    Describes efforts to integrate technology into a reading methodology course for secondary English majors. Discusses the use of e-mail, multimedia, distance education for videoconferences, online discussion technology, subject-specific software, desktop publishing, a database management system, a concept mapping program, and the use of the World…

  11. A Protean Practice? Perspectives on the Practice of Action Learning

    ERIC Educational Resources Information Center

    Brook, Cheryl; Pedler, Mike; Burgoyne, John G

    2013-01-01

    Purpose: The purpose of the paper is to assess the extent to which these practitioners ' perspectives and practices match Willis's conception of a Revans "gold standard" of action learning. Design/methodology/approach: This study adopts a qualitative design and methodology based on interviews and the collection of cases or accounts of…

  12. Decoding the Disciplines: An Approach to Scientific Thinking

    ERIC Educational Resources Information Center

    Pinnow, Eleni

    2016-01-01

    The Decoding the Disciplines methodology aims to teach students to think like experts in discipline-specific tasks. The central aspect of the methodology is to identify a bottleneck in the course content: a particular topic that a substantial number of students struggle to master. The current study compared the efficacy of standard lecture and…

  13. Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events

    USDA-ARS?s Scientific Manuscript database

    Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....

  14. 40 CFR 230.95 - Ecological performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Ecological performance standards. 230.95 Section 230.95 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING... functional capacity described in functional assessment methodologies, measurements of hydrology or other...

  15. Lessons Learned about the Methodology of Economic Impact Studies: The NIST Experience.

    ERIC Educational Resources Information Center

    Tassey, Gregory

    1999-01-01

    Summarizes ongoing economic impact assessment activities at the National Institute of Standards and Technology (NIST) for its Measurement and Standards Laboratory Program. Explores designing economic impact studies for integration into assessments of broader programmatic objectives. (SLD)

  16. Standards to Assure Quality in Tertiary Education: The Case of Tanzania

    ERIC Educational Resources Information Center

    Manyaga, Timothy

    2008-01-01

    Purpose: The purpose of this paper is to provide information on development of standards in Tanzania which may be of help to training providers in other countries as they seek to improve the quality and standards of their provision. Design/methodology/approach: The need to provide quality assured tertiary qualifications in Tanzania to win both…

  17. A review of the solar array manufacturing industry costing standards

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.

  18. Issues of E-Learning Standards and Identity Management for Mobility and Collaboration in Higher Education

    ERIC Educational Resources Information Center

    Alves, Paulo; Uhomoibhi, James

    2010-01-01

    Purpose: This paper seeks to investigate and report on the status of identity management systems and e-learning standards across Europe for promoting mobility, collaboration and the sharing of contents and services in higher education institutions. Design/methodology/approach: The present research work examines existing e-learning standards and…

  19. Peculiarities of the Application of Income Tax Standards by the Subsidiary Company in the Russian Accounting Practice

    ERIC Educational Resources Information Center

    Ermakova, Natalya A.; Gudshatullaeva, Elena M.

    2016-01-01

    The aim of this work is to analyze the application practice of accounting regulation provision of subsidiary company "Accounting of settlements on income tax" (AR 18/02) and correlation of methodology of formed indicators with standards of International Accounting Standards (IAS) 12 "Income taxes" at formation of the…

  20. Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards

    ERIC Educational Resources Information Center

    Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.

    2015-01-01

    With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…

  1. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed

    Lexchin, J; Holbrook, A

    1994-07-01

    To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.

  2. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  3. Procurement Contracting Officer’s Guide to Cost Accounting Standards,

    DTIC Science & Technology

    1977-09-01

    ACCESSION MO r P.R0CUR2K2NT CONTRACTING ^FFICDR’S %UID2 TO COST ACCOUNTING STANDARDS. .-IB’ i 4fiSj irPBVPWra ONOANIZATION NAME MB AOONESS...discussing the history and development of Cost Accounting Standards, the functions of the Cost Accounting Standards Board, and the methodology...20. Abstract (continued) the tasks that Cost Accounting Standards have placed on the procurement officer. 3y understanding these tasks the

  4. Protocol-developing meta-ethnography reporting guidelines (eMERGe).

    PubMed

    France, E F; Ring, N; Noyes, J; Maxwell, M; Jepson, R; Duncan, E; Turley, R; Jones, D; Uny, I

    2015-11-25

    Designing and implementing high-quality health care services and interventions requires robustly synthesised evidence. Syntheses of qualitative research studies can provide evidence of patients' experiences of health conditions; intervention feasibility, appropriateness and acceptability to patients; and advance understanding of health care issues. The unique, interpretive, theory-based meta-ethnography synthesis approach is suited to conveying patients' views and developing theory to inform service design and delivery. However, meta-ethnography reporting is often poor quality, which discourages trust in, and use of, meta-ethnography findings. Users of evidence syntheses require reports that clearly articulate analytical processes and findings. Tailored research reporting guidelines can raise reporting standards but none exists for meta-ethnography. This study aims to create an evidence-based meta-ethnography reporting guideline articulating the methodological standards and depth of reporting required to improve reporting quality. The mixed-methods design of this National Institute of Health Research-funded study (http://www.stir.ac.uk/emerge/) follows good practice in research reporting guideline development comprising: (1) a methodological systematic review (PROSPERO registration: CRD42015024709) to identify recommendations and guidance in conducting/reporting meta-ethnography; (2) a review and audit of published meta-ethnographies to identify good practice principles and develop standards in conduct/reporting; (3) an online workshop and Delphi studies to agree guideline content with 45 international qualitative synthesis experts and 45 other stakeholders including patients; (4) development and wide dissemination of the guideline and its accompanying detailed explanatory document, a report template for National Institute of Health Research commissioned meta-ethnographies, and training materials on guideline use. Meta-ethnography, devised in the field of education, is now used widely in other disciplines. Methodological advances relevant to meta-ethnography conduct exist. The extent of discipline-specific adaptations of meta-ethnography and the fit of any adaptions with the underpinning philosophy of meta-ethnography require investigation. Well-reported meta-ethnography findings could inform clinical decision-making. A bespoke meta-ethnography reporting guideline is needed to improve reporting quality, but to be effective potential users must know it exists, trust it and use it. Therefore, a rigorous study has been designed to develop and promote a guideline. By raising reporting quality, the guideline will maximise the likelihood that high-quality meta-ethnographies will contribute robust evidence to improve health care and patient outcomes.

  5. Application of acetone acetals as water scavengers and derivatization agents prior to the gas chromatographic analysis of polar residual solvents in aqueous samples.

    PubMed

    van Boxtel, Niels; Wolfs, Kris; Van Schepdael, Ann; Adams, Erwin

    2015-12-18

    The sensitivity of gas chromatography (GC) combined with the full evaporation technique (FET) for the analysis of aqueous samples is limited due to the maximum tolerable sample volume in a headspace vial. Using an acetone acetal as water scavenger prior to FET-GC analysis proved to be a useful and versatile tool for the analysis of high boiling analytes in aqueous samples. 2,2-Dimethoxypropane (DMP) was used in this case resulting in methanol and acetone as reaction products with water. These solvents are relatively volatile and were easily removed by evaporation enabling sample enrichment leading to 10-fold improvement in sensitivity compared to the standard 10μL FET sample volumes for a selection of typical high boiling polar residual solvents in water. This could be improved even further if more sample is used. The method was applied for the determination of residual NMP in an aqueous solution of a cefotaxime analogue and proved to be considerably better than conventional static headspace (sHS) and the standard FET approach. The methodology was also applied to determine trace amounts of ethylene glycol (EG) in aqueous samples like contact lens fluids, where scavenging of the water would avoid laborious extraction prior to derivatization. During this experiment it was revealed that DMP reacts quantitatively with EG to form 2,2-dimethyl-1,3-dioxolane (2,2-DD) under the proposed reaction conditions. The relatively high volatility (bp 93°C) of 2,2-DD makes it possible to perform analysis of EG using the sHS methodology making additional derivatization reactions superfluous. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Estimating chlorophyll content and photochemical yield of photosystem II (ΦPSII) using solar-induced chlorophyll fluorescence measurements at different growing stages of attached leaves

    PubMed Central

    Tubuxin, Bayaer; Rahimzadeh-Bajgiran, Parinaz; Ginnan, Yusaku; Hosoi, Fumiki; Omasa, Kenji

    2015-01-01

    This paper illustrates the possibility of measuring chlorophyll (Chl) content and Chl fluorescence parameters by the solar-induced Chl fluorescence (SIF) method using the Fraunhofer line depth (FLD) principle, and compares the results with the standard measurement methods. A high-spectral resolution HR2000+ and an ordinary USB4000 spectrometer were used to measure leaf reflectance under solar and artificial light, respectively, to estimate Chl fluorescence. Using leaves of Capsicum annuum cv. ‘Sven’ (paprika), the relationships between the Chl content and the steady-state Chl fluorescence near oxygen absorption bands of O2B (686nm) and O2A (760nm), measured under artificial and solar light at different growing stages of leaves, were evaluated. The Chl fluorescence yields of ΦF 686nm/ΦF 760nm ratios obtained from both methods correlated well with the Chl content (steady-state solar light: R2 = 0.73; artificial light: R2 = 0.94). The SIF method was less accurate for Chl content estimation when Chl content was high. The steady-state solar-induced Chl fluorescence yield ratio correlated very well with the artificial-light-induced one (R2 = 0.84). A new methodology is then presented to estimate photochemical yield of photosystem II (ΦPSII) from the SIF measurements, which was verified against the standard Chl fluorescence measurement method (pulse-amplitude modulated method). The high coefficient of determination (R2 = 0.74) between the ΦPSII of the two methods shows that photosynthesis process parameters can be successfully estimated using the presented methodology. PMID:26071530

  7. Novel methodology to isolate microplastics from vegetal-rich samples.

    PubMed

    Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T

    2018-04-01

    Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Traumatic brain injury: methodological approaches to estimate health and economic outcomes.

    PubMed

    Lu, Juan; Roe, Cecilie; Aas, Eline; Lapane, Kate L; Niemeier, Janet; Arango-Lasprilla, Juan Carlos; Andelic, Nada

    2013-12-01

    The effort to standardize the methodology and adherence to recommended principles for all economic evaluations has been emphasized in medical literature. The objective of this review is to examine whether economic evaluations in traumatic brain injury (TBI) research have been compliant with existing guidelines. Medline search was performed between January 1, 1995 and August 11, 2012. All original TBI-related full economic evaluations were included in the study. Two authors independently rated each study's methodology and data presentation to determine compliance to the 10 methodological principles recommended by Blackmore et al. Descriptive analysis was used to summarize the data. Inter-rater reliability was assessed with Kappa statistics. A total of 28 studies met the inclusion criteria. Eighteen of these studies described cost-effectiveness, seven cost-benefit, and three cost-utility analyses. The results showed a rapid growth in the number of published articles on the economic impact of TBI since 2000 and an improvement in their methodological quality. However, overall compliance with recommended methodological principles of TBI-related economic evaluation has been deficient. On average, about six of the 10 criteria were followed in these publications, and only two articles met all 10 criteria. These findings call for an increased awareness of the methodological standards that should be followed by investigators both in performance of economic evaluation and in reviews of evaluation reports prior to publication. The results also suggest that all economic evaluations should be made by following the guidelines within a conceptual framework, in order to facilitate evidence-based practices in the field of TBI.

  9. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  10. User's manual for PRESTO: A computer code for the performance of regenerative steam turbine cycles

    NASA Technical Reports Server (NTRS)

    Fuller, L. C.; Stovall, T. K.

    1979-01-01

    Standard turbine cycles for baseload power plants and cycles with such additional features as process steam extraction and induction and feedwater heating by external heat sources may be modeled. Peaking and high back pressure cycles are also included. The code's methodology is to use the expansion line efficiencies, exhaust loss, leakages, mechanical losses, and generator losses to calculate the heat rate and generator output. A general description of the code is given as well as the instructions for input data preparation. Appended are two complete example cases.

  11. Computation of Nonlinear Backscattering Using a High-Order Numerical Method

    NASA Technical Reports Server (NTRS)

    Fibich, G.; Ilan, B.; Tsynkov, S.

    2001-01-01

    The nonlinear Schrodinger equation (NLS) is the standard model for propagation of intense laser beams in Kerr media. The NLS is derived from the nonlinear Helmholtz equation (NLH) by employing the paraxial approximation and neglecting the backscattered waves. In this study we use a fourth-order finite-difference method supplemented by special two-way artificial boundary conditions (ABCs) to solve the NLH as a boundary value problem. Our numerical methodology allows for a direct comparison of the NLH and NLS models and for an accurate quantitative assessment of the backscattered signal.

  12. Low cost composite manufacturing utilizing intelligent pultrusion and resin transfer molding (IPRTM)

    NASA Astrophysics Data System (ADS)

    Bradley, James E.; Wysocki, Tadeusz S., Jr.

    1993-02-01

    This article describes an innovative method for the economical manufacturing of large, intricately-shaped tubular composite parts. Proprietary intelligent process control techniques are combined with standard pultrusion and RTM methodologies to provide high part throughput, performance, and quality while substantially reducing scrap, rework costs, and labor requirements. On-line process monitoring and control is achieved through a smart tooling interface consisting of modular zone tiles installed on part-specific die assemblies. Real-time archiving of process run parameters provides enhanced SPC and SQC capabilities.

  13. Management methodology for pressure equipment

    NASA Astrophysics Data System (ADS)

    Bletchly, P. J.

    Pressure equipment constitutes a significant investment in capital and a major proportion of potential high-risk plant in many operations and this is particularly so in an alumina refinery. In many jurisdictions pressure equipment is also subject to statutory regulation that imposes obligations on Owners of the equipment with respect to workplace safety. Most modern technical standards and industry codes of practice employ a risk-based approach to support better decision making with respect to pressure equipment. For a management system to be effective it must demonstrate that risk is being managed within acceptable limits.

  14. Heat Transfer Modeling for Rigid High-Temperature Fibrous Insulation

    NASA Technical Reports Server (NTRS)

    Daryabeigi, Kamran; Cunnington, George R.; Knutson, Jeffrey R.

    2012-01-01

    Combined radiation and conduction heat transfer through a high-temperature, high-porosity, rigid multiple-fiber fibrous insulation was modeled using a thermal model previously used to model heat transfer in flexible single-fiber fibrous insulation. The rigid insulation studied was alumina enhanced thermal barrier (AETB) at densities between 130 and 260 kilograms per cubic meter. The model consists of using the diffusion approximation for radiation heat transfer, a semi-empirical solid conduction model, and a standard gas conduction model. The relevant parameters needed for the heat transfer model were estimated from steady-state thermal measurements in nitrogen gas at various temperatures and environmental pressures. The heat transfer modeling methodology was evaluated by comparison with standard thermal conductivity measurements, and steady-state thermal measurements in helium and carbon dioxide gases. The heat transfer model is applicable over the temperature range of 300 to 1360 K, pressure range of 0.133 to 101.3 x 10(exp 3) Pa, and over the insulation density range of 130 to 260 kilograms per cubic meter in various gaseous environments.

  15. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  16. Proposed Risk-Informed Seismic Hazard Periodic Reevaluation Methodology for Complying with DOE Order 420.1C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kammerer, Annie

    Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less

  17. Whole-Genome Sequencing and Assembly with High-Throughput, Short-Read Technologies

    PubMed Central

    Sundquist, Andreas; Ronaghi, Mostafa; Tang, Haixu; Pevzner, Pavel; Batzoglou, Serafim

    2007-01-01

    While recently developed short-read sequencing technologies may dramatically reduce the sequencing cost and eventually achieve the $1000 goal for re-sequencing, their limitations prevent the de novo sequencing of eukaryotic genomes with the standard shotgun sequencing protocol. We present SHRAP (SHort Read Assembly Protocol), a sequencing protocol and assembly methodology that utilizes high-throughput short-read technologies. We describe a variation on hierarchical sequencing with two crucial differences: (1) we select a clone library from the genome randomly rather than as a tiling path and (2) we sample clones from the genome at high coverage and reads from the clones at low coverage. We assume that 200 bp read lengths with a 1% error rate and inexpensive random fragment cloning on whole mammalian genomes is feasible. Our assembly methodology is based on first ordering the clones and subsequently performing read assembly in three stages: (1) local assemblies of regions significantly smaller than a clone size, (2) clone-sized assemblies of the results of stage 1, and (3) chromosome-sized assemblies. By aggressively localizing the assembly problem during the first stage, our method succeeds in assembling short, unpaired reads sampled from repetitive genomes. We tested our assembler using simulated reads from D. melanogaster and human chromosomes 1, 11, and 21, and produced assemblies with large sets of contiguous sequence and a misassembly rate comparable to other draft assemblies. Tested on D. melanogaster and the entire human genome, our clone-ordering method produces accurate maps, thereby localizing fragment assembly and enabling the parallelization of the subsequent steps of our pipeline. Thus, we have demonstrated that truly inexpensive de novo sequencing of mammalian genomes will soon be possible with high-throughput, short-read technologies using our methodology. PMID:17534434

  18. FIELD EVALUATION OF A SAMPLING APPROACH FOR PM-COARSE AEROSOLS

    EPA Science Inventory

    Subsequent to a 1997 revision of the national ambient air quality standards (NAAQS) for particulate matter (PM), the US Environmental Protection Agency is investigating the development of sampling methodology for a possible new coarse particle standard. When developed, this me...

  19. Movie Mitosis

    ERIC Educational Resources Information Center

    Bogiages, Christopher; Hitt, Austin M.

    2008-01-01

    Mitosis and meiosis are essential for the growth, development, and reproduction of organisms. Because these processes are essential to life, both are emphasized in biology texts, state standards, and the National Science Education Standards. In this article, the authors present their methodology for teaching mitosis by having students produce…

  20. MULTIRESIDUE DETERMINATION OF ACIDIC PESTICIDES ...

    EPA Pesticide Factsheets

    A multiresidue pesticide methodology has been studied and results for acidics are reported here with base/neutral to follow. This work studies a literature procedure as a possible general approach to many pesticides and potentially other analytes that are considered to be liquid chromatographic candidates rather than gas chromatographic ones. The analysis of thesewage effluent of a major southwestern US city serves as an example of the application of the methodology to a real sample. Recovery studies were also conducted to validate the proposed extraction step. A gradient elution program was followed for the high performance liquid chromatography leading to a general approach for acidics. Confirmation of identity was by EI GC/MS after conversion of the acids to the methyl ester (or other appropriate methylation) by means of trimethylsilyldiazomethane. The 3,4-dichlorophenoxyacetic acid was used as an internal standard to monitor the reaction and PCB #19 was used for the quantitation internal standard. Although others have reported similar analyses of acids, conversion to the methyl ester was by means of diazomethane itself rather than by the more convenient and safer trimethylsilyldiazomethane. Thus, the present paper supports the use of trimethylsilyldiazomethane with all of these acids (trimethylsilyldiazomethane has been used in environmental work with some phenoxyacetic acid herbicides) and further supports the usefulness of this reagent as a potential re

  1. Absolute pitch memory: its prevalence among musicians and dependence on the testing context.

    PubMed

    Wong, Yetta Kwailing; Wong, Alan C-N

    2014-04-01

    Absolute pitch (AP) is widely believed to be a rare ability possessed by only a small group of gifted and special individuals (AP possessors). While AP has fascinated psychologists, neuroscientists, and musicians for more than a century, no theory can satisfactorily explain why this ability is so rare and difficult to learn. Here, we show that AP ability appears rare because of the methodological issues of the standard pitch-naming test. Specifically, the standard test unnecessarily poses a high decisional demand on AP judgments and uses a testing context that is highly inconsistent with one's musical training. These extra cognitive challenges are not central to AP memory per se and have thus led to consistent underestimation of AP ability in the population. Using the standard test, we replicated the typical findings that the accuracy for general violinists was low (12.38 %; chance level = 0 %). With identical stimuli, scoring criteria, and participants, violinists attained 25 % accuracy in a pitch verification test in which the decisional demand of AP judgment was reduced. When the testing context was increasingly similar to their musical experience, verification accuracy improved further and reached 39 %, three times higher than that for the standard test. Results were replicated with a separate group of pianists. Our findings challenge current theories about AP and suggest that the prevalence of AP among musicians has been highly underestimated in prior work. A multimodal framework is proposed to better explain AP memory.

  2. Tools and methodologies to support more sustainable biofuel feedstock production.

    PubMed

    Dragisic, Christine; Ashkenazi, Erica; Bede, Lucio; Honzák, Miroslav; Killeen, Tim; Paglia, Adriano; Semroc, Bambi; Savy, Conrad

    2011-02-01

    Increasingly, government regulations, voluntary standards, and company guidelines require that biofuel production complies with sustainability criteria. For some stakeholders, however, compliance with these criteria may seem complex, costly, or unfeasible. What existing tools, then, might facilitate compliance with a variety of biofuel-related sustainability criteria? This paper presents four existing tools and methodologies that can help stakeholders assess (and mitigate) potential risks associated with feedstock production, and can thus facilitate compliance with requirements under different requirement systems. These include the Integrated Biodiversity Assessment Tool (IBAT), the ARtificial Intelligence for Ecosystem Services (ARIES) tool, the Responsible Cultivation Areas (RCA) methodology, and the related Biofuels + Forest Carbon (Biofuel + FC) methodology.

  3. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006)

    PubMed Central

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit. PMID:21217912

  4. Determination of absorbed dose to water for high-energy photon and electron beams-comparison of the standards DIN 6800-2 (1997), IAEA TRS 398 (2000) and DIN 6800-2 (2006).

    PubMed

    Zakaria, Golam Abu; Schuette, Wilhelm

    2007-01-01

    For the determination of the absorbed dose to water for high-energy photon and electron beams the IAEA code of practice TRS-398 (2000) is applied internationally. In Germany, the German dosimetry protocol DIN 6800-2 (1997) is used. Recently, the DIN standard has been revised and published as Draft National Standard DIN 6800-2 (2006). It has adopted widely the methodology and dosimetric data of the code of practice. This paper compares these three dosimetry protocols systematically and identifies similarities as well as differences. The investigation was done with 6 and 18 MV photon as well as 5 to 21 MeV electron beams. While only cylindrical chambers were used for photon beams, measurements of electron beams were performed using cylindrical as well as plane-parallel chambers. The discrepancies in the determination of absorbed dose to water between the three protocols were 0.4% for photon beams and 1.5% for electron beams. Comparative measurements showed a deviation of less than 0.5% between our measurements following protocol DIN 6800-2 (2006) and TLD inter-comparison procedure in an external audit.

  5. Laboratory diagnostics of malaria

    NASA Astrophysics Data System (ADS)

    Siahaan, L.

    2018-03-01

    Even now, malaria treatment should only be administered after laboratory confirmation. There are several principal methods for diagnosing malaria. All these methods have their disadvantages.Presumptive treatment of malaria is widely practiced where laboratory tests are not readily available. Microscopy of Giemsa-stained thick and thin blood films remains the gold standard for the diagnosis of malaria infection. The technique of slide preparation, staining and reading are well known and standardized, and so is the estimate of the parasite density and parasite stages. Microscopy is not always available or feasible at primary health services in limited resource settings due to cost, lack of skilled manpower, accessories and reagents required. Rapid diagnostic tests (RDTs) are potential tools for parasite-based diagnosis since the tests are accurate in detecting malaria infections and are easy to use. The test is based on the capture of parasite antigen that released from parasitized red blood cells using monoclonal antibodies prepared against malaria antigen target. Polymerase Chain Reaction (PCR), depend on DNA amplification approaches and have higher sensitivity than microscopy. PCR it is not widely used due to the lack of a standardized methodology, high costs, and the need for highly-trained staff.

  6. Development of an accurate and high-throughput methodology for structural comprehension of chlorophylls derivatives. (II) Dephytylated derivatives.

    PubMed

    Chen, Kewei; Ríos, José Julián; Roca, María; Pérez-Gálvez, Antonio

    2015-09-18

    Dephytylated chlorophylls (chlorophyllides and pheophorbides) are the starting point of the chlorophyll catabolism in green tissues, components of the chlorophyll pattern in storage/processed food vegetables, as well as the favoured structural arrangement for chlorophyll absorption. In addition, dephytylated native chlorophylls are prone to several modifications of their structure yielding pyro-, 13(2)-hydroxy- and 15(1)-hydroxy-lactone derivatives. Despite of these outstanding remarks only few of them have been analysed by MS(n). Besides new protocols for obtaining standards, we have developed a new high throughput methodology able to determine the fragmentation pathway of 16 dephytylated chlorophyll derivatives, elucidating the structures of the new product ions and new mechanisms of fragmentation. The new methodology combines, by first time, high resolution time-of-flight mass spectrometry and powerful post-processing software. Native chlorophyllides and pheophorbides mainly exhibit product ions that involve the fragmentation of D ring, as well as additional exclusive product ions. The introduction of an oxygenated function at E ring enhances the progress of fragmentation reactions through the β-keto ester group, developing also exclusive product ions for 13(2)-hydroxy derivatives and for 15(1)-hydroxy-lactone ones. Consequently, while MS(2)-based reactions of phytylated chlorophyll derivatives point to fragmentations at the phytyl and propionic chains, dephytylated chlorophyll derivatives behave different as the absence of phytyl makes β-keto ester group and E ring more prone to fragmentation. Proposals of the key reaction mechanisms underlying the origin of new product ions have been made. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Methodological reporting of randomized controlled trials in major hepato-gastroenterology journals in 2008 and 1998: a comparative study

    PubMed Central

    2011-01-01

    Background It was still unclear whether the methodological reporting quality of randomized controlled trials (RCTs) in major hepato-gastroenterology journals improved after the Consolidated Standards of Reporting Trials (CONSORT) Statement was revised in 2001. Methods RCTs in five major hepato-gastroenterology journals published in 1998 or 2008 were retrieved from MEDLINE using a high sensitivity search method and their reporting quality of methodological details were evaluated based on the CONSORT Statement and Cochrane Handbook for Systematic Reviews of interventions. Changes of the methodological reporting quality between 2008 and 1998 were calculated by risk ratios with 95% confidence intervals. Results A total of 107 RCTs published in 2008 and 99 RCTs published in 1998 were found. Compared to those in 1998, the proportion of RCTs that reported sequence generation (RR, 5.70; 95%CI 3.11-10.42), allocation concealment (RR, 4.08; 95%CI 2.25-7.39), sample size calculation (RR, 3.83; 95%CI 2.10-6.98), incomplete outecome data addressed (RR, 1.81; 95%CI, 1.03-3.17), intention-to-treat analyses (RR, 3.04; 95%CI 1.72-5.39) increased in 2008. Blinding and intent-to-treat analysis were reported better in multi-center trials than in single-center trials. The reporting of allocation concealment and blinding were better in industry-sponsored trials than in public-funded trials. Compared with historical studies, the methodological reporting quality improved with time. Conclusion Although the reporting of several important methodological aspects improved in 2008 compared with those published in 1998, which may indicate the researchers had increased awareness of and compliance with the revised CONSORT statement, some items were still reported badly. There is much room for future improvement. PMID:21801429

  8. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  9. Methodological accuracy of image-based electron density assessment using dual-energy computed tomography.

    PubMed

    Möhler, Christian; Wohlfahrt, Patrick; Richter, Christian; Greilich, Steffen

    2017-06-01

    Electron density is the most important tissue property influencing photon and ion dose distributions in radiotherapy patients. Dual-energy computed tomography (DECT) enables the determination of electron density by combining the information on photon attenuation obtained at two different effective x-ray energy spectra. Most algorithms suggested so far use the CT numbers provided after image reconstruction as input parameters, i.e., are imaged-based. To explore the accuracy that can be achieved with these approaches, we quantify the intrinsic methodological and calibration uncertainty of the seemingly simplest approach. In the studied approach, electron density is calculated with a one-parametric linear superposition ('alpha blending') of the two DECT images, which is shown to be equivalent to an affine relation between the photon attenuation cross sections of the two x-ray energy spectra. We propose to use the latter relation for empirical calibration of the spectrum-dependent blending parameter. For a conclusive assessment of the electron density uncertainty, we chose to isolate the purely methodological uncertainty component from CT-related effects such as noise and beam hardening. Analyzing calculated spectrally weighted attenuation coefficients, we find universal applicability of the investigated approach to arbitrary mixtures of human tissue with an upper limit of the methodological uncertainty component of 0.2%, excluding high-Z elements such as iodine. The proposed calibration procedure is bias-free and straightforward to perform using standard equipment. Testing the calibration on five published data sets, we obtain very small differences in the calibration result in spite of different experimental setups and CT protocols used. Employing a general calibration per scanner type and voltage combination is thus conceivable. Given the high suitability for clinical application of the alpha-blending approach in combination with a very small methodological uncertainty, we conclude that further refinement of image-based DECT-algorithms for electron density assessment is not advisable. © 2017 American Association of Physicists in Medicine.

  10. A methodology for estimating risks associated with landslides of contaminated soil into rivers.

    PubMed

    Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars

    2014-02-15

    Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Expectations for methodology and translation of animal research: a survey of health care workers.

    PubMed

    Joffe, Ari R; Bara, Meredith; Anton, Natalie; Nobis, Nathan

    2015-05-07

    Health care workers (HCW) often perform, promote, and advocate use of public funds for animal research (AR); therefore, an awareness of the empirical costs and benefits of animal research is an important issue for HCW. We aim to determine what health-care-workers consider should be acceptable standards of AR methodology and translation rate to humans. After development and validation, an e-mail survey was sent to all pediatricians and pediatric intensive care unit nurses and respiratory-therapists (RTs) affiliated with a Canadian University. We presented questions about demographics, methodology of AR, and expectations from AR. Responses of pediatricians and nurses/RTs were compared using Chi-square, with P < .05 considered significant. Response rate was 44/114(39%) (pediatricians), and 69/120 (58%) (nurses/RTs). Asked about methodological quality, most respondents expect that: AR is done to high quality; costs and difficulty are not acceptable justifications for low quality; findings should be reproducible between laboratories and strains of the same species; and guidelines for AR funded with public money should be consistent with these expectations. Asked about benefits of AR, most thought that there are sometimes/often large benefits to humans from AR, and disagreed that "AR rarely produces benefit to humans." Asked about expectations of translation to humans (of toxicity, carcinogenicity, teratogenicity, and treatment findings), most: expect translation >40% of the time; thought that misleading AR results should occur <21% of the time; and that if translation was to occur <20% of the time, they would be less supportive of AR. There were few differences between pediatricians and nurses/RTs. HCW have high expectations for the methodological quality of, and the translation rate to humans of findings from AR. These expectations are higher than the empirical data show having been achieved. Unless these areas of AR significantly improve, HCW support of AR may be tenuous.

  12. Assessment of clinical practice guideline methodology for the treatment of knee osteoarthritis with intra-articular hyaluronic acid.

    PubMed

    Altman, Roy D; Schemitsch, Emil; Bedi, Asheesh

    2015-10-01

    Clinical practice guidelines are of increasing importance in the decision making for the treatment of knee osteoarthritis. Inconsistent recommendations regarding the use of intra-articular hyaluronic acid for the treatment of knee osteoarthritis have led to confusion among treating physicians. Literature search to identify clinical practice guidelines that provide recommendations regarding the use of intra-articular hyaluronic acid treatment for knee osteoarthritis was conducted. Included guidelines were appraised using the AGREE II instrument. Guideline development methodologies, how the results were assessed, the recommendation formation, and work group composition were summarized. Overall, 10 clinical practice guidelines were identified that met our inclusion criteria. AGREE II domain scores were variable across the included guidelines. The methodology utilized across the guidelines was heterogeneous regarding the evidence inclusion criteria, analysis of evidence results, formulation of clinical practice recommendations, and work group composition. The recommendations provided by the guidelines for intra-articular hyaluronic acid treatment for knee osteoarthritis are highly inconsistent as a result of the variability in guideline methodology. Overall, 30% of the included guidelines recommended against the use of intra-articular hyaluronic acid in the treatment of knee osteoarthritis, while 30% deemed the treatment an appropriate intervention under certain scenarios. The remaining 40% of the guidelines provided either an uncertain recommendation or no recommendation at all, based on the high variability in reviewed evidence regarding efficacy and trial quality. There is a need for a standard "appropriate methodology" that is agreed upon for osteoarthritis clinical practice guidelines in order to prevent the development of conflicting recommendations for intra-articular hyaluronic acid treatment for knee osteoarthritis, and to assure that treating physicians who are utilizing these guidelines are making their clinical decisions on the best available evidence. At present, the inconsistent recommendations provided for intra-articular hyaluronic acid treatment make it difficult for clinical professionals to determine its appropriateness when treating patients with knee osteoarthritis. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Muscle dysmorphia: methodological issues, implications for research.

    PubMed

    Suffolk, Mark T; Dovey, Terence M; Goodwin, Huw; Meyer, Caroline

    2013-01-01

    Muscle dysmorphia is a male-dominated, body image-related psychological condition. Despite continued investigation, contention surrounds the nosological status of this disorder. The aim of this article was to review the literature on muscle dysmorphia to provide a qualitative account of methodological issues that may inhibit our understanding. Key areas relating to non-standardized participant groups, measuring instruments, and terminology were identified as potentially inhibiting symptom coherence and diagnostic reliability. New measuring instruments validated with clinical samples and carefully described participant groups, standardized terminology, and a greater emphasis on prospective longitudinal research with specific sub groups of the weight training community would be of interest to the field.

  14. On the evidentiary standards for nutrition advice.

    PubMed

    Jukola, Saana

    2018-06-01

    This paper evaluates the application of evidentiary standards originating from evidence-based medicine in nutrition advice. It shows that it is problematic to criticize nutrition recommendations for not being based on randomized controlled trials. Due to practical, ethical and methodological and reasons, it is difficult to conduct rigorous randomized controlled trials for acquiring evidence that is relevant for achieving the goals of population-level nutrition recommendations. Given the non-epistemic goals of the dietary recommendations, criteria of acceptable evidence should be adapted to the goals of the practice and the practical, ethical, and methodological constraints of the situation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Evaluating alternative service contracts for medical equipment.

    PubMed

    De Vivo, L; Derrico, P; Tomaiuolo, D; Capussotto, C; Reali, A

    2004-01-01

    Managing medical equipments is a formidable task that has to be pursued maximizing the benefits within a highly regulated and cost-constrained environment. Clinical engineers are uniquely equipped to determine which policies are the most efficacious and cost effective for a health care institution to ensure that medical devices meet appropriate standards of safety, quality and performance. Part of this support is a strategy for preventive and corrective maintenance. This paper describes an alternative scheme of OEM (Original Equipment Manufacturer) service contract for medical equipment that combines manufacturers' technical support and in-house maintenance. An efficient and efficacious organization can reduce the high cost of medical equipment maintenance while raising reliability and quality. Methodology and results are discussed.

  16. Sensitivity assessment of sea lice to chemotherapeutants: Current bioassays and best practices.

    PubMed

    Marín, S L; Mancilla, J; Hausdorf, M A; Bouchard, D; Tudor, M S; Kane, F

    2017-12-18

    Traditional bioassays are still necessary to test sensitivity of sea lice species to chemotherapeutants, but the methodology applied by the different scientists has varied over time in respect to that proposed in "Sea lice resistance to chemotherapeutants: A handbook in resistance management" (2006). These divergences motivated the organization of a workshop during the Sea Lice 2016 conference "Standardization of traditional bioassay process by sharing best practices." There was an agreement by the attendants to update the handbook. The objective of this article is to provide a baseline analysis of the methodology for traditional bioassays and to identify procedures that need to be addressed to standardize the protocol. The methodology was divided into the following steps: bioassay design; material and equipment; sea lice collection, transportation and laboratory reception; preparation of dilution; parasite exposure; response evaluation; data analysis; and reporting. Information from the presentations of the workshop, and also from other studies, allowed for the identification of procedures inside a given step that need to be standardized as they were reported to be performed differently by the different working groups. Bioassay design and response evaluation were the targeted steps where more procedures need to be analysed and agreed upon. © 2017 John Wiley & Sons Ltd.

  17. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    PubMed

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil Aviation Organization endorsed the use of FOA3.0 in February 2007. Further commitment was made to improve the FOA as new data become available, until such time the methodology is rendered obsolete by a fully validated database of PM emission indices for today's certified commercial fleet. This paper discusses related assumptions and derived equations for the FOA3.0 methodology used worldwide to estimate PM emissions from certified commercial aircraft engines within the vicinity of airports.

  18. A Conceptual Framework for Systematic Reviews of Research in Educational Leadership and Management

    ERIC Educational Resources Information Center

    Hallinger, Philip

    2013-01-01

    Purpose: The purpose of this paper is to present a framework for scholars carrying out reviews of research that meet international standards for publication. Design/methodology/approach: This is primarily a conceptual paper focusing on the methodology of conducting systematic reviews of research. However, the paper draws on a database of reviews…

  19. The Story of Schooling: Critical Race Theory and the Educational Racial Contract

    ERIC Educational Resources Information Center

    Leonardo, Zeus

    2013-01-01

    This article is an engagement of methodology as an ideologico-racial practice through Critical Race Theory's practice of storytelling. It is a conceptual extension of this practice as explained through Charles Mills' use of the "racial contract (RC) as methodology" in order to explain the Herrenvolk Education--one standard for…

  20. Using Six Sigma for Performance Improvement in Business Curriculum: A Case Study

    ERIC Educational Resources Information Center

    Kukreja, Anil; Ricks, Joe M., Jr.; Meyer, Jean A.

    2009-01-01

    During the last few decades, a number of quality improvement methodologies have been used by organizations. This article provides a brief review of the quality improvement literature related to academia and a case study using Six Sigma methodology to analyze students' performance in a standardized examination. We found Six Sigma to be an effective…

  1. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  2. The Standard of Quality for HEIs in Vietnam: A Step in the Right Direction?

    ERIC Educational Resources Information Center

    Tran, Nga D.; Nguyen, Thanh T.; Nguyen, My T. N.

    2011-01-01

    Purpose: The purpose of this paper is to provide a critical analysis of the Standard of Quality for higher education institutions in Vietnam which was developed in response to an urgent call for a fundamental reform to enhance the quality of educational provision, particularly of teaching and learning. Design/methodology/approach: The standard and…

  3. Health Data Standards and Adoption Process: Preliminary Findings of a Qualitative Study in Saudi Arabia

    ERIC Educational Resources Information Center

    Alkraiji, Abdullah; Jackson, Thomas; Murray, Ian

    2011-01-01

    Purpose: This paper seeks to carry out a critical study of health data standards and adoption process with a focus on Saudi Arabia. Design/methodology/approach: Many developed nations have initiated programs to develop, promote, adopt and customise international health data standards to the local needs. The current status of, and future plans for,…

  4. A Content Analysis of Immigration in Traditional, New, and Non-Gateway State Standards for U.S. History and Civics

    ERIC Educational Resources Information Center

    Hilburn, Jeremy; Journell, Wayne; Buchanan, Lisa Brown

    2016-01-01

    In this content analysis of state U.S. History and Civics standards, we compared the treatment of immigration across three types of states with differing immigration demographics. Analyzing standards from 18 states from a critical race methodology perspective, our findings indicated three sets of tensions: a unified American story versus local…

  5. Credit risk migration rates modeling as open systems: A micro-simulation approach

    NASA Astrophysics Data System (ADS)

    Landini, S.; Uberti, M.; Casellina, S.

    2018-05-01

    The last financial crisis of 2008 stimulated the development of new Regulatory Criteria (commonly known as Basel III) that pushed the banking activity to become more prudential, either in the short and the long run. As well known, in 2014 the International Accounting Standards Board (IASB) promulgated the new International Financial Reporting Standard 9 (IFRS 9) for financial instruments that will become effective in January 2018. Since the delayed recognition of credit losses on loans was identified as a weakness in existing accounting standards, the IASB has introduced an Expected Loss model that requires more timely recognition of credit losses. Specifically, new standards require entities to account both for expected losses from when the impairments are recognized for the first time and for full loan lifetime; moreover, a clear preference toward forward looking models is expressed. In this new framework, it is necessary a re-thinking of the widespread standard theoretical approach on which the well known prudential model is founded. The aim of this paper is then to define an original methodological approach to migration rates modeling for credit risk which is innovative respect to the standard method from the point of view of a bank as well as in a regulatory perspective. Accordingly, the proposed not-standard approach considers a portfolio as an open sample allowing for entries, migrations of stayers and exits as well. While being consistent with the empirical observations, this open-sample approach contrasts with the standard closed-sample method. In particular, this paper offers a methodology to integrate the outcomes of the standard closed-sample method within the open-sample perspective while removing some of the assumptions of the standard method. Three main conclusions can be drawn in terms of economic capital provision: (a) based on the Markovian hypothesis with a-priori absorbing state at default, the standard closed-sample method is to be abandoned for not to predict lenders' bankruptcy by construction; (b) to meet more reliable estimates along with the new regulatory standards, the sample to estimate migration rates matrices for credit risk should include either entries and exits; (c) the static eigen-decomposition standard procedure to forecast migration rates should be replaced with a stochastic process dynamics methodology while conditioning forecasts to macroeconomic scenarios.

  6. Prevalence of self-medication in the adult population of Brazil: a systematic review

    PubMed Central

    Domingues, Paulo Henrique Faria; Galvão, Taís Freire; de Andrade, Keitty Regina Cordeiro; de Sá, Pedro Terra Teles; Silva, Marcus Tolentino; Pereira, Mauricio Gomes

    2015-01-01

    OBJECTIVE To evaluate the prevalence of self-medication in Brazil’s adult population. METHODS Systematic review of cross-sectional population-based studies. The following databases were used: Medline, Embase, Scopus, ISI, CINAHL, Cochrane Library, CRD, Lilacs, SciELO, the Banco de teses brasileiras (Brazilian theses database) (Capes) and files from the Portal Domínio Público (Brazilian Public Domain). In addition, the reference lists from relevant studies were examined to identify potentially eligible articles. There were no applied restrictions in terms of the publication date, language or publication status. Data related to publication, population, methods and prevalence of self-medication were extracted by three independent researchers. Methodological quality was assessed following eight criteria related to sampling, measurement and presentation of results. The prevalences were measured from participants who used at least one medication during the recall period of the studies. RESULTS The literature screening identified 2,778 records, from which 12 were included for analysis. Most studies were conducted in the Southeastern region of Brazil, after 2000 and with a 15-day recall period. Only five studies achieved high methodological quality, of which one study had a 7-day recall period, in which the prevalence of self-medication was 22.9% (95%CI 14.6;33.9). The prevalence of self-medication in three studies of high methodological quality with a 15-day recall period was 35.0% (95%CI 29.0;40.0, I2 = 83.9%) in the adult Brazilian population. CONCLUSIONS Despite differences in the methodologies of the included studies, the results of this systematic review indicate that a significant proportion of the adult Brazilian population self-medicates. It is suggested that future research projects that assess self-medication in Brazil standardize their methods. PMID:26083944

  7. Operational Impacts of Wind Energy Resources in the Bonneville Power Administration Control Area - Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai

    2008-07-15

    This report presents a methodology developed to study the future impact of wind on BPA power system load following and regulation requirements. The methodology uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system, by mimicking the actual power system operations. Therefore, the results are close to reality, yet the study based on this methodology is convenient to conduct. Compared with the proposed methodology, existing methodologies for doing similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. Dispatch model simulation is constrained by the design of themore » dispatch program, and standard deviation evaluation is artificial in separating the load following and regulation requirements, both of which usually do not reflect actual operational practice. The methodology used in this study provides not only capacity requirement information, it also analyzes the ramp rate requirements for system load following and regulation processes. The ramp rate data can be used to evaluate generator response/maneuverability requirements, which is another necessary capability of the generation fleet for the smooth integration of wind energy. The study results are presented in an innovative way such that the increased generation capacity or ramp requirements are compared for two different years, across 24 hours a day. Therefore, the impact of different levels of wind energy on generation requirements at different times can be easily visualized.« less

  8. Melanins and melanogenesis: methods, standards, protocols.

    PubMed

    d'Ischia, Marco; Wakamatsu, Kazumasa; Napolitano, Alessandra; Briganti, Stefania; Garcia-Borron, José-Carlos; Kovacs, Daniela; Meredith, Paul; Pezzella, Alessandro; Picardo, Mauro; Sarna, Tadeusz; Simon, John D; Ito, Shosuke

    2013-09-01

    Despite considerable advances in the past decade, melanin research still suffers from the lack of universally accepted and shared nomenclature, methodologies, and structural models. This paper stems from the joint efforts of chemists, biochemists, physicists, biologists, and physicians with recognized and consolidated expertise in the field of melanins and melanogenesis, who critically reviewed and experimentally revisited methods, standards, and protocols to provide for the first time a consensus set of recommended procedures to be adopted and shared by researchers involved in pigment cell research. The aim of the paper was to define an unprecedented frame of reference built on cutting-edge knowledge and state-of-the-art methodology, to enable reliable comparison of results among laboratories and new progress in the field based on standardized methods and shared information. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Development of ASTM Standard for SiC-SiC Joint Testing Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, George; Back, Christina

    2015-10-30

    As the nuclear industry moves to advanced ceramic based materials for cladding and core structural materials for a variety of advanced reactors, new standards and test methods are required for material development and licensing purposes. For example, General Atomics (GA) is actively developing silicon carbide (SiC) based composite cladding (SiC-SiC) for its Energy Multiplier Module (EM2), a high efficiency gas cooled fast reactor. Through DOE funding via the advanced reactor concept program, GA developed a new test method for the nominal joint strength of an endplug sealed to advanced ceramic tubes, Fig. 1-1, at ambient and elevated temperatures called themore » endplug pushout (EPPO) test. This test utilizes widely available universal mechanical testers coupled with clam shell heaters, and specimen size is relatively small, making it a viable post irradiation test method. The culmination of this effort was a draft of an ASTM test standard that will be submitted for approval to the ASTM C28 ceramic committee. Once the standard has been vetted by the ceramics test community, an industry wide standard methodology to test joined tubular ceramic components will be available for the entire nuclear materials community.« less

  10. Methodological approach for the collection and simultaneous estimation of greenhouse gases emission from aquaculture ponds.

    PubMed

    Vasanth, Muthuraman; Muralidhar, Moturi; Saraswathy, Ramamoorthy; Nagavel, Arunachalam; Dayal, Jagabattula Syama; Jayanthi, Marappan; Lalitha, Natarajan; Kumararaja, Periyamuthu; Vijayan, Koyadan Kizhakkedath

    2016-12-01

    Global warming/climate change is the greatest environmental threat of our time. Rapidly developing aquaculture sector is an anthropogenic activity, the contribution of which to global warming is little understood, and estimation of greenhouse gases (GHGs) emission from the aquaculture ponds is a key practice in predicting the impact of aquaculture on global warming. A comprehensive methodology was developed for sampling and simultaneous analysis of GHGs, carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) from the aquaculture ponds. The GHG fluxes were collected using cylindrical acrylic chamber, air pump, and tedlar bags. A cylindrical acrylic floating chamber was fabricated to collect the GHGs emanating from the surface of aquaculture ponds. The sampling methodology was standardized and in-house method validation was established by achieving linearity, accuracy, precision, and specificity. GHGs flux was found to be stable at 10 ± 2 °C of storage for 3 days. The developed methodology was used to quantify GHGs in the Pacific white shrimp Penaeus vannamei and black tiger shrimp Penaeus monodon culture ponds for a period of 4 months. The rate of emission of carbon dioxide was found to be much greater when compared to other two GHGs. Average GHGs emission in gha -1  day -1 during the culture was comparatively high in P.vannamei culture ponds.

  11. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  12. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  13. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  14. Automotive Manufacturer Risk Analysis : Meeting the Automotive Fuel Economy Standards

    DOT National Transportation Integrated Search

    1979-08-01

    An overview of the methodology and some findings are presented of a study which assessed the impact of the automotive fuel economy standards (AFES) on the four major U.S. automakers. A risk model was used to estimate the financial performance of the ...

  15. 75 FR 18751 - FBI Criminal Justice Information Services Division User Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... Standards (SFFAS-4): Managerial Cost Accounting Concepts and Standards for the Federal Government; and other relevant financial management directives, BearingPoint developed a cost accounting methodology and related... management process that provides information about the relationships between inputs (costs) and outputs...

  16. Contracting Officer Technical Representative Briefing

    NASA Technical Reports Server (NTRS)

    Gettleman, Alan

    2001-01-01

    This viewgraph presentation gives an overview of the Agency Occupational Health Program, including details on organizational and personnel changes, medical program standardization, programmatic status, policies, standards, and guides and resources, industrial hygiene and radiological health, assessment schedule and methodology, upcoming events, and the future of the program.

  17. Improving Accuracy and Relevance of Race/Ethnicity Data: Results of a Statewide Collaboration in Hawaii.

    PubMed

    Pellegrin, Karen L; Miyamura, Jill B; Ma, Carolyn; Taniguchi, Ronald

    2016-01-01

    Current race/ethnicity categories established by the U.S. Office of Management and Budget are neither reliable nor valid for understanding health disparities or for tracking improvements in this area. In Hawaii, statewide hospitals have collaborated to collect race/ethnicity data using a standardized method consistent with recommended practices that overcome the problems with the federal categories. The purpose of this observational study was to determine the impact of this collaboration on key measures of race/ethnicity documentation. After this collaborative effort, the number of standardized categories available across hospitals increased from 6 to 34, and the percent of inpatients with documented race/ethnicity increased from 88 to 96%. This improved standardized methodology is now the foundation for tracking population health indicators statewide and focusing quality improvement efforts. The approach used in Hawaii can serve as a model for other states and regions. Ultimately, the ability to standardize data collection methodology across states and regions will be needed to track improvements nationally.

  18. Enantiomer fractions of polychlorinated biphenyls in three selected Standard Reference Materials.

    PubMed

    Morrissey, Joshua A; Bleackley, Derek S; Warner, Nicholas A; Wong, Charles S

    2007-01-01

    The enantiomer composition of six chiral polychlorinated biphenyls (PCBs) were measured in three different certified Standard Reference Materials (SRMs) from the US National Institute of Standards and Technology (NIST): SRM 1946 (Lake Superior fish tissue), SRM 1939a (PCB Congeners in Hudson River Sediment), and SRM 2978 (organic contaminants in mussel tissue--Raritan Bay, New Jersey) to aid in quality assurance/quality control methodologies in the study of chiral pollutants in sediments and biota. Enantiomer fractions (EFs) of PCBs 91, 95, 136, 149, 174, and 183 were measured using a suite of chiral columns by gas chromatography/mass spectrometry. Concentrations of target analytes were in agreement with certified values. Target analyte EFs in reference materials were measured precisely (<2% relative standard deviation), indicating the utility of SRM in quality assurance/control methodologies for analyses of chiral compounds in environmental samples. Measured EFs were also in agreement with previously published analyses of similar samples, indicating that similar enantioselective processes were taking place in these environmental matrices.

  19. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    PubMed

    Caldwell, Zachary R; Zgliczynski, Brian J; Williams, Gareth J; Sandin, Stuart A

    2016-01-01

    Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal) and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%), their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data.

  20. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  1. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE PAGES

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    2016-07-26

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  2. Drought risk assessment under climate change is sensitive to methodological choices for the estimation of evaporative demand.

    PubMed

    Dewes, Candida F; Rangwala, Imtiaz; Barsugli, Joseph J; Hobbins, Michael T; Kumar, Sanjiv

    2017-01-01

    Several studies have projected increases in drought severity, extent and duration in many parts of the world under climate change. We examine sources of uncertainty arising from the methodological choices for the assessment of future drought risk in the continental US (CONUS). One such uncertainty is in the climate models' expression of evaporative demand (E0), which is not a direct climate model output but has been traditionally estimated using several different formulations. Here we analyze daily output from two CMIP5 GCMs to evaluate how differences in E0 formulation, treatment of meteorological driving data, choice of GCM, and standardization of time series influence the estimation of E0. These methodological choices yield different assessments of spatio-temporal variability in E0 and different trends in 21st century drought risk. First, we estimate E0 using three widely used E0 formulations: Penman-Monteith; Hargreaves-Samani; and Priestley-Taylor. Our analysis, which primarily focuses on the May-September warm-season period, shows that E0 climatology and its spatial pattern differ substantially between these three formulations. Overall, we find higher magnitudes of E0 and its interannual variability using Penman-Monteith, in particular for regions like the Great Plains and southwestern US where E0 is strongly influenced by variations in wind and relative humidity. When examining projected changes in E0 during the 21st century, there are also large differences among the three formulations, particularly the Penman-Monteith relative to the other two formulations. The 21st century E0 trends, particularly in percent change and standardized anomalies of E0, are found to be sensitive to the long-term mean value and the amplitude of interannual variability, i.e. if the magnitude of E0 and its interannual variability are relatively low for a particular E0 formulation, then the normalized or standardized 21st century trend based on that formulation is amplified relative to other formulations. This is the case for the use of Hargreaves-Samani and Priestley-Taylor, where future E0 trends are comparatively much larger than for Penman-Monteith. When comparing Penman-Monteith E0 responses between different choices of input variables related to wind speed, surface roughness, and net radiation, we found differences in E0 trends, although these choices had a much smaller influence on E0 trends than did the E0 formulation choices. These methodological choices and specific climate model selection, also have a large influence on the estimation of trends in standardized drought indices used for drought assessment operationally. We find that standardization tends to amplify divergences between the E0 trends calculated using different E0 formulations, because standardization is sensitive to both the climatology and amplitude of interannual variability of E0. For different methodological choices and GCM output considered in estimating E0, we examine potential sources of uncertainty in 21st century trends in the Standardized Precipitation Evapotranspiration Index (SPEI) and Evaporative Demand Drought Index (EDDI) over selected regions of the CONUS to demonstrate the practical implications of these methodological choices for the quantification of drought risk under climate change.

  3. Sharing behavioral data through a grid infrastructure using data standards

    PubMed Central

    Min, Hua; Ohira, Riki; Collins, Michael A; Bondy, Jessica; Avis, Nancy E; Tchuvatkina, Olga; Courtney, Paul K; Moser, Richard P; Shaikh, Abdul R; Hesse, Bradford W; Cooper, Mary; Reeves, Dianne; Lanese, Bob; Helba, Cindy; Miller, Suzanne M; Ross, Eric A

    2014-01-01

    Objective In an effort to standardize behavioral measures and their data representation, the present study develops a methodology for incorporating measures found in the National Cancer Institute's (NCI) grid-enabled measures (GEM) portal, a repository for behavioral and social measures, into the cancer data standards registry and repository (caDSR). Methods The methodology consists of four parts for curating GEM measures into the caDSR: (1) develop unified modeling language (UML) models for behavioral measures; (2) create common data elements (CDE) for UML components; (3) bind CDE with concepts from the NCI thesaurus; and (4) register CDE in the caDSR. Results UML models have been developed for four GEM measures, which have been registered in the caDSR as CDE. New behavioral concepts related to these measures have been created and incorporated into the NCI thesaurus. Best practices for representing measures using UML models have been utilized in the practice (eg, caDSR). One dataset based on a GEM-curated measure is available for use by other systems and users connected to the grid. Conclusions Behavioral and population science data can be standardized by using and extending current standards. A new branch of CDE for behavioral science was developed for the caDSR. It expands the caDSR domain coverage beyond the clinical and biological areas. In addition, missing terms and concepts specific to the behavioral measures addressed in this paper were added to the NCI thesaurus. A methodology was developed and refined for curation of behavioral and population science data. PMID:24076749

  4. Determination of boron in uranium aluminum silicon alloy by spectrophotometry and estimation of expanded uncertainty in measurement

    NASA Astrophysics Data System (ADS)

    Ramanjaneyulu, P. S.; Sayi, Y. S.; Ramakumar, K. L.

    2008-08-01

    Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H 2O 2, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1 σ level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.

  5. A comparison of approaches for estimating relative impacts of nonnative fishes

    USGS Publications Warehouse

    Lapointe, N.W.R.; Pendleton, R. M.; Angermeier, Paul

    2012-01-01

    Lack of standard methods for quantifying impact has hindered risk assessments of high-impact invaders. To understand methodological strengths and weaknesses, we compared five approaches (in parentheses) for quantifying impact of nonnative fishes: reviewing documented impacts in a large-scale database (review); surveying fish biologists regarding three categories of impact (socioeconomic, ecological, abundance); and estimating frequency of occurrence from existing collection records (collection). In addition, we compared game and nongame biologists’ ratings of game and nongame species. Although mean species ratings were generally correlated among approaches, we documented important discrepancies. The review approach required little effort but often inaccurately estimated impact in our study region (Mid-Atlantic United States). Game fishes received lower ratings from the socioeconomic approach, which yielded the greatest consistency among respondents. The ecological approach exhibited lower respondent bias but was sensitive to pre-existing perceptions of high-impact invaders. The abundance approach provided the least-biased assessment of region-specific impact but did not account for differences in per-capita effects among species. The collection approach required the most effort and did not provide reliable estimates of impact. Multiple approaches to assessing a species’ impact are instructive, but impact ratings must be interpreted in the context of methodological strengths and weaknesses and key management issues. A combination of our ecological and abundance approaches may be most appropriate for assessing ecological impact, whereas our socioeconomic approach is more useful for understanding social dimensions. These approaches are readily transferrable to other regions and taxa; if refined, they can help standardize the assessment of impacts of nonnative species.

  6. Implementing and Evaluating a National Certification Technical Skills Examination: The Colorectal Objective Structured Assessment of Technical Skill.

    PubMed

    de Montbrun, Sandra; Roberts, Patricia L; Satterthwaite, Lisa; MacRae, Helen

    2016-07-01

    To implement the Colorectal Objective Structured Assessment of Technical skill (COSATS) into American Board of Colon and Rectal Surgery (ABCRS) certification and build evidence of validity for the interpretation of the scores of this high stakes assessment tool. Currently, technical skill assessment is not a formal component of board certification. With the technical demands of surgical specialties, documenting competence in technical skill at the time of certification with a valid tool is ideal. In September 2014, the COSATS was a mandatory component of ABCRS certification. Seventy candidates took the examination, with their performance evaluated by expert colorectal surgeons using a task-specific checklist, global rating scale, and overall performance scale. Passing scores were set and compared using 2 standard setting methodologies, using a compensatory and conjunctive model. Inter-rater reliability and the reliability of the pass/fail decision were calculated using Cronbach alpha and Subkoviak methodology, respectively. Overall COSATS scores and pass/fail status were compared with results on the ABCRS oral examination. The pass rate ranged from 85.7% to 90%. Inter-rater reliability (0.85) and reliability of the pass/fail decision (0.87 and 0.84) were high. A low positive correlation (r= 0.25) was seen between the COSATS and oral examination. All individuals who failed the COSATS passed the ABCRS oral examination. COSATS is the first technical skill examination used in national surgical board certification. This study suggests that the current certification process may be failing to identify individuals who have demonstrated technical deficiencies on this standardized assessment tool.

  7. Bromine isotope ratio measurements in seawater by multi-collector inductively coupled plasma-mass spectrometry with a conventional sample introduction system.

    PubMed

    de Gois, Jefferson S; Vallelonga, Paul; Spolaor, Andrea; Devulder, Veerle; Borges, Daniel L G; Vanhaecke, Frank

    2016-01-01

    A simple and accurate methodology for Br isotope ratio measurements in seawater by multi-collector inductively coupled plasma-mass spectrometry (MC-ICP-MS) with pneumatic nebulization for sample introduction was developed. The Br(+) signals could be measured interference-free at high mass resolution. Memory effects for Br were counteracted using 5 mmol L(-1) of NH4OH in sample, standard, and wash solutions. The major cation load of seawater was removed via cation exchange chromatography using Dowex 50WX8 resin. Subsequent Br preconcentration was accomplished via evaporation of the sample solution at 90 °C, which did not induce Br losses or isotope fractionation. Mass discrimination was corrected for by external correction using a Cl-matched standard measured in a sample-standard bracketing approach, although Sr, Ge, and Se were also tested as potential internal standards for internal correction for mass discrimination. The δ(81)Br (versus standard mean ocean bromide (SMOB)) values thus obtained for the NaBr isotopic reference material NIST SRM 977 and for IRMM BCR-403 seawater certified reference material are in agreement with literature values. For NIST SRM 977, the (81)Br/(79)Br ratio (0.97291) was determined with a precision ≤0.08‰ relative standard deviation (RSD).

  8. HETEROTROPHIC PLATE COUNT (HPC) METHODOLOGY IN THE UNITED STATES

    EPA Science Inventory

    ABSTRACT

    In the United States (U.S.), the history of bacterial plate counting methods used for water can be traced largely through Standard Methods for the Examination of Water and Wastewater (Standard Methods). The bacterial count method has evolved from the original St...

  9. Methodological Pluralism: The Gold Standard of STEM Evaluation

    ERIC Educational Resources Information Center

    Lawrenz, Frances; Huffman, Douglas

    2006-01-01

    Nationally, there is continuing debate about appropriate methods for conducting educational evaluations. The U.S. Department of Education has placed a priority on "scientifically" based evaluation methods and has advocated a "gold standard" of randomized controlled experimentation. The priority suggests that randomized control methods are best,…

  10. Assessing Conformity to Standards for Treatment Foster Care.

    ERIC Educational Resources Information Center

    Farmer, Elizabeth M. Z.; Burns, Barbara J.; Dubs, Melanie S.; Thompson, Shealy

    2002-01-01

    This study examined conformity to the Program Standards for Treatment Foster Care among 42 statewide programs. Findings suggest fair to good overall conformity, with considerable variation among programs. A discussion of methodological and substantive considerations for future research and evaluation using this approach is included. (Contains…

  11. ESTABLISH AND STANDARDIZE METHODOLOGY FOR DETECTION OF WATERBORNE VIRUSES FROM HUMAN SOURCES

    EPA Science Inventory

    Research is conducted to develop and standardize methods to detect and measure occurrence of human enteric viruses that cause waterborne disease. The viruses of concern include the emerging pathogens--hepatitis E virus and group B rotaviruses. Also of concern are the coxsackiev...

  12. Methodologic quality of meta-analyses and systematic reviews on the Mediterranean diet and cardiovascular disease outcomes: a review.

    PubMed

    Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane

    2016-03-01

    Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.

  13. A systematic review of methodology applied during preclinical anesthetic neurotoxicity studies: important issues and lessons relevant to the design of future clinical research.

    PubMed

    Disma, Nicola; Mondardini, Maria C; Terrando, Niccolò; Absalom, Anthony R; Bilotta, Federico

    2016-01-01

    Preclinical evidence suggests that anesthetic agents harm the developing brain thereby causing long-term neurocognitive impairments. It is not clear if these findings apply to humans, and retrospective epidemiological studies thus far have failed to show definitive evidence that anesthetic agents are harmful to the developing human brain. The aim of this systematic review was to summarize the preclinical studies published over the past decade, with a focus on methodological issues, to facilitate the comparison between different preclinical studies and inform better design of future trials. The literature search identified 941 articles related to the topic of neurotoxicity. As the primary aim of this systematic review was to compare methodologies applied in animal studies to inform future trials, we excluded a priori all articles focused on putative mechanism of neurotoxicity and the neuroprotective agents. Forty-seven preclinical studies were finally included in this review. Methods used in these studies were highly heterogeneous-animals were exposed to anesthetic agents at different developmental stages, in various doses and in various combinations with other drugs, and overall showed diverse toxicity profiles. Physiological monitoring and maintenance of physiological homeostasis was variable and the use of cognitive tests was generally limited to assessment of specific brain areas, with restricted translational relevance to humans. Comparison between studies is thus complicated by this heterogeneous methodology and the relevance of the combined body of literature to humans remains uncertain. Future preclinical studies should use better standardized methodologies to facilitate transferability of findings from preclinical into clinical science. © 2015 John Wiley & Sons Ltd.

  14. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior

    PubMed Central

    Sazonov, Edward; Schuckers, Stephanie; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Sazonova, Nadezhda; Melanson, Edward L.; Neuman, Michael

    2008-01-01

    A methodology of studying of ingestive behavior by non-invasive monitoring of swallowing (deglutition) and chewing (mastication) has been developed. The target application for the developed methodology is to study the behavioral patterns of food consumption and producing volumetric and weight estimates of energy intake. Monitoring is non-invasive based on detecting swallowing by a sound sensor located over laryngopharynx or by a bone conduction microphone and detecting chewing through a below-the-ear strain sensor. Proposed sensors may be implemented in a wearable monitoring device, thus enabling monitoring of ingestive behavior in free living individuals. In this paper, the goals in the development of this methodology are two-fold. First, a system comprised of sensors, related hardware and software for multimodal data capture is designed for data collection in a controlled environment. Second, a protocol is developed for manual scoring of chewing and swallowing for use as a gold standard. The multi-modal data capture was tested by measuring chewing and swallowing in twenty one volunteers during periods of food intake and quiet sitting (no food intake). Video footage and sensor signals were manually scored by trained raters. Inter-rater reliability study for three raters conducted on the sample set of 5 subjects resulted in high average intra-class correlation coefficients of 0.996 for bites, 0.988 for chews, and 0.98 for swallows. The collected sensor signals and the resulting manual scores will be used in future research as a gold standard for further assessment of sensor design, development of automatic pattern recognition routines, and study of the relationship between swallowing/chewing and ingestive behavior. PMID:18427161

  15. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior.

    PubMed

    Sazonov, Edward; Schuckers, Stephanie; Lopez-Meyer, Paulo; Makeyev, Oleksandr; Sazonova, Nadezhda; Melanson, Edward L; Neuman, Michael

    2008-05-01

    A methodology of studying of ingestive behavior by non-invasive monitoring of swallowing (deglutition) and chewing (mastication) has been developed. The target application for the developed methodology is to study the behavioral patterns of food consumption and producing volumetric and weight estimates of energy intake. Monitoring is non-invasive based on detecting swallowing by a sound sensor located over laryngopharynx or by a bone-conduction microphone and detecting chewing through a below-the-ear strain sensor. Proposed sensors may be implemented in a wearable monitoring device, thus enabling monitoring of ingestive behavior in free-living individuals. In this paper, the goals in the development of this methodology are two-fold. First, a system comprising sensors, related hardware and software for multi-modal data capture is designed for data collection in a controlled environment. Second, a protocol is developed for manual scoring of chewing and swallowing for use as a gold standard. The multi-modal data capture was tested by measuring chewing and swallowing in 21 volunteers during periods of food intake and quiet sitting (no food intake). Video footage and sensor signals were manually scored by trained raters. Inter-rater reliability study for three raters conducted on the sample set of five subjects resulted in high average intra-class correlation coefficients of 0.996 for bites, 0.988 for chews and 0.98 for swallows. The collected sensor signals and the resulting manual scores will be used in future research as a gold standard for further assessment of sensor design, development of automatic pattern recognition routines and study of the relationship between swallowing/chewing and ingestive behavior.

  16. Architecture and implementation considerations of a high-speed Viterbi decoder for a Reed-Muller subcode

    NASA Technical Reports Server (NTRS)

    Lin, Shu (Principal Investigator); Uehara, Gregory T.; Nakamura, Eric; Chu, Cecilia W. P.

    1996-01-01

    The (64, 40, 8) subcode of the third-order Reed-Muller (RM) code for high-speed satellite communications is proposed. The RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. The progress made toward achieving the goal of implementing a decoder system based upon this code is summarized. The development of the integrated circuit prototype sub-trellis IC, particularly focusing on the design methodology, is addressed.

  17. Factors Influencing the Reliability of the Glasgow Coma Scale: A Systematic Review.

    PubMed

    Reith, Florence Cm; Synnot, Anneliese; van den Brande, Ruben; Gruen, Russell L; Maas, Andrew Ir

    2017-06-01

    The Glasgow Coma Scale (GCS) characterizes patients with diminished consciousness. In a recent systematic review, we found overall adequate reliability across different clinical settings, but reliability estimates varied considerably between studies, and methodological quality of studies was overall poor. Identifying and understanding factors that can affect its reliability is important, in order to promote high standards for clinical use of the GCS. The aim of this systematic review was to identify factors that influence reliability and to provide an evidence base for promoting consistent and reliable application of the GCS. A comprehensive literature search was undertaken in MEDLINE, EMBASE, and CINAHL from 1974 to July 2016. Studies assessing the reliability of the GCS in adults or describing any factor that influences reliability were included. Two reviewers independently screened citations, selected full texts, and undertook data extraction and critical appraisal. Methodological quality of studies was evaluated with the consensus-based standards for the selection of health measurement instruments checklist. Data were synthesized narratively and presented in tables. Forty-one studies were included for analysis. Factors identified that may influence reliability are education and training, the level of consciousness, and type of stimuli used. Conflicting results were found for experience of the observer, the pathology causing the reduced consciousness, and intubation/sedation. No clear influence was found for the professional background of observers. Reliability of the GCS is influenced by multiple factors and as such is context dependent. This review points to the potential for improvement from training and education and standardization of assessment methods, for which recommendations are presented. Copyright © 2017 by the Congress of Neurological Surgeons.

  18. Ballast water regulations and the move toward concentration-based numeric discharge limits.

    PubMed

    Albert, Ryan J; Lishman, John M; Saxena, Juhi R

    2013-03-01

    Ballast water from shipping is a principal source for the introduction of nonindigenous species. As a result, numerous government bodies have adopted various ballast water management practices and discharge standards to slow or eliminate the future introduction and dispersal of these nonindigenous species. For researchers studying ballast water issues, understanding the regulatory framework is helpful to define the scope of research needed by policy makers to develop effective regulations. However, for most scientists, this information is difficult to obtain because it is outside the standard scientific literature and often difficult to interpret. This paper provides a brief review of the regulatory framework directed toward scientists studying ballast water and aquatic invasive species issues. We describe different approaches to ballast water management in international, U.S. federal and state, and domestic ballast water regulation. Specifically, we discuss standards established by the International Maritime Organization (IMO), the U.S. Coast Guard and U.S. Environmental Protection Agency, and individual states in the United States including California, New York, and Minnesota. Additionally, outside the United States, countries such as Australia, Canada, and New Zealand have well-established domestic ballast water regulatory regimes. Different approaches to regulation have recently resulted in variations between numeric concentration-based ballast water discharge limits, particularly in the United States, as well as reliance on use of ballast water exchange pending development and adoption of rigorous science-based discharge standards. To date, numeric concentration-based discharge limits have not generally been based upon a thorough application of risk-assessment methodologies. Regulators, making decisions based on the available information and methodologies before them, have consequently established varying standards, or not established standards at all. The review and refinement of ballast water discharge standards by regulatory agencies will benefit from activity by the scientific community to improve and develop more precise risk-assessment methodologies.

  19. ATS-F and Man: A Course of Study: An Experiment in Satellite Application to Statewide Instructional Methodology.

    ERIC Educational Resources Information Center

    Gaven, Patricia; Williams, R. David

    An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…

  20. Methodological Validation of Quality of Life Questionnaire for Coal Mining Groups-Indian Scenario

    ERIC Educational Resources Information Center

    Sen, Sayanti; Sen, Goutam; Tewary, B. K.

    2012-01-01

    Maslow's hierarchy-of-needs theory has been used to predict development of Quality of Life (QOL) in countries over time. In this paper an attempt has been taken to derive a methodological validation of quality of life questionnaire which have been prepared for the study area. The objective of the study is to standardize a questionnaire tool to…

  1. Nuclear power plant life extension using subsize surveillance specimens. Performance report (4/15/92 - 4/14/98)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Arvind S.

    2001-03-05

    A new methodology to predict the Upper Shelf Energy (USE) of standard Charpy specimens (Full size) based on subsize specimens has been developed. The prediction methodology uses Finite Element Modeling (FEM) to model the fracture behavior. The inputs to FEM are the tensile properties of material and subsize Charpy specimen test data.

  2. Application of Lean Healthcare methodology in a urology department of a tertiary hospital as a tool for improving efficiency.

    PubMed

    Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D

    To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Electron-beam lithography with character projection technique for high-throughput exposure with line-edge quality control

    NASA Astrophysics Data System (ADS)

    Ikeno, Rimon; Maruyama, Satoshi; Mita, Yoshio; Ikeda, Makoto; Asada, Kunihiro

    2016-07-01

    The high throughput of character projection (CP) electron-beam (EB) lithography makes it a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as for standard-cell logics and memory arrays. However, non-VLSI applications such as MEMS and MOEMS may not be able to fully utilize the benefits of the CP method due to the wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear because of the EB exposure process often result in intolerable edge roughness, which degrades device performances. In this study, we propose a general EB lithography methodology for such applications utilizing a combination of the CP and variable-shaped beam methods. In the process of layout data conversion with CP character instantiation, several control parameters were optimized to minimize the shot count, improve the edge quality, and enhance the overall device performance. We have demonstrated EB shot reduction and edge-quality improvement with our methodology by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and a high-resolution hydrogen silsesquioxane resist. Atomic force microscope observations were used to analyze the resist edge profiles' quality to determine the influence of the control parameters used in the data conversion process.

  4. Automatic and robust extrinsic camera calibration for high-accuracy mobile mapping

    NASA Astrophysics Data System (ADS)

    Goeman, Werner; Douterloigne, Koen; Bogaert, Peter; Pires, Rui; Gautama, Sidharta

    2012-10-01

    A mobile mapping system (MMS) is the answer of the geoinformation community to the exponentially growing demand for various geospatial data with increasingly higher accuracies and captured by multiple sensors. As the mobile mapping technology is pushed to explore its use for various applications on water, rail, or road, the need emerges to have an external sensor calibration procedure which is portable, fast and easy to perform. This way, sensors can be mounted and demounted depending on the application requirements without the need for time consuming calibration procedures. A new methodology is presented to provide a high quality external calibration of cameras which is automatic, robust and fool proof.The MMS uses an Applanix POSLV420, which is a tightly coupled GPS/INS positioning system. The cameras used are Point Grey color video cameras synchronized with the GPS/INS system. The method uses a portable, standard ranging pole which needs to be positioned on a known ground control point. For calibration a well studied absolute orientation problem needs to be solved. Here, a mutual information based image registration technique is studied for automatic alignment of the ranging pole. Finally, a few benchmarking tests are done under various lighting conditions which proves the methodology's robustness, by showing high absolute stereo measurement accuracies of a few centimeters.

  5. Do Heat Pump Clothes Dryers Make Sense for the U.S. Market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, Steve; Franco, Victor; Lekov, Alex

    Heat pump clothes dryers (HPCDs) can be as much as 50percent more energy-efficient than conventional electric resistance clothes dryers, and therefore have the potential to save substantial amounts of electricity. While not currently available in the U.S., there are manufacturers in Europe and Japan that produce units for those markets. Drawing on analysis conducted for the U.S. Department of Energy's (DOE) current rulemaking on amended standards for clothes dryers, this paper evaluates the cost-effectiveness of HPCDs in American homes, as well as the national impact analysis for different market share scenarios. In order to get an accurate measurement of realmore » energy savings potential, the paper offers a new energy use calculation methodology that takes into account the most current data on clothes washer cycles, clothes dryer usage frequency, remaining moisture content, and load weight per cycle, which is very different from current test procedure values. Using the above methodology along with product cost estimates developed by DOE, the paper presents the results of a life-cycle cost analysis of the adoption of HPCDs in a representative sample of American homes. The results show that HPCDs have positive economic benefits only for households with high clothes dryer usage or for households with high electricity prices and moderately high utilization.« less

  6. Calibration methodology for proportional counters applied to yield measurements of a neutron burst.

    PubMed

    Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo

    2014-01-01

    This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.

  7. The Economic Impact of Closed-Incision Negative-Pressure Therapy in High-Risk Abdominal Incisions: A Cost-Utility Analysis.

    PubMed

    Chopra, Karan; Gowda, Arvind U; Morrow, Chris; Holton, Luther; Singh, Devinder P

    2016-04-01

    Complex abdominal wall reconstruction is beset by postoperative complications. A recent meta-analysis comparing the use of closed-incision negative-pressure therapy to standard dressings found a statistically significant reduction in surgical-site infection. The use of closed-incision negative-pressure therapy is gaining acceptance in this population; however, the economic impact of this innovative dressing remains unknown. In this study, a cost-utility analysis was performed assessing closed-incision negative-pressure therapy and standard dressings following closure of abdominal incisions in high-risk patients. Cost-utility methodology involved reviewing literature related to closed-incision negative-pressure therapy in abdominal wall surgery, obtaining utility estimates to calculate quality-adjusted life-year scores for successful surgery and surgery complicated by surgical-site infection, summing costs using Medicare Current Procedural Terminology codes, and creating a decision tree illuminating the most cost-effective dressing strategy. One-way sensitivity analysis was performed to assess the robustness of the results. The aforementioned meta-analysis comparing closed-incision negative-pressure therapy to standard dressings included a subset of five studies assessing abdominal wall surgery in 829 patients (260 closed-incision negative-pressure therapy and 569 standard dressings). Decision tree analysis revealed an estimated savings of $1546.52 and a gain of 0.0024 quality-adjusted life-year with closed-incision negative-pressure therapy compared with standard dressings; therefore, closed-incision negative-pressure therapy is a dominant treatment strategy. One-way sensitivity analysis revealed that closed-incision negative-pressure therapy is a cost-effective option when the surgical-site infection rate is greater than 16.39 percent. The use of closed-incision negative-pressure therapy is cost-saving following closure of abdominal incisions in high-risk patients.

  8. Opening a Gateway for Chemiluminescence Cell Imaging: Distinctive Methodology for Design of Bright Chemiluminescent Dioxetane Probes

    PubMed Central

    2017-01-01

    Chemiluminescence probes are considered to be among the most sensitive diagnostic tools that provide high signal-to-noise ratio for various applications such as DNA detection and immunoassays. We have developed a new molecular methodology to design and foresee light-emission properties of turn-ON chemiluminescence dioxetane probes suitable for use under physiological conditions. The methodology is based on incorporation of a substituent on the benzoate species obtained during the chemiexcitation pathway of Schaap’s adamantylidene–dioxetane probe. The substituent effect was initially evaluated on the fluorescence emission generated by the benzoate species and then on the chemiluminescence of the dioxetane luminophores. A striking substituent effect on the chemiluminescence efficiency of the probes was obtained when acrylate and acrylonitrile electron-withdrawing groups were installed. The chemiluminescence quantum yield of the best probe was more than 3 orders of magnitude higher than that of a standard, commercially available adamantylidene–dioxetane probe. These are the most powerful chemiluminescence dioxetane probes synthesized to date that are suitable for use under aqueous conditions. One of our probes was capable of providing high-quality chemiluminescence cell images based on endogenous activity of β-galactosidase. This is the first demonstration of cell imaging achieved by a non-luciferin small-molecule probe with direct chemiluminescence mode of emission. We anticipate that the strategy presented here will lead to development of efficient chemiluminescence probes for various applications in the field of sensing and imaging. PMID:28470053

  9. International consensus for neuroblastoma molecular diagnostics: report from the International Neuroblastoma Risk Group (INRG) Biology Committee

    PubMed Central

    Ambros, P F; Ambros, I M; Brodeur, G M; Haber, M; Khan, J; Nakagawara, A; Schleiermacher, G; Speleman, F; Spitz, R; London, W B; Cohn, S L; Pearson, A D J; Maris, J M

    2009-01-01

    Neuroblastoma serves as a paradigm for utilising tumour genomic data for determining patient prognosis and treatment allocation. However, before the establishment of the International Neuroblastoma Risk Group (INRG) Task Force in 2004, international consensus on markers, methodology, and data interpretation did not exist, compromising the reliability of decisive genetic markers and inhibiting translational research efforts. The objectives of the INRG Biology Committee were to identify highly prognostic genetic aberrations to be included in the new INRG risk classification schema and to develop precise definitions, decisive biomarkers, and technique standardisation. The review of the INRG database (n=8800 patients) by the INRG Task Force finally enabled the identification of the most significant neuroblastoma biomarkers. In addition, the Biology Committee compared the standard operating procedures of different cooperative groups to arrive at international consensus for methodology, nomenclature, and future directions. Consensus was reached to include MYCN status, 11q23 allelic status, and ploidy in the INRG classification system on the basis of an evidence-based review of the INRG database. Standardised operating procedures for analysing these genetic factors were adopted, and criteria for proper nomenclature were developed. Neuroblastoma treatment planning is highly dependant on tumour cell genomic features, and it is likely that a comprehensive panel of DNA-based biomarkers will be used in future risk assignment algorithms applying genome-wide techniques. Consensus on methodology and interpretation is essential for uniform INRG classification and will greatly facilitate international and cooperative clinical and translational research studies. PMID:19401703

  10. Current trends in protein crystallization.

    PubMed

    Gavira, José A

    2016-07-15

    Proteins belong to the most complex colloidal system in terms of their physicochemical properties, size and conformational-flexibility. This complexity contributes to their great sensitivity to any external change and dictate the uncertainty of crystallization. The need of 3D models to understand their functionality and interaction mechanisms with other neighbouring (macro)molecules has driven the tremendous effort put into the field of crystallography that has also permeated other fields trying to shed some light into reluctant-to-crystallize proteins. This review is aimed at revising protein crystallization from a regular-laboratory point of view. It is also devoted to highlight the latest developments and achievements to produce, identify and deliver high-quality protein crystals for XFEL, Micro-ED or neutron diffraction. The low likelihood of protein crystallization is rationalized by considering the intrinsic polypeptide nature (folded state, surface charge, etc) followed by a description of the standard crystallization methods (batch, vapour diffusion and counter-diffusion), including high throughput advances. Other methodologies aimed at determining protein features in solution (NMR, SAS, DLS) or to gather structural information from single particles such as Cryo-EM are also discussed. Finally, current approaches showing the convergence of different structural biology techniques and the cross-methodologies adaptation to tackle the most difficult problems, are presented. Current advances in biomacromolecules crystallization, from nano crystals for XFEL and Micro-ED to large crystals for neutron diffraction, are covered with special emphasis in methodologies applicable at laboratory scale. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. High Mitochondrial DNA Stability in B-Cell Chronic Lymphocytic Leukemia

    PubMed Central

    Cerezo, María; Bandelt, Hans-Jürgen; Martín-Guerrero, Idoia; Ardanaz, Maite; Vega, Ana; Carracedo, Ángel; García-Orad, África; Salas, Antonio

    2009-01-01

    Background Chronic Lymphocytic Leukemia (CLL) leads to progressive accumulation of lymphocytes in the blood, bone marrow, and lymphatic tissues. Previous findings have suggested that the mtDNA could play an important role in CLL. Methodology/Principal Findings The mitochondrial DNA (mtDNA) control-region was analyzed in lymphocyte cell DNA extracts and compared with their granulocyte counterpart extract of 146 patients suffering from B-Cell CLL; B-CLL (all recruited from the Basque country). Major efforts were undertaken to rule out methodological artefacts that would render a high false positive rate for mtDNA instabilities and thus lead to erroneous interpretation of sequence instabilities. Only twenty instabilities were finally confirmed, most of them affecting the homopolymeric stretch located in the second hypervariable segment (HVS-II) around position 310, which is well known to constitute an extreme mutational hotspot of length polymorphism, as these mutations are frequently observed in the general human population. A critical revision of the findings in previous studies indicates a lack of proper methodological standards, which eventually led to an overinterpretation of the role of the mtDNA in CLL tumorigenesis. Conclusions/Significance Our results suggest that mtDNA instability is not the primary causal factor in B-CLL. A secondary role of mtDNA mutations cannot be fully ruled out under the hypothesis that the progressive accumulation of mtDNA instabilities could finally contribute to the tumoral process. Recommendations are given that would help to minimize erroneous interpretation of sequencing results in mtDNA studies in tumorigenesis. PMID:19924307

  12. The role of reporting standards in producing robust literature reviews

    NASA Astrophysics Data System (ADS)

    Haddaway, Neal Robert; Macura, Biljana

    2018-06-01

    Literature reviews can help to inform decision-making, yet they may be subject to fatal bias if not conducted rigorously as `systematic reviews'. Reporting standards help authors to provide sufficient methodological detail to allow verification and replication, clarifying when key steps, such as critical appraisal, have been omitted.

  13. DARPA ANTIBODY TECHNOLOGY PROGRAM STANDARDIZED TEST BED FOR ANTIBODY CHARACTERIZATION: CHARACTERIZATION OF TWO MS2 SCFV ANTIBODIES PRODUCED BY THE UNIVERSITY OF TEXAS

    DTIC Science & Technology

    2017-05-01

    a quality program for the standardization of test methods to support comprehensive characterization and comparison of the physical and functional...1 2.     MATERIALS AND METHODS ...4  2.8       SPR Methodology

  14. Development of a Methodology for Assessing Aircrew Workloads.

    DTIC Science & Technology

    1981-11-01

    Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting

  15. A Comparison of Approaches for Setting Proficiency Standards.

    ERIC Educational Resources Information Center

    Koffler, Stephen L.

    This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…

  16. Children's Spirit: Leadership Standards and Chief School Executives

    ERIC Educational Resources Information Center

    Boske, Christa

    2009-01-01

    Purpose: The purpose of this study is to increase awareness of the interactions among school leadership standards, cultural competence, and decision-making practices for chief school executives. Design/methodology/approach: To achieve this objective, 1,087 chief school executives, who were members of the American Association of School…

  17. About the necessity of standardizing no-tillage research

    USDA-ARS?s Scientific Manuscript database

    No-tillage / zero tillage research has now been performed for more than half a century in many countries around the world but few efforts have been made to standardize research methodology. This has led to a situation where no-tillage research results obtained until now often can not be compared bec...

  18. Why do we need to standardize no-tillage research?

    USDA-ARS?s Scientific Manuscript database

    No-tillage / conservation agricultural systems research has now been performed for more than half a century in many countries around the world, but few efforts have been made to standardize research methodology. This has led to a situation where no-tillage research results have often not been direct...

  19. Instructional Alignment under No Child Left Behind

    ERIC Educational Resources Information Center

    Polikoff, Morgan S.

    2012-01-01

    The alignment of instruction with the content of standards and assessments is the key mediating variable separating the policy of standards-based reform (SBR) from the outcome of improved student achievement. Few studies have investigated SBR's effects on instructional alignment, and most have serious methodological limitations. This research uses…

  20. Combining Archetypes with Fast Health Interoperability Resources in Future-proof Health Information Systems.

    PubMed

    Bosca, Diego; Moner, David; Maldonado, Jose Alberto; Robles, Montserrat

    2015-01-01

    Messaging standards, and specifically HL7 v2, are heavily used for the communication and interoperability of Health Information Systems. HL7 FHIR was created as an evolution of the messaging standards to achieve semantic interoperability. FHIR is somehow similar to other approaches like the dual model methodology as both are based on the precise modeling of clinical information. In this paper, we demonstrate how we can apply the dual model methodology to standards like FHIR. We show the usefulness of this approach for data transformation between FHIR and other specifications such as HL7 CDA, EN ISO 13606, and openEHR. We also discuss the advantages and disadvantages of defining archetypes over FHIR, and the consequences and outcomes of this approach. Finally, we exemplify this approach by creating a testing data server that supports both FHIR resources and archetypes.

Top